TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

United States Privacy Digest | A View from DC: The FTC says, if it can be biometric data, it already is Related reading: A view from DC: A red wave of state privacy laws

rss_feed

""

""

Imagine, if you will, that you carry with you a special key. Your key is unique to you, unchangeable, and you cannot leave it behind.

Since no one else can carry your key, you use it as a form of identification. Your special key can unlock doors for you, but only if the door knows and expects its exact shape. It’s also possible for sensors to uniquely identify your key from among thousands of others, if they have enough information to do so.

Since every person has their own key, it can also be used for simpler purposes, without even measuring its precise shape. Detecting the mere presence of a key indicates a person is nearby. Estimating the vintage or weight of your key can serve as a proxy to determine your gender, age, race or other characteristics.

Whether your key is your face, your fingerprint, or your gait doesn't matter, so long as it's an identifiable characteristic of your body. Depending on who you ask, measurements of this characteristic qualify as biometric information when used to identify you, or they may count when they could be used to identify you in the future, even if identifying you is not the purpose for which the measurements are taken. Setting aside these semantics, if your key has biometric characteristics, it should be treated with extra care.

That is the takeaway from this week's nonbinding policy statement from the U.S. Federal Trade Commission on the collection of biometrics. Taking an expansive approach to the issue, the agency has made clear it will use its Section 5 unfairness authority to prosecute harmful or surprising processing of biometric information.

Within scope of this guidance is any data that "depict or describe physical, biological, or behavioral traits, characteristics, or measurements of or relating to an identified or identifiable person’s body." Notably, the commission explicitly indicates that a photograph of a person’s face counts as biometric information, even if not processed to identify that person. In fact, any "depictions, images, descriptions, or recordings of an individual’s facial features, iris or retina, finger or handprints, voice, genetics, or characteristic movements or gestures" should be treated as biometric information worthy of extra privacy protections. But wait, there's more. Any data derived from such depictions is biometric information "to the extent that it would be reasonably possible to identify the person from whose information the data had been derived."

These definitions set the FTC’s guidance apart from existing laws. For starters, most rules governing the processing of biometric information explicitly exclude raw photos, videos, or voice recordings, with the notable exception of Washington's new My Health My Data Act. Although the FTC policy statement does not explicitly mention that biometric data is included whether it identifies an individual by itself or "in combination with other data" as MHMD and the CCPA do, the "identifiable" standard is likely contiguous with these rules.

Prior FTC cases have focused on misrepresentations related to the use of biometrics, particularly facial recognition technologies. In the Everalbum case, the company allegedly failed to seek opt-in consent for the facial grouping feature in its online album, despite promising to do so. Similar facts were alleged in the FTC's 2019 complaint against Facebook. In these cases, the FTC did not conclude that the companies were under an affirmative obligation to seek opt-in consent for biometric data collection, but merely that they had deceived their customers when they promised to do so.

The policy statement — and the accompanying remarks from Chair Lina Khan and Commissioner Alvaro Bedoya — add to the strong signals of a new era for biometric information. Regulators are making it crystal clear that identifiable bodily measurements should be considered sensitive data and treated accordingly. This does not necessarily mean opt-ins are required for all biometric collection, but they should be strongly considered unless there are other contextual factors that make the collection apparent to and avoidable by the consumer.

What does it look like to apply a higher privacy standard to biometric information? The FTC gives us some hints. If we infer best practices as the inverse of the list of possible unfair practices, the FTC expects organizations to:

  1. Conduct impact assessments to identify "foreseeable" risks and harms before collecting biometric data or deploying biometric systems. Such harms may include privacy and security issues, but also those related to unmitigated discrimination or bias in the biometric technology or its deployment.
  2. Take proactive steps to mitigate any known or foreseeable risks, including, at a minimum, deploying any "readily available" tools such as organizational and technical measures that reduce privacy and security risks or correct for algorithmic bias.
  3. Avoid surprises for consumers, considering the context of how the biometric system is deployed. Engaging in "surreptitious or unexpected collection or use" of biometric information could be an unfair practice, especially when it renders the collection unavoidable by the consumer.
  4. Embrace a mechanism for consumer complaints related to the collection and use of biometric information, then take steps to address them.
  5. Evaluate the practices and capabilities of third parties who have access to biometric information, and take steps that "go beyond contractual measures to oversee third parties and ensure they are meeting those requirements and not putting consumers at risk."
  6. Conduct regular employee training.
  7. Monitor biometric technologies, as deployed, to look for unintended uses and unforeseen harms.

The biometric guidance reflects the emerging consensus around how privacy professionals and others can embrace best practices for AI governance. Like any system powered by advanced algorithms, the development and deployment of biometric technologies should be conducted within a holistic and proactive approach to identifying and mitigating possible harms to consumers. When we have our priorities straight, new technical innovations should bring with them innovative privacy protections.

Here's what else I’m thinking about:

  • The FTC settled a third case in its groundbreaking series of health-related data enforcement actions. A consent order against Easy Healthcare relies on the Health Breach Notification Rule to impose a $100,000 civil penalty against the company related to its Premom Ovulation Tracker app. The complaint also includes seven counts of deceptive and unfair practices, most of which relate to the alleged sharing of data with third parties through the app. As always, the FTC's Lesley Fair has an informative and entertaining summary of the case. Even as the FTC continues the trend of using the HBNR to enforce against unexpected sharing of health data, it also proposed modifications to the rule. Comments on the proposed change will be due in about 60 days.
  • Age verification and assurance techniques remain a hot topic of policy discussion, as state laws and federal proposals consider ever more varied implementations of age-based access restrictions. Cyberscoop’s Tonya Riley wrote about the gap between policymaker expectations and technical reality. First Amendment advocates continue to maintain that many new rules are unconstitutional. Meanwhile, innovators in the age assurance industry, such as Yoti, report that privacy-friendly and bias-mitigated approaches are possible for many age-based categorization requirements.
  • Federal legislators’ efforts to tackle AI issues are seizing headlines, even as we await the latest bipartisan privacy proposals. This week, Senate Majority Leader Chuck Schumer, D-N.Y., took steps toward effectuating his intentions to build a bipartisan framework for AI by meeting with Sens. Mike Rounds, R-S.D., Todd Young, R-Ind., and Martin Heinrich, D-N.M.
  • Google made its Privacy Sandbox plans a little more concrete and pledged to work collaboratively toward building post-cookie solutions, with the planned release of relevance and measurement APIs this July. It looks like 2024 is still slated as the year we see deprecation of third-party cookies in the Chrome browser. There is much work still to do, in desktop and mobile environments, before the ad marketplace meets regulators’ evolving privacy expectations.

Please send feedback, updates and unique keys to cobun@iapp.org.


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.