The U.S. consumer privacy laws that took effect in 2023, and those slated to do so later this year, will impact multiple industries and sectors. They will regulate items as diverse as universal opt-out signals, dark patterns and data protection assessments. And despite significant variations in scope, application and enforcement, they all contain a relative constant — biometrics.
Traditionally, biometric systems — designed to capture and compare certain identifiers to previously documented references — have relied predominantly on biological traits that could not be altered, like fingerprints and iris shape, to serve as identifiers.
For years, Illinois' Biometric Information Privacy Act, with its expansive application, daunting private right of action and successive case law, has generally been regarded as the gold standard for entities collecting or processing biometric information. However, despite its reputation, the BIPA regulates only biometric identifiers concisely defined as "a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry."
Following the BIPA's outline, the biometric-specific privacy laws enacted in Texas and Washington state similarly define biometric identifiers within the more traditionally understood scope of biological characteristics or patterns. The same is true for Connecticut's Data Privacy Act, Utah's Consumer Privacy Act, Virginia's Consumer Data Protection Act and the entirety of the 2024 class of privacy laws in Florida, Montana, Oregon and Texas.
Such a biologics-dependent, conceptual framework for biometrics may soon see a deliberate and significant expansion beyond purely fixed characteristics. Looking closely at the nuances of the mosaic of laws that will impact the collection and use of biometrics in the U.S., it seems behavioral characteristics may soon play a much larger role.
Similar to the EU General Data Protection Regulation's broad definition of biometric data — which includes an individual's physical, physiological and behavioral characteristics — the California Consumer Privacy Act as amended by the California Privacy Rights Act, the Colorado Privacy Act and New Jersey's Senate Bill 332 each include behavioral characteristics in their definitions of biometric information and biometric identifiers, respectively. Notably, sleep, health and exercise data is included under physiological or behavioral characteristics that qualify as biometric information in California and behavior patterns and characteristics are included as defined biometric identifiers in Colorado and New Jersey.
Following in those progressive definitional footsteps, Washington state's My Health My Data Act and Nevada's consumer health data privacy law expressly include "behavioral characteristics" as a category of protected consumer health data. Nevada's law goes even further than the MHMDA and expressly includes such alterable identifiers as tattoos, scars and bodily marks in its biometrics definition.
Not surprisingly, the U.S. Federal Trade Commission and federal government have not been sitting by idly while states address biometrics. In its May 2023 policy statement, the FTC broadly defined biometric information to include "data that depict or describe … behavioral traits, characteristics, or measurements of or relating to an identified or identifiable person's body" and specifically included in its definition "characteristic movements or gestures." The FTC in a recent blog post named protecting biometrics information as a top "priority." Also, the executive branch, by way of President Joe Biden's executive order on artificial intelligence, has expressly acknowledged the power of biometrics, directing that its application to movement-related traits, like gaze direction and hand motions, be carefully considered to allow for the equitable application of AI-technologies.
This confluence of pioneering statutory language in comprehensive privacy laws from California and Colorado, and in consumer health data privacy laws from Nevada and Washington, coupled with the breadth of the federal government's definitional position, may have paved the way for a much broader general interpretation of what qualifies as biometric data going forward. Even though more than a dozen state legislatures tried, and failed, to pass biometric data laws in 2023, it will be telling to see if any states, or new regulations, move beyond regulating collection protocols for biometric identifiers and delve deeper into the processing and use of such biometrics, however broadly defined.
Perhaps a more layered approach and risked-based analysis of the actual processing of biometrics will prove to be a more efficient way to cultivate innovation and develop technologies without sacrificing important privacy and equity concerns. Examples of this approach can already be found in states like California and cities including New York, which have imposed a human monitoring and control obligation in addition to certain automated decision-making when decisions of consequence are at stake.
Regardless of what the future of biometric regulation holds, given the heightened protections and rights already afforded to biometrics as a universally recognized category of sensitive data, businesses deploying any form of unique identifier collection or processing may do well to quickly add the examination of their biometrics practices and policies to their New Year's resolution list.