In January, plaintiffs and Facebook reached the largest privacy settlement in U.S. history. Facebook agreed to settle for $550 million for violations of the Illinois Biometric Information Privacy Act in its "tag suggestions" feature, which identifies faces in uploaded photos and suggests users who match the faces. Apple and Google are facing similar lawsuits. This settlement will have far-reaching implications for all businesses using biometric identification techniques, such as facial recognition. California, Texas, Arkansas, New York and Washington all have some form of state law that regulates biometric privacy, and more states are working on legislation.
Many companies are already using facial-recognition technology to sell products and services to its clients. Cosmetics giant Sephora has interactive cameras in its stores that allow customers to select a variety of eye, face and lip colors to virtually apply cosmetics. Facial-recognition tech helps place the suggested make-up tones on the customer’s image; their eyes, faces or lips. Sephora’s app provides the opportunity to upload one’s photo for a similar experience.
So it's clear facial-recognition tech has many potential business uses. But the sensitive nature of facial data means there are legal risks involved in using it. Privacy practitioners must provide detailed guidance to clients to ensure compliance with applicable laws or risk expensive consequences. Facebook has learned that the hard way in a class-action lawsuit alleging Facebook violated Illinois state law governing biometric information.
Illinois Biometric Information Privacy Act
Facial-recognition tech relies on storing information about facial geometry, which is a type of biometric identifier. One of the most robust state laws on biometric identifiers is the Illinois Biometric Information Privacy Act. The federal sectoral privacy governance approach, combined with the state-by-state patchwork of consumer, employment and other privacy laws, creates a complex web of rules for organizations to follow. While BIPA only governs in Illinois, it provides insights into how other states might treat biometric identifiers. When evaluating privacy laws, two key considerations are (1) do they provide a private right of action for violations, and (2) may plaintiffs state a claim without proving actual harm from the privacy violation? For BIPA, the answer to both questions is yes.
This has been a crucial time for privacy rights, in general, and BIPA, in particular. In 2019, in the closely watched case of Rosenbach v. Six Flags Entertainment Corporation, the Illinois Supreme Court held that where a plaintiff can show a violation of their rights under BIPA, they need not show actual injury or adverse effect in his or her case. BIPA covers “biometric identifiers,” including “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.” Private entities that collect this information must:
- Maintain a publicly available, written policy that documents a retention schedule.
- Establish “guidelines for permanently destroying biometric identifiers and biometric information when the initial purpose for collecting such identifiers or information has been satisfied or within three years of the individual’s last interaction with the private entity, whichever comes first.”
- Obtain informed written consent regarding the collection of biometric identifiers, specific purpose of collection and storage period.
Importantly, BIPA provides a private right of action for violations. It provides for statutory damages and the ability to recoup attorneys fees and costs. Each BIPA violation can carry between a $1,000 and $5,000 fine. Some state privacy laws only allow attorneys general to bring suit for violations. Attorneys general, who are charged with a variety of duties, such as protecting consumers and advising state agencies, may have limited resources to bring privacy-related suits. The private right of action and availability of attorneys fees allow for much greater enforcement of privacy laws.
Class-action BIPA lawsuit against Facebook
Illinois residents sued Facebook for violating BIPA in its use of facial-recognition tech in Patel v. Facebook. Facebook has created and stores a large database of face maps that it uses to identify individuals in uploaded photos. The plaintiffs alleged Facebook violated BIPA because it collected, used and stored biometric identifiers without seeking written consent or providing an adequate retention schedule. A California federal district court considered the plaintiffs’ motion for class certification and Facebook’s motion to dismiss based on lack of standing. Facebook argued what many in the privacy community argue: Violating a privacy law does not, in itself, give rise to harm, and the plaintiff must prove something more.
The trial court granted the plaintiffs’ motion for class certification and denied Facebook’s motion to dismiss. Facebook appealed. In August 2019, the 9th Circuit upheld the trial court’s class certification and denial of Facebook’s motion to dismiss the case due to lack of standing. Relying in part on the ruling in Rosenbach, the 9th Circuit held that the violation of BIPA was enough to show harm and confer standing. The 9th Circuit quoted the Illinois Supreme Court, stating, “procedural protections in BIPA ‘are particularly crucial in our digital world' because '[w]hen a private entity fails to adhere to the statutory procedures ... the right of the individual to maintain his or her biometric privacy vanishes into thin air.'”
The class-action focused on Facebook’s violation of BIPA, so it makes sense that the 9th Circuit relied on the Rosenbach court’s ruling. However, BIPA is one of the strongest state laws protecting biometric identifiers and is more than a decade old. It is likely that other states will consider the Illinois Supreme Court’s rationale and historical application of BIPA in developing their own biometric identifier protections.
California, Texas, Arkansas, New York and Washington all have some form of state law that regulates biometric privacy. More states will be joining them in the coming years. So long as federal privacy law remains sectoral, states will be looking to one another for examples of workable privacy legal and regulatory frameworks. Additionally, they will look to existing laws to conceptualize issues such as harm. Federal lawmakers will also look to existing laws to assess issues such as private rights of action, the threshold to prove harm and the definition of biometric identifiers. In the current environment, privacy practitioners should consider not only current law, but also potential future laws in developing privacy programs. This will allow them to organize adaptable privacy programs that can allow for seamless compliance with future privacy laws. Additionally, robust privacy programs inspire customer trust and can provide value to organizations in ways that move beyond simply legal compliance.
Some action items that privacy practitioners may take with clients include:
- Walk-through client business processes to inventory the various situations in which they capture biometric identifiers.
- Document the location of customers and business units and identify which laws apply.
- Develop data maps to track information flows through the organization.
- Itemize the various uses for biometric identifiers. Consider potential risks of using biometric identifiers, and weigh them against the benefits.
- Create systems to secure informed consent from data subjects explaining biometric identifier policies in plain language.
- Develop systems to retain and destroy biometric identifiers within appropriate time periods.
- Identify individuals who will own these tasks within the organization. Establish how clients will maintain these responsibilities in case of staff turnover or reorganizations of business units.
Photo by Blake Barlow on Unsplash