TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Will there be federal facial recognition regulation in the US? Related reading: ACLU files class-action vs. Clearview AI under biometric privacy law

rss_feed

""

New and discrete issues around facial recognition technology continue to make headlines and propel bans, lawsuits and litigation, highlighting that facial recognition was a hot topic in 2020 and is poised to continue in 2021.

There are federal, state and city-level efforts to regulate the technology and class-action lawsuits taking aim at high-profile companies that use facial recognition. While there is no federal law in the U.S. to specifically regulate the burgeoning technology, numerous bills have been proposed. In the meantime, headline-grabbing use cases of facial recognition will likely spur lawmakers to examine potential facial recognition policies.

Facial recognition in the news

The biggest story for facial recognition has been its use in the wake of the Jan. 6, 2021, Capitol riot. Both law enforcement professionals and amateur sleuths turned to facial recognition to identify insurrectionists.

Facial recognition company Clearview AI saw a 26% jump in searches the day after the riot, and police departments in both Florida and Alabama employed it to help identify wanted individuals. The Federal Bureau of Investigation has also reached out to the public for information on individuals who were present.

Beyond algorithm-based facial recognition, at least one website, called Faces of the Riot, has set out to crowd-source who was inside the Capitol building that day. It is doing so through screenshots from photos and videos scraped from the now-defunct website Parler. There are currently more than 6,000 faces pictured. However, the virtual photo array is unable to verify if a face belongs to a participant or a member of the police or the media; the anonymous website creators, who describe themselves as two college students, even specifically warn on the top of the website against “attempt[ing] your own investigation into anyone shown on this website.”

Other news stories about facial recognition have centered on the pandemic. In May 2020, one business proposed creating immunity passports for individuals who were no longer at risk of contracting or spreading COVID-19 and would use facial recognition technology to identify the immunity passport holder.

City-level bans of facial recognition

Facial recognition regulation, in the form of bans, is on the rise at the city level. Some bans are broader and include private actors, like stores and restaurants, while most only prevent the use of facial recognition by public actors, like police departments.

In September 2020, Portland, Oregon, banned facial recognition use by both public and private entities, including in places of “public accommodation,” such as restaurants, retail stores and public gathering spaces.

Portland, Maine passed an ordinance in November 2020 banning both the city and its departments and officials from “using or authorizing the use of any facial surveillance software on any groups or members of the public.” The ordinance allows members of the public to sue if “facial surveillance data is illegally gathered and/or used." Citizens would be eligible to receive “$100 per violation or $1,000, whichever is greater, plus attorneys fees.”

Nongovernmental organizations are also advocating for city-level bans of facial recognition.

In late January 2021, Amnesty International launched its "Ban the Scan" initiative that calls for a “total ban on the use, development, production, and sale of facial recognition technology” by police and government agencies. For now, Amnesty is focusing its initiative in New York City with plans to expand in 2021.

Litigation and enforcement actions

Given the lack of a federal facial recognition law, individuals are turning to state laws to bring suit against the use of facial recognition technologies. 

Illinois

Illinois is a popular place to bring facial recognition lawsuits because the technology could violate the state’s Biometric Information Privacy Act, which allows for a private right of action.

BIPA provides that before a private entity may use a consumer’s biometric information, it must inform the consumer in writing that the biometric information, including a “scan of hand or face geometry,” is being collected and stored and for how long. The entity cannot “sell, lease, trade, or otherwise profit from a person’s or a customer’s biometric identifier or biometric information.” Since BIPA covers scans of face geometry, it tends to cover private entities that use a person’s face for facial recognition purposes.

Facebook lawsuit

Facebook settled a lawsuit with Illinois consumers, which claimed the company “illegally collected and stored biometric data of millions of users without their consent.” Illinois users claimed Facebook violated BIPA through the “tag suggestions” feature on photos. The 9th Circuit Court of Appeals agreed, finding “the development of a face template using facial recognition technology without consent … invades an individual’s private affairs and concrete interests.” Members of the class were eligible to receive between $400 and $600 from Facebook in compensation. 

Clearview AI lawsuits

Clearview AI has been the defendant in lawsuits in multiple states, including Illinois, Virginia, New York and Vermont. Clearview AI works by scraping photographic content from the internet, including social media sites that expressly prohibit scraping in their terms of service, to create databases that match photos. To date, it has harvested and analyzed billions of photos.

In Illinois alone, there are currently at least three lawsuits related to Clearview AI. The first, filed jointly by the ACLU of Illinois and law firm Edelson, contends that Clearview AI violates BIPA by collecting faceprints from Illinois residents without notice or consent.

The second suit, brought by a single plaintiff and a proposed class, alleges that Clearview AI violated Section 15(c) of Illinois’ BIPA, which states that “no private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person's or a customer's biometric identifier or biometric information.” Clearview AI removed the case to federal court, but the 7th Circuit Court recently kicked it back down to state court.

The third suit does not name Clearview AI as a defendant but instead brings suit against Macy’s, a national department store chain, and one of Clearview AI’s customers. The complaint alleges that Macy’s violated BIPA because of the store’s use of Clearview AI’s technology.

Vermont

The state of Vermont is also suing Clearview AI — which is registered as a data broker in the state — for violating both the state’s Consumer Protection Act, which prohibits “unfair or deceptive acts or practices in commerce,” and the state’s data broker law. As of September 2020, the Superior Court of Vermont denied Clearview AI’s motion to dismiss, meaning the case will continue.

New Jersey

Focusing more on harmful outcomes, a suit was filed against police in a New Jersey town for using facial recognition to identify a suspect but ended up identifying the wrong individual. The wrongfully accused man spent 10 days in jail. A study by the National Institute of Standards and Technology found “for one-to-one matching” there were “higher rates of false positives for … African American faces relatives to images of Caucasians” with the differentials ranging “from a factor of 10 to 100 times, depending on the individual algorithm.” The case highlights the racial bias baked into many facial recognition technologies.

Federal: The FTC’s Everalbum settlement

In addition to state-level litigation, the Federal Trade Commission has started playing an active role in the misuse of facial recognition.

In January 2021, it settled with Everalbum, which featured an app called Ever. Individuals could upload photos and videos in Ever’s cloud. However, according to the FTC, “Everalbum launched a new feature in the Ever app, called ‘Friends,’ that used facial recognition technology to group users’ photos by the faces of the people who appear in them and allowed users to ‘tag’ people by name.”

The FTC alleged the feature ran afoul of Section 5 of the FTC Act by “misrepresenting the company’s practices with respect to Ever users’ content.” The technology was enabled by default for all app users, even though Everalbum said it “would not apply facial recognition technology to users’ content unless users affirmatively chose to activate the feature.”

The FTC’s consent order states Everalbum must “obtain consumers’ express consent before using facial recognition technology on their photos and videos” and requires Everalbum to “delete models and algorithms it developed by using the photos and videos uploaded by its users.”

Legislation

In the absence of a federal law regulating facial recognition, state and city laws have filled the gap. However, there have been attempts by Congress to pass federal legislation that puts guardrails around the use of the technology.

While several bills in the 116th Congress focused on facial recognition, including 2019’s Commercial Facial Recognition Privacy Act, 2020’s Ethical Use Of Facial Recognition Act, and Facial Recognition and Biometric Technology Act of 2020, none of them moved past introduction. The George Floyd Justice In Policing Act and Advancing Facial Recognition Act progressed a little further in the legislative process but were not adopted.

On the state level, there have recently been laws introduced in Alabama, Maryland and Washington state.

Government agency updates

U.S. government agencies have also been studying the use and accuracy of facial recognition. The Department of Homeland Security recently announced that facial recognition at airports could still correctly identify individuals even when those individuals were wearing protective masks because of COVID-19. DHS said the technology could help “reduce the need for people to remove masks at airports or ports of entry.” 

DHS’s announcement corresponds with NIST's statement that facial recognition software is becoming more adept at identifying faces even when the individual is wearing a protective mask. In July 2020, NIST first released a report stating traditional facial recognition technologies struggle to correctly identify masked individuals; however, the report indicated that technology is getting better at recognizing masked faces.

Conclusion

Could 2021 be the year a federal facial recognition bill is passed in Congress?

The increased use of technology, state-level lawsuits and city-level bans will certainly add impetus. The IAPP will continue to report on new developments in the facial recognition space.

Photo by Blake Barlow on Unsplash


Approved
CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.