Editor's note: This is the final article in a three-part series that tracks class action litigation at the nexus of privacy, artificial intelligence, and other digital technology. The first article, "
," looks at how decades-old anti-wiretapping laws have found new significance due to the emergence of automated technologies capable of eavesdropping. The second article, "
," explores recent legal theories that have applied the California Invasion of Privacy Act and the Video Privacy Protection Act.

As discussed in the previous parts of this series, class-action suits have found success in retailoring old privacy laws in the context of new fact patterns, often involving technologies that were never imagined when the laws were originally enacted. The Electronic Communications Privacy Act, for example, remains a viable avenue for plaintiffs to combat non-consensual interception of data, especially as new limitations to its immunity provisions arise. Moreover, the Copyright Act has been implemented to target AI developers. Finally, taking a more modern perspective, the Illinois Biometric Information Privacy Act is also relevant. 

A new exception to ECPA immunity: The Bulk Data Transfer Rule  

As discussed in the first part of this series, the ECPA provides an exception to liability where one party consents to the interception. But the crime-tort exception to that party exception limits immunity. Violation of the new Bulk Data Transfer Rule is one such limiting crime.

In Baker v. Index Exchange, the plaintiff alleged that Index Exchange, a supply-side platform in the digital advertising space, violated the ECPA by intercepting user communications and sharing them with Temu, an e-commerce site owned by a foreign entity. The complaint argues that the "party exception" under ECPA Section 2511(2)(d), which excuses liability where one party to a covered communication consents to interception, does not apply because Index Exchange's interception was carried out for the purpose of committing a violation of the U.S. Department of Justice's Bulk Data Transfer Rule. The rule, promulgated in April, "prohibits U.S. persons from engaging in certain categories of data transactions with six 'countries of concern,'" including China.

Of note, the plaintiff alleged Index Exchange engaged in cookie syncing, a process of matching its own user IDs with those used by platforms like Temu, enabling advertising partners to build a more detailed profile on individual users. The complaint urges that "this data can be weaponized for profiling, coercive targeting, or even blackmail" of U.S. residents by Temu, which is subject to China's National Intelligence Law, Cybersecurity Law and Data Security Law.  

This class action may encounter several hurdles in meeting the requirements of the ECPA as well as the Bulk Data Transfer Rule. Even so, it marks a new frontier in ECPA litigation, one that applies national security law in circumventing limitations to ECPA applicability. And Bakeris not alone: similar facts were recently alleged in Porcuna v. Xandr. Xandr, a subsidiary of Microsoft, intercepted user communications with third-party websites for the alleged purpose of transmitting said communications to Temu, meaning that Xandr, like Index Exchange, cannot seek immunity under ECPA Section 2511(2)(d).

Indeed, the party exception to the ECPA remains limited. As new data sharing laws and regulations, such as the Bulk Data Transfer Rule, crop up at the state and federal level, this exception is likely to face further constraints. Entities that record private communications or disclose video viewing data without user consent should remain aware of this evolving trend to avoid liability under old, though newly impacted, laws like the ECPA.

Other class-action suits about digital responsibility

Class actions at the nexus of privacy, AI and cybersecurity and a recent survey suggest that this space represents the new frontier in multi-plaintiff lawsuits. What's more, AI implicates adjacent domains like intellectual property law. The proliferation of class-action suits also raises the question of appropriate remedy; that is, what means of redress will adequately compensate the class and meaningfully dissuade the defendant from future malfeasance. As such, entities operating in this space should take note of a few recent developments. 

Clearview AI consumer privacy litigation

Clearview AI provides facial recognition services by scraping publicly available websites for photographs and creating a database of biometric facial geometry. In 2020, multiple class-action complaints were filed alleging that Clearview violated Illinois' Biometric Information Privacy Act among other consumer privacy laws. The Biometric Information Privacy Act prohibits the non-consensual processing of Illinois residents' biometric data. Clearview settled with the plaintiffs in March 2025 to the tune of a 23% stake in the business. This unprecedented agreement balances the need for class members to be compensated with the fact that Clearview's violations would have bankrupted the company under the Biometric Information Privacy Act's private right of action, thus depriving injured parties of any redress. Ironically, it is now in the interest of the class plaintiffs for Clearview to expand its business, thus expanding their equity share.

23andMe customer data security breach litigation

In September 2024, 23andMe agreed to a USD30 million settlement of a class-action lawsuit resulting from a 2023 data breach that affected 6.9 million customers. Many of those impacted had opted into an application feature that allowed users to share information with their DNA-related relatives. The breach included personal, genetic and ancestral data. 

What's more, the biotech company was accused of failing to alert affected customers with Chinese and Ashkenazi Jewish ancestry that their information had been specifically targeted by the cybercriminals, sparking concern that bad actors could purchase ethnicity data for violent or oppressive purposes. 23andMe suggested that the breach was the result of recycled user passwords, not inadequate security measures on their part. No matter the root cause, the plaintiffs' attorneys maintain that 23andMe should have foreseen the use of recycled passwords and instituted a level of security commensurate with the sensitivity of the health and genetic data that the company retained.

Bartz v. Anthropic

In a record-setting proposed settlement, Anthropic agreed to pay a group of authors USD1.5 billion for allegedly infringing the copyright of some 500,000 books to train its AI systems. The underlying case, which probes the limits of the fair use doctrine as applied to training generative AI, is consequential for the future of the AI industry — a sector that relies heavily on copyrighted material for training data. 

A full rundown of the case shows that, at the summary judgment stage, training Anthropic's Claude model on copyrighted works likely qualifies as fair use. However, the storage of the same works in a central library only constitutes fair use if those works were legally obtained. In a further twist, the U.S. District Court of the Northern District of California denied preliminary approval of the settlement, finding the proposal incomplete, particularly that the method for allocating payments for works with multiple claimants had not been determined.

OpenAI copyright infringement litigation

A copyright infringement lawsuit filed by the New York Times in December of 2023 — targeting OpenAI, Microsoft, and other defendants for OpenAI's alleged ingestion of New York Times' copyrighted articles to train its generative AI models — has led the Southern District of New York to order OpenAI to preserve all ChatGPT user chat logs going forward, as these records may contain evidence relevant to the suit. OpenAI objected to the preservation order, pointing to user deletion requests and privacy regulations that mandate data minimization practices. The court, however, was not persuaded. In a contemporaneous post on X, OpenAI CEO Sam Altman suggested that communications with AI should be privileged, similar to attorney-client or doctor-patient privilege.

As the foregoing series demonstrates, the class-action lawsuit remains a powerful tool in the plaintiff's arsenal for enforcing privacy and related rights. This litigation approach may only grow as AI continues to proliferate among goods and services, exposing more people to unforetold risk. In response, the plaintiff's bar is likely to continue its strategy of repurposing decades-old laws like the ECPA, California Invasion of Privacy Act and Video Privacy Protection Act to hold modern-day defendants to task.

Will Simpson, CIPP/US, is a Westin Fellow for the IAPP.