Understanding emerging digital litigation trends in the US

A look at the rise of privacy-related class-action litigation and novel legal issues being litigated in U.S. courts.

Published
Subscribe to IAPP Newsletters

Contributors:

Alex LaCasse

Staff Writer

IAPP

There is an ongoing surge in U.S. litigation related to data privacy and cybersecurity matters, with those thorny legal issues being further compounded by a rush among the private sector to integrate emerging technologies like artificial intelligence into operations and services.

Attorneys handling digital litigation issues, regulators and members of the judiciary are all impacted by the increased case log in one way or another.

At the IAPP Global Summit 2026, Morgan, Lewis & Bockius Cyber, Incident Response and Privacy Partner Hannah Levin discussed how data breach and other privacy-related class-action lawsuits are two of the fastest-growing areas of complex litigation, with more than 3,000 data breach class-action lawsuits filed in 2025 alone. Combined with the fact there are more insurers leaving the cyber insurance marketplace due to skyrocketing costs to provide coverage, she said companies today are in some respects more vulnerable legally than they have been previously. 

The recent spike in privacy-related lawsuits is making it difficult for companies to keep up with tailoring privacy notices to stave off most class-action lawsuits when violations occur. Given the high-risk nature of safeguarding sensitive personal data, Levin indicated when data breaches do occur, class-action complaints are often being filed before the organization in question even mounts its formal incident response. 

"Maybe five years ago, a client would ask me what is the risk of being sued for this data breach, this privacy violation, and I would say, 'Well, how many people are involved?'" she said. "I no longer do that analysis anymore because it has become more of a question of, is this going to be a public breach? Are we going to notify an attorney general? And if so, the likelihood of a class-action lawsuit is extremely high."

The plaintiffs

Keller Rohrback Partner Cari Laufenberg highlighted that privacy-related class-action complaints have experienced a 200% increase since 2022. Aside from the main company, which experienced a data breach, classes of plaintiffs are now pursuing damages against third-party technology vendors, depending on the underlying circumstances of how the breach occurred.

The Snowflake breach in 2024 is one example of how third parties are being looped into litigation. 

"Quite a bit now we're seeing vendors being implicated, and these cases are what we call hub-and-spoke lawsuits," Laufenberg said. "You'll have an entity that is the hub in the middle that is a vendor that has provided a service to all kinds of other organizations, and they get implicated in all of them."

Additionally, classes of plaintiffs are also bringing privacy complaints for a host of emerging technologies, such as tracking pixels and using consumer data to train AI. Laufenberg said plaintiffs are relying on older laws, such as U.S. Video Privacy Protection Act, the Electronic Communications Privacy Act and California's Invasion of Privacy Act, that were designed to regulate analog technologies and are experiencing varying levels of success in the courtroom.  

"It's becoming more of an uncertain environment because it's broadening the array of types of violations, and broadening the array of defendants who could be at issue," she said. "We're really seeing divergent rulings on all of these hosts of issues, because there isn't really well-developed law about how these statutes apply to context, in which they're being applied."

View from the bench

For judges, emerging technologies like AI are presenting a whole new set of novel legal issues that were simply not conceivable when laws governing analog technologies of the past were signed into law.  

U.S. District Court for the District of Columbia Chief Judge James Boasberg said with advancements in AI, the likelihood of AI-manipulated photos and documents being entered into evidence in cases cannot be overlooked. However, the legal system's "adversarial" structure between parties in civil lawsuits, and between defendants and the state in criminal law, the responsibility to determine what evidence is authentic should fall on the respective parties and not a judge.  

"It's not up to judges to look at (evidence) and say, 'I think this is a fake photo,'" Boasberg said. "The issue of flagging it at first glance is what you have the adversarial system for. I would hope the other side would say, 'This isn't legitimate, this is AI-generated,' and then maybe you as a judge will have to make a ruling on it."

On the FTC's radar

U.S. Federal Trade Commission Division of Privacy and Identity Protection Senior Attorney Erik Jones offered insight into some of the agency's recent privacy enforcement action over the past year, including most recently, the Protecting Americans' Data from Foreign Adversaries Act, as well as preparing to begin enforcement of the TAKE IT DOWN Act. 

With PADFAA, Jones said the FTC recently sent letters to 13 data brokers reminding them of their new obligations. Those warnings "noted that the FTC had identified instances in which none of the letter recipients offered solutions and insights involving the status of members of the armed forces" and how their data was being shared. 

Jones also highlighted recent FTC workshops on children's privacy and age verification, noting the agency is likely to be taking a major interest in age assurance technology providers and how their solutions work in practice, as well as how they protect sensitive personally identifiable information. He pointed to the USD7.5 million enforcement action against education technology company Illuminate Education and the USD10 million settlement with Disney to resolve Children's Online Privacy Protection Act Rule violations. 

To avoid the FTC's attention, Jones recommended organizations take steps to improve their data security postures and implement data minimization measures to avoid having data elements that no longer serve a business purpose becoming the point of failure that leads to a breach. 

"Failing to implement adequate data security for what is being collected is something to be aware of, and also keeping data for longer than you need (can lead to compliance issues)," he said. "When you make promises to consumers on privacy and security, you have ensure that those promises are being kept, because if they are not, it could be considered a deceptive trade practice and trigger a violation of Section 5 of the FTC Act."

CPE credit badge

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.

Submit for CPEs

Contributors:

Alex LaCasse

Staff Writer

IAPP

Tags:

Children’s privacy and safetyData securityEnforcementLitigation and case lawU.S. federal regulationLegalCybersecurity lawPrivacyAI governance

Related Stories