Aristotle, the Greek philosopher and polymath of the fourth century BC is known to have coined the phrase horror vacui — nature abhors a vacuum — meaning that in nature, a vacuum or a void isn't a steady state.

Fast forward 2,400 years, and the same holds true for technology regulation. If policymakers in Washington, D.C., take their feet off the gas, don't expect privacy and artificial intelligence laws to slow down. Enter Sacramento, Austin, Albany, Tallahassee, Trenton and Olympia. This land is your land, and it is rife with tech policy initiatives.

No federal privacy law? Introducing 20 states — and counting. No federal AI regulation? Prepare for an avalanche of state action.   

As we enter the early days of 2025, it is an ideal moment to consider what the new year may bring to privacy regulation. We are in a period of continued evolution, marked by the emergence of new technologies, most notably AI, growing awareness of personal data rights and rising public demand for stronger protections.

This year, however, the shift is not only technological but also political, with Republicans consolidating power in Washington and holding control over the executive, legislative and, for the most part, judicial branches.

While some observers suggested this may signal a deregulatory stance to privacy and AI, the absence of federal regulation in the U.S. may very well open the floodgates of state laws. Blue states, in particular, are expected to double down on protecting consumer health information and imposing antibias and discrimination measures on developers and deployers of AI. 

With a diverse and complex regulatory landscape at both the federal and state levels, businesses and individuals alike are navigating the implications of this shift. At the state level, privacy legislation continues to gain momentum, with states like California, Texas and Colorado leading the charge, setting precedents with groundbreaking laws, while other states move toward similar measures.

Here are the key privacy and AI issues that will make 2025.

Federal regulation  

Consider this curveball: With Republicans in control of both chambers of Congress, as well as the White House, federal privacy legislation could actually get done.

To be sure, historically Republican lawmakers were more circumspect about privacy regulation. At the same time, industry has been clamoring for a federal law that would preempt the growing patchwork of state privacy laws.

While we wouldn't hold our breath waiting for a federal privacy law, such an initiative could gain momentum in Congress. If it happens with Sen. Ted Cruz, R-Texas, at the helm of the Senate Committee on Commerce, Science and Transportation and Rep. Brett Guthrie, R-Ky., chairing the House Committee on Energy and Commerce, we predict a proposal for a new comprehensive federal law that looks more like the Texas Data Privacy And Security Act than the last iteration of a federal privacy bill, the American Privacy Rights Act.

That is, a federal law with strong preemption, no private right of action, less emphasis on civil rights and data minimization, and a stronger focus on security.

Short of a federal privacy law, we may well see kids' privacy legislation pass in this Congress. Recent efforts, such as the Kids Online Safety and Privacy Act have strong bicameral and bipartisan support. If lawmakers want to show they can get something done, this would be a good place to start.

The new Federal Trade Commission, led by President-elect Donald Trump's choice for chair, sitting Commissioner Andrew Ferguson, and a Republican majority with new Commissioner Mark Meador, will look starkly different from outgoing Chair Lina Khan's agency.

For starters, the agency is all but certain to scrap its Commercial Surveillance and Data Security Rulemaking. In fact, we're unlikely to even hear the term commercial surveillance under this leadership.

Moreover, Ferguson has already stated he intends to roll back the agency's recent efforts to become a de facto AI enforcement agency, focusing instead on its traditional role as a competition and consumer protection regulator. This intent can be witnessed in his Rytr dissent, in which he stated the agency "certainly should not chill innovation by threatening to hold AI companies liable for whatever illegal use some clever fraudster might find for their technology."

On privacy, we expect the FTC to shift from an activist enforcement stance, which often relies on the unfairness prong of its Section 5 authority, to a more conservative approach based on antideception.

State law frenzy

You can surely expect 2025 to mark a pivotal year in state law regulation of privacy as more states implement comprehensive privacy laws that offer robust consumer rights. Many of these laws reflect a trend toward more consumer control over personal data, corporate transparency and greater accountability for organizations that process sensitive information. 

In 2024, the number of comprehensive state privacy laws grew from five to eight, with the laws in Texas, Oregon and Montana coming into force.

This year, the tally will balloon to 16, as privacy laws go into effect in Tennessee, Delaware, Iowa, New Jersey, New Hampshire, Nebraska, Maryland and Minnesota. Some of these laws, including in Delaware, Iowa, Nebraska and New Hampshire, entered into force 1 Jan., and New Jersey's law is imminent, entering into force 15 Jan.

Further, in certain states with such laws in effect, new requirements are now implemented. In Texas and Connecticut, for example, businesses must now respect universal opt-out mechanisms. All in all, 19 states, plus Florida, whose law's scope is somewhat more limited, have passed a comprehensive privacy law, with many others in the pipeline.

In addition to having a direct effect for rights of consumers and obligations of business in the particular state at play, these laws may come to have a broader impact by influencing discussions around federal privacy legislation. Importantly, the new Maryland law adopted the data minimization paradigm from one of the stalled federal bills, marking a shift in U.S. state law from a permissive, largely notice and opt-out approach to a stricter EU General Data Protection Regulation-like regime limiting the collection of data to only what is necessary.

While, for several years, businesses addressed the fracturing U.S. privacy landscape as "California plus," grounding their compliance efforts in the California Consumer Privacy Act while keeping an eye on state-by-state developments, this approach will now change to a more integrative strategy. California is no longer the strictest regime, though it continues to be an outlier in its application to employees' data, and new frameworks, such as Maryland's data minimization principles and Washington state's sector-specific My Health My Data Act, will need to be accounted for.

State laws also diverge in application thresholds; for example, Texas law introduced a small business exemption based not on revenue thresholds but on being defined as a small business by the U.S. Small Business Administration. And while some states provide entity-level exemptions for businesses covered by federal laws such as the Health Insurance Portability and Accountability Act, Gramm-Leach-Bliley Act or Family Educational Rights and Privacy Act, others allow a much more limited data-based exemption. Other differences manifest in state law treatment of specific categories of data, such as kids' information or biometrics, or in the scope of consumer privacy rights. 

Health privacy

Don't be surprised if the red wave in federal elections will lead to a blue counterwave in state efforts to protect consumer health information, particularly — though not only — in the context of reproductive rights and gender affirming care. Already, back in 2023, Washington state enacted the MHMDA, which has been emulated by legislation in Nevada.

Over the past couple of months, we've seen a reproductive data privacy geofencing bill in New York with Assembly Bill 5517, a bill specifically scoped to reproductive data privacy in Michigan with Senate Bill 1082, a bill on AI and mental health data in Texas with House Bill 1265, multiple bills on location health data in California with AB 45 and AB 67, and a bill on privacy in the context of reproductive rights and gender-affirming care in Virginia with SB 754.

At the federal level, we will see significant changes to HIPAA regulations in the near future. Recently, the U.S. Department of Health and Human Services proposed updates to the HIPAA Security Rule to strengthen the security of electronic protected health information.

The proposed rule outlines several major revisions to the Security Rule, including the removal of the distinction between required and addressable implementation specifications, the introduction of compliance deadlines and documentation requirements, and more detailed risk analysis guidelines.

Additionally, the rule proposes new notice requirements for changes in workforce access termination status, updates to contingencies and security incident response procedures and the introduction of new security controls. It further mandates the encryption of ePHI both at rest and in transit, along with the required use of multifactor authentication.

Kids privacy  

Few areas are as contentious — but also draw as much bipartisan convergence — as kids' privacy. Policymakers, educators and parents alike assert that somethingmust be done about kids' privacy, but what? Everyone is anxious about kids' use of devices, apps and social media sites that generate and consume a gushing stream of data, but regulatory protective measures present formidable risks to free speech and online anonymity.

As discussed above, this may be one area where Congress steps up and delivers federal legislation. The latest bipartisan efforts include the Kids Online Safety and Privacy Act, sponsored by Sens. Marsha Blackburn, R‑Tenn., and Richard Blumenthal, D‑Conn., and Children and Teens' Online Privacy Protection Act 2.0, sponsored by Sens. Ed Markey, D-Mass., and Bill Cassidy, R-La.

In this area too, states haven't been sitting on their hands. Just this past year, New York passed two important kids' privacy laws: the Child Data Protection Act, which focuses on data protection, and the Stop Addictive Feeds Exploitation For Kids Act, which focuses on preventing addictive apps and technologies. Meanwhile, Maryland passed an Age Appropriate Design Code Act, modeled after the California law that, for now, is held up in constitutional litigation.

Regulators too have weighed into this space. While the FTC's commercial surveillance rulemaking is kaput, its COPPA Rule refresh is very much viable. Furthermore, enforcement agencies from the FTC to the California and Texas attorneys general have demonstrated vigor in enforcing against violations of kids' privacy rights. We expect state-level interest in and enforcement of children's privacy rights to grow in 2025 and beyond.

Sensitive data, ad tech and data brokers

In 2024, we witnessed the regulation of data brokers veer off from a privacy track to one anchored in national security for the first time. As tensions with China heightened over national security and trade policy, policymakers recognized concerns around foreign governments' access to the sensitive data of U.S. persons.

Consequently, the Biden-Harris administration passed legislation and an executive order intended to protect the sensitive personal information of U.S. persons from being accessed by China or entities under its control. While the "Tik Tok law" — Protecting Americans from Foreign Adversary Controlled Applications Act of 2024 — drew a lot of media attention, the administration also passed the Protecting Americans' Data from Foreign Adversaries Act of 2024  as an add-on to a supplemental appropriations bill to support Israel and Ukraine.

In addition, Executive Order 14117 on "Preventing Access to Americans' Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern" passed and was accompanied by extensive rulemaking by the Department of Justice, published just days before the end of 2024.

While PADFA focuses on preventing data brokers from selling personally identifiable sensitive data of U.S. individuals to a foreign adversary country or an entity controlled by a foreign adversary country, the executive order has much broader remit. It applies not only to data brokers but rather to any entity that makes Americans' bulk sensitive personal data and certain U.S. government-related data accessible in countries of concern, notably China, including Hong Kong. It extends to not only data selling but also investment, employment or vendor agreements involving such entities and access to data.

Data brokers attracted the scrutiny of other regulators too. Last month, the Consumer Financial Protection Bureau issued a proposed rule for "Protecting Americans from Harmful Data Broker Practices." The rule would apply the protections under the Fair Credit Reporting Act to a category of data brokers that have traditionally not been viewed as consumer reporting agencies, if such businesses sold information about a consumer's income, financial tier, credit history, credit score or debt payments, regardless of how the information is used.

The rule also asserts that advertising and marketing are not permissible purposes for consumer reporting agencies to furnish consumer reports. However, given the November election results, the fate of the rule — and indeed of the CFPB itself — has become unclear.

The FTC too focused on enforcement actions against data brokers. In a series of enforcement actions, starting with InMarket and X-Mode at the beginning of 2024 and culminating with Mobilewalla and Gravy Analytics at the end of the year, the agency cracked down on data brokers selling location data, particularly in sensitive contexts.

Importantly for this industry, the FTC stated brokers cannot simply rely on contractual language with data providers to verify consumer consent; rejected data brokers' creation of sensitive location segments, such as ones tied to health care clinics, places of worship, LGBTQ+ gatherings or political rallies; and cracked down on brokers' practice of monetizing data inappropriately collected and retained from real time bidding exchanges.

The FTC's decisions bear implications not only for data brokers but also for companies up and down the data supply chain, requiring businesses to ensure consumers provide verifiable consent, honor opt outs, block sensitive locations, disclose retention practices, and verify data hygiene and accountability. Industry groups also tightened their best practices to prohibit the use, sale and transfer of precise location information related to "sensitive points of interest."

The FTC criticized the advertising technology industry beyond just data brokers. In September, the agency released a lengthy report titled, "A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services," with recommendations addressing data minimization, restrictions on data sharing, protection of kids' and teens' personal information, and automated decision-making.

In a November blog post, the FTC criticized data clean rooms, stating they "are not rooms, do not clean data, and have complicated implications for user privacy, despite their squeaky-clean name." The FTC added, "companies shouldn't view DCRs as a way to get around their obligations under the law or the promises they have made to consumers."

States too kept their eyes on the data broker industry. With Oregon's data broker law going into effect and Texas' attorney general announcing an enforcement sweep against unregistered data brokers, registration requirements now figure in four states, along with California and Vermont.

In recent rulemaking under California's Delete Act, the California Privacy Protection Agency narrowed the direct relationship exception in the statutory definition of data broker to situations in which "a consumer intentionally interacts with a business for the purpose of obtaining information about, accessing, purchasing, using, or requesting the business' products or services within the preceding three years."

That means the CPPA would consider a business to be a data broker evenif it had a direct relationship with consumers, as long as it sold personal information about such consumers that the business did not collect directly from them. While still a year out, the Delete Act requires the CPPA to establish a "deletion request and opt-out platform" by 1 Jan. 2026. By August 2026, data brokers will have to respond to single-click deletion requests and review the deletion system every 45 days. As a result, opt-out rates are expected to skyrocket.

And, if the intentions of policymakers weren't clear enough, the powerful attorneys general of Texas and California announced broad enforcement sweeps against data brokers, culminating in a set of fines and penalties.

AI

Anyone say AI? No technological or business development has captured the public's imagination over the past few years like the AI revolution. Whether it's generative AI that puts personal data and intellectual property at risk or artificial general intelligence that threatens human existence, everyone is searching for AI governance frameworks.

One of President-elect Donald Trump's campaign promises was to repeal Biden's executive order on AI and, more broadly, AI policy on day one in office. The Trump Administration will address AI policy primarily through the lens of competition with China on innovation, national security and energy policy. While Trump's 2019 executive order on AI paid tribute to civil rights, that issue was nowhere near its prominence as in Biden's 2023 executive order or his 2022 Blueprint for an AI Bill of Rights.

But here, too, the states may have the final say. By stepping back from AI governance and ethics, the administration will effectively cede territory to the states. Already, Colorado is implementing its AI Act, focused on AI rendering consequential decisions, and California's AB 2013 requires generative AI providers to ensure training data transparency. California will likely relitigate SB 1047, the bill requiring developers of frontier models to implement prescriptive safety measures. In Texas, Rep. Giovanni Capriglione, R-Texas, who authored the state's privacy law, recently submitted his Responsible AI Governance Act. Indeed, dozens or even hundreds of AI laws are making their way through the legislative pipeline in state capitals.

Litigation

Historically, privacy wasn't a field of vibrant litigation. Enforcement was largely the remit of regulatory agencies, such as the FTC and HHS Office for Civil Rights. No more. The past couple of years have seen a surge in privacy litigation, with the plaintiffs' bar weaving privacy causes of action out of federal and state laws ranging from the U.S. Video Privacy Protection Act to the Children's Internet Protection Act to Illinois' Biometric Information Privacy Act and even trap-and-trace legislation and New Jersey's Daniel's Law.

These cases target business practices ranging from integrating cookies, pixels, session replay or software development kids to using third parties to offer customer service chatbots or track email open rates.

Despite some recent cases viewed as helpful by the industry, such as the Supreme Judicial Court of Massachusetts decision in Vita v. New England Baptist Hospital, rejecting application of wiretapping laws to the use of pixel technology on websites, and the U.S. Ninth Circuit Court of Appeals decision in Zellmer vs. Meta, holding that biometric identifiers must be capable of identifying a person under BIPA, plaintiffs are likely to continue to pepper businesses with demand letters and claims threatening mass arbitration or class-action litigation.

Looking forward

Privacy regulation and enforcement activity were neither quiet nor predictable last year. Rather, the landscape of privacy and AI law in 2024 continued to evolve at a breakneck speed with both federal and state governments taking a more active role in defining the regulatory framework. Key developments include a growing focus on protecting consumer health data, enhancing children's privacy and addressing the practices of data brokers.

Litigation around privacy violations has surged, highlighting the urgency of robust protections and measures for risk mitigation. As we look ahead to 2025 and beyond, we can expect continued efforts to standardize privacy regulations with increased emphasis on consumer rights, transparency and accountability of companies handling sensitive data.

And, wherever federal policymakers and regulators take a step back, expect state lawmakers and attorneys general to press forward. There's never a vacuum in technology regulation, an issue that draws intense media focus and broad political appeal. The intersection of technology and regulation will remain dynamic, requiring both innovation and adaptation to ensure individuals' privacy and security in an interconnected world. As new challenges arise, the push for stronger enforcement, clearer guidelines for businesses and more comprehensive protections for vulnerable groups like children will be central to the ongoing dialogue. The coming years will shape the future of privacy and AI law, balancing innovation with the need to protect personal freedoms and individual protections in the coming age.

Omer Tene and Jacqueline Klosek, CIPP/US, are partners at Goodwin Procter.