Last year was a rollercoaster year for data protection and privacy professionals, but expect 2023 to call and raise 2022’s activity at the state, federal and international levels. Privacy, as we all know, is recession-proof. These will be some of the highlights:

US state laws go into force

The most important development this year is the entry into force of U.S. state privacy laws, including the California Privacy Rights Act and Virginia Consumer Data Protection Act Jan. 1; Colorado’s and Connecticut’s laws July 1; and Utah’s Dec. 31.

The onslaught of state laws marks a seismic shift in the U.S. Businesses often perceive EU General Data Protection Regulation enforcement as focused on large technology platforms. In contrast, in its first enforcement action under the California Consumer Privacy Act, the California attorney general signaled its intention to pursue more nimble targets such as Sephora, a cosmetics website. If Sephora is on the hook, so could you.

The CPRA amends and adds to the CCPA in many ways. These aspects will be most impactful:

First, the law creates a new dedicated privacy enforcement agency — the California Privacy Protection Agency. While the California attorney general is a formidable regulator, it is in charge of not only privacy but also a plethora of other laws and regulations. The CPPA is the first real data protection authority in the U.S., and it is headed by Ashkan Soltani, a staunch privacy advocate.

Second, the CCPA will now apply to employee data and business contact information. This is the first time an information privacy law in the U.S. extends to these constituencies, who will now benefit from the full slate of data rights.

Third, the CPRA introduces a consumer opt-out of sharing data for cross-context behavioral advertising. Importantly, companies that acted as service providers under the CCPA will no longer be able to claim that role if their activities advance CCBA. The CPRA clarifies that service providers for the purpose of CCBA are automatically third parties. And if you’re a third party, you’re involved in a sale and must honor consumer opt-outs, including ones sent by an automated preference signal such as the Global Privacy Control.

When trying to implement the new opt-out rule, businesses will quickly realize the devil is in the details. There’s a great deal of uncertainty whether activities on the periphery of the legal definition of CCBA — such as retargeting, lookalike audiences, measurement and attribution — constitute CCBA.

Even if there is consensus on these terms, compliance won’t be straightforward. How should a publisher or advertiser send an opt-out signal downstream? In the absence of deterministic identifiers, how can individuals maintain their opt-out status over time and across websites and applications? What is the distinction between “advertising and marketing services” — which are recognized as legitimate business purposes under the CPRA — and CCBA? These are just some of the questions that will need to be answered with the law already coming into force.

Fourth, the CPRA requires businesses to provide not only a privacy notice but also a robust notice at collection. Importantly, it will no longer suffice to direct consumers to a privacy notice or to require them to scroll to find the notice at collection. If businesses choose to provide notice at collection through a privacy policy, they must hyperlink directly to the section of the privacy policy that includes the requisite disclosures.  

Fifth, the CPRA dictates language that businesses must use in contracts with downstream service providers or contractors. This includes restrictions on combining personal information across customers (an issue that routinely comes up in vendor contracts), a duty to monitor compliance and sub-processor obligations. If you don’t have the requisite language in your contracts, you are, by legal definition, selling data in potential breach of the CCPA. 

As of Jan. 1, the CCPA is no longer the only general state privacy law in the nation. Virginia's privacy law is now in force, and while many of its provisions are similar to the CCPA, some aren’t. Critically, while the CCPA, including CPRA amendments, remains an opt-out law, Virginia's law requires businesses to obtain consumers’ opt-in to process their sensitive data. The shift from an opt-out to an opt-in paradigm is profound. It will present potentially insurmountable obstacles to third parties transacting in consumer personal information, including, primarily, data brokers.

BIPA be BIPA

The privacy law that companies fear most is Illinois' Biometric Information Privacy Act, given its private right of action, which individuals and class-action plaintiffs have deployed with extraordinary zeal.

The BIPA will continue to spur litigation in 2023. More ominously for some, it could motivate copycat legislation across the country, focused on either biometric information (e.g., Montana) or sensitive data more broadly (e.g., Connecticut’s privacy law and Oregon’s draft law).

Legislated more than a decade ago, the BIPA leaves much to be desired in terms of the clarity of its definitions and its adaptability to today’s technological landscape. For example, it lacks the all-important distinction between businesses (controllers) and service providers (processors), spurring a contractual morass between tech vendors and their customers as well as additional parties up and down the biometric data chain.

Other challenges in the BIPA, as well as additional biometric privacy laws, include:

  • Lack of a crisp distinction between use of biometrics, in particular facial recognition, for identification, authentication, detection or characterization;
  • Tension between biometric data minimization mandates and requirements in emerging artificial intelligence frameworks to mitigate algorithmic bias and discrimination.

Sensitive data: kids and reproductive health

In the absence of comprehensive U.S. privacy law, policymaking efforts will likely focus on high-risk areas — namely, children’s privacy and data about reproductive health.

Last August, California passed the California Age-Appropriate Design Code, which is modeled after the U.K. The CAADC significantly expands the scope of and protections afforded by the federal Children's Online Privacy Protection Act.

COPPA applies strictly to sites or services directed at children under age 13 or to general audience sites with actual knowledge of collecting, using or disclosing personal information from children under 13. Its main mandate is to require such sites or services to obtain verifiable parental consent.

The CAADC applies to any business that provides an online service, product or feature likely to be accessed by children under age 18. As a mental exercise, try to think of sites or services that are unlikely to be accessed by children under 18. Even adult-oriented services, such as porn or gambling sites, are very much likely to be accessed by teens, and will therefore have to comply with the CAADC.

Importantly, compliance obligations under the CAADC are far broader than just obtaining parental consent, and include (a) deploying age-estimation obligations, which could entail collecting additional personal information from users, including biometrics such as face scanning; (b) configuring all default privacy settings to offer a high level of privacy; and (c) conducting data protection impact assessments, which the attorney general may request to see at three-day notice.  

In the context of health-related information, the Supreme Court’s decision in Dobbs v. Jackson Women’s Health sent shockwaves throughout the tech industry. Several states already passed laws intended to prevent local courts and businesses from complying with out-of-state law enforcement requests related to anti-abortion laws. Many businesses collect data that could be implicated in such investigations, including not just period-tracking apps and health services, but also apps that collect precise geolocation or communications data. These businesses need to strategize their approach to possible law enforcement requests.

Federal regulators also responded to the decision. The U.S. Department of Health and Human Services' Office for Civil Rights issued "Guidance to Protect Patient Privacy in Wake of Supreme Court Decision on Roe." And in August 2022, the Federal Trade Commission sued Kochava, a data broker, alleging it sold consumers’ precise geolocation information, explicitly mentioning the potential use of such data to track an individual to an abortion clinic. In 2023, expect the FTC to remain focused on preventing businesses from exposing consumers to legal risks attendant to post-Dobbs regulation of reproductive health.

Artificial Intelligence

If, like me, you’ve played with ChatGPT over the holiday break, you too will know the feeling that consumer-facing artificial intelligence has arrived. Policymakers around the globe have addressed the emerging tech transformation over the past few years with starkly different approaches. The EU is advancing its AI Act, a highly prescriptive framework addressing AI as a product liability issue governed by certification schemes and regulatory oversight. Canada has elected a more principles-based approach in its latest Artificial Intelligence and Data Act (Bill C-27).

In the U.S., the National Institute of Standards and Technology's AI Risk Management Framework is intended for voluntary implementation to mitigate risks in the design, development, use and evaluation of AI products, services and systems.

U.S. states too have been active on the AI front, legislating rules on algorithmic decision-making. Even cities such as New York instituted rules requiring that before being used, AI and algorithm-based technologies for recruiting, hiring or promotion be audited for bias. This year, expect businesses to charge privacy leaders with handling AI systems’ auditing and impact assessments, to ensure such systems are compliant, ethical and fair.

Conclusion

Hot on the heels of a landmark year, privacy seems poised to soar to new heights in 2023. The entry into force of a slate of state laws will keep compliance departments busy for the foreseeable future. Signature issues such as kids’ privacy and protection of data concerning reproductive health will gain momentum. And the portfolio of privacy leaders will expand to cover algorithmic fairness and ethical AI.