The California Privacy Protection Agency is not expected to begin the formal rulemaking process on its latest set of regulations, including those related to automated decision-making technology, until September at the earliest. The holding pattern stems from ongoing concerns voiced by the CPPA Board and relevant stakeholders during a 16 July board meeting.

The proposed rules on ADMT, risk assessments and cybersecurity audits have been closely watched as California, typically a trend setter for large-scale legislation, looks to set standards that will affect technology such as general software, artificial intelligence and machine learning.

How the agency defines automated decision-making and requirements around the impact of the technologies employing it continue to be sticking points. During the latest board meeting, it was clear those concerns, as well as more analysis of economic implications stemming from the proposed regulations, would need to be addressed before settling on a final proposed text.

The agency needs the board approval to send a regulatory impact assessment to California’s department of finance, which must be done 60 days before it can start another, more formal period of public comment lasting 45 days. Pushing a potential decision to start the process until September means it is likely the board will not finalize the rules until 2025, and they will not take effect until next summer, according to the California Office of Administrative Law.

The holdup

CPPA Board Member Vinhcent Le said the length of the leadup to this particular rulemaking and the formal process that follows might feel arduous for those looking for certainty around how ADMT can be used. As the CPPA has worked on these proposed rules, an endeavor that began in the fall of 2021, Colorado became the first state to enact rules around ADMT in the context of a privacy law.

Le argued it is worthwhile to get the rules right and lined up with more global regulations, nodding to the long road to finalizing the EU AI Act. The law setting standards for what constitutes as “high-risk” AI and how it can be used is set to take effect in August, nearly six years after the European Commission began looking into the topic.

“They’re just finishing this year and they had thousands of people building that; they had high-level working groups, multi-country consultation,” Le said. “The work we’ve been able to do and to end up in a place where (we’re) complimentary to the EU AI Act, but in a California context … has been a great accomplishment.”

Hurdles to climb

Aspects of the current draft ADMT regulations did not sit well with board member Alastair Mactaggart, an entrepreneur who led the California Privacy Rights Act ballot initiative that created the CPPA.

Picking up on his skepticism of the proposed rules during the board's April meeting, Mactaggart continued to express frustration at what he said was the broadness of how AI is defined in the rules. He also took issue with proposed risk assessment requirements for businesses which process personal information and present a significant risk for a person's privacy.

Tying the need for a risk assessment to ADMT could pose a problem down the road, according to Mactaggart, noting automated technology continues to evolve at a rapid pace. He said regulators should be more concerned with what happens to a consumer's information and how to protect that, not necessarily how the decision is made.

"Basically, if you use this technology, you're going to have to do a risk assessment," Mactaggart said, "and I don’t think that's what we want."

Similarly frustrated was Robert Hertzberg, a former speaker of the California State Assembly and majority leader of the California State Senate who helped pass the California Consumer Privacy Act. Hertzberg told board members they should wait to see which of the dozens of bills related to AI pending in the state legislature pass at the end of the 2024 legislative session, as well as consult Gov. Gavin Newsom, D-Calif., before trying to write the rules.

The legislature adjourns 30 Aug., and the governor has 30 days to sign off on bills that cleared the legislature afterward.

Hertzberg said it was also unclear if the CPPA should try to regulate AI through the lens of privacy, saying the technology will "affect every inch of California."

"These guys are trying to jump over an entire mechanism of government when it comes to AI," Hertzberg told the IAPP in an interview. “Who cares if it takes them six months to a year let’s get it right."

Board members agreed they would like to hear more feedback on the rules from the public as well as more detail on their economic impact.

A preliminary assessment from the agency estimated it would cost California businesses USD1.4 billion to comply with the rules. The figure assumes all California businesses will be subject to the rules and how many hours software developers will need to come into compliance.

Caitlin Andrews is a staff writer for the IAPP.