TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | CPPA's draft automated decision-making rules unpacked Related reading: CPPA releases initial automated decision-making rules proposal



The California Privacy Protection Agency's draft regulations on the use of automated decision-making technologies are the first preview of how artificial intelligence processing could be regulated under the California Consumer Privacy Act, as part of a CPPA workstream that began in the fall of 2021.

When formally proposed and adopted, the ADMT rules will join the expanding set of privacy laws that include rules about the use of AI in contexts related to decision-making and profiling. Like in other jurisdictions, the regulations would require companies to provide enhanced transparency and choice around the use of certain AI systems.

The draft signals that the CPPA could bring an expansive set of practices within the remit of its AI decision-making rules, which will require risk assessments, new dedicated notices, access to an explanation of automated process and a mechanism for consumers to opt out of automated processing. The diverse set of proposed contexts where this applies potentially includes behavioral advertising, public spaces, employment and even the use of personal data to train AI systems.

In context

The rules are an early draft, as the agency is continuing to lay the groundwork for triggering its formal rulemaking process. Clarified rules for automated decision-making technologies are required under the California Privacy Rights Act, which instructed the CPPA to issue regulations "governing access and opt-out rights with respect to businesses' use of automated decision-making technology, including profiling." The law does not mention ADMT anywhere else.

The draft is an initial recommendation from the CPPA's Rules Subcommittee in coordination with agency staff. According to the CPPA's accompanying guidance, the draft is meant to "facilitate Board discussion and public participation and is subject to change."

IAPP Westin Research Fellow, Andrew Folks, CIPP/US, published a breakdown of the CPPA's rulemaking process in August after the agency released initial drafts of two other potential regulations covering cybersecurity audits and risk assessments. The CPPA Board is scheduled to discuss the draft at its 8 Dec. public meeting, along with a staff presentation on the regulations. The board is unlikely to begin a formal rulemaking process at that time but could suggest future direction. The CPPA already received public comments on this tranche of regulations, but formal rulemaking will trigger another comment opportunity.

Facilitates human decision-making 

As ADMT is not defined by statute, the CPPA will need to create a definition through its rulemaking process. The draft rules reiterate the definition of ADMT previewed by the Board this summer: "any system, software, or process — including one derived from machine-learning, statistics, other data-processing or artificial intelligence — that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decision making. ADMT includes profiling."

The inclusion of systems that facilitate human decision-making puts the California proposal in line with the broadest portions of the automated processing requirements in Colorado's privacy regulations, described as "human involved automated processing." While Colorado's regulations include some exceptions to the provision of opt outs for such systems, the draft rules apply the same exceptions regardless of the level of human involvement.

In contrast, the EU General Data Protection Regulation prohibits "fully" automated decision-making without a legal basis. The expanded scope of application of some U.S. state laws, even though they may have less strict requirements for covered systems, will continue to challenge privacy professionals to effectuate opt-out rights within a broadening set of contexts.

Board to discuss when rules apply

The draft lists six possible threshold situations where ADMT would be subject to notice, access and opt-out rights, including use for a decision that produces legal or significant effects. and five possible different uses of profiling.

Legal or significant effects are defined to include a list of situations that are important to consumers' everyday lives. Businesses would need to provide opt-out and access rights if the automated decision results in access to, or the provision or denial of:

  • Financial or lending services, housing, insurance.
  • Education enrollment or opportunity.
  • Criminal justice.
  • Employment or independent contracting opportunities or compensation.
  • Health care services. 
  • Essential goods or services.

The definition of "profiling" is already included in the amended CCPA, but the statute provides that the CPPA can "further define" profiling as part of these regulations. So far it has kept the definition intact, but it has listed multiple contexts where profiling would be subject to the ADMT rules.

The draft rules would cover any profiling of consumers in their capacity as employees, contractors, job applicants or students as well as any profiling that takes place in a publicly accessible place.

The final three thresholds proposed in the draft are flagged for board discussion, which could preview future disagreements both within and outside the agency.

Profiling for behavioral advertising purposes is flagged in the draft as "subcommittee option for board discussion." Flagged as "additional options for board discussion" are profiling of consumers under the age of 16 and "processing the information of consumers to train automated decisionmaking technology." 

The latter would mean that consumers would be due notice and opt-out choices from the use of their personal information to train certain AI systems, which could have a wide impact on the ongoing debate around proper safeguards for AI training.

A series of exceptions

The proposed ADMT regulations would not apply to AI systems that would otherwise be covered, if the use of the system is "necessary to achieve, and used solely for" security, fraud, safety or delivery of the service. To qualify for an exception, the use of ADMT must remain compliant with the CCPA's data minimization requirements and the business must document its reasoning behind the application of the exception.

Further, if a business claims the exception for the delivery of the service, it would be required to document the fact that there is no "reasonable alternative method" to deliver the service without AI.

Enhanced transparency

The regulations would require a new "pre-use" notice to describe the purpose for which the business will use covered ADMT, alongside existing privacy notice requirements under the CCPA. The new notice would include information about the consumer's right to opt-out and access information about the use of ADMT along with a direct link to an opt-out mechanism.

The draft rules would also provide consumers with a "right to access" information about a business's use of ADMT. This would add to the existing California right to know about the personal information a business collects and how it is used and shared.

When a consumer requests additional information about the use of ADMT, the business would be required to describe the logic used in the ADMT, its intended output, how the business intends to use the output to make a decision and whether the system has been evaluated for validity, reliability and fairness.

 Even without a consumer request, a business would be under an obligation to provide a direct notice to consumers if it makes a decision using the ADMT that results in the denial of goods or services. This final element of transparency would include an explanation about the denial along with information about access rights and the right to file a complaint with regulators.

A right to opt out of ADMT

How the proposed rules are applied in practice depends on the timing of a consumers' opt-out request. If it occurs before the processing, the rules would require companies not to process the consumer's personal information using the ADMT.

If it occurs after the processing, the opt-out would trigger a requirement to cease processing within a maximum of 15 days, or as soon as feasibly possible, and notify service providers to effectuate the opt out.

AI risk assessments

The draft ADMT rules are meant to be read in tandem with the draft risk-assessment regulations that are also scheduled for Board discussion in December.

Under the draft rules, risk assessments would be required whenever the processing of personal information "presents significant risk" to consumers' privacy. As proposed, this would include all contexts of the use of automated systems regulated in the ADMT rules.

AI governance and privacy pros alike will be watching the California rulemaking process closely. Once the formal process begins, there will be renewed opportunity for stakeholders to comment on the details of the draft rules. Regardless of their final shape, the ADMT rules will likely have an outsized impact on data governance processes for years to come.

Credits: 1

Submit for CPEs


If you want to comment on this post, you need to login.