In November 2020, a majority of Californians (56.1%) voted to pass Proposition 24 — establishing the California Privacy Rights Act. While the CPRA’s provisions become enforceable in 2023, many aspects of the law come into effect now, including the creation of a new California Privacy Protection Agency and a period of formal rulemaking that could begin as early as July 2021.
While preserving the CCPA’s existing consumer rights, the CPRA establishes a range of new protections, including and perhaps most importantly, the right for consumers to limit the use and disclosure of their sensitive personal information. This new right may yield many positive benefits for consumers, by limiting certain instances where the use and sharing of sensitive personal information causes a risk of harm. Harms associated with the misuse of sensitive personal information have been well-documented, including risks of profiling associated with dating platforms; menstruation apps sharing data about pregnancy status with employers; and mental health apps sharing information with advertisers.
However, because the CPRA’s right to limit uses of sensitive personal information is structured as an opt-out, it may not go far enough to protect consumers.
However, because the CPRA’s right to limit uses of sensitive personal information is structured as an opt-out, it may not go far enough to protect consumers. In other instances, the opt-out may result in the loss of data needed for important research, for example, research regarding the impact of certain services on minority communities. At the same time, greater clarity is needed for companies that may still be uncertain whether the data they are processing falls under CPRA’s definitions.
Guidance from the California attorney general or new California Privacy Protection Agency could help provide greater clarity regarding cases where distinctions between sensitive and non-sensitive personal information are difficult to draw. The Future of Privacy Forum has recently started working with stakeholders on this issue since such questions are important to address sooner rather than later, especially if emerging comprehensive privacy proposals on the state and the federal levels begin to look toward the CPRA as model legislation.
What categories of sensitive personal information are included under the CPRA?
It has long been understood that certain types of data may warrant enhanced levels of protection due to that data’s potential for harm. For example, an individual’s financial information may facilitate risk or harm in the form of identity theft, while information pertaining to an individual’s sexuality may lead to disparate access to housing and medical services. This harms-based concept is present under the CPRA, where “sensitive personal information” is defined as any personal information that reveals an individual’s:
- Government ID — a consumer’s Social Security, driver’s license, state identification card, or passport number.
- Finances — a consumer’s account log‐in, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account.
- Geolocation — a consumer’s precise geolocation.
- Race, religion and union membership — a consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership.
- Communications — the contents of a consumer’s private communications, unless the company is the intended recipient of the communication.
- Genetics — a consumer’s genetic data.
- Biometrics — the processing of biometric information for the purpose of uniquely identifying a consumer.
- Health — personal information collected and analyzed concerning a consumer’s health.
- Sexual orientation — personal information collected and analyzed concerning a consumer’s sex life or sexual orientation.
While these above-listed categories have been identified as “sensitive,” others may yet be added through California’s rulemaking process, and the definitions of each, similarly, are subject to change.
For entities considering the ramifications of using and sharing publicly available information, it’s important to note that the CPRA contains a broad exemption for such data — even if that information contains sensitive details about an individual.
Under the CPRA, publicly available information includes information gathered from government records, made available by a consumer, or from widely distributed media. However, this exemption does not extend to biometric information collected by a company about a consumer without the consumer’s knowledge. This provision might prohibit companies from scraping, for example, widely available images of individuals in order to populate facial recognition databases for law enforcement use.
If you’re wondering how these categories map to the “special categories of data” within the EU’s General Data Protection Regulation, there are important differences.
The GDPR’s special category data (under Article 9) includes information revealing ones’ racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership; or information concerning a person’s data health, sex life or sexual orientation, and genetic or biometric data processed for purposes of identification. While the CPRA does not include trade union membership in its sensitive personal information provision, it does include all the other categories covered by the GDPR, as well as additional categories, like government issues identifiers, financial account information, consumer communications, and precise geolocation.
However, it should be noted that under EU law there are some additional restrictions for accessing communications data and metadata, as well as for geolocation data from devices, under the strict opt-in consent rules of the ePrivacy Directive.
The CPRA’s opt-out procedures for sensitive personal information and obligations on companies
While the CCPA allows consumers to opt-out of the sale of their personal information, the CPRA augments that right — subject to certain exemptions — by additionally allowing consumers the ability to opt-out of the use and disclosure of their sensitive personal information. An important first step for companies regarding CPRA compliance will be mapping out the types of data they collect, use, and disclose — and whether that data falls into any of the categories listed above as sensitive.
Under the CPRA, companies that use or disclose sensitive personal information must (except in the limited circumstances): (1) provide notice to consumers, and (2) provide “a clear and conspicuous link on the company’s internet homepage(s), titled "Limit the Use of My Sensitive Personal Information," which enables a consumer, or a person authorized by the consumer, to limit the use or disclosure of their information. In addition, the CPRA places data minimization requirements on the retention of all personal information, requiring companies to keep the information, including sensitive personal information, no longer than is “reasonably necessary” to fulfill purposes disclosed to the consumer.
Is the CPRA a good model for regulating the use and disclosure of sensitive personal information?
By allowing individuals the choice to opt-out of the use and disclosure of their sensitive personal information, and imposing certain obligations on companies that use this data, the CPRA may limit some harms. However, leading proposed federal bills, including the Consumer Online Privacy Rights Act, introduced by Sen. Maria Cantwell, D-Wash., and supported by other ranking members of the Senate Commerce Committee in late 2019, seeks to provide a higher standard of consent for sensitive data, as does the draft Washington Privacy Act.
The CPRA’s opt-out procedure also departs from the GDPR, where a lawful ground is needed for collecting or using any personal data (under Article 6) and where collecting or using sensitive personal information is generally prohibited as a rule (under Article 9(1)). Under the GDPR, in order to be able to process sensitive personal information, a specific permission listed under Article 9(2) must be applicable, such as explicit opt-in consent, providing medical services, or for scientific research purposes, only as long as necessity and proportionality conditions are met.
Unlike the CPRA, the GDPR’s approach creates a multi-tiered system regarding the protection of sensitive personal information.
In addition to issues tied to the opt-out procedure itself, individuals often disagree on what information they view as sensitive. Personal data can be used to draw incredibly intimate inferences at the individual-level (i.e., shopping patterns, personality traits, habits, and other revealing characteristics) and those inferences can be harmful if used in ways that oppress or otherwise wrong individuals.
Furthermore, as technology evolves, sensitive information may also be able to be discerned from categories of information not traditionally thought of as sensitive (for example, emerging advancements in voice analysis that promise to discern an individual’s COVID-19 status through the sound of their cough).
But in a world where any data can, in theory, be used to discern a sensitive inference, treating all data as sensitive is impractical. It’s necessary to draw some bright lines and look to profiling or discrimination protections to fill any gaps where harms may exist. The CPRA’s definition of “profiling” — as well as the new right of access to automated decisions the law enshrines — could act as such a barrier to data-driven harms; however, much of that will hinge on the rulemaking process. The CPRA also seeks to address issues of overbreadth, by allowing an exemption for instances where processing or collecting sensitive data is incidental.
Questions that may be addressed through rulemaking
During the CPRA’s rulemaking period, either the California Attorney General or the upcoming California Privacy Protection Agency will begin to tackle important questions not answered in the text of the law. Those questions may include how to treat cases where sensitive information is collected only incidentally, given that such incidental collection will be exempted from certain obligations. In addition to issues of clarity, the following few questions will be helpful for the new agency or the attorney general to ponder in the coming months:
- How will the CPRA regulate fitness and lifestyle data? Certain data, such as how many steps a particular individual has taken on a given day or how many hours of sleep they have had, may be considered non-sensitive fitness or lifestyle data, but, in some instances, this data could be construed as sensitive health data — for example, when such information is used to make inferences about a person’s physical or mental wellbeing. Additional guidance regarding the distinction between non-sensitive fitness or lifestyle data and sensitive health information may create important distinctions for companies operating in the direct-to-consumer wellness space and consumer care arena, as well as make compliance easier for those entities that may not have considered the data they hold to constitute CPRA-covered health data.
- How will the CPRA deal with data-driven inferences? Inferences, including those drawn from proxy information, could lead to interesting areas of CPRA compliance. It’s questionable whether, for instance, the CPRA would disallow the infamous Target case, for example, where a teen’s pregnancy was revealed to her family members on the basis of her past purchases. Under the CPRA, pregnancy status may be considered sensitive as revealing a consumer’s health, but questions remain regarding whether providing product recommendations on the basis of historical purchases constitutes the use of health information. For example, records revealing the purchase of a pregnancy test might constitute consumer health information, but what about purchases of zinc vitamins or other products that may be correlated with pregnancy? Large datasets often hold the capacity to reveal patterns or behavior that wouldn’t otherwise be uncovered, and strict sensitive data categories may be over- or underinclusive for covering data-driven inferences.
- How will sensitive personal information provisions impact existing research efforts? There may be questions about ways in which the sensitive personal information provision intersect with other areas of the CPRA, particularly when sensitive information is used to conduct socially beneficial research or ensure against discrimination or bias in data-driven systems. For example, CPRA’s research exemption allows for data to be used for commercial research with “informed consent,” a standard that may lead to consent bias and hinder some ongoing commercial-academic research partnerships. Analyzing data pertaining to an individual’s sensitive information, such as sexuality or ethnicity may, in some cases, help solve issues tied to disparate access to treatment and vaccines. Many questions remain on how the CPRA intends to best effectuate socially beneficial research that relies on the use of sensitive personal information.
Now that the CPRA has passed, California is the first state to implement a comprehensive privacy law that treats sensitive information as deserving of protections above those afforded to other types of personal data. Regarding the efficacy of the CPRA’s opt-out mechanism for sensitive personal information, it will be important, for example, to see whether consumers exercise and find useful the right to limit the use or disclosure of their sensitive personal information.
Moving forward, the California attorney general or new California Privacy Protection Agency could help provide more nuance to existing definitions of sensitive data, as well as provide guidance in cases where distinctions between non-sensitive and sensitive personal information are difficult to draw.
Photo by freestocks on Unsplash
This is a 10-part series intended to help privacy professionals understand the operational impacts of the CPRA, including how it amends the current rights and obligations established by the CCPA.
If you want to comment on this post, you need to login.