Since its public release on 7 April, the American Privacy Rights Act discussion draft has generated much buzz within the privacy community. Privacy professionals continue to process the published text to better understand the operational implications that might arise if the discussion draft becomes the U.S.'s first comprehensive federal privacy law.

The APRA draft enumerates 18 categories of sensitive covered data, a subset of "covered data" to which heightened protections attach. Under the draft, covered entities may not collect, process, retain or transfer covered data unless such actions are necessary, proportionate and limited to one of the permitted purposes. The draft goes a step further for sensitive covered data, requiring affirmative express consent before any transfer to a third party unless the transfer is necessary, proportionate and limited to a permitted purpose.

Understanding the proposed categories for sensitive covered data requires context, including with respect to other relevant regulatory frameworks at the state, federal or international level.

Government-issued identifiers

The APRA discussion draft's list of sensitive covered data categories begins with "a government-issued identifier, such as a social security number, passport number or driver's license number, that is not required by law to be displayed in public."

The California Consumer Privacy Act is the only state to include a similar category, but it sets forth an exhaustive list of common government identifiers, including social security numbers or driver's licenses. By designating the entire category of "government-issued identifiers" as sensitive, the APRA discussion draft leaves flexibility for future identifiers not yet in use. The APRA draft does not extend protections to publicly displayed identifiers such as license plates, which would instead qualify as covered data under the draft.

Health information

The discussion draft's broad definition of health information follows the state-level trend of expanding health privacy protections. It includes "any information that describes or reveals the past, present or future physical health, mental health, disability, diagnosis, or health care condition or treatment of an individual."

Including "treatment" as well as "condition or diagnosis" are additions otherwise only featured in New Jersey's comprehensive privacy law. Notably, the definition includes "the precise geolocation information of such treatment," an expansion echoing Washington state's far-reaching health information definition.

Significantly, the discussion draft expressly excludes the use of health information from the permitted purposes for sensitive covered data for use in response to criminal activity. This provision was likely designed to safeguard women's reproductive health data in response to recent state laws criminalizing such care.

Genetic information

The APRA's discussion draft defines genetic data as "any covered data, regardless of its format, that concerns an identified or identifiable individual's genetic characteristics" such as "raw sequence data" or "genotypic and phenotypic information" from an individual's DNA. In practice, this will cover information collected, processed and stored by direct-to-consumer genetic testing companies that offer ancestry tracing and diagnostic genomic reports.

Genetic information is one of two sensitive covered data categories that includes additional obligations along with biometric information. For these categories, an entity must obtain affirmative express consent prior to collecting, processing or retaining such data.

This provision echoes comprehensive privacy laws in Connecticut, Colorado and Virginia, which require opt-in consent for processing sensitive data. While the CCPA considers a consumer's genetic data sensitive, it only mandates a business provide consumers with an option to limit the use or disclosure of that data. Instead, the Genetic Information Privacy Act, which applies to direct-to-consumer genetic testing companies, provides individuals the right to revoke consent, request access to genetic data, require companies to delete their accounts and data, and request to destroy biological samples.

Financial information

The APRA discussion draft breaks from state consensus here by regarding as sensitive "a financial account number, debit card number, credit card number, or any required security or access code, password, or credentials allowing access to any such account or card."

To date, only California and New Jersey consider such data sensitive, with New Jersey requiring opt-in consent for the collection, processing and retention of financial information. The APRA's inclusion of this category is a holdover from its predecessor, the American Data Protection and Privacy Act. At first, this provision appears to overlap in part with the Gramm-Leach-Bliley Act, under which data is excluded from the APRA's jurisdiction through safe harbor-like provisions. However, the two laws regulate different types of financial information, with the GLBA covering a narrowly defined set of financial institutions and the APRA protecting the broader types of financial data listed in this provision.

Biometric information

The APRA discussion draft defines biometric information as "any covered data that is specific to an individual and is generated from the measurement or processing of the individual’s unique biological, physical or physiological characteristics that is linked or reasonably linkable to an individual." This includes fingerprints, voice prints, iris or retina imagery scans, facial or hand mapping, geometry templates, and gait. However, digital or physical photographs, audio or video recordings, and associated metadata associated that cannot be used to identify an individual are excluded. As with genetic information, covered entities must obtain affirmative express consent to collect, process or retain an individual's biometric information, except for limited permissible purposes.

This category has come into recent legislative focus, with Illinois legislators working to amend the state's oft-litigated Biometric Information Privacy Act — arguably the nation's most robust biometric protection law due to its private right of action. Privacy laws in California, Connecticut, Utah and Virginia all consider biometric information sensitive, as does the EU General Data Protection Regulation. Regulators, too, have recently brought attention to biometrics, enforcing against businesses misrepresenting usage of facial recognition technology, among other things.

Precise geolocation information

Location data has been a point of regulatory emphasis nationwide for its propensity to reveal sensitive characteristics about an individual. High-profile Federal Trade Commission settlements and court battles have spotlighted concerns around the widespread sharing of geolocation data.

Across the board, states consider precise geolocation data sensitive. States do diverge, however, in their definitions of what constitutes "precise." The CCPA uses 1,850 feet as the radius within which a consumer must be located for the data to be considered precise and thus sensitive. Following the Washington Privacy Act model, all other states instead use 1,750 feet. The APRA eschews the general trend and uses 1,850 feet, a measurement also found in its predecessor, the ADPPA.

Communications data

The APRA designates as sensitive "an individual's private communications, such as voicemails, emails, texts, direct messages or mail, or information identifying the parties to such communications, information contained in telephone bills, voice communications, and any information that pertains to the transmission of voice communications, including numbers called, numbers from which calls were placed, the time calls were made, call duration and location information of the parties to the call, unless the covered entity is an intended recipient of the communication."

This section of the APRA draft includes content data, which has long been afforded Fourth Amendment protection from government search and seizure and, in the private sector, has been subjected to heightened safeguards under the Federal Communications Commission's jurisdiction. This provision follows California, the only state to consider this category sensitive, in excluding contents of messages directed to businesses. Notably, this section provides that metadata — information indicating the circumstances surrounding communications — should receive the same protection from collection, processing, retention and transfer as content data.

Log-in credentials

With different wording, the CCPA is the only state to protect this type of data, designating "a customer’s account log-in, password, or credentials allowing access to an account" as sensitive. Both the APRA and CCPA deviate from the GDPR and similar international laws, which provide no heightened protection for such credentials.

Sexual behavior

States differ in their terminology for this category in ways that lead to distinct legal meanings and practical implications, while the APRA draft defines it as "information revealing the sexual behavior of an individual in a manner inconsistent with the individual's reasonable expectation regarding disclosure of such information." The APRA draft adopts the term "sexual behavior," consistent with the 2022 draft ADPPA yet a marked departure from most state privacy laws. Many state laws refer to "sexual orientation," which explains an individual's sexual identity within defined categories or, as in Texas' bill, refer to "sexuality" as a broad term that could be interpreted to implicate an individual's sexual desires, acts or health.

While the APRA does not specifically define "sexual behavior," its common definition suggests it may be understood to reference an individual's specific acts of sexual conduct, similar but arguably narrower than the GDPR's "sex life" category. Additionally, this APRA provision includes the subjective reasonable person standard, requiring the individual to have reasonably expected the information regarding their sexual behavior not to be disclosed for such information to be considered sensitive.

Calendars, address book information, phone or text logs, photos, audio recordings, or videos intended for private use

The APRA discussion draft expands the sensitive covered data categories to include data typically stored on smartphones and tablets that reveal personal information about an individual's daily habits and social interactions — with the caveat that such data must be intended for private use. Notably, these categories of data are not considered sensitive under any state comprehensive privacy laws nor under the GDPR, though they have been included previously in proposed federal privacy bills, including the ADPPA, the Consumer Online Privacy Rights Act and the recently-enacted Protecting Americans Data from Foreign Adversaries Act.

Nude or intimate digital media

This provision explicitly designates any "photograph, film, video recording or other similar medium that shows the naked or undergarment-clad private area of an individual" as sensitive, regardless of the individual's consent or reasonable expectation of disclosure. While no comprehensive privacy laws specifically characterize nude photos as sensitive, this addition to the APRA is not without precedent. Many states, including Colorado, have enacted nonconsensual pornography laws that similarly criminalize "posting or distributing the intimate parts of another identifiable person" without their consent.

Nonconsensual intimate imagery also comes up in the discussion draft's reference to publicly available information. The APRA explicitly excludes publicly available information from covered data but excepts from this exclusion "intimate images, authentic or computer-generated, known to be nonconsensual." The addition of "computer-generated" here reflects lawmakers' efforts to address harms from generative deepfake pornography.

Transferred video viewing habit information

This category designates "information revealing the extent or content of any individual's access, viewing or other use of any video programming described in section 713(b)(2) of the Communications Act of 1934 (47 U.S.C. 613(h)(2)), including by a provider of broadcast television service, cable service, satellite service or streaming media service, but only with regard to the transfer of such information to a third party and excluding any such data used solely for transfers for independent video measurement" as sensitive. In the ADPPA, this provision was included under the section prescribing loyalty duties onto covered entities and service providers, and it likewise restricted third-party transfer. In the APRA, it is included as a separate sensitive data category but, again, only affords heightened safeguards with respect to third-party disclosures.

No states have expressly extended such protections to video viewing habit information, but the California attorney general has prioritized proper protections for such information via an investigative sweep of streaming services to celebrate Data Privacy Day 2024.

Video viewing habit information collected by other covered entities

This category includes "information collected by a covered entity, that is not a provider of a service described in clause (xii), that reveals the video content requested or selected by an individual, excluding any such data used solely for transfers for independent video measurement." The twelfth and thirteenth categories are complementary in that the former applies to typical broadcast and streaming platforms and services and the latter applies to all other covered entities that handle video viewing habit information. Plaintiffs often bring claims of unlawful disclosure of video viewing habit information under modern interpretations of the Video Privacy Protection Act. Defendants in these cases generally do not resemble traditional video providers, often including social media platforms and news apps, among others.

These categories notably exclude "data used solely for transfers for independent video measurement," leaving information used for attribution and ratings systems to be governed by the obligations for covered data.

Race, ethnicity, national origin, religion or sex

The draft APRA's above list of personal identifiers is a marginally narrower group compared to the lists outlined in the GDPR and many state laws. Both comprehensive laws include data revealing union membership in this category, while the CCPA adds citizenship, immigration status and philosophical belief, rather than only established religion. All state comprehensive privacy laws include race, ethnicity and religion, while only Maryland and Oregon include national origin.

APRA drafters again inserted a reasonableness standard to this category, qualifying that data fitting the above description is sensitive only if the individual did not reasonably expect such data to be disclosed. Such a standard affords flexibility to the interpretation of what data privacy practices courts and regulators may deem disclosure violations.

Online activity

The APRA draft has followed the FTC in declaring as sensitive "information revealing an individual's online activities over time and across websites or online services that do not share common branding or over time on any website or online service operated by a covered high-impact social media company." Visiting a website or app typically results in such information about an individual being collected, retained and possibly sold. Thus far, no states have considered browsing data sensitive.

This provision includes as sensitive browsing data from any "high-impact social media company" or covered entity that provides an internet-accessible platform that generates more than USD3 million in global annual revenue, has 3 hundred million global monthly active users and is primarily used to access or share user-generated content. This designation will likely only apply to a small number of online companies, akin to the EU's very large online platform category under the Digital Services Act.

Information about minors

The APRA discussion draft conveys heightened protections for children and teens by setting the age threshold at 17, raised from the prevailing consensus of state privacy laws that for the most part consider information about individuals under the age 16 as sensitive. This aligns with the proposed Children and Teen Online Privacy Protection Act, or COPPA 2.0, a separate federal bill currently under consideration.

California and Virginia recently raised or proposed raising their age thresholds to 18, thus prohibiting the nonconsensual collection, use or sale of data related to all minors. The APRA draft stops short in comparison but may end up further extending protections if drafters heed the feedback of Energy and Commerce Committee Ranking Member Rep. Frank Pallone, D-N.J., to include "heightened protections for our nation's young people."

Data identifying other sensitive covered data

This clause ensures inferences are captured within the scope of sensitive covered data, so long as they reveal any protected categories of sensitive data. A few states include inferences in other ways. The California attorney general has opined that the CCPA confers protections to such inferences drawn from personal information. Thus far, the Oregon Consumer Privacy Act is the only other comprehensive state privacy law to follow suit.

FTC rulemaking

Significantly, this final clause vests the FTC with authority to expand the list of sensitive covered data at a later date through rulemaking, as is granted to state privacy enforcers in California, Colorado and New Jersey, and to member states in limited fashion under the GDPR. Such rulemaking authority offers flexibility in adding, removing or editing categories in light of emerging technologies, business practices or individual sentiment.

Conclusion

The APRA discussion draft will be subject to scrutiny and revisions throughout the legislative process, and much hard work remains before this early draft leads to final legislative significance. However, the sensitive covered data provisions of the APRA discussion draft provide important insight into how lawmakers view and value information types and how they have pulled from other privacy legislation and regulation.  

Andrew Folks, CIPP/E, CIPP/US, CIPM, and Luke Fischer are IAPP Westin Fellows.