For the first time in over a decade, Congress is seriously considering a comprehensive federal privacy law. While the American Data Privacy and Protection Act has received significant attention — mostly focused on enforcement mechanisms and whether it would preempt the California Privacy Rights Act — privacy experts have not yet publicly discussed other needed improvements. Absent these improvements, the ADPPA would impede innovation, undercut law enforcement’s most effective tools, and hamstring private sector ability to thwart identity theft and fraud.
Here’s what we need to do to improve the ADPPA so it can be enacted:
The ADPPA should be able to adapt to data innovation
While titled “data minimization,” ADPPA Section 101 is unambiguously a data “purpose limitation” provision, allowing new personal data to be used solely for 15 identified purposes (and allowing a few additional uses for previously collected data).
This provides certainty about which data uses are lawful, but also rigidly locks our privacy law into a single point in time. If a privacy law with that approach existed in 1995, we likely would never have realized the benefits of the internet.
In contrast to the ADPPA, the European Union takes a more flexible approach to safeguarding data uses in the General Data Protection Regulation. Despite being one of the strictest privacy laws in the world, the GDPR allows for new and innovative data uses. GDPR Article 6 describes the lawful bases for data processing, two of which accommodate innovation: the “public interest” basis and the “Legitimate Interests” test (Art. 6(1)(e) and (f)).
The GDPR legitimate interests test is a useful model: it allows a controller to develop innovative data uses, but the controller must perform and document a balancing test that demonstrates why its interests are not overridden by the interests of the data subject. Moreover, that balancing test can be challenged before a regulator, thereby allowing data innovation to be subject to regulatory oversight.
The ADPPA would be a better law if it accommodated new data uses, either through a legitimate interests-like balancing test or even by giving a lead government agency authority to approve additional data uses.
The ADPPA should not choose which types of illegal activities can be investigated using data
Section 101 of the ADPPA permits using personal data to prevent or investigate crimes only if a privacy regulator decides that those crimes cause “direct harm”. (ADPPA, Section 101(b)(6)). Translation: the privacy bill limits which crimes can be prevented or investigated using personal data. Under the ADPPA, personal data could not be used to prevent or investigate crimes that cause only “indirect harm.” Crimes causing only indirect harm likely include:
- Money laundering by oligarchs and drug dealers.
- Border and immigration law offenses.
- Possession of child sexual exploitation material.
- Possession of illegal drugs.
The ADPPA would not apply directly to law enforcement agencies but it would apply to private sector firms that provide law enforcement with critical investigative tools.
Purpose of the restrictive definition of 'illegal activities'
The goal of a narrow “illegal activity” definition is not clear. In my conversations with staff involved in drafting the provision, they would only say that the provision was heavily negotiated involving lots of people. I suspect that crime victims and federal, state, local or tribal law enforcement agencies tasked with protecting our country were not included in the “lots of people” category.
A broad coalition of law enforcement associations recently sent a letter about the ADPPA to congressional leaders “to express deep concern about significant public safety consequences that could result if the [ADPPA] were to be enacted as currently drafted.” The coalition expressed concerns about the illegal activity definition and the grave danger that would be posed by letting criminals invoke Section 206 to opt out of law enforcement tools. The ADPPA, they explained, would make their critical investigative tools “unavailable or extremely limited.”
The letter was signed by leading law enforcement groups (e.g., Federal Law Enforcement Officers Association, Major Cities Chiefs Association, and Major County Sheriffs of America). District attorney and assistant U.S. attorney groups also signed the letter.
The FLEOA separately stated their concerns even more strongly: “we ... must overcome schemes to hamstring law enforcement by crippling its most effective digital analysis tools.” The FLEOA concluded “the ADPPA should not be a backdoor ploy to pick and choose which laws can be enforced.”
Supporters of these ADPPA limitations inexplicably applaud the bill’s promise “to reduce the amount of personal data flowing into … the hands of law enforcement.”
No privacy law should be written with the intent to cripple the overall ability of law enforcement to protect people. If there is an issue with use of a particular data type or with enforcement of the terms of a particular law, democratic processes should be followed to address those specific goals.
Anti-money laundering enforcement
The restrictive definitions in Section 101 also threaten to undercut enforcement of anti-money laundering laws against drug dealers, oligarchs and others. The harm stopped by AML laws is huge, but arguably “indirect.”
Another ADPPA issue arises because financial institutions do not maintain their own AML lists of sanctioned individuals or of "Politically Exposed Persons." Private companies devote resources to compile carefully vetted lists of PEPs and provide the results to banks through AML services. Firms like mine assist financial institutions with their AML compliance but do not have our own AML compliance requirements. The ADPPA allows data use for compliance but not for assisting others with compliance. All of the enacted comprehensive state privacy laws have clauses authorizing data use to assist others with compliance.
The ADPPA provides criminals with the unfettered ability to 'opt out' of law enforcement tools
ADPPA Section 206 addresses “Third Party Collecting Entities,” creating unlimited rights for any person to opt out of data held by any firm that acquired the data indirectly. In its effort to address data brokers that freely buy and sell personal data, Section 206 also threatens to severely affect data analytics firms like mine that acquire data indirectly to prevent crimes and fight fraud.
Section 206 contains no exceptions for data used to prevent or investigate fraud or other crimes. One should ask:
- Why should sexual predators be allowed to opt out of a law enforcement investigative service before committing their planned crime?
- Why should oligarchs be allowed to opt out of AML services?
- Why should individuals be permitted to opt-out of investigative services prior to or after threatening the safety of a lawmaker?
- Why should a covert terrorist planning a mission be allowed to delete information about his past residences?
- Why should a person planning insurance fraud be allowed to opt out of fraud prevention services to hide connections to other participants in the scheme?
The impact of Section 206 would be significant. FLEOA explained in its letter to House leaders that the ADPPA “targets and handicaps services that do not get their data directly from consumers, which are the most effective anti-crime tools available to prevent fraud and assist law enforcement in carrying out their criminal investigations.” They concluded that the ADPPA would seriously impede law enforcement investigations. That outcome is the last thing legislators should allow.
Allowing criminals to opt out would also impact private sector fraud prevention and identity authentication services. Protecting consumers in a digital age requires efficient, low-cost, low-friction identity authentication tools to prevent fraud while letting e-commerce move forward smoothly. I am proud to share that my company alone:
- Processes millions of transactions every day to authenticate identities, stop fraud and prevent fraudulent account openings.
- Helps stop over $23 billion in fraudulent purchases or payments annually.
- Helps stop 120 million fraudulent account applications annually.
Services that protect consumers by throttling identity theft and fraud should not be treated the same as targeted advertising services.
Private sector companies also provide identity authentication services to government agencies to prevent fraudulent efforts to claim pandemic payments, tax refunds or unemployment benefits. Section 206 would hurt the effectiveness of that government-private sector collaboration. We now know the magnitude of such fraud in recent years has been disheartening and the public interest certainly requires us to reduce or stop fraud on government agencies so they can support people in need.
The ADPPA should not conflate the “data broker” issues it seeks to address in Section 206 with data analytics services that protect consumers by preventing fraud, authenticating identities and helping to combat crime. A good privacy law should make tradeoffs so that privacy and consumers are both protected and the country is kept safe.
First Amendment issues in the ADPPA
The five state privacy laws that take effect in 2023 all exempt publicly available information about individuals. They do so to avoid First Amendment issues. Under the First Amendment, the Supreme Court has never upheld restricting speech when the content of the speech consists of true, publicly available information that was lawfully made public. The Supreme Court has also been clear that corporations have First Amendment rights.
In this context, a privacy law's attempt to enforce a data deletion right applied to accurate publicly available personal data would in effect be limiting the corporation’s speech. Indeed, enabling individuals to selectively send deletion requests to particular companies would be supporting a speaker-based or viewpoint-based speech restriction. For example, an individual might send deletion requests only to firms whose viewpoint differs from their own.
This First Amendment analysis explains the exemption for publicly available data in the ADPPA and other U.S. privacy laws. However, the ADPPA bill would face serious constitutional challenges because it takes away that exemption if the public data is combined with other types of covered data. ADPPA, Section 2 (27). The law enforcement and identity authentication services outlined above all combine public data with non-public covered data in order to improve the accuracy of such services.
Congress can readily improve the ADPPA
How the ADPPA can be improved is readily in sight. State privacy laws that had bipartisan input in Colorado, Connecticut and Virginia all exempt personal data used to prevent and investigate illegal activity from certain obligations. For example, Connecticut’s privacy law provides that “Nothing in [this Act] shall be construed to restrict a controller's or processor's ability to: ... (9) prevent, detect, protect against or respond to [] any illegal activity, … or (11) assist another controller, processor or third party with any of the obligations under sections 1 to 11, inclusive, of this act” (Section 10)
Congressional ADPPA drafters could add a similar provision to the ADPPA that exempts such public interest data uses such as fraud and crime prevention from ADPPA obligations that are inconsistent with those uses.
This treatment of public interest data uses could be backstopped in the ADPPA by an obligation that firms using data in this manner — e.g., fraud prevention or AML compliance — review and credential all customers with such uses to ensure that they are actually using the data for those purposes. Such credentialing is already performed by leading companies offering such services and the ADPPA could make it mandatory.