When, during last year’s IAPP Europe Data Protection Congress in Brussels, I said I could see a path to federal privacy legislation in the 116th Congress, IAPP Vice President and Chief Knowledge Officer Omer Tene called me “optimistic — perhaps ridiculously optimistic.”

But soon afterward, the prospects for legislation quickened when U.S. Senate Committee on Commerce, Science, and Transportation Ranking Member Maria Cantwell, D-Wash., introduced the Consumer Online Privacy Rights Act, Chair Roger Wicker, R-Miss., released the draft Consumer Data Privacy Act, and House Committee on Energy and Commerce staffers circulated a bipartisan discussion draft.

This year, though, energy among members of Congress for bipartisan privacy negotiations has faded — despite subsequent efforts like Sen. Jerry Moran’s, R-Kan., Consumer Data Privacy and Security Act. Although the proposals have many promising elements in common — including data minimization and individual rights of access, correction, deletion and portability — they are far apart on two pivotal and more polarized issues: preemption and private right of action.

Response to the COVID-19 pandemic has necessarily consumed much of the current bandwidth in Congress. But the pandemic response has also been hampered by the absence of clear standards for the privacy of health and location data — a stark reminder of gaps in U.S. privacy laws. Protagonists in the broader privacy debate have stepped in to fill this gap with legislation aimed at protecting data collected for the COVID-19 response. But the Republican bill from Chairman Wicker, Moran, and Majority Leader John Thune, R-S.D., and its Democratic counterpart from Sens. Richard Blumenthal, D-Conn., and Mark Warner, D-Va., both display the same gulf on preemption and private right of action.

These developments make clear that privacy legislation is even more relevant now than ever but that legislation cannot progress unless both sides find a way to move off their all-or-nothing positions on these stubborn issues.

Bridging the gaps

Over the past year, my Brookings Institution colleagues and I have spoken with congressional staffers, civil liberties advocates, members of civil society and industry representatives across a broad spectrum of sectors and viewpoints.

Based on these conversations, as well as our own experiences in formulating privacy legislation, we have released recommendations for federal privacy legislation in a report titled “Bridging the gaps: A path forward to federal privacy legislation.” Our report presents detailed recommendations — not only on preemption and private rights of action, but also on many issues that compose comprehensive privacy legislation — based on a line-by-line comparison of COPRA and CDPA, as well as other legislation and stakeholder proposals.

Our recommendations show ways to align these legislative proposals and, where they are far, to find a middle ground that addresses key interests on both sides of the debate. Here, we highlight a few of our recommendations.

  1. Graduated approach to risk and obligations

In our report, we propose an adaptive regulatory model that scales privacy and security obligations according to the covered entity and risk involved. A scaled approach reflects that one size does not fit all when it comes to privacy — organizations are diverse and use data in a wide range of circumstances.

Small and medium entities should be exempt from mechanisms that demand significant engineering or personnel, such as the rights of access, correction, deletion, portability and our proposed “right to recourse.”

Yet, we recommend establishing a general duty of loyalty and duty of care applicable to all covered entities. The duty of loyalty would require covered entities to respect individual privacy by implementing data minimization, fairness and transparency. The duty of care would prohibit covered entities from causing foreseeable injury, such as financial harms, privacy invasions “highly offensive to a reasonable person” and violations of anti-discrimination law. Covered entities should be subject to general obligations to assess risk and protect privacy “appropriate to the size and complexity of the covered entity and volume and nature and intended uses of the covered data processed.”

The goal of this trade-off between flexibility and exposure is to focus attention on outcomes rather than delineate processes that can reduce to check-the-box compliance exercises.

  1. Heightened focus on obligations of covered entities

Federal privacy legislation must establish a clearer shift in regulatory paradigm toward boundaries on how organizations collect, process and share personal information. To reduce the burden of protecting personal information on individuals, legislation should minimize the number of times individuals are asked to review consent requests. Otherwise, the legislation will end up perpetuating the existing failures of notice and consent.

Thus, while we accept requiring organizations to obtain affirmative express consent to process sensitive data, we suggest narrowing the definition of “sensitive data” to avoid over-notification and “consent fatigue.”

We see a need to more clearly differentiate between transparency provisions directed to individuals and those directed to regulators and privacy watchdogs. Privacy notifications to individuals should be contextual, to the point, and offer clear and actionable choices with the option to access other publicly available but separate privacy statements about data collection and individual rights. The latter “privacy statements” are distinct from what we term “comprehensive disclosures,” which primarily provide value to regulators and privacy watchdogs.

  1. Tailored preemption of 'inconsistent' state laws

COPRA contains a savings provision for a variety of state statutes of general applicability, a separate one for state rights of action, then a preemption provision aimed at “directly conflicting” state laws. However, the latter provision is largely negated by a further provision that state law “shall not be considered in direct conflict if it affords a greater level of protection to individuals protected under this act.” 

We recommend omitting the open door for more protective state laws in that bill.

Instead, we recommend preempting “inconsistent” state laws that regulate “collection, processing, sharing, and security” of protected personal data. We also propose a limited eight-year sunset on preemption, which would allow Congress to evaluate whether the federal law is working and revisit any need for state laws to supplement a comprehensive federal privacy law.

  1. Targeted private litigation

We recommend scaling a private right of action based on injury and the proposed duties of loyalty and care. We generally recommend limiting recovery to “actual damages,” except in cases of “willful or repeated” violations where statutory damages could be available. Other than for harms specifically identified under the duty of care, which have been commonly compensable under existing laws, we suggest plaintiffs should be required to demonstrate a heightened standard of “knowing or reckless disregard” to sue for violations of most privacy provisions. And for more technical or administrative statutory violations, we suggest a “willful or repeated” standard to sue.

We also propose that a civil action under the federal law be the exclusive remedy for the actions complained of. To give individuals and businesses a way to avoid litigation, an individual plaintiff would need to exercise a “right to recourse” before bringing suit. This form of notice and opportunity to cure is adapted from a variety of state consumer protection or unfair and deceptive acts and practices statutes.

  1. Algorithmic discrimination

As the scale and complexity of machine-learning and algorithmic decision-making grow, they generate increasing concerns about the potential effects they can have on individuals. While the focus primarily on whether algorithms can compound existing forms of societal discrimination — the province of anti-discrimination laws — algorithmic discrimination is relevant to information privacy legislation when discrimination is based on the collection and use of personal information. 

Then discrimination implicates privacy interests.

To broaden consideration of the potential effects of artificial intelligence beyond unlawful discrimination, federal privacy legislation should address civil rights and algorithmic decision-making as separate statutory sections. On civil rights, legislation should clearly state that no covered entity may engage in data processing that violates current anti-discrimination laws and maintain the existing role of agencies that currently enforce anti-discrimination laws. We suggest combining provisions in COPRA and CDPA to address the interaction of existing anti-discrimination laws with federal privacy legislation and Federal Trade Commission enforcement. This would recognize the primary enforcement role of existing anti-discrimination agencies but bring to bear the technical expertise of the FTC and privacy law on the misuse of covered personal data.

To address other outcomes of algorithmic decision-making, large data holders should conduct impact assessments and audits when deploying algorithmic decision-making systems that may have “significant effects” on individuals.

  1. Organizational accountability

It is important to have strong processes in place to ensure that covered entities take their privacy obligations seriously and engage executive-level attention on these obligations. We propose that all covered entities — even small and medium entities — conduct baseline risk assessments that analyze “the benefits of its covered data collection, processing, and transfer practices; the potential adverse consequences of such practices to individuals and their privacy; and measures to mitigate any such adverse consequences.” Consistent with our scaled approach, such risk assessments should vary in scope and depth depending on the size and complexity of the covered entity and the potential risks. Large data holders, in turn, should be required to conduct more in-depth and extensive risk assessments and to retain written copies of the assessments for at least five years.

In addition to risk assessments, most covered entities (except small and medium entities) should designate at least one privacy officer and one data security officer who should develop written privacy and security programs to guide compliance with privacy legislation. The CEOs and privacy and security officers of large data holders should annually certify to the FTC that their annual disclosure of privacy practices is accurate and effective.

A path forward

Taken in their entirety, our recommendations reflect a somewhat different regulatory model from most proposals. Some businesses might be anxious about complying with standards that do not translate into predictable checklists. Some advocates may see flexibility as a loophole that unscrupulous companies could exploit. Yet both sides of the policy debate have something to gain from the balance struck — and both have something to lose from continued inaction and stalemate.

The longer we wait, the harder a consistent national standard becomes to achieve. It took nearly two decades for all 50 states to adopt data protection laws as basic as breach notification. A similar path forward, simply put, would provide less comprehensive and meaningful privacy protections over a long time frame than what we think is achievable at the federal level in the near future — if industry, advocates and political leaders are willing to make some hard choices.

We hope our broad but carefully calibrated compromises can point toward a strong national framework to protect information privacy for Americans.

Photo by Joey Csunyo on Unsplash