People are justifiably excited about the American Data Privacy and Protection Act. It’s the most significant bipartisan privacy legislation introduced in more than a decade, and it represents a sincere attempt to move beyond the ineffective “notice and choice” approach to privacy that has been the hallmark of U.S. legislators since the days of dial-up modems. The ADPPA has many interesting parts, which have been others. But possibly the most significant part of the bill—and the response bill from Sen. Maria Cantwell, D-Wash. — are “duties of loyalty,” which in theory would require organizations to act in our best interests when processing data and designing services.

Done correctly, duties of loyalty would change a company’s business incentives away from manipulative and exploitative practices toward long-term, sustainable and mutually beneficial information relationships between people and companies. Lawmakers seem to be converging on data loyalty as the foundation for a federal privacy framework. Sen. Brian Schatz's, D-Hawaii, Data Care Act, Cantwell’s earlier Consumer Online Privacy Rights Act, and, even in practice, the Kids Online Safety Act from Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., are all anchored by duties of loyalty. State lawmakers have also keyed in on data loyalty as a defining value for U.S. privacy frameworks. Lawmakers in New York and Massachusetts have proposed privacy rules built around the concept of data loyalty. California is close to passing legislation around an Age-Appropriate Design Code that references a version of loyalty in its introductory findings. Even The Washington Post has explicitly called for U.S. privacy legislation built around the idea of data loyalty.

As academics working on the concept of data loyalty, we’re thrilled by this development. In a series of articles over the past few years, we’ve argued data loyalty is the critical missing piece to America’s regulatory tool kit for privacy. We think data loyalty can bring substance to America’s privacy identity and meaningfully check the excesses of surveillance capitalism while preserving the benefits of information relationships. All the bills that incorporate duties of loyalty into privacy frameworks are moving us closer to holding data collectors meaningfully accountable. They are so close to fully realizing the potential of data loyalty as the anchor of U.S. privacy law. Unfortunately, the ADPPA and other recently proposed bills that contain a duty of loyalty are insufficient.

We are at a critical juncture in the data privacy debate. Like the choice of where to lay roads, the privacy rules we choose now will be with us for decades, if not centuries. If lawmakers are going to create data loyalty rules, it is essential they get them right.

Where the current bills fall short

To be clear, we think the ADPPA and all the bills that propose a duty of loyalty are commendable steps in the right direction and a vast improvement over relying on disclosures and data subject rights to anchor privacy legislation. But all these proposals arrive with diminished duties of loyalty. The current federal proposals either focus on a narrow aspect of loyalty, such as data minimization, or they unnecessarily saddle loyalty rules with harm requirements.

For example, Title I of the ADPPA is subtitled “Duty of Loyalty,” but Section 101 clarifies that this duty is really that of “data minimization,” prohibiting companies from collecting, processing or transferring covered data beyond what is “reasonably necessary, proportionate, and limited to” provide a requested service, a reasonable anticipated communication or an explicitly permitted purpose. We have argued strong data minimization rules are a key part of data loyalty. But data minimization is merely one aspect of acting in the best interests of trusting parties. Data loyalty rules should also cover manipulation, breaches of confidentiality, wrongful discrimination, and reckless and extractive engagement models. Data minimization rules only indirectly confront many of these issues.

Other proposals, such as Cantwell’s COPRA, offer broader duties of loyalty but then hamper the effectiveness of the duty by requiring a showing of some kind of harm other than the disloyalty itself. For example, in addition to a duty of data minimization, the latest version of COPRA also includes a prohibition on deceptive and harmful data practices as part of its duties of loyalty. A report on drafts of COPRA notes “Cantwell’s bill defines harmful data practices to include practices that cause or are likely to cause financial, physical, or reputational injury, or offensive intrusion upon solitude or seclusion of an individual, where such intrusion would be offensive to a reasonable person.” The Data Care Act imposes somewhat similar harm requirements for a breach of data loyalty.

This is unnecessary and confuses the purpose of data loyalty. It is a long-settled principle in American law that harm requirements are important for duties of care, which prohibit companies from unreasonably harming or creating a risk of harm to others. But trusted parties can violate a duty of loyalty without any showing of harm or even a likelihood of harm. The mere betrayal of one’s trust can be enough to hold disloyal parties accountable, like if your lawyer uses your business secrets to make money — even if you are not otherwise harmed. Like with rules against deception, the betrayal of a trust by one bound to loyalty is a legal wrong in and of itself. It is the integrity of the relationship of trust that has been corrupted.

Ironically, the statute that comes closest to getting a duty of loyalty right actually mislabels it. Section 3 of the bipartisan Kids Online Safety Act requires covered platforms to “act in the best interests of a minor that uses the platform’s products or services.” But the bill categorizes this obligation as a “Duty of Care” even though a “best interest” requirement is the defining element of loyalty rules. Statutes should obligate companies to both duties of loyalty and care and those two duties should be distinct, even if they overlap in many contexts.

Doing loyalty right

The good news with data loyalty is that to get it right, we don’t need to reinvent the wheel. Duties of loyalty have operated in other areas of our law literally for centuries, and there is a clear model for how to do it properly — a reasonable general duty of loyalty backed up by specific rules operating in the contexts where the incentive for disloyalty is high or where rules need to be expressly spelled out. For example, lawyers have a general duty to be loyal to their clients, supplemented by specific prohibitions in areas like not mixing client funds up with the law firm’s accounts, not going into business with clients and not having sex with them. (Some of these things might seem obvious, but most of these rules exist because these are areas in which there are incentives for manipulation, self-dealing or significant injury.) This is what we have called the “loyalty two-step” — a general catchall duty for the most extreme cases supplemented by more specific and stronger rules for particular contexts.

How would this work for data loyalty? In our article, "A Duty of Loyalty for Privacy Law," we propose a rule that would prohibit data collectors from designing digital tools and processing data in a way that conflicts with trusting parties’ best interests. That’s the general rule — and it’s precisely that kind of general rule that is missing from the ADPPA and Cantwell proposals. It is important to have a general duty as a catch-all standard so the law can evolve with the times. It’s no coincidence the reason both the U.S. federal Trade Commission’s “unfair and deceptive trade practices” and the Fourth Amendment’s “unreasonable searches and seizures” standards have been able to keep up with the march of technology is because they have also been phrased in general terms. A general duty of data loyalty would work in a similar way — and it would give the lie to the frequently heard statement that “law can’t keep up with technology.” Law can keep up with technology, but only if we impose general, flexible standards rather than specific rules that can become outdated as technology changes.

In "Legislating Data Loyalty," we develop this general rule, emphasizing the importance of a “no conflict” rule, the prioritization of people over profits and the collective best interests of trusting parties. At the same time, a good duty of data loyalty must have some of those specific rules, like the ones the ADPPA calls “loyalty duties.” Such rules can offer the specificity that targeted regulation supplies and can regulate more heavily in the specific contexts where we believe disloyalty is more likely to happen or more likely to be harmful. Using the two-step model from fiduciary law, we suggest certain subsidiary data loyalty rules targeting the five most-likely areas ripe for disloyal and harmful self-dealing.

We think five separate areas call for such specific rules. First, there is "Collection," the act of collecting, recording and deciding to keep data about a person. Strong data minimization rules would fall in this category. Second, there is "Personalization," the act of treating people differently based on personal information or characteristics. Strict anti-discrimination and anti-subordination rules along with prohibitions on certain kinds of cross-contextual behavioral advertisements like those targeted in the California Consumer Privacy Act would be responsive to this context. Third, there is "Gatekeeping," the extent to which trusted entities allow third parties to access people and their data. Robust data security, confidentiality and deidentification rules would be appropriate here.

The fourth context is "Influence," where companies leverage technologies to exert sway over people to achieve results. Here, anti-dark patterns rules would be helpful. We recommend adopting a rule based on the Consumer Financial Protection Bureau’s prohibition on abusive trade practices, which prohibits taking unreasonable advantage of uniformed trusting parties, the inability of people to protect themselves from exposure, or the reasonable reliance by trusting parties that an organization is acting in their interests. Finally, there is "Mediation," which concerns the way organizations design their platforms to facilitate people interacting with each other. Here we recommend creating anti-harassment and disinformation design rules. These subsidiary rules would not solve all problems of data and platform power, but they would engage with problem areas in a specific enough way to resist inevitable efforts to dilute the general loyalty obligation.

Loyalty skepticism

Many people in industry and Congress are skeptical of loyalty duties for data. They fear that they might be burdensome or might “stifle innovation” and make it harder for companies to bring new products to market. In our article "The Surprising Virtues of Data Loyalty," we argue that nothing could be further from the truth. Properly understood and implemented, loyalty duties for data can promote fair competition and put the right incentives on companies to provide even better products. By taking manipulation, betrayal and self-dealing off the table, loyalty duties allow companies to compete on products that are good for their customers, building trust and sustainable, long-term relationships.

Consider how long you have had a relationship with large companies like Apple or Microsoft through your iPhone or use of Microsoft Word. Loyalty rules reward this kind of long game, ensuring the “I Agree” button isn’t a trap. For new companies, loyalty duties mean they can focus on providing truly valuable products and services without fear that their competitors will be using manipulative or other disloyal models to get ahead of them. And if we really believe in the power of innovation, shouldn’t we have some faith in it? After all, as the old proverb puts it well, necessity is the mother of invention. And if the only way a company can make money is through disloyal data practices, then we should celebrate the failure of this business model.

And we can’t wait to see what kinds of loyal, sustainable, long-term information products companies subject to a duty of loyalty can produce. In this way, by encouraging the development of good products at a good price and by promoting competition on service rather than manipulation or data extraction, a duty of loyalty can be good for businesses as well as protecting vulnerable consumers. Lawyers have been subject to a duty of loyalty for centuries and it hasn’t prevented the delivery of legal services and instead built trust in the profession.

As technology companies become ever more intertwined with our lives, as they know our secrets and vulnerabilities even more than our lawyers and doctors do, it’s time to give them the kinds of mature duties those professionals have thrived under for centuries. As the pending bills have recognized, it’s time for a duty of data loyalty. All we ask is that when we impose data loyalty duties, we do them the right way so we can build the kind of trust in our digital society that we deserve — in a way that is good for everyone in the long term. Lawmakers are a stone’s throw away from fully realizing the potential for data loyalty. Let’s help them get it right as we make a historic push for privacy.

Photo by Cytonn Photography on Unsplash