Starting off the year with a bang, the New York State Legislature reintroduced the New York Privacy Act, Assembly Bill 8680, a carbon copy of a bill from the previous legislative session. In many respects, the bill borrows heavily from its coastal companion, the California Privacy Rights Act, as it includes requirements for consumer rights, transparency, deidentification and anti-discrimination.

However, one of the newer and less-talked-about provisions is the requirement for data controllers to apply a fiduciary duty standard to the consumer data they process. 

The fiduciary duty standard

In a fiduciary relationship, one party (the consumer) is dependent on the other (the company) for the provision of a service that can only be performed if the company is so entrusted. In the context of this relationship, the company/fiduciary must uphold a series of duties to its consumers. These typically include a duty of care, a duty of loyalty and, in some cases, a duty of confidentiality.

The duty of care imposes a legal obligation for a company to adhere to a reasonable standard of care while performing any acts that could foreseeably harm the consumer.

The duty of loyalty requires that the company act at all times in the best interest of the consumer it serves and avoid possible conflicts of interest.

The duty of confidentiality means that information a company obtains about a consumer may be confidential and must not be used for the benefit of persons not authorized by the consumer.

Under the fiduciary standard, courts would consider whether, in a given interaction between the parties, the company fulfilled these duties to the consumer. While the fiduciary standard has long existed for professional services and relationships like attorney-client, doctor-patient and trustee-beneficiary, this standard has been relatively nonexistent in the digital world. Instead, the legal paradigm in the digital space has been based on contract law.

Notice and choice

For every website, mobile application and connected device, consumers have been subject to the language written within the terms of service statement, plus the privacy policy statement. It has been considered a best practice (and is often required) to give consumers the “choice” to read through both statements before allowing them to use the service.

In other cases, just opening up a website means that the consumer becomes a party to these statements — whether or not they realize they’ve actually agreed to them. The notice and choice regime, one of the longstanding principles of the U.S. Fair Information Practices, has become the dominant mechanism to regulate and govern a company’s data-processing activities.

The last decade has seen a multitude of solid research conducted on the consequences of how these statements are written — that is, in legal jargon that’s rather dense and extensive. One finding shows that it would take the average consumer around 200 to 250 hours per year, or about a month of time, to actually stop and read all of the privacy statements for every website they’ve visited.

These statements are admittedly written with regulators and plaintiff attorneys in mind. Consumers rarely read through all these terms — either because the gravity that the terms hold is not readily apparent, the consumer doesn't care or have the time, or the consumer is just resigned to this current paradigm of digital life. Even the well-meaning efforts of laws, like the CPRA, which gives Californian consumers “rights” to their data, still puts the burden on the consumer to review each company’s individual statement to exercise those rights.

As Florida State University College of Law Professor Lauren Henry Scholz declares in her law review paper, "Fiduciary Boilerplate," “When they participate in the information economy, consumers grant companies control over selecting and implementing data protection. The risks and opportunities presented by data processing in the information economy are too specialized, complex, and dynamic for consumers to monitor.”

Whatever the reason may be, companies set their rights and responsibilities owed to their consumers, with whom they have ongoing relationships, through public-facing privacy terms — which are subject to a moment’s change, and frequently do change.

The 'best interests' approach

At a time when states like New York, Washington and even the federal government are considering comprehensive privacy legislation, our over-reliance on notice/choice via contracts has become clear. We would be remiss to myopically continue with a trend that has done arguably little to benefit consumers. Implementing some form of a fiduciary standard into the digital economy is a potential solution.                 

The application of such a standard would need to be different from the traditional fiduciary relationship; my relationship with my doctor and attorney is quite different than my relationship with Gmail and Instagram. Legal scholars Woodrow Hartzog and Neil Richards argue for the duty of loyalty’s “best interests” approach, which puts the consumer’s well-being first, “even when they don’t understand the technology, the legal terms they are agreeing to, or the full consequences or risks of their actions.” This approach would also rid consumers of the burden of having to trust companies to do the right thing with their data, as “trust” is automatically inferred via the fiduciary duty requirements.                  

There are real concerns regarding the shortcomings of applying the fiduciary standard within the digital space. The standard itself already suffers from a “vagueness” problem, so applying it within the context of online relationships may not be so straightforward. When it comes to the “best interests” of the consumer, it’s not always clear what those interests are — or even what the consumer actually wants.

For example, an e-commerce platform might interpret your best interest as one that gains you the most points in its multi-partner loyalty program. Will a court of law have the same interpretation?

Additional challenges

Another challenge is there may be potential conflicts of interest between the consumer (beneficiary) and shareholders to whom companies are traditionally beholden. Imagine this standard applied to a company like Facebook, whose advertising sales across its platforms are its primary source of revenue. It has been hotly debated since Facebook’s inception as to whether this is what its users actually want

In addition, there is an open question about whether the fiduciary relationship is transferable. The primary burden of the relationship would traditionally fall on the first-party data collector, but what about the first-party’s relationship to its third-party processors — and those processors’ relationships with their sub-processors? Like data-processing addendums used for terms of the EU General Data Protection Regulation, this standard would require agreements between downstream recipients of the consumers’ data that impose the fiduciary standard. In turn, those recipients would be required to impose the same obligations upon future parties that would be receiving the data.

Would these “chain-link” protections be sufficient? Or would they place too much of the compliance burden on the shoulders of first-party data controllers (including small- and medium-sized businesses that are limited in their data-processing activities) — to the point where those data controllers might not be able to handle the costs of doing business at all? 

Ultimately, there are a host of questions that would need to be answered for lawmakers to come to a reasonable consensus as to whether the fiduciary standard is the appropriate legal mechanism to use within the data protection sphere. What does emerge is the need to move beyond legal reliance on notice/choice, which forces consumers to blindly trust businesses, and shift to a framework that puts the burden on industry.  

As Uncle Ben said in Spider-Man, “With great power comes great responsibility.” Companies that process data have a great power. Transparency and choice are no longer sufficient to foster a real sense of responsibility. Whether it’s a form of the fiduciary standard or another legal mechanism, data companies need to put the best interests of their consumers first — so those consumers will have no reason to doubt the trustworthiness of their actions. 

Photo by Dan Gold on Unsplash