In her opening remarks at a Future of Privacy Forum event in February 2021, U.S. Federal Trade Commission Acting Chairwoman Rebecca Kelly Slaughter previewed for the first time her enforcement priorities under the new Biden administration. Slaughter emphasized "meaningful disgorgement" as a key remedy she wants the FTC to seek out in future enforcement actions for privacy and security cases.
What is disgorgement?
Defined by Black's Law Dictionary as "the act of giving up something on demand or by legal compulsion," disgorgement is an equitable remedy used to deprive wrongdoers of their ill-gotten gains and deter future violations. Authorized by the Securities Exchange Act of 1934, the tool is primarily intended as a vehicle to prevent unjust enrichment or gain of one party at the expense of another.
Disgorgement is not a new legal concept by any means; we see it applied in various aspects of the law all the time. The Security and Exchange Commission, for example, prescribes disgorgement as a statutory remedy for short-swing profits gained by company insiders.
Poisoned fruit
Disgorgement also appears in criminal procedure jurisprudence. The "fruit of the poisonous tree" doctrine, a phrase coined by Supreme Court Justice Felix Frankfurter, makes evidence obtained by illegal means inadmissible in court. If the evidential "tree" is tainted, then so is its "fruit."
So, if law enforcement conducts an unconstitutional Fourth Amendment search of a suspect's home, finds a key to a gym locker, and evidence of a crime comes from that locker, that evidence would likely be excluded during subsequent court proceedings.
On the contractual side, many data protection lawyers have negotiated terms regarding the "return of data." Take, for instance, this clause: "Upon request by Company, or upon termination of Agreement, Vendor shall delete or return to Company all Company Data, including any copies of Company Data and any Company Data subcontracted to a third party for processing."
In this instance, the vendor must "disgorge" — or, surrender, yield or expunge — data they have processed on behalf of the company. Such language is so common it has even been codified into the EU General Data Protection Regulation's standard contractual clauses, a legal mechanism used to transfer personal data from within the European Economic Area to non-EEA countries. While some negotiations may prove easy to settle, they become more difficult as the vendor leverages company data to improve its overall models and services.
The duty to delete
While the FTC's use of the disgorgement tool is relatively new, the agency has a history of being adaptable in its approach, instructive for the business community and consumer-oriented in its agenda. Through its workshops, reports, best practices and, ultimately, its consent decrees, the FTC is well-known for its work in the consumer protection space.
Disgorgement was put on the map in January 2021, when the FTC announced it had settled with Everbaum, the developer of a now-defunct photo storage service, for deceptively using facial recognition technology and retaining photos and videos from deactivated user accounts.
The FTC alleged Everbaum:
- Kept their facial recognition feature active, even if users did not affirmatively opt in to the feature.
- Combined millions of facial images extracted from publicly available datasets to improve its facial recognition algorithms.
- Failed to delete or destroy the photos and videos of users who requested their information be expunged.
Among the settlement requirements, Everbaum must delete:
- Photos and videos of users who deactivated their accounts.
- All face embeddings — data reflecting facial features that can be used for facial recognition purposes — that the company derived from the photos of users who did not provide affirmative consent.
- All facial recognition models or algorithms developed with users' photos or videos.
The FTC is not only requiring that Everbaum destroy the data it improperly obtained; it also requires the developer to destroy any algorithms or models that were developed or derived from the tainted data (aka, the poisoned fruit).
Data disgorgement via data lineage
Privacy practitioners may wonder how they can implement disgorgement across their organizations, especially if they lack full knowledge of their organization's data assets. While well-known methods to delete and destroy data already exist, how does one know what data has been "tainted" by originating from some type of unjust action and must, therefore, be disgorged?
As my BigID colleague and vice president of data solutions, Peggy Tsai, and I previously wrote in this Privacy Perspectives article, data that is properly governed can support an organization's privacy management program, down to any disgorgement requirements. One recommended method that is particularly useful and applicable to disgorgement is to leverage data lineage. While data lineage is typically built for specific business processes or scoping specific data elements, it can also be used to show the full context of data, including:
- The original source of the data.
- How datasets are subsequently built and aggregated.
- The potential quality of the datasets.
- Any transformations along the data's lifecycle journey.
By leveraging data lineage, organizations can document the end-to-end flow of data in and out of its various systems — as well as how data has been used, altered or transformed — for instance, in the development of artificial intelligence.
Disgorgement is likely to become an increasingly popular remedy for algorithms derived from tainted data — and as it does, there are significant implications for AI research and development. Knowledge of the data's provenance and ensuing life cycle within the organization can mitigate what could end up being a burdensome obligation, financially and timewise. As Tsai elucidates, "building lineage flows is also about understanding impact and doing impact assessments. So if data needs to be 'disgorged,' you could potentially limit it to a few source models (aka cut off the finger instead of the entire arm)."
The future of disgorgement
Last week, the Supreme Court of the United States' ruling in AMG Capital Management, LLC vs. Federal Trade Commission could be considered a case of a civil procedure gone wrong regarding the disgorgement remedy. In a unanimous decision, the court held that Section 13(b) of the FTC Act does not authorize the commission to seek "equitable monetary relief such as restitution or disgorgement"; the statutory language at issue only authorizes injunctions, not equitable relief.
While the commission still has the authority to obtain restitution on behalf of consumers under other sections of the FTC Act, some have taken this as a complete blow to the commission's powers to use disgorgement as an equitable remedy. Slaughter sees this as the court depriving the commission of "the strongest tool we had to help consumers when they need it most."
Congress has come quickly to the rescue: HR 2688 (Bill to Give FTC Power Back) is already being circulated among the Capitol with 13 democratic sponsors signed on. The bill explicitly includes language for which the commission may seek "disgorgement of any unjust enrichment that a person, partnership, or corporation obtained as a result of the violation that gives rise to the suit."
Several weeks before AMJ Capital, Slaughter had already announced the creation of a new rulemaking group within the FTC, one that will strengthen the commission's ability to prohibit unfair or deceptive data practices. "I believe that we can and must use our rulemaking authority to deliver effective deterrence for the novel harms of the digital economy and persistent old scams alike," Slaughter said. Whether this recent ruling calls this into question remains to be seen.
Will all of this include a more consistent, permanent and "SCOTUS"-approved use of the disgorgement remedy? While monetary fines and privacy/security program requirements have been helpful, some regulators, including Slaughter, have argued that such measures have not gone far enough. Strong relief for consumers may mean hitting companies where it hurts the most, requiring them to give up the data that powers their services in the first place.
Photo by Roksolana Zasiadko on Unsplash