When The New York Times recently reported that Facebook “failed to police how its partners handled user data,” it seemed like just another in a series of bad privacy headlines in 2018. But the report also noted that the problem was uncovered and dismissed during an assessment pursuant to Facebook’s 2011 consent order with the U.S. Federal Trade Commission.

This order has loomed over many conversations about how to grapple with Facebook’s issues during the course of this year, but it should be acknowledged that the order’s continued inadequacy creates a bigger issue for the FTC. Just last week, Consumer Reports’ Justin Brookman pondered on this page what tools are needed for the FTC to bolster its enforcement.

Unfortunately for the FTC, the United States’ top privacy cop appears to have thought it had done enough back in 2011. But it is likely that new federal privacy proposals will endeavor to empower the FTC. Some proposals are calling for extensive new rulemaking authority, while Sen. Ron Wyden, D-Ore., has already proposed significant new legal and technical staffing for the commission.

However, during a week when the Senate Commerce Committee has taken a critical eye to the FTC, it’s worth taking a minute to consider whether the consent orders themselves have become something of a privacy paper tiger.

It’s worth taking a minute to consider whether the consent orders themselves have become something of a privacy paper tiger.

The FTC has regularly held up its consent orders as an essential pillar of its privacy enforcement activities. A consent order typically imposes a 20-year period of FTC oversight and requires companies to implement privacy and security programs and perform regular independent assessments of the company’s data practices. For a while, even the most ardent privacy advocates were placated as the FTC brought several major tech companies under consent orders for privacy violations.

But as the FTC’s oversight of Facebook reaches its midpoint, there is growing evidence that these orders simply create box-checking exercises without protecting anyone’s privacy.

To be fair, I don’t mean to discount the progress that’s been made. There is no question that the FTC’s activities have shaped the privacy profession. The IAPP’s FTC Casebook is a tremendous resource for how to think about data privacy requirements. FTC workshops have helped to drive conversations about what should be privacy best practices; its staff has worked to advance the public’s understanding of how technology is driving privacy shifts. Its enforcement work has shamed the worst of the smallest bad actors. But with limited resources and constant legal challenges to its authority, the commission has done as much as it probably can.

Facebook now serves as patient zero in underscoring the basic limitations posed by relying largely on Section 5’s prohibitions against unfair and deceptive practices as the enforcement hook for privacy violations. Consumer advocates, including the Center for Democracy & Technology, have long called on the FTC to more aggressively go after unfair data practices, but the legal reality is that unfairness demands a showing of injury not easily met in privacy disputes and under siege elsewhere, likely making the FTC reasonably skittish. This dynamic is why the FTC needs to be charged with enforcing a concrete and comprehensive federal privacy law.

Instead, we are operating under the so-called common law of privacy at the FTC, and the limitations of consent orders reveal a problem with this form of policymaking. However, so long as they constitute an important element of American privacy “law,” additional transparency efforts in and around a consent order can move existing privacy enforcement in the right direction.

Consent orders are contracts, not privacy law

Privacy professionals might operate under the notion that consent orders form a larger privacy common law, but the reality is that they are ultimately a contractual settlement between a regulator and a company. We should be more open about the fact that they are the careful product of private negotiations between FTC staff and well-trained legal counsel. Each clause and provision is carefully worded to limit its scope and cabin any corporate liability.

When the first headlines broke about Cambridge Analytica, the FTC issued a public statement that it was reopening an investigation into Facebook. David Vladeck and Jessica Rich, former directors of the Bureau of Consumer Protection, have vigorously defended the original consent order, but Chris Hoofnagle argues the FTC faces a difficult enforcement challenge. Eight months post–Cambridge Analytica, it is unclear whether the FTC is in any position to obtain a significant monetary penalty from the company under the terms of its original settlement.

The underlying issue is that the agreement between the FTC and Facebook is typical of what the FTC generally negotiates. The FTC is risk-averse and wants an agreement that will hold up in court, while companies wish to get on with their business. Everyone is incentivized to come to a settlement, but the privacy of individuals might be traded away in the process.

Everyone is incentivized to come to a settlement, but the privacy of individuals might be traded away in the process.

One of the biggest problems that have been identified with privacy enforcement via consent order is that companies are afforded one free “bite at the apple.” The challenge of enforcing a consent order suggests companies may actually get more than one free bite. The FTC must show how a company violated its order, and, often times, the second privacy violation is outside the four corners of the first settlement, sending everyone back to the proverbial drawing board.

We need to put in place mechanisms to address these potential problems before they arise.

Transparency and structural flaws in the privacy assessment process

Privacy audits are one area where a lack of transparency has hidden serious problems with privacy consent orders. While FTC has long touted the usefulness of independent “privacy audits” as an important element of consent orders, these are nothing like “audits” as commonly understood. Instead, the consent orders speak of privacy “assessments,” which are not as rigorous as a formal audit. Audits measure compliance against predefined criteria, while an assessment simply certifies compliance with a standard set by the assessed company itself. As PwC appeared to concede to The New York Times, this distinction is important because it effectively lets companies set their own terms for how their approach to privacy is evaluated.

Privacy audits are one area where a lack of transparency has hidden serious problems with privacy consent orders.

Privacy advocates have raised this concern with the FTC. In comments to the Uber consent decree, Bob Gellman warned that an “assessment can be worthless if it allows the company being assessed too much control over the review.” Yeoman’s work by Hoofnagle confirmed this fear, and this spring, Megan Gray, currently of DuckDuckGo but previously at the FTC, published a white paper that was extremely critical of the existing assessment process. She noted that privacy assessments can be circular. For example, if a company asserts it has a reasonable privacy program, an assessor can certify that a company has a reasonable privacy program based on the company’s own assertion. She warns that this model will fail to identify or uncover a company’s privacy blind spots, which seems to presciently capture PwC’s inability to flag what ails Facebook.

Enforcement would benefit from more transparency into consent orders

Both the technology and privacy advocacy communities desire more insight into and transparency about the FTC’s processes and procedures for investigating privacy problems. We should be cognizant both that FTC staff does much work behind the scenes and that the agency is ultimately being tasked to do a lot with little. Requiring additional transparency risks straining the FTC’s ability to act nimbly. Balancing the need for public transparency against the commission’s legitimate need for flexibility is a difficult challenge.

The nonpublic status of privacy assessments illustrates the extent of the problem.

Hoofnagle, as well as advocacy groups like the Electronic Privacy Information Center, has had to file Freedom of Information Act requests simply to get heavily redacted versions of these assessments. Recently, two FTC commissioners acknowledged the value of proactive disclosure, though Commissioner Rebecca Slaughter noted that FTC enforcement activities “extend far beyond what can be gleaned from an isolated assessment.” While companies have an interest in protecting proprietary information, it is unclear what rationale exists for these assessments not to be made public by default.

While companies have an interest in protecting proprietary information, it is unclear what rationale exists for these assessments not to be made public by default.

When the public has gotten their hands on these assessments, they are “heavily redacted and written in almost impenetrable language,” according to Gray. This limits anyone’s ability to evaluate either a company’s or the FTC’s response to emerging privacy concerns.

The end result is that a company’s privacy blind spots grow larger.

The FTC is perhaps settling for less than they should when disciplining offenders. Advocacy groups have called for orders that more fully operationalize privacy principles, but, at minimum, the FTC could include requirements that companies explain and detail to the public how they are complying with the terms of the settlement. Unfortunately, while the commission seeks public comment about its settlements and consent orders, there is no evidence that any suggestion with respect to privacy has ever been adopted.

FTC consent orders currently come with no admission of wrongdoing, and companies often provide no detail into whether or how any of their business practices will be impacted by the FTC’s order. If a consent order is only confirming a companies’ existing practices, its ability to effectively address emerging privacy problems in the future is limited. Improving these sorts of front-end disclosures may force companies to make meaningful changes and be more accountable to the public.

However, the common law of privacy ultimately highlights the vital need for a federal privacy law that pairs substantive requirements alongside strong enforcement. The status quo favors process mandates over meaningful controls or prohibitions on companies, which may be a boon to the privacy profession but does little to protect an individual’s privacy.

Consent decrees have been unable to set limits on companies’ data practices. Fortunately, Congress is well positioned to do just that, and I hope the FTC will push lawmakers to follow suit.

photo credit: Blue MauMau Federal Trade Commission building via photopin (license)