TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | What the FTC really needs to deal with Facebook Related reading: Privacy Harm Is In the Eye of Beholder

rss_feed

""

""

100

An incredible series of New York Times articles over the last week paints an ugly picture of life at Facebook, from concerning corporate culture and ethical issues to very real questions about the company’s data privacy views and practices. Unsurprisingly, public outrage was swift, with many calling on Washington to finally take action to rein in the company.

As the nation’s foremost consumer protection agency, the U.S. Federal Trade Commission is the first line of defense for consumers when it comes to privacy. This naturally leads to an important question: What does the FTC need to grapple with a force like Facebook?

As the nation’s foremost consumer protection agency, the U.S. Federal Trade Commission is the first line of defense for consumers when it comes to privacy. This naturally leads to an important question: What does the FTC need to grapple with a force like Facebook?

In 2006, the FTC formed the Division of Privacy and Identity Protection to specifically focus on privacy and security issues, and it has since brought hundreds of important cases establishing norms and limitations on what companies can do with your personal information.

Nevertheless, it is difficult to argue that the FTC has done enough to safeguard personal privacy in today’s digital world.

Clearly, what the FTC has today isn’t working. Facebook tracks what its users (and non-users, for that matter) do on the majority of other websites and apps without their knowledge or permission. Six-and-a-half months ago, they promised a “Clear History” control in "a few months" to at least offer some semblance of an opt-out, but they have yet to deliver on that. In the wake of the Cambridge Analytic debacle, we learned that Facebook has been sharing users’ private profile data with hardware manufacturers without any transparency (the Facebook Data Use Policy vaguely reserves the right to share your data with service providers “and other partners”). When an FTC-mandated auditor asked questions about whether Facebook was monitoring what these manufacturers were doing with this data, they were stonewalled — subsequent audits stopped asking this question.

And this scandal, of course, is just the latest in a long series of terrible privacy decisions. Put simply, Facebook is, from a privacy perspective, a disaster.

The FTC did bring a wide-ranging case against Facebook in 2011, resulting in a 20-year Consent Order. Nevertheless, the threat of substantial fines doesn’t seem to be meaningfully deterring Facebook’s behavior. This is partly because the Consent Order is incredibly narrow — substantively, Facebook is only required to not misrepresent what they do with your personal information, and from offering privacy controls that don’t work, things they are already prohibited by Section 5 of the FTC Act. They’re required to have a privacy program and third-party audits in place as well, but as we have seen, these audits don’t seem to be particularly effective. While third-party audits might help identify privacy vulnerabilities for a small, unsophisticated company, these audits are simply a minor inconvenience for a multibillion-dollar company like Facebook, unlikely to root out or deter bad behavior. Facebook gets to choose its auditor and pays the bills: It is difficult to imagine an auditor submitting a report to the FTC detailing serious noncompliance. Even the mild criticism from PricewaterhouseCoopers of Facebook’s failure to monitor its partners was rather surprising.

Yes, there is more the FTC could do today. In addition to sharing private data with hardware manufacturers, Facebook also shared personal data with third-party apps — even after rolling out a control to turn it off. That is a pretty clear violation of the Consent Order, and the FTC should enforce it. Moreover, the FTC probably should have acted on EPIC’s 2016 complaint arguing that Facebook violated promises that WhatsApp users’ data wouldn’t be used for ad targeting.

While important, these behaviors are at the margin and do not get to the core of the problem. The scope of Facebook’s privacy obligations shouldn’t be determined by Facebook — whether in a control they decide to offer, or in a blog post reassuring leery users. There need to be practical outside checks to hold them accountable.

The scope of Facebook’s privacy obligations shouldn’t be determined by Facebook — whether in a control they decide to offer, or in a blog post reassuring leery users. There need to be practical outside checks to hold them accountable.

Fundamentally, the FTC needs privacy authority that extends beyond deception or murky unfairness to accord companies’ data practices with consumer preferences and reasonable expectations. I will not attempt to spell out what exact admixture of data minimization, permission, controls, and fiduciary duties should be included in such a law (though we go into substantial detail on those questions in our recent NTIA comments).

Nevertheless, the net result should be that Facebook collects a lot less about what its users do off of the service.

Any new privacy statute will be relatively high level, so the FTC must also be given the authority to promulgate clarifying rules to adapt to evolving technology and business practices. Online tracking especially has evolved substantially in recent years to track users by offline identity and across devices; the FTC needs the ability to specify what privacy law means for ever-evolving technologies and business models. Defendants have argued that the FTC’s general purpose unfairness authority does not offer sufficient guidance as to what privacy and security practices are required; in fact, the 11th Circuit recently held in the LabMD case that in order to install a data security program was too vague. In order to provide more certainty for both companies and consumers, the FTC should have the same rulemaking capacity possessed by other government agencies.

In order to provide more certainty for both companies and consumers, the FTC should have the same rulemaking capacity possessed by other government agencies.

A new privacy statute should also require companies to be far more transparent about their data practices. Today, some state laws require websites to have a privacy policy, but they don’t actually specify what needs to be in it. While some proposed bills require simpler and easier-to-read policies, a better approach would be to require much more detailed disclosures so that outsiders can meaningfully assess Facebook’s practices. We need to come to terms with the fact that consumers are not the primary audience for privacy policies — rather, these policies are for regulators, the press, academics, and ratings services, like Consumer Reports. We look to policy disclosures in making assessments of company’s privacy and security practices for products and services under our Digital Standard, but that can be challenging when privacy policies don’t contain very much meaningful information.

Robust transparency requirements would also prevent companies like Facebook from cloaking practices such as sharing profile data with hardware manufacturers under catch-all provisions about “sharing data with partners,” and introduce meaningful accountability for their behaviors. (This is not to argue that consumers shouldn’t be told about what’s happening to their data — just that a privacy policy isn’t a particularly effective place to do it. Certainly, consumers should be told separately — and more prominently — about potentially surprising data practices.) If companies want to justify certain data practices because the data has been de-identified, they should be required to explain in detail what they’re doing to de-identify it. And if the FTC thinks the disclosures aren’t precise enough, there should be an expedited process for asking for more information (outside of any Consent Order framework).

Stronger enforcement will play a key role as well — and the FTC should be given the ability to obtain civil penalties (and for all violations of Section 5 while we’re at it). Importantly, this penalty authority cannot be subject to a hard cap as was proposed in the Obama privacy bill of rights; such a cap would only privilege giant companies like Facebook with a greater ability to pay. The right penalty for Facebook is substantially different than the right penalty for a company like LabMD, and so a penalty must be calibrated to ability to pay and the severity of wrongdoing in order to serve as an appropriate deterrent.

Finally, the FTC needs a massive infusion of staff and budget to support its mission. The agency today is significantly smaller than it was in the 1980s, while the economy has grown three times in size. The development of entire new industries based on internet connectivity poses tremendous consumer protection — and especially privacy — challenges. IAPP itself has grown 300-fold in just the past seventeen years, and Facebook itself has grown to more than two billion active users in a shorter time than that.

The FTC needs a substantial boost in resources just to keep up.

Much has been said about the need to bring in many more technologists to the FTC, even potentially creating a Bureau of Technology (perhaps coming out of my former group the Office of Technology Research and Investigation which is itself only three years old). But the agency also needs more privacy attorneys as well in order to stay on top of a dizzyingly complex ecosystem.

All this might sound ambitious, but in reality, it’s only part one of what the agency needs in order to wrangle Facebook and other giants like Google and Amazon. The law needs to be updated to address a host of problems posed by these platforms, most notably competition and a failure to adequately police fraud, harassment, and propaganda on their networks.

But giving the FTC what it needs to effectively check ravenous and abusive privacy behaviors is a good place to start.

photo credit: eli.pousson via photopin cc

Comments

If you want to comment on this post, you need to login.