Last week, the U.S. Court of Appeals for the 11th Circuit issued its long-awaited decision in the latest permutation of the LabMD, Inc. v. Federal Trade Commission case. While not questioning the FTC’s jurisdiction to enforce against privacy and data security violations (the court didn’t uphold jurisdiction either; rather it assumed it arguendo), the court handed the agency a setback, vacating its cease and desist order against the now-defunct LabMD for lack of specificity. The important question now is what the 11th Circuit decision entails for the FTC’s privacy and data security regulation.
In an important 2014 Columbia Law Review article, Dan Solove and Woody Hartzog likened the body of FTC settlements in the privacy and data security space to a common law. Like the common law, they suggested, the FTC’s consent decrees build incrementally from case to case and follow a coherent and consistent approach closely tethered to industry best standards. Following up on Solove and Hartzog’s paper, in September 2014, IAPP Westin Fellow Patricia Bailin, CIPP/US, CIPM, CIPT, published a study titled "What FTC Enforcement Actions Teach Us About the Features of Reasonable Privacy and Data Security Practices." In her study, Bailin pointed out that one of the shortcomings of the common law approach is the difficulty in deciphering what the FTC expects companies to do as a reasonable measure of data security.
Looking at the body of FTC enforcement actions, we can clearly say what companies should not do. They shouldn’t do what got those companies into trouble in the first place. Indeed, this is crystal clear even in the LabMD case itself: Companies processing sensitive patient data should not allow employees to download P2P software (in that case, LimeWire), resulting in leakage of sensitive data about 1,718 patients. Doing so is unreasonable (tort law), unfair (FTC law) and even bad (morally) behavior.
But what’s a company supposed to do to avoid the next FTC enforcement action? Allow P2P software in the presence of sensitive patient data? No. But what about allowing P2P software in the presence of non-sensitive data? Or, looking back to LabMD, what if the company hadn’t allowed an employee to download P2P software but had still failed to implement industry standard security practices?
We recognized back in 2014 that the FTC privacy and data security orders were written in broad brushstrokes. This, obviously, weighed heavily in the decision of the 11th Circuit in LabMD. To address this, Bailin’s study sought to “reverse engineer” the FTC’s implied standard for privacy and data security. In other words, reading between the lines of numerous cases where the FTC articulated why certain practices were unreasonable, the Westin Center tried to delineate the zone of activity that the agency would consider to be reasonable. Skeptics who claim that the FTC’s body of work is too cryptic should look back to the 2014 study, which offers a list of clear guidance recommendations.
We now use this opportunity to update the 2014 study, adding analysis of the rules implicit in the settlement orders between 2014 and 2018, in a brand-new piece by current IAPP Westin Fellow Müge Fazlioglu, CIPP/E, CIPP/US: "What FTC Enforcement Actions Teach Us About the Features of Reasonable Privacy and Data Security Practices: A Follow Up."
An interesting question that arises out of last week’s LabMD decision is what effect, if any, it has on the dozens of existing FTC settlement orders in our space. In the past, the FTC acted several times against parties that allegedly violated a consent decree. Could such parties now argue that the settlements are unenforceable to the extent that they’re overly vague? To be sure, many of the settlements, some of which mandate privacy and data security programs for periods of up to 20 years, include language that lawyers would characterize as more standard than rules-based. For example, instead of requiring 128-bit encryption or 12-character passwords, they call for reasonable industry standard security measures. As Solove and Hartzog pointed out last week, ironically, the 11th Circuit’s antipathy toward courts’ micromanaging corporate security programs will inevitably lead to much more micromanaging in FTC injunctive orders that will be specific and rules-based. Of course, agreed upon settlements are not identical to top-down cease and desist orders, so courts may allow the agency more flexibility in crafting them in the future. But according to the complex administrative law of the FTC, settlements too need to be enforced, and, at the point of contention, a company could ostensibly claim that injunctive relief lacks specificity.
More generally, the LabMD decision demonstrates the need for U.S. privacy and data security legislation. The FTC is operating against an unconventional legal backdrop. There is no overarching U.S. law determining privacy and data security standards. Instead, the agency must chart its own path using what I once called the narrowest of legal mandates. Compare the 55,000 words in the European General Data Protection Regulation of 2018 to the 14 words in Section 5 of the FTC Act, which dates back to 1914 (“unfair or deceptive acts or practices in or affecting commerce, are hereby declared unlawful”). There’s an inherent measure of vagueness in rules that are based on just those 14 words. Interestingly, the court assumes arguendo that its authority in data security cases is grounded in “consumers’ right to privacy” (based where?) and that the “legal policy” supporting the unfairness test is the tort of negligence (Restatement (Second) of Torts Section 281).
Importantly, the FTC doesn’t have the power to impose penalties directly, instead implementing a two-strike system where it first issues a cease and desist order and then imposing civil sanctions only if that order is violated. This is the case even in blatantly negligent fact patterns such as the Wyndham case, where a court likened the hotel chain to a “supermarket, leaving so many banana peels all over the place that 619,000 customers fall.” Under a streamlined legal regime, the FTC would pursue a more straightforward path. It would determine that by failing to deploy reasonable data security the respondent acted unfairly, and thereby impose a civil monetary penalty. Case closed.
In the absence of federal legislation, states are rushing in to fill the space. With a steady drumbeat of state privacy laws, Vermont legislating against data brokers, Illinois restricting biometric information and – the biggest one of all – California weighing a sweeping privacy ballot initiative, businesses may soon be clamoring for a uniform, national privacy law, overseen by an agency with a tradition of bi-partisanship and careful development of law like the FTC.
Photo credit: theilr parallel lines via photopin(license)