TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout
GDPR-Ready_300x250-Ad
PSR18_Web_300x250-COPY
DPC18_Web_300x250-COPY

Last week, the U.S. Court of Appeals for the 11th Circuit issued its long-awaited decision in the latest permutation of the LabMD, Inc. v. Federal Trade Commission case. While not questioning the FTC’s jurisdiction to enforce against privacy and data security violations (the court didn’t uphold jurisdiction either; rather it assumed it arguendo), the court handed the agency a setback, vacating its cease and desist order against the now-defunct LabMD for lack of specificity. The important question now is what the 11th Circuit decision entails for the FTC’s privacy and data security regulation.

In an important 2014 Columbia Law Review article, Dan Solove and Woody Hartzog likened the body of FTC settlements in the privacy and data security space to a common law. Like the common law, they suggested, the FTC’s consent decrees build incrementally from case to case and follow a coherent and consistent approach closely tethered to industry best standards. Following up on Solove and Hartzog’s paper, in September 2014, IAPP Westin Fellow Patricia Bailin, CIPP/US, CIPM, CIPT, published a study titled "What FTC Enforcement Actions Teach Us About the Features of Reasonable Privacy and Data Security Practices." In her study, Bailin pointed out that one of the shortcomings of the common law approach is the difficulty in deciphering what the FTC expects companies to do as a reasonable measure of data security.

Looking at the body of FTC enforcement actions, we can clearly say what companies should not do. They shouldn’t do what got those companies into trouble in the first place. Indeed, this is crystal clear even in the LabMD case itself: Companies processing sensitive patient data should not allow employees to download P2P software (in that case, LimeWire), resulting in leakage of sensitive data about 1,718 patients. Doing so is unreasonable (tort law), unfair (FTC law) and even bad (morally) behavior.

But what’s a company supposed to do to avoid the next FTC enforcement action? Allow P2P software in the presence of sensitive patient data? No. But what about allowing P2P software in the presence of non-sensitive data? Or, looking back to LabMD, what if the company hadn’t allowed an employee to download P2P software but had still failed to implement industry standard security practices?

We recognized back in 2014 that the FTC privacy and data security orders were written in broad brushstrokes. This, obviously, weighed heavily in the decision of the 11th Circuit in LabMD. To address this, Bailin’s study sought to “reverse engineer” the FTC’s implied standard for privacy and data security. In other words, reading between the lines of numerous cases where the FTC articulated why certain practices were unreasonable, the Westin Center tried to delineate the zone of activity that the agency would consider to be reasonable. Skeptics who claim that the FTC’s body of work is too cryptic should look back to the 2014 study, which offers a list of clear guidance recommendations.

We now use this opportunity to update the 2014 study, adding analysis of the rules implicit in the settlement orders between 2014 and 2018, in a brand-new piece by current IAPP Westin Fellow Müge Fazlioglu, CIPP/E, CIPP/US: "What FTC Enforcement Actions Teach Us About the Features of Reasonable Privacy and Data Security Practices: A Follow Up."

An interesting question that arises out of last week’s LabMD decision is what effect, if any, it has on the dozens of existing FTC settlement orders in our space. In the past, the FTC acted several times against parties that allegedly violated a consent decree. Could such parties now argue that the settlements are unenforceable to the extent that they’re overly vague? To be sure, many of the settlements, some of which mandate privacy and data security programs for periods of up to 20 years, include language that lawyers would characterize as more standard than rules-based. For example, instead of requiring 128-bit encryption or 12-character passwords, they call for reasonable industry standard security measures. As Solove and Hartzog pointed out last week, ironically, the 11th Circuit’s antipathy toward courts’ micromanaging corporate security programs will inevitably lead to much more micromanaging in FTC injunctive orders that will be specific and rules-based. Of course, agreed upon settlements are not identical to top-down cease and desist orders, so courts may allow the agency more flexibility in crafting them in the future. But according to the complex administrative law of the FTC, settlements too need to be enforced, and, at the point of contention, a company could ostensibly claim that injunctive relief lacks specificity.

More generally, the LabMD decision demonstrates the need for U.S. privacy and data security legislation. The FTC is operating against an unconventional legal backdrop. There is no overarching U.S. law determining privacy and data security standards. Instead, the agency must chart its own path using what I once called the narrowest of legal mandates. Compare the 55,000 words in the European General Data Protection Regulation of 2018 to the 14 words in Section 5 of the FTC Act, which dates back to 1914 (“unfair or deceptive acts or practices in or affecting commerce, are hereby declared unlawful”). There’s an inherent measure of vagueness in rules that are based on just those 14 words. Interestingly, the court assumes arguendo that its authority in data security cases is grounded in “consumers’ right to privacy” (based where?) and that the “legal policy” supporting the unfairness test is the tort of negligence (Restatement (Second) of Torts Section 281).

Importantly, the FTC doesn’t have the power to impose penalties directly, instead implementing a two-strike system where it first issues a cease and desist order and then imposing civil sanctions only if that order is violated. This is the case even in blatantly negligent fact patterns such as the Wyndham case, where a court likened the hotel chain to a “supermarket, leaving so many banana peels all over the place that 619,000 customers fall.” Under a streamlined legal regime, the FTC would pursue a more straightforward path. It would determine that by failing to deploy reasonable data security the respondent acted unfairly, and thereby impose a civil monetary penalty. Case closed.

In the absence of federal legislation, states are rushing in to fill the space. With a steady drumbeat of state privacy laws, Vermont legislating against data brokers, Illinois restricting biometric information and – the biggest one of all – California weighing a sweeping privacy ballot initiative, businesses may soon be clamoring for a uniform, national privacy law, overseen by an agency with a tradition of bi-partisanship and careful development of law like the FTC.

Photo credit: theilr parallel lines via photopin (license)

2 Comments

If you want to comment on this post, you need to login.

  • comment Ken Mortensen • Jun 11, 2018
    Some thoughts on what this means... 
    
    
    
    While I too have analogized the FTC Consent Decrees to the "common law" of consumer privacy from the FTC and while they do work in many respects as such, the analogy only works so far as to see the evolution  and policy-shaping of the FTC's approach to privacy and information security.  The reality was that each Consent Decree was templated from the most recently done one and that the FTC used the next one to push the boundaries of the different areas, such as scope of the definition of personal information, breadth of controls, level of assessments and review, etc., depending on its policy directions and the cases presented. Not that these are bad things, but it did lead to some problems, which got us to the LabMD holding and specifically the court's finding that there was not a clear connection between the order and the implementation of FTC's authorities - as the court notes, the FTC required LabMD's information security program to meet an "indeterminable standard of reasonableness."  Unfortunately, there may be an FTC "common law," but without the necessary checks and balances.
    
    To that end, the "vagueness" of the 14-words provides the better balance without the need for a "common law" and the FTC is not necessarily responsible for determining the contents of the relevant standard, which can be picked from the many that now exist, but rather the agency can link its determination to those standards knowing that the provide the requisite applicability to the underlying risks. Indicating a reasonable connection to a standard would address the "ambiguity" in the order and also provide a specific basis for the entity on the other side to take issue. While not perfect, this approach of each case being a de novo review will result in better outcomes for consumers (as well as for the agreeing entities) because the focus will be on protections and safeguards with articulable standards tied to the "consumer's right to privacy."
    
    On that thought, I don't believe that this dictates the need for an overarching U.S. privacy law - not that I am against such an outcome, but I do not believe that this would be a necessary result of this particular case.  Within the 55,000 works of the GDPR, there is still no language defining the controls that an organization should implement based upon specifically defined privacy and information security risks.  What I think would be more useful would be for global standards, along the lines of ISO/IEC 27001/2 and ISO/IEC 29100, that provide a standardized control set from which protections and safeguards can be deployed by an organization against defined risks, such that the organization can audit for design of controls and operational effectiveness - better proofs for implemented protections and safeguards.
    
    
  • comment Omer Tene • Jun 11, 2018
    Thanks Ken, super insightful invaluable feedback. Much appreciate it.