TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Study: What FTC Enforcement Actions Teach Us About the Features of Reasonable Privacy and Data Security Practices Related reading: MedData data breach lawsuit settled for $7M




One of the major policy questions in the data governance space these days is what constitutes a reasonable level of privacy and data security that could provide a company with a safe harbor from U.S. Federal Trade Commission (FTC) enforcement action. Over the past several months, two companies facing FTC action as well as a chorus of lawyers and scholars have complained that enforcement is misguided absent clearer data security standards. Complicating matters, the FTC has been particularly tight-lipped about what data security standards it expects to see, and industry calls for policy guidance documents or workshops were left unheeded. Essentially, the industry is arguing that the FTC is shifting the goalposts during the game.

Regardless of the merits of such arguments (industry groups fight tooth-and-nail against data security legislation even as they urge clearer guidance), the Westin Research Center has explored FTC privacy and data security consent decrees to try to parse out what an acceptable level of privacy and data security could be. This study is part of the Westin Research Center’s project to provide a comprehensive casebook of FTC privacy and data security enforcement actions.


In at least 47 cases since 2002, the FTC has cited companies for failing either to design or to implement an appropriately comprehensive privacy or data security program. Almost all of these cases have been settled. The settlement requirements include relatively standardized language outlining the parameters of a data security program and begin to chart a path toward the development of similarly standard language for privacy programs. However, aside from requiring the designation of an adequately trained chief data security or privacy officer and the undertaking of regular risk assessments, the standard language that the FTC uses is terse and offers little in the way of specifics about the components of a compliance program. Consequently, anyone seeking to design a program that complies with FTC expectations would have to return to the complaints to parse out what the FTC views as “unreasonable”—and, by negation, reasonable—privacy and data security procedures.

The following analysis is the result of this approach. It suggests possible guidelines for complying with FTC privacy and data security standards based on what the FTC has determined inadequate. In other words, by pointing out what companies did not have in their programs, the FTC provides a peek at what, in its opinion, these companies should have done. The guidelines below are drawn from the 47 cases, loosely organized into seven categories that are not mutually exclusive: Privacy, Security, Software/Product Review, Service Providers, Risk Assessment, Unauthorized Access/Disclosure and Employee Training.

We emphasize that the requirements below reflect neither legal advice by the Westin Research Center nor express guidance by the FTC; rather, they are extrapolated from FTC assertions of corporate wrongdoings. (Ostensibly, there could be a gap between what the FTC views as inadequate and the guidance below; i.e., a company may not be cited for a Section 5 violation even if it does not fix all of its shortcomings under the FTC complaint.)


In the four decisions requiring companies to establish a comprehensive privacy program [MySpace, Facebook, Google and Snapchat], companies allegedly failed to respect user choices. According to the FTC, these companies ignored consumer privacy preferences or misled consumers by providing inaccurate or incomplete information about user privacy, notice and control. In all four cases, the companies allegedly violated their own privacy policies and their statements on privacy settings. It is possible to deduce from the FTC’s complaints that companies should:

  • Perform risk assessments during the design and development stage of a new product or service to identify and address privacy vulnerabilities;
  • Conduct regular testing and monitoring of the effectiveness of privacy controls, settings and procedures to ensure that user choices are respected;
  • Conduct regular reviews of privacy statements and product design in order to ensure a match between privacy policies, available user options, disclosures to third parties and product functions, and
  • Obtain explicit user consent to override prior user choices or to apply new privacy policies to previously collected data when privacy options or policies change.


Since the FTC’s settlement with Microsoft in 2002, the commission has made clear that companies handling consumer information must implement a program that contains “administrative, technical, and physical safeguards appropriate to [the organization’s] size and complexity, the nature and scope of [its] activities, and the sensitivity of the personal information collected from or about consumers.” Such a program must set forth procedures not only for data collection, but also for its storage, handling, transport, and disposal.

The commission has reiterated several times that formal data security procedures should employ “readily available” technology and practices for safeguarding consumer information. The programs must consider not only well-known threats but also business-specific vulnerabilities, and they must be tailored to an organization’s specific business model and data needs. Specific language in FTC cases indicates that security policies should include consideration of password procedures, encryption protocol, access limitations and processes for data retention and disposal.


Despite the well-documented importance of complex user credentials, corporate employees, as well as average consumers, continue to generate weak user ID and password combinations. Password policies are therefore fundamental to a company’s data security, as the FTC notes with increasing frequency. A company should:

  • Establish and enforce rules requiring strong [TJX/Cardsystems], hard-to-guess [LifeLock/Twitter] user IDs and passwords. Strong credentials can be achieved through default requirements that prohibit the use of common dictionary words [Reed Elsevier]; forbid the use of the same word or a close variant of the word for both the password and the ID [Lookout], and deny a user the ability to create credentials that he or she already employs elsewhere on the network, especially passwords used to access third-party programs, websites, and networks [Twitter];
  • Require periodic changes of user credentials, such as every 90 days, for customers and employees with access to sensitive personal information [Lookout/Twitter/Reed Elsevier];
  • Suspend user credentials and/or disable administrative passwords [Twitter] after a reasonable number of unsuccessful login attempts [Lookout/Reed Elsevier];
  • Prohibit the use of default passwords [BJ’s/DSW] or the sharing of user credentials [Reed Elsevier], as these practices reduce the likelihood of detecting or accounting for unauthorized activity;
  • Establish and enforce policies to prohibit storage of administrative passwords in plain text on computers [Guidance Software], in cookies [Reed Elsevier], or in personal email accounts [Twitter], and
  • Implement procedures for verifying or authenticating the identity of users who create new credentials for systems or programs that will enable them to access personal information [Choicepoint/Rental Research].


FTC attention has regularly focused on data encryption. In more than half (27) of the cases requiring privacy or data security programs, the FTC addressed the defendant’s encryption protocols, which should be compatible with industry standards. ValueClick, for example, was cited in 2008 for “using only an insecure form of alphabetic substitution that [was] not consistent with, and less protective than, industry-standard encryption.” Given the demonstrated risk of security breaches, companies should render personal information “unusable, unreadable, or indecipherable” as the FTC noted in its complaint against CBR. Consequently, security programs should contain protocols to ensure the company will:

  • Transmit sensitive [Credit Karma/Fandango] and personal information [TJX], including user credentials [TRENDnet] and financial account and credit card information [Compete/Upromise], securely in either encrypted format [BJ’s] or through cryptographic protocols (TLS or SSL), and
  • Store sensitive and personal information in encrypted format [Genelink/Guess], including information that was encrypted during transmission [Petco] and any personal information on in-store networks [BJ’s], back-up tapes [Nutter], or other portable media devices [CBR].

Limited Access

Another common security issue raised in a number of cases is lack of sufficient control over access to personal information. Access limitations can be instituted at either a network level, for example, by maintaining multiple servers or installing firewalls or similar solutions, or at the employee level, by restricting access to personal information to personnel of specific departments or to individual employees. Limited access policies curb unnecessary security risks and minimize the number and type of network access points that an information security team must monitor for potential violations. A company is therefore advised to:

  • Segment servers [RockYou] so that unauthorized access to one does not compromise the security of all;
  • Apply readily available security measures such as firewalls or isolated payment card systems [Dave & Buster's] to control and monitor access:
    • to a computer network from wireless access points [BJ’s/Dave & Buster's],
    • between computers on a network, including between computers on one in-store network and those on other in-store or corporate networks [DSW], and
    • between computers on the company network and the Internet [Cardsystems/Genica/TJX];
  • Limit employee access [CBR] to and copying of [Accretive Health] personal information based on such employee’s role;
  • Restrict third party access to personal information based on business need, for example, by restricting access based on IP address, granting temporary access privileges, or similar procedures [Dave & Buster's];
  • Establish an employee login page that is unknown to consumers and separate from the customer/end user login page [Twitter], and
  • Restrict the number of files on which data can be stored [DSW] in order to simplify compliance with data use limitations and deletion procedures [Accretive Health].

Data Retention and Disposal

The FTC has proscribed companies’ continuous storage of data after it has served its business purpose. While the FTC does not specify a precise threshold for data retention, it has stated that data should be stored only for as long as it serves a legitimate business need. Companies should also have policies in place for disposing of data once such business needs have been met. Companies should thus:

  • Formalize policies regarding the length of time consumer data will be stored. Data should not be stored indefinitely, but rather for a period of time commensurate with legitimate business needs [Ceridian/Life is Good]. Information for which there is no longer a business need should be destroyed [CBR], and
  • Implement and monitor compliance with policies and procedures for rendering information unreadable or otherwise secure in the course of disposal. Securely disposed information must not practicably be read or reconstructed [CVS Caremark/PLS Financial].

Software/Product Review

Companies are responsible for managing the privacy and security of personal information that is processed by the products they develop and use. Accordingly, they are required to instate checks and controls on the products they use as well as throughout the development process of new products and services [Credit Karma/Fandango]. The goal is to ensure that all products and software function according to an organization’s stated privacy and data security policies as well as any applicable industry standards. More specifically, companies should:

  • Implement appropriate checks and controls on the review and testing of software and products intended for internal use [Eli Lilly];
  • Follow well-known, commonly-accepted secure programming practices, including secure practices that are expressly described in the product’s operating system guides for manufacturers and developers [HTC], and
  • Perform security reviews and testing of software and products at key points in the development cycle:
    • Such procedures may take the form of a security architecture review, vulnerability and penetration testing, and reasonable and appropriate code review and testing to verify that security and privacy protections are consistent with user choices and company policy [TRENDnet];
    • Special procedures should exist to test for any excessive collection of personal information, i.e., information that does not serve a legitimate business need [Compete]; unauthorized collection of personal information [Snapchat], and products or software that will override existing, promised, or standard defaults without employing substitute security measures [TRENDnet].

Service Providers

Service providers who access or handle personal information on behalf of a company could create liability for the company and must therefore be properly managed. For example, in a recent case, CBR, an umbilical cord blood and tissue bank, settled an FTC enforcement action alleging, among other things, failure to adequately supervise a service provider. In that case, the company’s lack of oversight resulted in two security lapses: the retention of a legacy database for which CBR no longer had a legitimate business need, as well as the storage of personal information in that database in unencrypted, vulnerable form.

Requisite oversight of a service provider will vary depending on the service and the sensitivity of the information; but companies should generally:

  • Require by contract that service providers implement and maintain appropriate safeguards for consumers’ personal information [Genelink/Goal Financial];
  • Ensure reasonable oversight of service providers’ security practices and their employees’ handling of personal information [Genelink/Credit Karma]:
    • Adequately verify, through monitoring and assessments, that service providers implement reasonable and appropriate security measures to  protect personal information [GMR];
    • Request and review relevant information about a service provider’s security practices, such as its documented information security program or the results of audits and security assessments conducted on their network [GMR].

Risk Assessment

Each time the FTC has mandated that a company implement a comprehensive privacy or data security program, it has required that such program consider “material internal and external risks that could result in the…compromise of personal information and an assessment of the sufficiency of any safeguards in place to control these risks…in each area of relevant operation.” In addition, the FTC expects to see policies in place to ensure that any issues are addressed and corrected. Risks should be continuously assessed and the program adjusted as a result of ongoing monitoring, material changes to company operations, or any other changing circumstances. Risk assessment policies should include provisions obligating the company to:

  • Record and retain system information sufficient to perform security audits and investigations [Microsoft];
  • Perform assessments to identify reasonably foreseeable risks to the security, integrity, and confidentiality of personal information collected and stored on the network [Genelink], online [EPN] or in paper files [Nutter];
  • Assess application or network vulnerabilities, particularly to commonly known or reasonably foreseeable attacks like “cross-site scripting” [Reed Elsevier], to widely known security flaws like “predictable resource location,” or to design flaws like the ability of users to bypass website authentication procedures [Lookout], and
  • Evaluate the likelihood and risks of third party access [Premier Capital Lending]; for example, in some cases it might be sufficient to provide third parties with access to fictitious data sets rather than real personal information or to provide them with access to only certain categories of personal data rather than unrestricted database access [Genelink].

Unauthorized Access/Disclosures

Preventing and addressing incidents of unauthorized access to and disclosures of personal information are critical components of a comprehensive privacy and data security program. References to incident response appear in more than half (24) of the cases in this study. Extrapolating from the inadequacies described in past complaints, a company should:

  • Implement reasonable measures to assess and enforce compliance with established security policies and procedures, such as by scanning networks for and blocking unauthorized downloads of applications [EPN];
  • Implement policies for the prevention of unauthorized access, including standard procedures, such as regularly checking for and installing security patches and critical updates on the company’s network [LifeLock];
  • Implement policies for the detection of unauthorized access, including:
    • installing antivirus or anti-spyware programs on computers [LifeLock],
    • employing an intrusion detection system [Genica],
    • creating a formalized process to address security warnings and intrusion alerts [TJX], and
    • logging network activity [EPN] and reviewing activity on the network [LifeLock];
  • Monitor and filter outbound traffic [Dave & Buster's] and outgoing transmissions [EPN] to identify and block unauthorized disclosures of personal information;
  • Implement procedures for receiving, reviewing, and addressing security vulnerability reports from third parties [Fandango], including researchers, academics, or other members of the public [HTC], and
  • Prepare an incident response plan that is ready to implement immediately upon detection of a security breach [EPN].

Employee Training

Designing and implementing an employee training program has increasingly become a standard industry practice and one that the FTC has required in numerous settlements. The employee training program should be designed to assist employees in understanding and managing privacy and data security safeguards. Once a company has a privacy and data security program in place, the employee training program should focus on ensuring compliance with it. According to the FTC, an employee training program should:

  • Educate employees on the company’s privacy and data security policies as well as on security risks [Upromise] relevant to their jobs:
    • Employees, including software/product engineers [HTC] and executives [EPN], should receive training on information security, collection, handling, transport, maintenance and disposal of consumer personal information;
    • Engineering staff should receive appropriate training and oversight for detecting application vulnerabilities and conducting security testing [Tower Records/HTC];
  • Provide adequate training to employees about timely and effective security incident response [Goal Financial].

Interestingly, in one case, the FTC alleged that a company suffered a security breach by “using personal information in training sessions with employees and failing to ensure that the information was removed from employees’ computers following the training” [Accretive Health]. Of course, the training program itself must not compromise a company’s data security.


This study sets forth the contours of a reasonable privacy and data security program based on analysis of 47 FTC enforcement cases. While providing companies with neither a safe harbor from enforcement nor immunity from a privacy or data security breach, such a program will mitigate risk and strengthen a company’s hand in dealing with any adversity. Although the analysis in this study is informal and based on “reverse-engineering” the FTC’s complaints, it offers a potential starting point for defining “reasonable” measures of privacy and data security, thereby helping to clear the fog of uncertainty surrounding compliance with a constantly shifting legal landscape.

1 Comment

If you want to comment on this post, you need to login.

  • comment Julie • Sep 24, 2014
    Thank you, Patricia, for a very thorough and relevant analysis!
    -Julie Glover, JD, CIPP/US
    6 Degrees Privacy Consulting