If you haven’t already read the speech that Federal Trade Commission (FTC) Bureau of Consumer Protection (BCP) Director Jessica Rich gave last week at an advertising industry meeting, take the time to do it now. Direct, sharp and concise, it maps out the state of play at the intersection of technology, law and policy, and charts a path for businesses that wish to avoid legal trouble not only with the powerful federal regulator but also with state attorneys general, class-action plaintiffs and European privacy authorities.
Rich states that “privacy has moved from a simple matter of legal compliance, best left to lawyers and IT professionals, to a C-suite issue—part of a broader bottom-line strategy.” This reinforces the message of the IAPP that compliance and security are not enough for data governance in today’s digital economy. Rather, to avert public embarrassment and consumer backlash, they must consider ethical review processes and instill issue-spotting skills in employees throughout the organization.
Speaking about the risks of the digital economy, Rich is not apologetic. While tipping her hat to the benefits of online advertising, she states, “targeted advertising raises consumer privacy concerns, plain and simple.” The FTC is mindful of both sides of the value equation. Quoting the agency’s “unfairness” test codified in Section 5(n) of the FTC Act, she states, “That’s legal mumbo jumbo, but the important point is that by law and practice, the FTC weighs market benefits and harms as part of its enforcement and policy work.”
Rich highlights cross-device tracking, device fingerprinting and onboarding (“taking data collected offline and using it online”) as flashpoints for privacy friction. “We are so beyond cookies,” she says, adding, “Privacy policies—if you have the will and ability to find them—are impenetrable.” She also addresses the difficulty of maintaining individual control over data in an era where “most companies that obtain consumer data are behind the scenes and never interact with consumers.” In its report on the Internet of Things, the FTC staff raised similar concerns about delivering notice and choice in an age of chattering devices without a user interface.
With the dizzying pace of technological innovations and the surge of new business models, Rich heralds regulation that is “flexible by design.” Just like tort law, which for centuries has poured meaning into a “reasonableness” standard, Section 5 of the FTC Act, with its restriction against “unfair or deceptive acts or practices,” lends itself to contextual interpretation that evolves over time. To anyone questioning the clarity of the FTC’s enforcement standards, Rich cites a long line of enforcement cases saying: “These are just some examples of ways your data practices could go wrong—the things you don’t want to do.” Simple enough. Don't do what other businesses did to get into trouble. This month, the IAPP provided its membership a powerful tool, an FTC Casebook, helping them search, arrange and study more than 180 cases tagged, indexed, full-text searchable and analyzed. You no longer have an excuse to plead ignorance on the FTC’s body of work in this area.
However, not doing what previous FTC targets did will not buy you an automatic get out of jail free card. The FTC has not yet pursued every conceivable infraction; circumstances change, and new fact patterns are sure to develop. Even the most detailed legislation leaves much room for interpretation. It is hard to believe that even those businesses that are complaining about lack of fair notice and due process would favor regulation that addresses every nook and cranny. In other words, notwithstanding a residual degree of uncertainty, businesses are far from clueless about potential FTC enforcement. The writing is on the wall—and now also on the IAPP FTC Casebook, which Rich’s predecessor as BCP chief, David Vladeck, calls “a game changer.”
But knowing what not to do is only part of the job. What are businesses expected to do in order to be proactive and diligent about privacy? While not offering a checklist, Rich lays out in her speech a clear strategy for responsible data practices:
First, as data collection becomes complex and all-encompassing, corporate disclosures should apply to all of the different forms. “Otherwise,” Rich says, “the protections being offered are illusory, applying only to a small percentage of the practices that are actually occurring.” To be meaningful and nondeceptive, the information and choices that businesses provide to their consumers must cover all of their tracking practices, not just a subset.
Second, disclosures can’t include exceptions that swallow the rules. “For example,” Rich notes “if they purport to limit tracking based on sensitive data, they shouldn’t play games about what ‘sensitive data’ means, such as defining medical data to mean only official medical records.” These words will resonate in a market where non-HIPAA-regulated health information is becoming ever more prevalent, through wearable devices, online and mobile tracking and big data analysis. Businesses must make sure that any claims they make about how they collect, use or share data are truthful and complete.
Fourth, Rich reiterates the agency’s calls in its 2012 report, Protecting Consumer Privacy in an Era of Rapid Change, for Privacy by Design. Originally introduced into policy jargon by former Ontario Privacy Commissioner Ann Cavoukian, and sometimes derided by critics as mere media hype, Privacy by Design is the notion of engineering privacy into products and services from the first development stage. For any remaining skeptics, Rich spells out what she means when she says Privacy by Design. This includes “reasonable data collection and retention limits, de-identification of data where feasible and sound data security and disposal practices.” While recognizing the technical limits of robust de-identification, which have been discussed extensively in the scientific community, Rich continues to emphasize the value of de-identification as a measure for risk mitigation. She explains, “you should bolster sound technical strategies for de-identifying data with a strong commitment and effective polices not to reidentify it. This means that companies should publicly commit not to seek to reidentify the data and should, through contract, require the same of those with whom they share data.”
Fifth, Rich urges companies to be careful about whom they do business with. Through a series of enforcement actions, the FTC has developed a theory of indirect liability, imposing on companies obligations with regard to selecting, contracting with and overseeing service providers that access their data. For example, in the GMR Transcription case, the agency indicates that subcontractors who handle personal information must be explicitly, contractually required to adopt reasonable data security measures. Additionally, companies should continue not only to monitor but also to verify that their service providers are adequately protecting personal information. Rather than simply relying on a service provider’s promise, best practices now require proactive measures such as reviewing third-party audit reports about a service provider and examining a service provider’s written data security policies before hiring it and during the course of the relationship. Moreover, Rich warns, “if you buy information from bad actors, or sell or share it with them, you could find yourself embroiled in a law violation.” In short, today, any transactional due diligence process must include a module focused on verifying your business partner’s privacy and data security credentials. Your own compliance is no longer enough.
Finally, Rich recommends that businesses steer clear of marketing using sensitive data. “Consider avoiding it altogether,” she says, “but, at the very least, provide opt-in.” This common-sense approach should have become evident to companies based on the expanding group of businesses that got burned for peddling sensitive information—consider Target (“pregnancy score”), Office Max (“daughter got killed in car crash”) and Uber (“rides of glory”).
Jessica Rich’s speech provides a road map for companies seeking to put in place a strategy for privacy and data security. To operationalize these recommendations, businesses must employ knowledgeable, well-trained privacy and security professionals who can spot issues, coordinate responses and preempt problems through responsible design of new products and services.
Jessica Rich will speak as part of the IAPP Global Privacy Summit’s Conversations in Privacy, March 5 and 6, in Washington, DC.
Search through nearly 200 cases involving privacy and data security to research the FTC’s historical positions. Use the FTC Casebook here.
If you want to comment on this post, you need to login.