Editor's Note: From time to time, the IAPP publishes the opinions of leading figures in data privacy and protection. Chris Jay Hoofnagle is one of them. Chris is director of the Berkeley Center for Law & Technology and is an active researcher and voice in the greater privacy community. We see value in the perspectives that come from voices like his, but these opinions do not reflect those of the IAPP. In fact, the IAPP takes no opinion on these issues. But we do like to put them out there, occasionally, to be disputed, reputed or otherwise discussed; such conversations have the potential to lead to better solutions in the marketplace. Do you have an opinion about the state of self-regulation? Do you have ideas about how to improve this and other privacy-related matters? Let us know.
Why do privacy advocates remain so opposed to self-regulation? Self-regulatory programs suffer from an enduring credibility problem, established by the short-lived IRSG and the languid NAI, and continued today in the form of business practices that express disregard for consumers' expressed preferences. Having been caught using technical measures to undo consumers' preferences, these companies act with indignation over civil liability rather than confront the ethics of the practices in question. Some even try to justify companies' anti-consumer actions by arguing that the public policy approaches of privacy laws were illegitimate to begin with. In this atmosphere, privacy advocates are right to argue that self-regulation is not a viable option. If the industry itself is unwilling to recognize the legitimacy of consumer privacy interests, how can it be trusted to protect these interests absent legal obligations? Why do privacy advocates remain so skeptical of self regulation? At the most basic level, self-regulatory initiatives suffer from a deep lack of credibility among advocates. From the outside, this skepticism may appear to be some sort of anti-business bias. That may be the case among some, but among others the anxiety surrounds a simple and (in my view) legitimate problem: how can industry self-regulate when it fundamentally rejects the value of consumer privacy? Here are three recent examples to consider:
  • Over the years, consumers have become savvier about Internet use, and a substantial number have been "tossing" their cookies. This obviously affects Web tracking, and so many Web sites and advertisers adopted Flash cookies. These cookies were relatively unknown to consumers, had very poor controls, offered advantages for tracking over standard cookies and did something very tricky: Flash cookies could "respawn" or re-instantiate ordinary cookies that consumers deleted. Adopting Flash cookies allowed Web sites to continue tracking in ways even more persistent than normal cookies and undo consumers’ privacy-seeking behaviors.
  • Microsoft, with the adoption of Internet Explorer 6 (MSIE), incorporated the Platform for Privacy Preferences (P3P) into its browser. In its default mode, MSIE blocked third-party cookies. Network engineers noticed this and got around it by posting a P3P “Compact Policy” (CP). Merely posting a policy—even an invalid one—caused cookies to be installed. Lorrie Cranor’s team at Carnegie Mellon recently found, “…thousands of sites using identical invalid CPs that had been recommended as workarounds for IE cookie blocking…98% of invalid CPs resulted in cookies remaining unblocked by IE under its default cookie settings. It appears that large numbers of Web sites that use CPs are misrepresenting their privacy practices, thus misleading users and rendering privacy protection tools ineffective.”
  • Recently, the California Supreme Court held in Pineda v. Williams Sonoma that zip codes could qualify as personal information under the state's Song-Beverly Act. This means that companies that collect the zip code at the register when accepting plastic may be violating Song-Beverly. The state’s Song-Beverly law prohibits merely requesting personal information at the point of sale when the consumer uses plastic. The California legislature passed this law because it was concerned that consumers who used credit were being tricked into providing their home address, which could then be reused for marketing purposes, or for stalking purposes by the sales clerk. Pineda and related cases involved retailers who used technical methods to determine the consumer’s home address—they just asked for the zip code and then used reverse-lookup databases to identify the consumer.
In all three of these examples, the relevant industries used technical measures to circumvent consumer privacy. These actions speak to a profound disrespect for consumer preferences and law. They suggest that the actors involved, at core, do not respect consumer privacy. The industry’s reaction to getting caught taking these technical measures is instructive, too. With regards to the Pineda example, the reaction has reached a new low. Consider the four reactions to Pineda featured in a recent
Privacy Advisor
. First, a DMA official opined that zip codes were not sensitive. This seems to ignore the very point of Pineda—the use of trickery to convert seemingly non-personal data into the consumer’s home address. In fact, Acxiom markets a product to accomplish this linkage, and it is explicitly
as a tool to identify consumers without them realizing the privacy implications of providing the zip code. Second, a lawyer commenting on Pineda in the same article argued, "...the Song-Beverly Act was passed in order to protect consumers from dumpster-diving criminals aiming for carbon copies of credit card slips, which often contained personally identifiable information--such as phone numbers, for example--in addition to the customer’s credit card number." Of course, this is perfectly true, but irrelevant. Much of Song-Beverly has to do with security. But the section at issue in Pineda was not about security, it was about privacy. It was passed to stop the very practice that Williams Sonoma was engaged in—requesting information about credit card users and adding it to marketing databases. Third, a commenter objected to the public policy approach of Song-Beverly, arguing that attempting to define “personal information” is the wrong approach because technology will always make it easier to link data. This commenter invoked a “freedom to observe” that apparently justifies using technical means to undo California’s goal of shielding consumers at point of sale. Finally, a prominent lawyer bemoaned the return of “the privacy class action.” What is missing from these responses is a serious engagement with the ethics of reverse zip-code lookups. Instead, the industry has reacted to getting caught circumventing a privacy law by spreading confusion about practices involved, by misstating the purpose of the law, by rejecting the pubic policy approach of the law and then by bemoaning the consequences of violating the law. This response does not engender trust or credibility. Elsewhere, I have
that the industry’s response is akin to the law of the jungle. A law where principle is trumped by might, here, technological might. In this world, law has no moral value. This world rejects the idea that we as a society can come to an agreement about the bounds of commercial behavior. In other areas where self-regulation is successful, organizations entrusted with setting standards fundamentally believe in the underlying public interest of the rules. Compare the privacy world, where self-regulatory organizations are unwilling to recognize privacy as a legitimate interest. For instance, the recently proposed European Advertising Standards Alliance
for behavioral advertising do not even invoke “privacy” as a policy goal. The atmosphere created by this response undermines credibility in self-regulatory debates. It is so corrosive of trust that is makes it difficult for advocates to bargain in good faith. Put yourself in the shoes of an actor bargaining with an opponent that fundamentally does not respect the values you are attempting to protect. Given this course of conduct, how could you trust them to honor their word?

ADVERTISEMENT

Syrenis ad, a privacy professional's AI checkilist