Perspective: Self-regulation’s credibility problem


Published
Contributors:
Chris Hoofnagle
Editor's Note: From time to time, the IAPP publishes the opinions of leading figures in data privacy and protection. Chris Jay Hoofnagle is one of them. Chris is director of the Berkeley Center for Law & Technology and is an active researcher and voice in the greater privacy community. We see value in the perspectives that come from voices like his, but these opinions do not reflect those of the IAPP. In fact, the IAPP takes no opinion on these issues. But we do like to put them out there, occasionally, to be disputed, reputed or otherwise discussed; such conversations have the potential to lead to better solutions in the marketplace. Do you have an opinion about the state of self-regulation? Do you have ideas about how to improve this and other privacy-related matters? Let us know. Why do privacy advocates remain so opposed to self-regulation? Self-regulatory programs suffer from an enduring credibility problem, established by the short-lived IRSG and the languid NAI, and continued today in the form of business practices that express disregard for consumers' expressed preferences. Having been caught using technical measures to undo consumers' preferences, these companies act with indignation over civil liability rather than confront the ethics of the practices in question. Some even try to justify companies' anti-consumer actions by arguing that the public policy approaches of privacy laws were illegitimate to begin with. In this atmosphere, privacy advocates are right to argue that self-regulation is not a viable option. If the industry itself is unwilling to recognize the legitimacy of consumer privacy interests, how can it be trusted to protect these interests absent legal obligations? Why do privacy advocates remain so skeptical of self regulation? At the most basic level, self-regulatory initiatives suffer from a deep lack of credibility among advocates. From the outside, this skepticism may appear to be some sort of anti-business bias. That may be the case among some, but among others the anxiety surrounds a simple and (in my view) legitimate problem: how can industry self-regulate when it fundamentally rejects the value of consumer privacy? Here are three recent examples to consider:
- Over the years, consumers have become savvier about Internet use, and a substantial number have been "tossing" their cookies. This obviously affects Web tracking, and so many Web sites and advertisers adopted Flash cookies. These cookies were relatively unknown to consumers, had very poor controls, offered advantages for tracking over standard cookies and did something very tricky: Flash cookies could "respawn" or re-instantiate ordinary cookies that consumers deleted. Adopting Flash cookies allowed Web sites to continue tracking in ways even more persistent than normal cookies and undo consumers’ privacy-seeking behaviors.
- Microsoft, with the adoption of Internet Explorer 6 (MSIE), incorporated the Platform for Privacy Preferences (P3P) into its browser. In its default mode, MSIE blocked third-party cookies. Network engineers noticed this and got around it by posting a P3P “Compact Policy” (CP). Merely posting a policy—even an invalid one—caused cookies to be installed. Lorrie Cranor’s team at Carnegie Mellon recently found, “…thousands of sites using identical invalid CPs that had been recommended as workarounds for IE cookie blocking…98% of invalid CPs resulted in cookies remaining unblocked by IE under its default cookie settings. It appears that large numbers of Web sites that use CPs are misrepresenting their privacy practices, thus misleading users and rendering privacy protection tools ineffective.”
- Recently, the California Supreme Court held in Pineda v. Williams Sonoma that zip codes could qualify as personal information under the state's Song-Beverly Act. This means that companies that collect the zip code at the register when accepting plastic may be violating Song-Beverly. The state’s Song-Beverly law prohibits merely requesting personal information at the point of sale when the consumer uses plastic. The California legislature passed this law because it was concerned that consumers who used credit were being tricked into providing their home address, which could then be reused for marketing purposes, or for stalking purposes by the sales clerk. Pineda and related cases involved retailers who used technical methods to determine the consumer’s home address—they just asked for the zip code and then used reverse-lookup databases to identify the consumer.



