TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | So Glad You Didn’t Say That! A Response to Viktor Mayer-Schönberger Related reading: OMB to issue government-wide AI risk mitigation directive

rss_feed

""

In response to my comments on an IAPP story, “Forget Notice and Choice, Let’s Regulate Use,” Viktor Mayer-Schönberger distances himself from views attributed to him by the IAPP and positions taken in an earlier whitepaper.

My first thought when reading Mayer-Schönberger’s response was, “I’m so glad he didn’t mean that!” In sum, Mayer-Schönberger assures me that our views are aligned as follows: The belief that individuals have an interest in privacy protection; privacy should be anchored in the OECD Fair Information Practice Principles; the public should have control over their personal information, and privacy does not impede innovation. Allow me to assure all of you that in addition to the IAPP story, I have indeed viewed the video of Mayer-Schönberger’s Brussels keynote and have read the two papers he referenced.

Mayer-Schönberger reaffirms the importance of privacy as a value, while suggesting that to be effective, the mechanisms to ensure privacy must be changed. The answer he offers is shifting the focus away from “consent,” to “use” because, according to Mayer-Schönberger as stated in his keynote, “data protection should not rely on an individual’s ability to comprehend what is going on exactly with his or her data and take actions.”

I too am in favour of systems which take the burden away from individuals to protect their own privacy. In today’s mobile world, individuals should not be expected to negotiate overly long and complex user agreements. However, my approach has been to develop the concept of Privacy by Design (PbD), in which organizations are urged to build in privacy measures, right from the outset, so that the individual’s privacy is protected by default. But Privacy by Design was not referenced in the various papers Mayer-Schönberger simultaneously references and distances himself from. I found this somewhat surprising given PbD’s prominence on the global stage, having been unanimously passed as an international framework for privacy in 2010, now translated into 35 languages, included in the draft EU Data Protection Regulation and referenced by the U.S. FTC as forming an essential component of its privacy program.

The changes to privacy protection proposed in the papers cited by Mayer-Schönberger, which are consistent with his keynote, include removing purpose specification and leaving the decision to obtain consent to the discretion of the organization. The acceptable determination of secondary uses of the data would be left up to the company involved. With due respect, since the OECD principles are interrelated—and were re-affirmed in July, 2013—removing such fundamental concepts as purpose specification and use limitation would unhinge the rest of the principles—at that point, one could no longer say the approach was anchored in the current privacy framework. If there is no purpose specification, you cannot ensure openness and accountability to the data subject.

How could such changes not weaken Fair Information Practices?

Mayer-Schönberger suggests in his keynote that in place of consent and purpose specification, an accountability model in which legal restrictions and regulatory oversight, rather than individual consent, regulate the use of personally identifiable information. This is the antithesis of PbD, in terms of allowing privacy harms to develop and then, after-the-fact, offering systems of redress. In this day and age, this is too little, too late. I am of course in favour of responsible data use and accountability but not eliminating the data subject from the picture, in terms of making the necessary determinations relating to the uses of their personally identifiable information. Also, speaking on behalf of regulators who endeavor to pursue the cases that come before them vigorously, our offices and resources are already stretched to the limit, with no additional resources being allocated for such enforcement. And with the massive growth in online connectivity and ubiquitous computing, we would barely see the tip of the iceberg of the privacy infractions that would arise.

The Seven Foundational Principles of Privacy by Design build upon and raise the bar of Fair Information Practices. We did this by adding elements of proactive privacy protections—embedding privacy into information technologies, business practices and networked infrastructure. An ancillary benefit of employing PbD is that companies can experience a “Privacy Payoff”—gaining a sustainable, competitive advantage. By strongly de-identifying one’s personal data holdings, companies will face far less liability should they experience a breach due to a rogue employee or hacking incident, not to mention the harm involved in cases of identity theft (think Target).

The suggestion to downplay consent and purpose specification are in stark contrast to a growing movement among the private sector (see the Personal Data Ecosystem Consortium) that believe providing individuals with ultimate control over their personal information is the most commercially advantageous scenario for the future. The World Economic Forum has encouraged consensus regarding the rules for obtaining individuals’ permissioned flow of data in different contexts, specifically citing PbD (See “Unlocking the Value of Personal Data: From Collection to Usage”). There is also nothing in this approach that would prevent a Personal Data Ecosystem company from enabling Big Data analytics. Privacy by Design is inherently positive-sum, not zero-sum. For more, see our paper “Big Privacy: Bridging Big Data and the Personal Data Ecosystem Through Privacy by Design.” Big Data will require Big Privacy, enabling both to flourish.

I also take issue with Mayer-Schönberger’s statement in his keynote that purpose specification is “crippling Big Data innovation” and suggest that he explore some of the material to de-identify large data sets for Big Data purposes (see the excellent work of Khaled El Emam, for example, in designing strong de-identification tools). Just as Big Data algorithms have grown in sophistication, so too has our ability to de-identify, encrypt, obfuscate, aggregate, introduce noise, etc., so that the data may be reused in a positive-sum manner for many of the altruistic purposes in healthcare and education that Mayer-Schönberger references. This too is consistent with applying a PbD framework—what is needed is Big Data and Big Privacy.

While I am sure that Mayer-Schönberger and I share similar views on the value of privacy, it is nonetheless important to voice other perspectives, such as Privacy by Design, that are not currently reflected in his view of how the OECD principles should be revised. I look forward to doing so in an upcoming paper with Berlin Data Protection and Freedom of Information Commissioner Alexander Dix, and University of Ottawa Professor Khaled El Emam. Stay tuned, and definitely join in!

PS: This matter is of such great importance that we decided to hold a live webinar on Friday, January 24 at 9:00 a.m. EST with Commissioner Alexander Dix, Professor Khaled El Emam, CDT President Nuala O’Connor, CIPP/US, and myself. I hope you will join us! Please click on the above link for additional details.

1 Comment

If you want to comment on this post, you need to login.

  • comment Christopher Vera • Jan 17, 2014
    (my comments are my own and do not necessarily reflect that of my employer)
    
    The Dr. makes great points here. The government and special interests could potentially improve my lifestyle with all kinds of great recommendations if they could kick down my door and inspect my home once per month. That doesn't make it right or good. While "big data" removes the need to kick down my door, the data is collects about me is much the same and so therefore, is the risk. Do warrants stifle police efficieny (i.e., law enforcement innovation)? Yes. And there's a reason for that: Because intrusion simply for the sake of innovation may lead to unintended consequences that could limit or destroy future innovation. Purpose specification and data minimization are mainstays of privacy because they protect the individual's privacy with as little impact to the individual as possible. If you don't collect it, I don't have to worry about protecting it. If I WANT the product or service being offered, I will give you my consent to collect the necessary data about me to make it happen.