TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | More Privacy Paternalism: “We Know What’s Best for You” Related reading: Thinking through ACL-aware data processing




Scott Charney failed to mention that many of his arguments, which would result in weakening privacy protections, have already been addressed and do not enjoy widespread support. While he indicates that his suggestions will “enhance” data protection for consumers, the opposite will most likely occur. They reflect a paternalistic approach to data protection that, if implemented, will weaken rather than strengthen privacy in the 21st century. The notion that privacy can only pose a negative challenge to new technological trends reflects a dated, zero-sum perspective that has been proven wrong on countless occasions.

Charney raises the point that the Fair Information Practice Principles (FIPPs) were developed “decades ago,” including the OECD Guidelines of 1980, and thus, do not reflect today’s evolving technological and business landscape. He fails to mention that a revised set of OECD guidelines was published in July 2013, based on a comprehensive multiyear review, in which all eight of the original OECD principles were left intact and unchanged—with the exception of using gender-neutral language. Thus, the argument that the OECD principles, and the data privacy laws on which they are based, are overly restrictive and suppress innovation does not appear to be shared by the 34 member states who participated in the review process.

Few of us, including myself, expect individuals to navigate their way through dense and lengthy privacy notices in order to understand how to protect their privacy. However, as stated by the Article 29 Working Party of the European Commission (WP29) in their opinion on purpose limitation, “When we share personal data with others, we usually have an expectation about the purposes for which the data will be used.” It is disturbing that Charney would argue that current practices that fail to provide notice to individuals, such as a bank checking a person’s Facebook profile, should constitute a reason to relax the requirement that individuals should have control over how their personal information is used.

I totally agree that greater accountability for the uses of personal data is critical. However, the argument that increased organizational accountability should act as a substitute for the removal of privacy essentials is deeply flawed—they are not mutually exclusive. To do so removes the individual as an empowered participant, to keep a check on the uses and misuses of their personal data—should they choose to do so. It is individuals, not regulators, who are best positioned to hold data processors accountable in a way that no law, regulation or oversight authority could ever do.

Big Data analytics, the Internet of Things and cloud computing are all trends that may yield remarkable new correlations, insights and benefits for society at large. While there is no intention to have privacy stand in the way of progress, it is essential that future trends be shaped in ways that are truly constructive, enabling both privacy and advances in technology to develop, in tandem—not one at the expense of the other.

Privacy is often positioned in a zero-sum manner—as having to compete with other legitimate interests, design objectives and technical capabilities, in a given domain. However, when using Privacy by Design to embed privacy and data protection into a given information technology, operational process or business practice, it can be done in a positive-sum, win/win manner that delivers full functionality: All legitimate interests may be accommodated and requirements optimized.

For more information, I invite you to read my formal rebuttal to Mr. Charney and the authors of “Data Protection Principles for the 21st Century and Data Use and Global Impact” in “The Unintended Consequences of Privacy Paternalism.”

1 Comment

If you want to comment on this post, you need to login.

  • comment Richard Beaumont • Apr 24, 2014
    I agree with Ann Cavoukian that consent and purpose limitation should not be traded for 'greater accountability'.  Better privacy comes when such accountability builds on the consent based model.
    I would suggest that if we are to accept change - the first step is for organisations to prove to consumers that they can be trusted.  What we need is greater transparency.
    There is often a lot of corporate talk about greater engagement with consumers - and this is one of the drivers and potential benefits of big data.  In reality though this is often a euphemism for collecting more data, often surreptitiously.  Instead what should be happening is greater engagement about issues of privacy - and use of data in new and beneficial ways.
    Obtaining consent for use of data should not be seen as a barrier - it is an opportunity to engage and convince people of the benefits of new data collection and use.  Unfortunately for some this will mean having to accept that lack of consent, is not always apathy, it because people may not want the engagement being offered.