TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Old School Privacy is Dead, But Don’t Go Privacy Crazy Related reading: State senator proposes sanctions over Madison Square Garden facial recognition



When I have the occasion to drive the kids to school, our music selections range almost as widely as our breakfast choices—some Christian, some country and some 80s, to which I alone know the lyrics. Recently, a particularly funny, somewhat concerning country song, “Redneck Crazy” by Tyler Farr, caught my attention. The song includes the following line, “You done broke the wrong heart baby ... drove me redneck crazy.”

I find these lyrics useful in two discrete circumstances. First, my daughters are 17, and I find great joy in potential male suitors seeing me as “redneck crazy.”  I plan to download the song and play it on an endless loop whenever they’re around. Trust me on this. If a 17-year-old boy observes a somewhat unstable dad endlessly playing this song, it’s as good as cleaning the shotgun at the kitchen table.

The second circumstance is perhaps more relevant for this audience. It strikes me that this is how I see many people react in response to those who would dare to collect and use their personal data: “You done used the wrong data baby, drove me PRIVACY crazy.”  And like the 17-year-old boys at my door, I’m a bit wide-eyed looking at the “privacy crazies” and wondering if I should step into the room, or run fleeing, leaving the innovators at the threshold.  But we who can occasionally go privacy crazy need to face something: Privacy is dead.

These types of regulations that default to all “use” of data as being impermissible unless authorized by the individual are trying to protect a version of privacy that no one really wants—not if we have to go back to using VCRs and flip phones.

There is nothing left to debate. Our old-school privacy, as we’ve known it for decades, is dead and buried.

But, there’s good news. Even great news. Some things change for the better. I never watch “live” television anymore and no one in our generation can likely tell you the names of the nightly news anchors. But I watch Netflix and ESPN3 (mobile version) and HBOgo, and I love it. Same with the VCR. I haven’t “video cassette-recorded” for a decade. But boy do I love our whole-house DVR. IU basketball on the DVR is amazing.

These were great changes. And that’s what I believe to be true about privacy. It’s changing for the better.

If your notion of privacy is defined by your personal control over all of the data about you, well, you’re privacy crazy and I have tragic news: That privacy is lost. And no amount of EU directives, HIPAAs or other central, user control-based regulations will change this fact. These types of regulations that default to all “use” of data as being impermissible unless authorized by the individual are trying to protect a version of privacy that no one really wants—not if we have to go back to using VCRs and flip phones. That brand of privacy isn’t only culturally undesirable and impossible to maintain, the regulations that attempt to enable it are unethical.

The central control regulations force individuals to exercise every precaution (which we don’t, because we feel like we should be able to trust reputable companies), scrutinize every privacy notice (no one reads privacy notices, not even those among us who draft them) and then only provide our consent either through our actions or inactions in circumstances we deem wise given the precautions (we haven’t taken) and the content of the notices (we haven’t read)—and failing to make an intelligent decision, we forfeit our privacy. That’s as absurd as requiring my teens to use a cassette tape in the VCR to record the latest American Idol episode. To use my kids’ lingo, that’s sketch.

And the answer isn’t clearer, layered, surveyed, animated or chip-embedded notices. Society and culture have moved on, and as privacy experts we better at least catch up to the tail end. Stop the notice-requiring legislation and regulations. We don’t read the notices, and you can’t make us. So don’t pass more regulations that force us to read notices and give consents we won’t understand.

But, the good news is that, like TV and VCRs, our parents’ brand of privacy is being replaced by a better, more sustainable and meaningful privacy. This new and improved privacy is privacy that really matters. It’s a privacy that doesn’t depend on security breach fear to get internal buy-in and budget because its value is not defined by the latest data security breach. This is privacy that enables innovation. Privacy that, if it’s violated, actually results in real harm to us, harm we can demonstrate.

We simply need to understand and harness the market forces and translate it into reasonable use standards. Society, in its common culture and through its market actions, understands that—just as it understands the difference between annoying data collection by marketers and serious data collection by the NSA.

This new generation of privacy is just beginning to get defined. And none too soon. With the “Internet of Things,” Big Data and constant innovation, the old privacy never had a chance. I don’t read notices on the apps I download; I can’t imagine trying to read them on the grocery aisles I walk down, the cars I drive, the doors I open, the thermostat I set or the pills I swallow. These innovations will bring wonderful convenience, efficiency and quality to our lives, and these uses will not harm our privacy, because we will define appropriate uses of the data.

And while we don’t think any protections exist, the marketplace is a powerful force. Bad ideas get sniffed out, panned and abandoned. It happens all the time. We simply need to understand and harness the market forces and translate it into reasonable use standards. Society, in its common culture and through its market actions, understands that—just as it understands the difference between annoying data collection by marketers and serious data collection by the NSA.

But now, as we start down this new privacy road, we need the deep privacy experts to step through the door, face the old-school privacy crazy, and take the innovators by the hand. These privacy experts among us need to help the innovators understand their obligations relative to real privacy and data security, show them how to use data responsibly and enable them to innovate with that in mind. We also need to step into the policy fray and help define appropriate and responsible use. Help define real harm. THIS is going to be fun. This is a worthwhile pursuit. This can help save lives and drive economies.

It won’t be all country music and 80s rock, however. We will make mistakes.

That’s the caution we need to bear in mind. As we define what we really care about in privacy-appropriate use while defining the harms that really should drive our policies, we need to make sure we get it right. To do this, we need full inclusion of all of the stakeholders—not just those who scream the loudest. In healthcare, that means including the patients and patient advocates, the innovators, researchers, healthcare providers, industry folks and those who care about and understand privacy. We don’t just have cassette tapes and network news anchors to lose. We have economies, innovation, real new generation privacy and, ultimately, our way of life at stake. This debate is real, and this opportunity is why we’re in this profession.


If you want to comment on this post, you need to login.

  • comment Bill Wells • Jan 21, 2014
    Define "harm we can demonstrate," please. Does it, for example, include the mental anguish caused by knowing someone has access to information about me and the ability to use it without my consent? Sure, I can retaliate via legal means, if I have the money, the time, and the mental fortitude required to pursue it, but why should this be required of me at all? The measure of privacy should not be based on the amount of harm that is demonstrable, but on the fact that harm of any sort is reasonably possible. Quantification of that harm can only, in practical terms, be determined by the person(s) harmed. Making this determination--the act of being required to actually quantify it--is actually a further form of harm that is perpetrated. 
    Please don't suggest "my parents' privacy" is dead, lest you become simply a herald of this self-fullfilling prophecy. It will only die if such rhetoric is allowed to survive unchallenged and allowed to proliferate.
    You have a right to your opinion, but you need to be clear with readers that this is really just your ONE opinion and that other working privacy professionals have differing views.
  • comment Damon Greer • Jan 21, 2014
    Thanks for your insight Stan.  What you wrote needed to be aired and hopefully understood.  There indeed are compelling reasons for the public, private experts, and innovators to work together to develop those reasonable use standards that encourage trust and confidence among all parties.  Technological advances demand that everyone works toward common goals.  I know of no one who reads short notices, layered notices, bank privacy policy statements, or even their physicians' privacy practices statement.  I think "our parents privacy is dead" because it was born in an era where life was much different than the last thirty years and data has exploded.  The Europeans are want to label the rise of big data as a digital tsunami, I would suggest it is a digital opportunity to strengthen joint efforts to reasonably protect and use data.  By the time the European Data Protection Regulation is approved, Google's digital contact lens will be in wide use diagnosing glaucoma around the world and transferring the results to eye specialists...well anywhere.
  • comment Caitlin Pencarrick Hertzman • Jan 21, 2014
    Although I agree with you that no one reads notices, and I agree that a new and technological solution is needed to keep pace with the current corporate and privacy environments, I disagree that the solution rests with the corporations. No, not all bad ideas are panned. Corporations do not have the best wishes of their clients in mind; profit is their sole motivator. I see the EFF and other non-governmental organisations as being more capable of innovation when it comes to the protection of our personal information. Let's hope they continue to be funded and can work unabated on the issue!
  • comment Malcolm Crompton • Jan 21, 2014
    Stan - I think you have pretty much nailed it with your description of just how busted the current privacy models are.
    In terms of what we do next, as you note, we need the robust and informed debate.  Here are two things that need further consideration at an absolute minimum:
    1. The US legal system has made a mess of the concept of 'harm'.  It has set such a high bar that the vast majority of us on this planet who live outside the US consider that the test of 'harm' so applied is irrelevant.  The concept has to include some much softer concepts or it is irrelevant.  The old saying 'knowledge is power' has stood the test of time for good reason: it is the ability to exercise it, not just the actual exercise of it, that can be a harm as Winston Smith understood so clearly in 1984.  That doesn't mean 'all possible adverse impacts in any circumstance' but the debate is being held up because we are not having a sufficiently mature discussion on where the boundary line should be.
    2. Market forces are imperfect as judged by most mature communities worldwide.  The argument that all are equal in the market place is a fallacy.  That is why we have anti-discrimination laws, consumer protection laws and more.  It is why we have sound processes to make sure that only appropriately trained individuals with an appropriate outlook on life can undertake certain functions in society such as doctors, nurses and airline pilots:  most of us can never know enough to sort the good from the bad so we have other processes in place other than the market to provide protection.  Again, it is a delicate balance as to how much legal framework is needed, but most societies have agreed that none is insufficient.
    Roll on the debate - we need it!
    Malcolm Crompton CIPP
    Managing Director, Information Integrity Solutions Pty Ltd
    Privacy Commissioner of Australia 1999-2004
  • comment Carrie Kerskie • Jan 21, 2014
    As someone that has worked with hundreds of identity theft victims over the years I can say that having an expectation of privacy is not merely old fashioned thinking.  The problem is that privacy policies are are nothing but drawn out legalese.  So much in fact that many refuse to read them.  Including you as you stated in your article. Simply throwing in the towels and saying it's the way of the world is not the solution.  The solution is to keep it simple.  If a company is storing, tracking, selling, etc customer data then the company should be required to list what data is being collected and how it will be used.  This does not mean listing it in a complex multi-page document filled with legalese but in a simple format such as "here is what we collect and here is how we use it(check the appropriate box- store, share, track, sell,etc it.)  That's it!  One simple box instead of page after page of boring, complicated legalese. Consumers will be informed and corporations will be providing notice.  Problem solved.  
  • comment Stan Crosley • Jan 21, 2014
    Thanks for the comments, all.  Bill, I'm not suggesting it is dead, I'm observing that it is - which is of course, my opinion!  But the criticisms you level at a use and harm-based model are even worse with the existing notice/consent paradigm.  You still have to identify the harm with either -- but with notice-based privacy, if you don't read the notice closely, the harmful use or access you described, could be included as a permissible use and you'd have no chance of a claim.  I'm suggesting a cross-functional task force (not just companies, Caitlin - I agree that we need more than just companies participating)to help us understand appropriate use.  In health research, we have the institutional review boards.  This is an interesting model we should explore for healthcare -- not the use of IRBs, but use of experts in the issues to help assess use and harm.
    Malcolm and Damon -- brilliant as usual.  I agree with most of your premise Malcolm, although the harm model in the US is swinging back around to favor privacy plaintiffs.  At least more than was true even a few years ago.  I don't believe we're that far apart. But we do have to be careful of allowing the anecdotal to drive the regulations created by the legislature rather than the liabilities determined by courts. Over-regulation in health data kills people.  The NIH reports are clear that information errors kill nearly 100,000 every year.  
  • comment Fábio Freitas de Sousa • Jan 21, 2014
    I agree that the concept of Privacy, as it was understood by past generations, has changed dramatically with technology and innovation. And because of the constant and fast innovations and the consequent shifts in perception of what constitutes "privacy", it is difficult to define it as it is today, let alone try to figure out how to protect it through laws and regulations for the present and the future.
    And although innovation brings "convenience, efficiency and quality to our lives" it can also create the need to protect its benificiaries from its potential harmful effects. You said that "these uses will not harm our privacy, because we will define appropriate uses of the data", but that statement raises many doubts for me: (1) what is the "harm" you're mentioning, (2) who's "we" in that context, and (3) what can or could (now and later) be considered "appropriate uses" of the data.
    I also agree that no one reads notices. But most also don't read terms and conditions or licence agreements. (Not to mention other types of contracts!). But for many, even if they read them they wouldn't understand them. And that's a big issue that needs to be addressed. Because if the ones the policies and notices concern don't read and/or can't understand, then it's completely innefective!
    Which brings me to the concept of "consent" which is a big concern in my opinion: if people don't read or don't understand, how can they give informed consent? Should consent have the relevance it has? Then what about clicktrough or opt-ins, are they relevant in that sense?
    Many people don't find privacy that important (e.g. the nothing-to-hide argument) simply because they don't grasp what these new technologies entail.
    On the other hand, one thing I must say I disagree is your perspective on the markets. First of all "bad ideas" are only bad depending on the way you look at them. If you meant unprofitable then they should obviously abandoned. But any "bad idea" can be good if profitable enough. In that sense, no: the markets can't be left alone to find their way. That's why there's laws and regulations, for instance commercial law, consumer protection law, competition law, privacy and data protection law, etc. It's a constant friction of forces and interests that build and transform society and its standards.
    It should envolve society as a whole, just like you said in your conclusion. But the information and knowledge is still unbalanced between stakeholders. That needs to change.
  • comment Stan Crosley • Jan 21, 2014
    Carrie - thanks so much for your comment.  I actually agree that there may remain some limited benefit in certain circumstances to continue with notice/consent. And where we do, I like your observations.  But for the overwhelming majority of circumstances, notice and consent is not even colorable -- IoT will be it's end, whether we want it to be or not.  And, we have tried for 10 years to create clearer and more readable notices -- even nutrition type notices. No one reads them still. And, as an attorney, it is a rare circumstance when you can craft something succinctly that complies with the regulations created by the hundreds of privacy laws (yes, hundreds is an understatement) in the USA alone.  At IU CLEAR, we documented just a few categories related to healthcare privacy, and reached hundreds.  Look at the health information law database here:
  • comment Name • Jan 22, 2014
    "If your notion of privacy is defined by your personal control over all of the data about you..." That's not a definition of "old-school" privacy--it's a straw man. It will be easier to help define the new privacy norms if we don't describe ourselves as "crazy" before reaching out to the innovators.
  • comment Malcolm Crompton • Jan 22, 2014
    Stan - thank you for staying in the debate.  It really helps.  At the risk of being in heated agreement, I agree very strongly with your observation that "In health research, we have the institutional review boards.  This is an interesting model we should explore for healthcare—not the use of IRBs, but use of experts in the issues to help assess use and harm".
    When I have advocated such a model, it gains a lot of traction.  It makes a lot of sense.  When you look around outside of privacy, the concept to which you allude is a widely applied model.  At its core, it requires an independent group of people with a range of interests/perspectives and a strong focus on end user impact (not just company bottom line) which sets out the rules or parameters and another similarly independent group which undertakes the assessment process.
    It needs to be combined with a strong and credible assurance process that the company is saying what it is doing (even if it is in language only the experts understand) and is doing what it says.
    Examples outside the health industry:  energy supply; motor vehicles; aircraft and airways and pilot safety.  The list is endless.
    The benefit to the company:  permission to do things that would otherwise not be permitted or require a much more onerous proof process.
    Malcolm Crompton CIPP
    Managing Director, Information Integrity Solutions Pty Ltd
    Privacy Commissioner of Australia 1999-2004
  • comment Richard Beaumont • Jan 22, 2014
    I think a large part of the problem with notice and consent is that it is presented to the consumer as a fait accompli.  'Here is what you are going to let us do, if you don't like it, leave'.  Because notices are treated as contracts of agreement - they are of necessity tied up in legalese, designed to empower the organisation, rather than inform the individual.
    What is fundamentally missing is control/choice.  Even where it is presented it is often illusory.
    My thinking, and I realise that it is not an easy approach is:
    Focus on notice as transparency - 'this is what we do with your data' - but don't treat it as a contractual agreement.  You then give meaningful choice to users to easily opt-out of different uses/collections.  As they do this - you can alter the service they receive - and therefore you create a marketplace where value/choice can operate meaningfully.
    I think also trade organisations/professional bodies can take an active role. Which I believe is a model that is used in Australia.  These bodies, can set professional standards in certain areas.  Standards that go above the letter of the law, but which individual organisations can be benchmarked against - with that benchmark data being publicly available.
    As I said - not easy to apply.
  • comment John Kropf • Jan 25, 2014
    Well said.  I fear the law is too far behind that even the reform efforts now underway will be woefully out of date no matter how quickly they might be enacted.  You identify the other problem that many of the old school privacy crazies will not create laws based in reality -- and like prohibition, folks simply won't comply with the laws that may be passed.  I think civil law systems will have an especially hard time with copping because it is their nature to be detailed in their level of prescription and will make it more difficult to adapt to the changes.  You mention the marketplace at the end of your piece.  An implication that private entities may find privacy a selling point?  
  • comment Gaylord • Jan 31, 2014
    People who exhibit an unusual enthusiasm for the death of something are fanatics.  OP has lost the balance of his faculties in discussing the troubles we all face in a post-privacy world.  Watching personalized TV hardly compensates, and it's clownish to think that it does.  Talk about shot credibility.