TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

""

Presidential campaigns. The Olympics. Game of Thrones. I’ve been reading and viewing a lot lately: newspapers (it is Indiana, we still have paper), documentaries, blogs, Facebook rants, Instagram memes, bathroom walls. It’s a long list. I’ve laughed. I’ve cried. I’ve felt lazy. And I’ve cringed. A lot.

I worry that we seem to take joy in extreme polarizing positions often just because we know it will offend those with whom we can’t stand to agree. 

Unlike the generations before us, we don’t have to see our adversaries face to face. We don’t have relationships with each other and with each others’ families that are inter-generational. We have “friends,” “followers,” and “links.” And into this fragile society, where we are still finding a way to “trust” one another with our virtual lives, we have introduced a toxic mix of a dismissive and contemptuous lack of respect for the thoughts and beliefs of others with a means to easily express it on a globally public stage.

My fear is not for politics – we’ve always been a bit crazy in that area as a society – but that the ultra-polarization we’re seeing in the presidential election is actually running underneath the surface of our lives in many other areas. And, if we permit it to seep into our wonderful privacy profession, we could have disastrous results.  

I have been involved in privacy for more than 20 years now. I have worked in large law firms, in-house at a large multi-national, in academia, in not-for-profits, on the board of a privacy advocacy organization, and in state and federal government committees and workgroups. Like each of you, I have seen the evidence of the lives ruined with identity theft and the true cost of the disclosure of deeply personal health information to those who would choose to harm. I have seen horrific privacy practices and have read about the well-documented oppressive government surveillance practices – from governments around the world.

But also like you, I know the statistics of more than 100,000 lives lost in healthcare every year due to errors that are completely preventable with more information and better information exchange. I’ve heard the stories of family, friends, and loved ones whose lives are lost because we can’t advance cures for deadly diseases fast enough and when information isn’t exchanged more freely to fight preventable terrorist attacks.  

There is no one right answer for all circumstances. There is not one rule that always overrides every other.

We need clear-headed, frosty, dispassionate dialogue when we discuss privacy and data use, not blindly passionate rhetoric.

Privacy at all costs is just as wrong as data collection and use without limits. We need clear-headed, frosty, dispassionate dialogue when we discuss privacy and data use, not blindly passionate rhetoric. We need ethical privacy every bit as much as we need ethical data use because I worry about the de-stabilizing effect of a society where distrust is the norm, irrational privacy the default, data collection an intellectual property right, and trust rare and hoarded.

It has to be about trust.

This isn’t going to be easy. We haven’t exactly engendered trust based on our actions with one another over the past 20 years. Email breaches begat credit card fraud, which led to identity theft, rampant phishing, malware, and a general lack of reasonable security. And, save for a few notable exceptions, we haven’t self-regulated to the point where we have chosen not to collect or use personal information that was clearly legal for us to collect and use. We aren’t willing to take the step to say we won’t, even though we could.

And that is how trust is formed. When we’re willing to sacrifice self-interest for the interest of others. 

This cuts both ways. Our inalienable rights are not unbridled. We cannot freely practice religion that has human sacrifice, and we cannot use our freedom of speech to shout “fire” in a crowded movie theater. It is as unethical to demand privacy when sharing data that can save lives and end suffering, as it is unethical to demand data when it may cause lives to be disrupted, even if we can get someone to say, “I Accept.” 

It is time for Ethical Privacy and Ethical Data Use. Privacy is far too critical to be handled in a way that models current social dialogue.

While we didn’t know it at the time (or at least some of us didn’t know . . . or at least I didn’t), much of what we’ve done in privacy for the past two decades was practice. The real game is about to start.  

Biometrics, ubiquitous sensors, genetics, commercial drone surveillance, predictive health, predictive aptitude, digital determination, telemedicine, mobile devices with entire medical histories, AI to diagnose disease, the internet of everything. We are on the cusp of the greatest advancements in human health, longevity, safety, and well-being in history and on the precipice of rampant distrust, abuse, untraceable harms, and societal chaos. But our privacy programs over the last twenty years have advanced less like the unbelievable pace of record-setting in modern Olympics and more like the choices we have for political office.

So, it’s time to make privacy great again. 

And not the political privacy where we fret about website privacy statements and overly broad consents. But Olympian, big “P” Privacy where, as Sheila Colclasure recently blogged and Marty Abrams has advised for more than a decade, we have ethical data use frameworks with internal oversight data review boards and standards that actually demonstrate that we’ve heard the privacy concerns all around us. Privacy that evokes trust. Earns trust. And an ethical data use framework that actually has a means of assessing and valuing the use of data that permits us to save and improve lives without restrictive regulations: A Trustworthy Ethical Privacy and Data Use Framework.  

And we need to advance these frameworks and data review boards well before the unreasonable passion takes over on both sides of our profession. We need them now. Even if it means we have to step out and create them with little immediate gain. We cannot wait. It is Game On. You best be getting your Ethical Privacy and Data Use Framework and Data Review Board in place. Because like winter to Westeros, make no mistake, we haven’t seen anything close to that which is headed our way. Polarizing passion is coming to privacy.

photo credit: shout it out. via photopin (license)

3 Comments

If you want to comment on this post, you need to login.

  • comment Malcolm Crompton • Aug 18, 2016
    Great article Stan.  We are working on just this challenge in Australia right now with an independent, not for profit Health Research Ethics Committee service provider (IRB in your language) and will be seeking international input.
    Malcolm Crompton CIPP, Managing Director Information Integrity Solutions Pty Ltd
  • comment Danny Koning • Aug 22, 2016
    Solid and inspiring article Stanley. I couldn't agree more and especially appreciate this section: "And, ... , we haven’t self-regulated to the point where we have chosen not to collect or use personal information that was clearly legal for us to collect and use. We aren’t willing to take the step to say we won’t, even though we could." 
    
    As a legal counsel/privacy officer for a big social housing association in The Netherlands, I luckily do notice a positive change in reaction when we progress a conversation about a proposed privacy intrusion from "what can I (business unit) do legally?" to "what should we (whole business) do?".
  • comment Sheila Dean • Aug 24, 2016
    Privacy is a competitive game.  If you're too one-sided, too cold, too unipolar you will drive user-consumers right into your competitors offering sphere. The market addresses needs. Zero sum creates alternate or sub hemisphere markets. Principled privacy should mean human dignity is recognized. So makers will create something a user can work with based on how their children respond and react to problems they face in social environments, now dependent or mandating technical uses. So it's adaptive. For every action there is an equal and opposite reaction.  Ethics are the method of common nobility, right?  If you create it to be inflexible or unyielding the framework will be abandoned.  
    
    So it has to be mentally cool (analytical) for social innovators who preside over understanding the build, but maintain humility in order to serve a long-term interests *before* it chews up your kids. 
    
    So there's this call for papers from the FATML list... be advised.  It's a place to project your ideas for ethics frameworks. 
    
    3rd Workshop on Fairness, Accountability, and Transparency in Machine Learning
    
    Co-located with the Data Transparency Lab 2016
    
    November 18, New York, NY
    
    http://fatml.org/
    
    Submission Deadline: September 9, 2016