TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Should the privacy profession adopt a code of ethics? Related reading: Tech leaders unpack data ethics, corporate values



It’s common for professions to have codes of ethics. Particularly learned professions or those where high expertise is required. Think doctors, lawyers, engineers. But there is one specialized field that noticeably does not have a code of ethics: The privacy profession.

There’ve been clamors here and there for a professionalized code, and the "Pokemon Go" phenomenon renewed some of those calls recently. But it’s never gotten up off the ground. Part of that reason, insiders say, is the field’s diversity. How could one single code possibly apply to a field comprising technologists, engineers, lawyers, chief privacy officers, and seemingly countless others? But those same insiders also say, while it would be a significant task to take on, a code may be well worth it, given the sensitivity of the work privacy pros do every day and the specialized expertise required. 

Tracy Ann Kosa, who spent 20 years in privacy, four at Microsoft, said some years ago she began to notice a seismic shift in the profession’s makeup and, subsequently, its ideals. 

“What started to grow was the notion of a privacy officer or privacy manager as someone who could run a program that could pull together the technical and the legal piece, and I think everyone in the profession at the time thought that was a really good thing,” Kosa said. “But as the discipline grew, as the domain evolved, a lot more people got interested in it, but a lot of those people got interested not for the same reasons the people who grew the field were interested in it.”

In other words, it turned into a compliance-based exercise.

That shift didn’t sit well with her. What irked her was her sense that the field was losing its strong base of privacy advocates, replaced by professionals who were saying to companies, “I can knock out a privacy impact assessment for you for $50,000, no problem.” 

And that's why she thinks it's time for the privacy profession to develop a code of ethics. She said there has to be a way to ensure the very important work privacy professionals do is done for the right reasons, with the right end-goal in mind.

She's not alone. Plenty of folks agree. But the problem is in the execution. 

Aurélie Pols is, among other things, a member of the ethics advisory group at the European Data Protection Supervisor. While she understands the need, she's skeptical a code of ethics could ever work in privacy. The problem? There are so many lawyers in the field, and lawyers are focused on compliance and are ethically bound to their clients, not the data subject. 

“To be honest, I fight mostly with lawyers [who say], ‘Why would we do that? We’re compliant … Why would we do that?’” Lawyers’ responsibilities are to limit risk for their clients, the organizations, whereas ethics is about “having a balance for society in general,” Pols said. So there’s an inherent tension there.

As a lawyer herself, Irina Raicu understands that tension. Director of the Internet Ethics Program at Santa Clara University’s Markkula Center for Applied Ethics, she notes that, because lawyers already adhere to a professional code of ethics, creating a code for the privacy profession could actually conflict with their existing code, which ethically binds them to their clients, not the data subject. The same could be said for software engineers, who follow a code established by the IEEE and ACM. 

“I think that’s an interesting challenge for developing something like this,” Raicu said. “The privacy profession combines a lot of different professional expertise types.”

Evan Selinger is professor of philosophy at the Rochester Institute of Technology and formerly a fellow at the Institute for Ethics and Emerging Technologies. While he supports the idea of a code, he agrees the composition of the field means creating one would likely be messy, though not necessarily because of a sizable lawyer contingency. "Largely for principled reasons,” he said, noting the disparity in expertise and background among privacy professionals. For example, there’s likely greater agreement amongst engineering professionals about core ideals than privacy pros.

But, Selinger said, by now privacy professionals “have gained sufficient social esteem and are deemed important enough to the functioning of a digital global world that they should willingly follow the lead of others and embrace their own code of ethics, one that details internal and social expectations and articulates them according to designations of a few recognizable categories.” He cited those of the National Society of Professional Engineers Code of Ethics, for example: fundamental canon, rules of practice and professional obligations. 

Katy Lovell is senior compliance analyst at State Farm Insurance, and she thinks more along Selinger's lines. Noting her views are hers and not those of State Farm, she said a code of ethics makes a lot of sense and is even surprised there isn’t one yet. She subscribes to a code of ethics maintained by State Farm’s compliance department. She said she doesn’t see much difference between compliance and privacy.

“Privacy is kind of a non-traditional leadership role,” she said. “You’re helping people make difficult decisions and reinforcing desired behaviors. That’s someone you’d like to have good ethics and morals.”  

That's where Kosa is coming from, too. It's too important a role to not codify behavior, she said. 

“What’s the difference from a privacy person that gives advice and a doctor who gives advice?” she found herself recently thinking. “What it comes down to, for me, is the fact that … your job is to understand the technical data flows and potential impact of it all in the eco system for the data subject and interpret that,” she said. “To me it seems like a parallel, it seems like what I see in the regulated professions. All these people are relied on as experts to understand something far too complicated, or which has far too many implications for your Average Joe Public to be expected to grasp.”

In the same way a doctor takes a Hippocratic Oath given that patients can't be expected to understand the intricacies of the human body and so the onus should be on the doctor to “do no harm” based on their intimate subject-matter expertise, the privacy professional should be required to take on the same ethical obligations, she said, based on what data-subjects don't know and what privacy professionals do. 

But Pols takes issue with that. 

"The problem is, that oath, it's about allegiance. Basically, doctors have their allegiance related to their patients, and if they don't do that, they — certainly in the U.S. — they can get sued and a patient can die," she said. "The problem in the privacy profession is: Where is your allegiance? Where does it lie? What does it mean to take that kind of oath, and can you actually act upon that? What are you promising under that allegiance?" 

In other words: a privacy attorney may not be able to make the kind of promises pros like Kosa would have them make to the data subject under a code of conduct. 

Who decides what 'good' ethics are? 

Besides the profession's composition, privacy as an industry poses unique ethical challenges because ideas around it can be very different depending on background, social status, geography and other factors, said Raicu. And after all, what are ethics? It's not a science and is most often very subjective. 

“Many times people feel privacy is very important but needs to be balanced against other rights, and then you get into a more personal balancing act that people do when it comes to issue likes privacy. It makes it really difficult to develop a code of ethics,” she said. Also problematic, she said, would be making a code that isn’t so vague that it’s simply a marketing tool. 

“It’s a problem for any code of ethics,” she said. “Do you keep it broad or try to get into the nitty-gritty responsibilities.”

Selinger notes even in good faith, privacy professionals can “reasonably disagree” on a host of ethics issues. For example, "data brokers have offered a coherent and thoughtful account of why their job benefits society,” he said. “By contrast, critics of some versions of this enterprise come close to characterizing data brokers as the devil incarnate who bear responsibility for advancing the contemporary surveillance society. Or take the widely debated issue of government surveillance. Ethical judgements vary widely and strongly on the matter.”

Who would create the code?

Pablo Garcia Molina, who teaches graduate courses in ethics and technology management at Georgetown University, said normally codes are developed by two sets of people. One group, the core, would be the members of the profession themselves. The second would be an institution. For example, for lawyers it’s the American Bar Association; for doctors it’s the Board of Medical Examiners. Molina said IAPP members would be the right place to start. But he said it’s important to include people who aren’t part of the profession who might also adhere to a code of conduct and can think more broadly.

In the end, Raicu said, just the exercise of creating a code of ethics can be useful. Or, she offers alternatively, maybe the privacy profession doesn't even develop a code but uses the “Framework for Ethical Decision Making”to advise privacy pros on making their own decisions. “If you can’t necessarily reach some sort of consensus, you can help people make their own decisions,” she said. “How can we help you make a decision that will make you happier with it than you would be otherwise, because you thought it through.”

So, where does this leave us?

"I believe these reflections suggest that no single individual should be authorized to draft a code of ethics for privacy professionals, as opinions vary, and that no existing code from other fields — whether it’s engineering or others — can be copied 100 percent," Selinger said. "But if there are enough people who believe the privacy profession would benefit from such a code, there is a clear pathway forward." 

photo credit: edmenendez The Thinker via photopin (license)


If you want to comment on this post, you need to login.

  • comment Sheila Dean • Feb 28, 2017
    Angelique,  nice footwork on this article.  I have been in discussions with IEEE concerning this very topic.   The misfortune privacy proponents - possibly different from privacy professionals- is a dislocation of individual rights and individual control over personal data.  The purposes of say an adapted Hippocratic Oath for AI developers handling PHI would be so their mission would not deviate from the mission of the healthcare profession itself.  Machine Learning makers for insurance, healthcare and public information are in different value positions but they all need ethics coaching and guidance.  There are good good reasons.  
    Governments and large companies (corporations) are both saying to the identified person, your consent over your body of information does not matter.  The law supports individual consent and regard for it.  However, you have a current system of legal academic simpatico who pays token lip service to issues of consent, teaches law students it is proper form to accept parallel construction from your government when it comes to privacy and your individual rights have no advocate in corporate privacy temps or government staffers who can and will ignore your complaints. 
    Look at this Harvard paper.  It gives token mention to consent over the terms of making internally protected CIVIC information a public commodity for trade!  It's got appropriate professional standards, yes.  It's barely legal. Law professionals may treat it like future gold-standard policy. This is how we got incrementally marginalized HUMAN RIGHT OF PRIVACY.  This was meant for  "innovation" as (IoT) to stomp over the Privact Act of 1974; which is being serially ignored an unenforced because there is corruption in legal standardization.
    A code of ethics, independently certified by a diversified professional process, would inject a self-regulatory function where it has been certainly hosed out of the legal process.  AI workers would not be compelled to make digital machine learning or other AI tools to be used to commit acts of genocide, targeted repression or human rights abuses or face loss of professional license, if not legal disgrace.
  • comment Leon Goldman • Mar 1, 2017
    Angelique, well done and well said. If members of the IAPP are to consider themselves members of a "profession" and they wish others to see them as members of a "profession", then it goes without saying that there should be a code of ethics. A code is part of what makes a group of people part of a profession. As both a former physician and compliance officer, a code was the norm. Yes, writing a code is difficult and needs to be done by a group and not by a single individual, but it needs to be done.
  • comment Peter Westerhof • Mar 1, 2017
    Articles 40 and 41 of the EU GDPR explicitly  adress "Codes of conduct".
    Also GDPR differentiates two distinct roles for the DPO : one independent role within the organization reporting directly to the management of the controller or the processor. And an independent role as liaison between the organization and the national supervisory authority.
    So there should be no conflict of interests. GDPR even guards against that.
    See also the overview article by Rita Heimes at
  • comment Joseph Jerome • Mar 3, 2017
    I hope Angelique's piece kickstarts this conversation, which seemed poised for a breakout when Intel's David Hoffman was talking about a "Data Innovation Pledge" a few years back. It's certainly tough to thread the needle of existing legal and engineering ethics codes, but particularly if the IAPP gets its wish and everyone on the planet is a CIPP (and the privacy "profession" includes more and more lawyers and non-technical types), some sort of ethical code will be essential to ensure privacy becomes nothing more than a dispassionate compliance exercise.
  • comment Hilary Wandall • Mar 5, 2017
    Angelique, your article has taken the concept of ethics in our field to the next level.  In my early years as a privacy pro, I often was called to advise biomedical research scientists on secondary uses of health research data and biological samples.  Determining the appropriate guidance always ended up in the gray space and necessitated balancing the interests and risks to individuals, their communities and broader society with our obligations and values.  Although I was trained as both as a scientist and a lawyer, as genetic information was becoming increasingly identifiable and data was being generated from a broader set of sources and being combined more readily, I was compelled to obtain training as a bioethicist in order to provide guidance that factored not only legal and regulatory compliance, but also uncertain and evolving data-related risks.  As multi-source data analytics became more common across the entire enterprise and relevant to a broader scope of business, we found that the principles we applied to biomedical research were valuable to decision-making about customer, employee and compliance analytics as well.  It has been encouraging to see growing interest defining standards and frameworks for data ethics over the past few years as I believe that it will be difficult to effectively demonstrate compliance with the GDPR's standards for determining the risks of data processing to the rights and freedoms of individuals in the absence of a comprehensive ethical analysis.  However, your article has caused me to wonder whether such standards and frameworks will be broadly and timely adopted in the absence of a code of ethics for privacy pros that establishes a standard of practice for ethical data processing evaluation among all data decision-makers.  So, I'm now on the side of developing a professional code that helps to drive the adoption of ethical data decision-making on a global scale.  Count me in as a supporter.
  • comment Richard Beaumont • Mar 13, 2017
    Some great thinking here.  I completely agree that there is a role for an ethical code, so it is great to see this discussion.
    I think there are perhaps two dimensions to this which could be teased apart.  I think a code of ethics for the IAPP as a membership body, and a code of ethics in the use of personal information are 2 things.
    A code for the profession itself can be about standards of behaviour, expectations of honesty and integrity, and might be easier to produce if it does not concern itself with how information is treated/processed, but more about how decisions about such processing are reached.  A responsibility to create an accurate representation of data flows and risks, to advise on legality of an approach to the best of one's knowledge - this is to me the ethics of the profession. This needn't address whether a practice in one company or another is inherently good or bad.
    Then there is a broader 'ethics of data handling' which is more directly about defining what is and is not acceptable practice in the use of information.  I think such codes are best placed within industry bodies themselves - like the car industrty, the medical informatics industry, or the software development industry (though trying to find a professional body that reresents all of that field is a challenge).  Such codes belong being written into those professions, with their distinct focus and needs. But those who would help such codes be written, are the members right here.