It’s common for professions to have codes of ethics. Particularly learned professions or those where high expertise is required. Think doctors, lawyers, engineers. But there is one specialized field that noticeably does not have a code of ethics: The privacy profession.

There’ve been clamors here and there for a professionalized code, and the "Pokemon Go" phenomenon renewed some of those calls recently. But it’s never gotten up off the ground. Part of that reason, insiders say, is the field’s diversity. How could one single code possibly apply to a field comprising technologists, engineers, lawyers, chief privacy officers, and seemingly countless others? But those same insiders also say, while it would be a significant task to take on, a code may be well worth it, given the sensitivity of the work privacy pros do every day and the specialized expertise required. 

Tracy Ann Kosa, who spent 20 years in privacy, four at Microsoft, said some years ago she began to notice a seismic shift in the profession’s makeup and, subsequently, its ideals. 

“What started to grow was the notion of a privacy officer or privacy manager as someone who could run a program that could pull together the technical and the legal piece, and I think everyone in the profession at the time thought that was a really good thing,” Kosa said. “But as the discipline grew, as the domain evolved, a lot more people got interested in it, but a lot of those people got interested not for the same reasons the people who grew the field were interested in it.”

In other words, it turned into a compliance-based exercise.

That shift didn’t sit well with her. What irked her was her sense that the field was losing its strong base of privacy advocates, replaced by professionals who were saying to companies, “I can knock out a privacy impact assessment for you for $50,000, no problem.” 

And that's why she thinks it's time for the privacy profession to develop a code of ethics. She said there has to be a way to ensure the very important work privacy professionals do is done for the right reasons, with the right end-goal in mind.

She's not alone. Plenty of folks agree. But the problem is in the execution. 

Aurélie Pols is, among other things, a member of the ethics advisory group at the European Data Protection Supervisor. While she understands the need, she's skeptical a code of ethics could ever work in privacy. The problem? There are so many lawyers in the field, and lawyers are focused on compliance and are ethically bound to their clients, not the data subject. 

“To be honest, I fight mostly with lawyers [who say], ‘Why would we do that? We’re compliant … Why would we do that?’” Lawyers’ responsibilities are to limit risk for their clients, the organizations, whereas ethics is about “having a balance for society in general,” Pols said. So there’s an inherent tension there.

As a lawyer herself, Irina Raicu understands that tension. Director of the Internet Ethics Program at Santa Clara University’s Markkula Center for Applied Ethics, she notes that, because lawyers already adhere to a professional code of ethics, creating a code for the privacy profession could actually conflict with their existing code, which ethically binds them to their clients, not the data subject. The same could be said for software engineers, who follow a code established by the IEEE and ACM. 

“I think that’s an interesting challenge for developing something like this,” Raicu said. “The privacy profession combines a lot of different professional expertise types.”

Evan Selinger is professor of philosophy at the Rochester Institute of Technology and formerly a fellow at the Institute for Ethics and Emerging Technologies. While he supports the idea of a code, he agrees the composition of the field means creating one would likely be messy, though not necessarily because of a sizable lawyer contingency. "Largely for principled reasons,” he said, noting the disparity in expertise and background among privacy professionals. For example, there’s likely greater agreement amongst engineering professionals about core ideals than privacy pros.

But, Selinger said, by now privacy professionals “have gained sufficient social esteem and are deemed important enough to the functioning of a digital global world that they should willingly follow the lead of others and embrace their own code of ethics, one that details internal and social expectations and articulates them according to designations of a few recognizable categories.” He cited those of the National Society of Professional Engineers Code of Ethics, for example: fundamental canon, rules of practice and professional obligations. 

Katy Lovell is senior compliance analyst at State Farm Insurance, and she thinks more along Selinger's lines. Noting her views are hers and not those of State Farm, she said a code of ethics makes a lot of sense and is even surprised there isn’t one yet. She subscribes to a code of ethics maintained by State Farm’s compliance department. She said she doesn’t see much difference between compliance and privacy.

“Privacy is kind of a non-traditional leadership role,” she said. “You’re helping people make difficult decisions and reinforcing desired behaviors. That’s someone you’d like to have good ethics and morals.”  

That's where Kosa is coming from, too. It's too important a role to not codify behavior, she said. 

“What’s the difference from a privacy person that gives advice and a doctor who gives advice?” she found herself recently thinking. “What it comes down to, for me, is the fact that … your job is to understand the technical data flows and potential impact of it all in the eco system for the data subject and interpret that,” she said. “To me it seems like a parallel, it seems like what I see in the regulated professions. All these people are relied on as experts to understand something far too complicated, or which has far too many implications for your Average Joe Public to be expected to grasp.”

In the same way a doctor takes a Hippocratic Oath given that patients can't be expected to understand the intricacies of the human body and so the onus should be on the doctor to “do no harm” based on their intimate subject-matter expertise, the privacy professional should be required to take on the same ethical obligations, she said, based on what data-subjects don't know and what privacy professionals do. 

But Pols takes issue with that. 

"The problem is, that oath, it's about allegiance. Basically, doctors have their allegiance related to their patients, and if they don't do that, they — certainly in the U.S. — they can get sued and a patient can die," she said. "The problem in the privacy profession is: Where is your allegiance? Where does it lie? What does it mean to take that kind of oath, and can you actually act upon that? What are you promising under that allegiance?" 

In other words: a privacy attorney may not be able to make the kind of promises pros like Kosa would have them make to the data subject under a code of conduct. 

Who decides what 'good' ethics are? 

Besides the profession's composition, privacy as an industry poses unique ethical challenges because ideas around it can be very different depending on background, social status, geography and other factors, said Raicu. And after all, what are ethics? It's not a science and is most often very subjective. 

“Many times people feel privacy is very important but needs to be balanced against other rights, and then you get into a more personal balancing act that people do when it comes to issue likes privacy. It makes it really difficult to develop a code of ethics,” she said. Also problematic, she said, would be making a code that isn’t so vague that it’s simply a marketing tool. 

“It’s a problem for any code of ethics,” she said. “Do you keep it broad or try to get into the nitty-gritty responsibilities.”

Selinger notes even in good faith, privacy professionals can “reasonably disagree” on a host of ethics issues. For example, "data brokers have offered a coherent and thoughtful account of why their job benefits society,” he said. “By contrast, critics of some versions of this enterprise come close to characterizing data brokers as the devil incarnate who bear responsibility for advancing the contemporary surveillance society. Or take the widely debated issue of government surveillance. Ethical judgements vary widely and strongly on the matter.”

Who would create the code?

Pablo Garcia Molina, who teaches graduate courses in ethics and technology management at Georgetown University, said normally codes are developed by two sets of people. One group, the core, would be the members of the profession themselves. The second would be an institution. For example, for lawyers it’s the American Bar Association; for doctors it’s the Board of Medical Examiners. Molina said IAPP members would be the right place to start. But he said it’s important to include people who aren’t part of the profession who might also adhere to a code of conduct and can think more broadly.

In the end, Raicu said, just the exercise of creating a code of ethics can be useful. Or, she offers alternatively, maybe the privacy profession doesn't even develop a code but uses the “Framework for Ethical Decision Making”to advise privacy pros on making their own decisions. “If you can’t necessarily reach some sort of consensus, you can help people make their own decisions,” she said. “How can we help you make a decision that will make you happier with it than you would be otherwise, because you thought it through.”

So, where does this leave us?

"I believe these reflections suggest that no single individual should be authorized to draft a code of ethics for privacy professionals, as opinions vary, and that no existing code from other fields — whether it’s engineering or others — can be copied 100 percent," Selinger said. "But if there are enough people who believe the privacy profession would benefit from such a code, there is a clear pathway forward." 

photo credit: edmenendez The Thinker via photopin (license)