TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Creating Standards for Privacy Engineers: It Ain’t Easy! Related reading: What NIST Is Hoping To Get Out Of Its Privacy Grant Program



Privacy has never been an easy term to define. Plus, layered upon protean privacy definitions reside other nebulous definitions, including that of “personal information,” “harm” and “reasonable expectations.” In stark contrast, the security engineering world has created a fairly black-and-white standards framework for developers to follow.

As engineers are increasingly tasked with Privacy by Design, they’d like privacy standards to start looking more like security standards. Perhaps it’s no surprise, though, that developing standards around privacy engineering is no easy task.

That’s what stakeholders actively grasped Monday and Tuesday at the National Institute of Standards and Technology’s (NIST's) second Privacy Engineering Workshop in San Jose, CA, collocated here with the IAPP Privacy Academy and CSA Congress and meant to provide fodder for an eventual NIST report on privacy engineering. Other fields—particularly security engineering and safety risk management—have developed technical standards for promoting security and safety, but to date, there has not been an analogous framework in the privacy world. NIST is working to rectify that.

One thing that was clear during Tuesday’s workshops is that, although much work is left to be done, there is no shortage of insight, opinion and desire for a solution among the interested parties, all of whom come from different backgrounds: technology, security, privacy and law, to name a few.

“For those of us who have run in a marathon,” said one of the workshop’s facilitators, “we’re not at the 10K mark yet, so we have a long way to go.”

For example, lawyers and engineers don’t really speak the same language, so much of the conversation revolved around defining terms, including “privacy engineering.” NIST defines it as a “collection of methods to support the mitigation of risks to individuals of loss of self-determination, loss of trust, discrimination and economic loss by providing predictability, manageability and confidentiality of personal information within information systems.” But as one participant pointed out, a privacy engineer in one organization may be considered a developer in another and an architect in yet another.

Under the current draft framework, NIST has highlighted other key terms, including “data actions,” “problematic data actions” and “context,” all of which were debated thoroughly by Tuesday’s participants. What’s more, stakeholders also hashed out a proposed list of potential privacy harms under four main categories: “loss of self-determination,” “dissemination,” “loss of trust” and “economic loss.” Are these definitions for data subjects only, some asked, or should they include potential harms for businesses?

Several participants also noted the importance privacy engineering standards would have for organizations as a whole. Bringing a risk-management prong into the discussion between the legal and technological sides may be a significant necessity for the enterprise as well.

Though some stakeholders were not yet ready to say they agreed with the draft framework thus far, there is energy to move forward and hash out these issues.

What’s next for the process? The public comment period for the NIST Privacy Engineering Objectives and Risk Model Discussion Draft has been extended and will be open until October 10. NIST officials will continue to go through the feedback they have received. If you have comments or feedback, NIST made it clear they want to hear from as many stakeholders as possible.

And though the workshop was not webcast this week, NIST will host a webinar on the privacy engineering framework on Friday, September 26, so if you weren’t able to make it out to California this week, tune in next week for more. Watch the NIST Twitter feed for details.

1 Comment

If you want to comment on this post, you need to login.

  • comment Jon • Sep 17, 2014
    Good post.  Here's a contrarian view:  There's real competition in mHealth and mPayments right now for broad privacy promises (e.g., We will not sell health information.).  The killer app that will transform healthcare and payments the way DRM transformed the music industry will do something relatively simple, i.e., ENFORCEMENT of those promises (using big data surveillance, of course).  Hope to see y'all in San Jose.