Privacy engineering is a growing field these days, becoming an essential function in many modern businesses as they grapple with the "how" of privacy and trust. To respond, the IAPP created the Privacy Engineering Section earlier this year. After a successful opening at the Global Privacy Summit last spring, the Privacy Engineering Section Forum returned for another half-day workshop here at Privacy. Security. Risk. in Austin to help those working in the field to think through tough challenges, learn best practices and hear about what privacy-engineer peers are doing within their organizations.
Clearly there's interest in the area, as the workshop sold out. No doubt the exchange of ideas and audience questions throughout the afternoon demonstrates the thirst for answers and insight among the community.
Broken up into three "mini-sessions," the forum ran the gamut of some of the field's pressing issues. How do you design for the user experience with privacy in mind, for example? How can you help make data governance possible for engineers? What are some practical approaches to real-world issues involving cross-border data transfers for global vendors?
The answers are not easy, but Maritza Johnson, for example, had some insight for designing privacy into the user experience. "Think about how you're measuring for success," she offered. Are you measuring it by click-through rates or A/B testing? Are those measurements that cultivate trust or bolster the privacy of your users?
"Those don't really capture the experience we're looking for," Johnson said. "People don't sit down to 'do privacy' or to 'do security.'" She continued: "We know people don't read privacy policies, but we do need to understand what threats people are interested in and what they want to know more about. Where can you surface the relevant information and controls in areas that people are actually looking for them?"
Johnson suggested teams should think outside the box and get creative about how and what you measure for success. Think about the trust of your users, and build a "privacy narrative." But simply saying, "'we care about your privacy' is not a privacy narrative."
Google Security and Privacy User Experience Researcher Patrick Gage Kelley noted metrics are key, but also pointed out, during UX research, "privacy is rarely the primary test." He said they look at all kinds of metrics, including those built around trust, comfort, or whether a user could be surprised by something. "Having an excellent privacy notice alone is not enough," he argued. "UX is about thinking about the entire user experience and working with the teams at your companies by thinking about the whole ecosystem."
A team from Microsoft also shared how they've operationalized privacy by design in the enterprise. David Marcos, CIPM, CIPT, principal program manager, cloud and enterprise privacy, was joined by Carrie Culley, senior principal program manager, online privacy, security, and accessibility, and Euan Grant, principal architect, privacy, security and compliance, who discussed how data governance has helped with privacy and how privacy has helped data governance. There were plenty of audience questions for this practical take.
The demand for more use cases is apparent in privacy engineering. Wednesday's forum was rounded out by a use case involving data transfers for a global ICT (information and communications technology) vendor. Led by Nokia Software Head of Privacy Frank Dawson, CIPT, and Nokia Software Digital Experience Program Manager Joy Dion, the thorny and complex task presented to attendees was a difficult challenge with the time allotted, but it definitely got the wheels spinning for attendees.
Annie Anton, who servers as chair of the Privacy Engineering Section, also noted she is looking for materials for a book of case studies. So if you have any resources, use cases or templates, be sure to get in touch.
Underlying the day's workshop was the keynote message from Google Principal Engineer Lea Kissner. Ethics in artificial intelligence will play an immensely important role steering technology toward a healthy and safer society. She said this means that design should be socially beneficial, avoid bias, built and tested for safety, accountable to people, incorporate privacy by design principles, uphold high standards of scientific excellence, and avoid societal harm.
These are no small tasks, but ones that will help shape the future.