Regulations and policies alone won’t get us to a world that effectively respects privacy. To get there, we need multiple things: (1) systems we can trust to robustly implement policy, (2) technology that enables better choices than the ones we have today and (3) a deep understanding of the wide spectrum of humans who interact with our systems. In short, we need privacy engineering.
Organizations’ understanding of privacy engineering and their need for it is evolving rapidly alongside new technologies, products and services, regulations and even societal trends. These needs range from architecting privacy at scale to enabling varied forms of consent, facilitating data deletion, designing user-friendly privacy interfaces and protecting the data of vulnerable populations. Increasingly, academia and industry are coming together to consider and solve these privacy engineering problems. But more ideas and conversations are needed to improve privacy engineering in practice and deploy it effectively within organizations.
Architecting large-scale systems for predictable and robust data protection, data retrieval and data deletion is more challenging than it looks, which many of us found out the hard way.
And this is only where privacy engineering starts. Today’s privacy engineers work with privacy professionals across organizations to address myriad new challenges. For instance, today, privacy engineers are implementing new cryptographic privacy-enhancing tech to meet post-Schrems II data flows compliance demands. They are exploring privacy threat modeling to reduce organizations’ risk profiles when deploying new machine learning applications. Privacy engineers have played even more critical roles during the pandemic — deploying anonymization techniques to enable clinical trial transparency, enabling privacy-protective data sharing between companies and researchers to identify COVID-19 indicators in pre-symptomatic individuals or the effectiveness of government stay at home orders, and building protections into rapidly expanding digital health applications.
They are also helping organizations consider new approaches to traditional privacy needs. Consent is always tricky to operate effectively. Most organizations’ users aren’t attorneys — meaningful privacy choices require user experience research and technical design. Privacy engineers are at the forefront here, too. They are working across organizations to develop review and design processes to incorporate foundational privacy concepts, user experience research and regular user testing. In the classroom, they are teaching and being taught how to communicate privacy needs and possibilities in creative and accessible ways — illustrating privacy engineering concepts with potty talk as one such example.
Users are not all the same; effective products and systems for some users will be completely inappropriate for others. Privacy engineers must understand this. They must also help their organizations respond by building privacy protections according to their operating context, considering those who will use their products and services, the culture in which they are operating, their organization’s values and their legal obligations. This calls for conversations between technical teams, product teams, legal teams, design teams, HR teams and more. Privacy engineers can contribute to and help steer these conversations within organizations.
On June 10 and 11 privacy engineers grappling with all of these challenges and those interested in learning how to address them will gather for the annual PEPR: Privacy Engineering, Practice and Respect Conference. Due to the pandemic, the gathering will again be virtual this year. If you are interested in contributing to the conversation and the evolution of our field, we hope you will join us and experts from across sectors and around the world to discuss privacy engineering and building respectful systems. PEPR is hosted by the Future of Privacy Forum and Carnegie Mellon University CyLab in cooperation with the IAPP.
Photo by Alain Pham on Unsplash