Privacy Engineering Section Forum

Monday, April 6
9:30 a.m. – 3 p.m.

With the rising tide of products and services collecting and processing personal data, the time for real solutions is here.

The ever-closer connection between people, privacy and technology is the focus of the Privacy Engineering Section Forum, brought to you by the IAPP’s Privacy Engineering Section as part of Summit 2020 Active Learning Day sessions. This nitty-gritty half-day event will examine the nuts and bolts of integrating privacy management and product development. Come for in-depth discussions and real-world examples of privacy management at work.

Select the Conference Plus+ Pass (3-Day).

This April event follows the November session at the IAPP Europe Data Protection Congress 2019, which focused on practical solutions for integrating privacy management and product development.

Schedule and Program

  • local_dining8 – 9:30 a.m.
    Check-In and Breakfast
  • expand_more9:30 – 10 a.m.
    Opening Keynote Address: Engineering Formal Privacy Into the 2020 Census

    John M. Abowd, Associate Director, Chief Scientist, Research and Methodology, U.S. Census Bureau

    Being a good data steward in the 21st Century means engineering formal privacy protection into the entire data life-cycle of a statistical agency. The Census Bureau will be the first statistical agency worldwide to implement a production central differential privacy system to protect the data from a national census. This talk will discuss the policy and engineering challenges of using differential privacy to protect confidentiality in the 2020 Census data publications.

  • expand_more10 – 10:30 a.m.
    The Privacy Engineering Mindset: From Design to Launch

    Sha Sundaram, Tech Lead, Privacy and Security, Snap

    The privacy engineering mindset is a set of methodologies, tools and patterns you can employ to create products that protect users and their data. Having done this at scale at companies like Google and Snap, the speaker is eager to share my learnings and practices with you. In this talk, delve into the nuances of privacy, security, risk and impact to user and identify a blueprint for an effective privacy program you can setup at your company. We will discuss instituting the right collaborative practices and incentives to work with partners in engineering, privacy and security, product, and legal teams to develop products that respect user privacy rights. Finally, we'll elaborate on the role a privacy engineer can play in shaping product from design to launch.

    What you’ll take away:

    • Privacy engineering mindset that you need within an organization.
    • Privacy engineering process and how it works with software engineering.
    • Privacy engineering design patterns, tools and program blueprint.
  • local_cafe10:30 – 11 a.m.
    Networking Refreshment Break
  • expand_more11 – 11:30 a.m.
    User-Centric Privacy: Designing Effective Protections That Meet Users' Needs

    Florian Schaub, CIPP/US, CIPT, Assistant Professor, University of Michigan School of Information

    Privacy engineering aims to respect and protect users' privacy. User studies provide insights on users' privacy needs, concerns and expectations, which are essential to understand what a system's actual privacy issues are from a user perspective. Drawing on the speaker's research on privacy notices and controls online, on smartphones and in the context of smart speakers, this talk discusses how and why privacy controls are often misaligned with user needs, and how user studies can inform the design of user-centric privacy protections that more effectively meet users' needs as well as benefit companies.

    What you’ll take away:

    • Privacy is a product design issue, not just a compliance issue.
    • Getting privacy information and controls right requires understanding users' privacy needs and concerns.
    • Privacy notices and controls need to be relevant, understandable and actionable for users.
    • Any privacy notice or control is an opportunity to explain both risks and efforts taken to protect privacy in order to build consumer trust.
  • expand_more11:30 – 12 p.m.
    Creating Usable Privacy Choices for Consumers

    Lorrie Cranor, CIPT, Professor of Computer Science, Engineering, Public Policy, Director of CyLab, Carnegie Mellon University

    In a recent empirical analysis of 150 English-language websites, researchers found that privacy choices and outcomes are likely negatively affected by inconsistent placement of privacy notices on organization websites, difficult to understand language, and incomplete information, among other issues. Follow-up user research confirmed that users struggle to figure out how to opt out or delete their data on many websites. In this session, the co-author of these studies, Professor Lorrie Cranor, will discuss her study findings and suggest guidelines and policy recommendations for improving the usability of choice mechanisms.

    What you’ll take away:

    • Understand factors that make website privacy choice mechanisms difficult to use.
    • Learn approaches to making website privacy choice mechanisms more usable.
  • expand_more12 – 12:30 p.m.
    Designing Usable and Informative Privacy and Security Labels for IOT Devices

    Pardis Emami-Naeini, PhD Student, Carnegie Mellon University

    Despite growing concerns about the privacy and security of IOT devices, studies have shown that consumers lack enough privacy and security information when purchasing such devices. To effectively inform consumers about the data practices and security mechanisms of IOT devices, our research group at Carnegie Mellon University designed usable and informative privacy and security nutrition labels. We developed the label design through a series of interviews and surveys with privacy and security experts and consumers. We then evaluated the label by conducting a large-scale online study and quantified the impact of the information presented on the label on consumers’ risk perception and willingness to purchase.

    What you’ll take away:

    • Understand how privacy and security labels for IOT devices can help consumers.
    • Learn ways of testing consumer understanding of risk communications in the context of IOT device labels.
  • local_dining12:30 – 1:30 p.m.
    Networking Lunch
  • expand_more1:30 – 2 p.m.
    Engineering Privacy for Biometric Data

    Amy Blumenfield, CIPM, Privacy Senior Program Manager, Windows, Browsers, Devices, Microsoft

    Microsoft and other companies are increasingly building tools and applications that are dependent upon biometric type data. The public, as well as privacy advocates and regulators, are increasingly concerned about the collection, processing and usage of this type of information. In this session, we will discuss how Microsoft is moving to address the creation, improvement and troubleshooting of products that integrate biometric data in a rapidly evolving legal and ethical environment.

    What you’ll take away:

    • What falls into the biometric data category beyond fingerprint, facial scans and eye data?
    • Considering intended versus potential use of biometric data (for good or evil).
    • Security considerations when collecting and storing biometric data.
  • expand_more2 – 2:30 p.m.
    Artificial Intelligence Real Threat: Privacy Threat Modeling for Machine Learning

    Jonathan Fox, CIPP/US, CIPM, Director, Privacy Engineering, Cisco

    Denise Schoeneich, CIPP/US, CIPM, CIPT, FIP, Privacy Engineer, Intel

    The increasing prevalence of artificial intelligence (AI), the high dependence on large data sets for machine learning as “training data” to reach reliable levels of accuracy, and the ability to derive actionable insights from data sets poses new ethical and potentially unforeseen challenges for privacy. In this interactive session, you will learn how privacy engineering and threat modeling can be used as tools to reduce potential privacy risks and prevent harms to individuals from data acquisition, retention, transparency in use, to the problematic bias that may affect automated decisions in unpredictable or undesirable ways.

    What you’ll take away:

    • Overview of artificial Intelligence and machine learning.
    • How threats pose risk to privacy, who are threat actors in a privacy context, and different frameworks for modeling privacy risks.
    • Privacy context diagram: how to build a context diagram and layer in threats and controls.
  • expand_more2:30 – 3 p.m.
    Minimization at Scale: Classifying, Inventorying, and Minimizing Sensitive Data

    Engin Bozdag, Senior Privacy Architect, Uber

    Derek Care, Director, Privacy Legal, Uber

    Roche Janken, Software Engineer, Privacy, Uber

    The large volume and various formats of data collected and created by organizations makes it hard to measure risk. However, without this metric, risk management is impossible. As Uber matured, we embarked on a strategic initiative to classify our data and build in controls to prevent internal misuse. The effort also includes initiatives to inventory and tag data using various techniques to map to the aforementioned classification, and then apply techniques to minimize the data footprint. We are using data tokenization, obfuscation and various access control methods to minimize re-identification of customers and anonymize their location data. This is part of a larger strategic goal to embed privacy not just into product design, but into the culture using data-specific privacy automation early in the data funnel. This workshop will teach techniques for getting to know your data and categorizing it by risk so that you can minimize your data and its attendant risk.

    What you’ll take away:

    • How to define data — both business data and customer data.
    • How business-specific usage determines classification and protection methods used.
    • Examples of how legal and engineering share context to reflect usage and arrive at a practical classification.
    • Learnings from Uber's proof of concept as we move towards wider execution across our data landscape.