The Fair Information Practice Principles are like a fountain of youth for privacy professionals. When we are confronted with a new technology or new operational challenge, returning to our core principles can refresh our sense of direction and mission — though maybe not restore our youthful vigor. Yet throughout the history of the profession not all the FIPPs, or “FIPs” if you prefer, have received equal attention.

Despite it serving as the very first principle, collection limitation — the reminder to limit the personal data we collect to what is fair, expected and reasonable in the circumstances — is often honored in the breach. But the idea is experiencing a resurgence among policymakers. Any shift away from a notice-and-choice-focused regime necessitates more activity on this and other aspects of “data minimization.”

In fact, data minimization is the first tile of the American Data Privacy and Protection Act. It is also a core theme of the Federal Trade Commission's recent activity, including in its Advance Notice of Proposed Rulemaking on Commercial Surveillance and Data Security. Reinvigorating this principle is bound to benefit the practice. Recent reports from EPIC and Consumer Reports and Access Now have highlighted the importance of this principle to the future of data protection. When minimization is built into the design and construction of digital systems, all the rest of data protection is simplified.

But speaking with privacy professionals, I worry that a focus on minimization can leave us fighting a losing battle. Business interests in collection undergird the functioning of our modern data economy. The drive for innovation and ever-more data hungry applications, including artificial intelligence, pushes inexorably toward the collection of data. Though it is deeply important to begin data protection conversations with this principle, and to balance business interests against the privacy risks to individuals and communities, we need to ensure there is a second tool in our tool belt to ensure responsible use of the data we do collect.

Luckily, there is also the principle of purpose specification. These principles go hand-in-hand, since the purposes for which we collect data inform the limits on what we should collect. Once we have the data, the purpose of processing should continue to inform how we use, share and dispose of it. This is, of course, easier said than done. The challenge is most extreme when new technologies are involved. Emerging data-heavy platforms, like connected cars or virtual reality, are popular in part because they facilitate future technical innovations. Even as we start to deploy these technologies, we may not know all the uses and purposes to which they will be put. This can lead to fraught conversations about purpose specification.

Reflecting on these challenges, I think privacy teams should consider building operational and technical tools that allow for a nimble, risk-based and forward-thinking approach to purpose specification. If we build systems and procedures in a way that allow us to continue tracking the uses to which data is put, we can consistently refine our thinking on new risks and protections. This is different than kicking the can down the road. Instead, a nimble approach to privacy means embracing the fact that systems should allow for ongoing refinements to purpose and use specifications. Building databases, procedures, and contracts in a way that allows for future restrictions will help us prepare for the next wave of privacy regulation — and the privacy-protective norms that are already emerging.

Here's what else I’m thinking about:

  • With one month to go, some comments are already trickling in for the FTC’s ANPR. You can submit comments and read others’ comments at Regulations.gov.

  • FTC Commissioner Alvaro Bedoya spoke at the National Advertising Division Annual Conference, highlighting his belief that privacy is not a luxury, but a necessity. His full remarks are worthy of a close read.

  • Meta’s Nick Clegg wrote a post on Medium problematizing the fracturing of the modern internet due to data localization requirements. Clegg pushes back on the common metaphor of data as oil.

  • The Metaverse Standards Forum published an update on its work to build technical interoperability into virtual and augmented reality platforms. On the policy side, the XR Association published a list of “basic beliefs” to inform privacy best practices among its members.

  • Under Scrutiny:
    • So-called shameware apps are the subject of a disturbing report in Wired on the practice of certain religious congregations inspecting the online activities of their members.
    • Clearview AI, the facial recognition service built using online photos, is the subject of a New York Times report about new uses of the system by public defenders.   
  • Privacy people on the move:
    • Josh Harris has moved from BBB National Programs to serve as Senior Privacy and Data Protection Counsel at Twitter.
  • Upcoming happenings:
    • Sept. 28 at 2 p.m. EDT, IAPP hosts a Diversity in Privacy Section Mentor Q&A (virtual).
    • Oct. 3 at noon EDT, the Georgetown Institute for Technology Law and Policy hosts a conversation with Linnet Taylor on Just Data Governance (500 First).
    • Oct. 6 at 5:30 p.m. EDT, the Center for Democracy and Technology hosts its annual Tech Prom (The Anthem).
    • Oct. 11 at 9 a.m. EDT, the Xterra Institute and the Georgetown Massive Data Institute host a forum on Privacy Preserving Technologies and AI in Education (500 First).

Please send feedback, updates and purpose solutions to cobun@iapp.org.