As always, there was an abundance of high-quality original research on display at this year's PrivacyCon. Hosted by the U.S. Federal Trade Commission, the annual workshop provides a platform for privacy researchers to speak directly to FTC staff and the broader community. The seventh annual iteration of the gathering showcased work exploring whether:

One noteworthy study included a network-layer investigation of the 150 most popular applications built for Meta's Oculus platform, mapping app data flows against privacy policies. Though the researchers found some under-disclosure of shared personal data, I was most surprised by one finding: more than 25% of the apps reviewed in the study allegedly had no posted privacy policy at all.

The idea that public privacy policies provide meaningful "notice" is probably a legal fiction, but it should go without saying that writing and posting a privacy notice is also a foundational component of any privacy program. As many before me have pointed out, privacy notices are written more for regulators than consumers — although great layered notices can provide value to users too. Privacy policies are particularly valuable to regulators in regions like the United States, where privacy enforcement is driven by consumer protection law. If you make no privacy promises, you can't break them.

As early as 2000, the FTC required a company to post an online privacy policy as a remedial action in its Rennert enforcement action. And since 2003, California law has required that all "commercial websites or online services" post privacy policies. Best practices followed suit with this scrutiny. Now, privacy policies are the default place to find any mandatory or voluntary notice about personal data practices. 

That said, as with many compliance goals, it is dangerous to focus on privacy policies as an end in themselves. When approached as an authentic process, writing a privacy policy requires an organization to reflect on its data practices and goals. When approached as a box-ticking exercise, privacy policy writing often creates inaccurate or insufficient disclosures. 

Platforms like Google and Apple seem to have learned that it's not enough to require third-party developers to link to any old privacy policy. These dominant mobile app stores have embraced robust review processes and extensive developer documentation, including resources to help developers understand privacy best practices (Apple) and how to implement them (Google). Over the past year, as both platforms have begun to implement "nutrition label" type privacy notices, their developer policies have become even more strict, requiring not just privacy policies, but also standardized disclosures of data practices.

The evolution of these efforts showcases the responsibility that platforms share for ensuring a trustworthy marketplace and an app experience that meets users' privacy expectations. Over time, perhaps we'll see more convergence in platform efforts to educate and enforce.

Here's what else I'm thinking about:

  • ADPPA Episode IV: A New Hope. "My boss is committed to getting things done by end of the year," Timothy Kurth said on stage at the Privacy+Security Academy. Kurth serves as chief counsel in the Subcommittee on Consumer Protection and Commerce at the House Committee on Energy and Commerce. His boss, U.S. Rep. Cathy McMorris Rodgers, R-Wash., has been a major force behind the American Data Privacy and Protection Act. The statement reflects widespread sentiment that movement is possible in the lame-duck session after next week's election, though passage in the Senate remains unlikely. Relatedly, in a new op-ed in the San Francisco Examiner, New America's David Morar calls on Californians to stop opposing the "higher level of civil rights and privacy protections" that the ADPPA would enshrine for "hundreds of millions" of Americans.
  • The FTC continued its end-of-year burst of activity with another data security caseIntriguingly, the Chegg consent order includes a requirement to adhere to a retention schedule with granular documented purpose limitations, just like the recent Drizly case. But unlike Drizly, there is no requirement that Chegg publish its mandatory retention schedule or share it with the FTC. Perhaps the difference is explained by the fact that Chegg, which among other services provides students with a scholarship search feature that required the collection of sensitive information, is not accused by the FTC of collecting more information than was necessary for its business. Still, the inclusion of the purpose limitation requirement is a notable trend.
  • This week's View from DC is a little deceptive, but not unfairly so. I'm actually writing to you from Seoul in the Republic of Korea, where I spoke at a workshop on the Cross-Border Privacy Rules system. There is much work still to do as the participating economies work to transition from APEC to the Global CBPR Forum, rebuilding and modernizing the multi-layered accountability system. To that end, there were many productive conversations at the workshop among global regulators, accountability agents and certified companies. Of note: a recent report presented by the Centre for Information Policy Leadership found a 61% overlap between the CBPR program requirements and the U.K. General Data Protection Regulation. 

Under scrutiny

  • Algorithmic decision-making deployed by Washington, D.C., is the subject of a comprehensive report from EPIC, highlighted in this Wired article.
  • An emerging culture of community surveillance is described in another Wired piece.
  • The practical impacts of the EU's emphasis on "digital sovereignty" are explored in a report from the Atlantic Council's Europe Center.
  • Uber's policy change allowing marketers to target riders based on destination is covered in this Wall Street Journal analysis.

 Upcoming happenings

  • Nov. 8, the U.S. holds its midterm elections, which will shape the next two years of legislative activity in D.C. ¡Vamos a votar!
  • Nov. 9 at noon EST, the FCBA Young Lawyers and Privacy & Data Security Committees host a lunch-and-learn on Location Data 101 (virtual).
  • Nov. 10 at 5 p.m. EST, the Washington, D.C., Boston and Des Moines Knowledge-Nets host a panel titled 2023 Countdown: Are you ready for the CPRA and other state privacy law changes? (virtual).
  • Nov. 10 at 5 p.m. EST, Women in Security & Privacy hosts a presentation on Content-Oblivious Trust and Safety Techniques (virtual).

 Please send feedback, updates and privacy nutrition facts to cobun@iapp.org