IAPP Privacy. Security. Risk. 2025

SAN DIEGO

28-31 October

Back to conference agenda

Are Censorship and Content Moderation Laws Outpacing Privacy Laws?

Friday, 31 Oct.

11:00 - 12:00 EDT

Intermediate level

BREAKOUT SESSIONPRIVACYAI GOVERNANCEAI LITERACYAI AND MACHINE LEARNINGCUSTOMER TRUST AND EXPECTATIONSLAW AND REGULATIONREGULATORY GUIDANCESURVEILLANCE
Download the presentation slides

Hilary Lane, CIPP/US, Partner, Privacy & Data Security, Ballard Spahr
Eric Opanga
, Lead Counsel, Trust and Safety, Epic Games
Susan Rohol, Partner, Privacy, Cybersecurity, AI, Willkie Farr & Gallagher
Bill Shafton, Vice President, Business and Legal Affairs, Grindr

The FTC and state regulators are taking a closer look at censorship by technology platforms and several states have passed new laws requiring enhanced transparency about takedown processes, “shadow banning,” and other efforts to deprioritize content via artificial intelligence tools and algorithms. This is also coinciding with new obligations in California to build reporting processes for AI sexual digital replicas, cyberbullying, and child sexual abuse materials, with substantial penalties for non-compliance and significant class action risk. There are also a number of litigation challenges on Section 230 and First Amendment grounds to some of these new content moderation laws. Learn about this new trend, the status of litigation challenges, and what enforcement looks like as state and federal regulators become more invested in regulating user generated content online.

 

What you will learn: 

  • Overview of new state laws requiring disclosure of AI tools and other processes for reviewing user-generated content and new laws requiring reporting mechanisms for cyberbullying, child sexual abuse material, and AI sexual digital replicas. 
  • Status of current litigation and Section 230 and First Amendment arguments.
  • Overview of enforcement risk and penalties related to user generated content.