One of the most consistent points of consensus among policymakers in the past few rounds of the privacy debate has been the need to better protect young people online. After high-profile hearings revealed unique harms and risks minors can face when using digital services, especially social media, lawmakers got to work on new ideas for enhancing responsibility and accountability over online platforms that serve this demographic.

While the Senate proposed initiatives like the Kids Online Safety Act and so-called Children's Online Privacy Protection Act 2.0, the bipartisan lawmakers behind the American Data Privacy and Protection Act added new rules for young people to their proposal, including data sharing restrictions and a ban on targeted advertising for users under 17. At the state level, California passed the Age-Appropriate Design Code Act, importing the U.K.’s approach to making platforms responsible for youth safety, mental health and privacy.

At first, this year seemed no different. Lawmakers in at least eight states proposed bills to either adopt the AADC approach or more directly add privacy rules for kids and teens. Notably, a pair of bills in Virginia, HB 1688 and SB 1026, would amend the Virginia Consumer Data Protection Act to add verifiable parental consent requirements for users under 18, among other things.

But then Utah entered the chat. At the state level, a pair of bills would up the ante on the top-down approach to governing platforms. The bill proposed in the Utah Senate, SB 152, which recently passed the Business and Labor Committee unanimously, would require social media platforms to provide parents or guardians of minors under 18 with “access to the content and interactions" of their ward’s accounts. Its House companion bill, HB 311, would require platforms to ban users under 16 completely and enforce this with a government ID-based age verification requirement.

Now, U.S. Rep. Chris Stewart, R-Utah, is bringing the beehive state’s ideas to the federal stage. As The Washington Post reported, the congressman is introducing a bill to entirely ban social media for users under 16. Although the text of the bill is not yet available, a news release from Rep. Stewart says his Social Media Protection Act also:

  1. Makes it the social media platform's responsibility to verify age using methods like ID verification.
  2. Requires social media platforms to establish and maintain reasonable procedures to protect the confidentiality, security and integrity of personal information collected from users and perspective users.
  3. Gives the authority to the state to bring a civil action on behalf of its residents.
  4. Gives parents a private right of action on behalf of their children.
  5. Directs the Federal Trade Commission to prevent any social media platform from violating these regulations including implementing fines for violations.

The marketplace of ideas is already bustling with activity around youth privacy protections. Only time will tell which new approaches will become law. But one thing is certain: new laws covering youth privacy will pass in 2023.

Here's what else I’m thinking about:

  • The U.S. FTC announced the first-ever Health Breach Notification Rule enforcement. The claim against GoodRx for failing to alert customers about personal health data exposed to third parties is one of eight counts in a complex complaint about the company’s privacy and advertising practices. Detailed IAPP analysis may come next week — but in addition to a $1.5 million fine, the proposed consent order would ban the company from all advertising practices that could expose personal health information to third parties. One major takeaway for privacy pros: sharing of health-related personal information is not allowed for advertising purposes, even for retargeting, without explicit consent.
  • A surprise privacy theme emerged at a hearing on U.S. economic competitiveness with China. Privacy came up over and over in the first hearing of the term of the Innovation, Data, and Commerce Subcommittee of the House Energy and Commerce Committee. Comprehensive privacy rules are, in part, being presented as a competitiveness issue in policy discussions this term. Remarks from the four committee and subcommittee leaders, plus many of the other members on both sides of the aisle, called for the prioritization of passing the ADPPA. Testimony at the hearing from Samm Sacks of New America highlighted how “inaction by the United States means ceding leadership and influence in setting international standards to both Europe and China,” while R Street Institute’s Brandon Pugh said “failing to act on federal data privacy and security legislation would ignore the broader risks posed by data and leave threats from China and other malicious actors unmitigated.”
  • Speaking of China, TikTok scrutiny continues, even as details emerge about its “Project Texas” plans to appease the Committee on Foreign Investment in the U.S., including through a unique arrangement for Oracle to inspect the app’s U.S. data flows. Sen. Michael Bennet, D-Colo., wrote a letter asking Google and Apple to remove the app from their mobile app stores. For a clear-eyed analysis of TikTok’s practices and their potential impact on U.S. national security, see this report from the Georgia Institute of Technology School of Public Policy Internet Governance Project.

Upcoming happenings

Please send feedback, updates, and photo IDs to cobun@iapp.org.