"No child shall be subjected to arbitrary or unlawful interference with his or her privacy, family, home or correspondence."

So begins the lofty pronouncements of Article 16 of the United Nations Convention on the Rights of the Child. With a tinge of historic irony, it was at the repeated insistence of the U.S. representative that the U.N. included reference to privacy in the UNCRC, aligning it with the International Covenant on Civil and Political Rights, which limits government interference in "privacy, family, home or correspondence." Although the United States ultimately signed the UNCRC, it is the only U.N. member state that has yet to ratify the convention.

Originally written to protect primarily against government interference, the UNCRC has been re-interpreted in recent years in recognition of the enormous influence the digital world plays in the lives of children. In 2021, the relevant U.N. committee released General comment No. 25, adapting the principles of the convention to digital-native generations, who grow up relying on systems designed by private companies. Building on this guidance, the U.K. Information Commissioner’s Office included a "best interests of the child" standard in its Children’s Code, advising the goal for those implementing the standard into digital systems should be to "ensure the full and effective enjoyment of all rights acknowledged by the UNCRC and the holistic development of the child."

It may be universally acknowledged that children deserve privacy protections vis-à-vis governments and companies, but opinions and cultural norms diverge markedly around the question of minors' rights to have private lives separate from their parents.

This week, Utah’s lawmakers enacted a prime example of a law that privileges the rights of parents over the autonomy of children and teens under age 18. S.B. 152, the Utah Social Media Regulation Act, passed both chambers of Utah’s legislature and awaits only the governor’s signature before becoming law.

The most distinctive section of the bill would require social media companies to provide a means for parents or guardians of Utah minors to access minors’ accounts and "view all posts … and all responses and messages sent to or by the Utah minor account holder." Sixteen and 17-year-olds would only be exempt from this regime if they are emancipated or married (also subject, under Utah law, to a parent or guardian’s consent).

Other requirements of the bill — not to mention the companion social media design bill that also passed this week — are in line with current trends. By March 2024, the law would require social media platforms to:

  • Verify the age of Utah account holders, subject to future rulemaking by the Utah Division of Consumer Protection.
  • For minor users:
    • Seek parent or guardian consent before allowing access to the account.
    • Prohibit all advertising, not just targeted advertising.
    • Prohibit targeted recommendations.
    • Impose off-hours by default (10:30 p.m. to 6:30 a.m.), subject to parent modification.

The definition of covered social media platforms is quite narrow in the final law, with more than two dozen exceptions and carve-outs. Nevertheless, it serves as a remarkable expansion of social media regulation at a time when age-appropriate design codes are spreading and proposals to ban social media for minors entirely have garnered attention. At the same time, age verification requirements are being challenged as unconstitutional, most notably in NetChoice’s case to block the California Age Appropriate Design Code, a position that professor Eric Goldman defended in an amicus brief.

All of this further complicates the role of platforms and governments in guiding the growth of individual autonomy for teens who may be at risk of abuse or other harms from their own family, especially if using digital systems to explore their identity. It is worth returning to the U.N.’s General comment No. 25 for its thoughtful guidance on this point:

"Parents’ and caregivers’ monitoring of a child’s digital activity should be proportionate and in accordance with the child’s evolving capacities. … Protecting a child’s privacy in the digital environment may be vital in circumstances where parents or caregivers themselves pose a threat to the child’s safety or where they are in conflict over the child’s care. Such cases may require further intervention, as well as family counselling or other services, to safeguard the child’s right to privacy."

Here's what else I’m thinking about:

  • BetterHelp was implicated in another health-related data enforcement from the U.S. Federal Trade Commission. The case shares many features with the recent GoodRx action, continuing the FTC’s crackdown on sharing health-related data with third parties for the purpose of retargeted advertising. But there are plenty of differences, too, including the lack of a charge under the Health Breach Notification Rule, and the use of consumer redress as the genesis of BetterHelp’s USD7.8 million settlement. One notable factor in the complaint: Because BetterHelp exclusively offers mental health services, the FTC alleges that any disclosure of its users’ information to a third party, even just a hashed email address, revealed their health-related data because it “implicitly identified” the user “as one seeking and/or receiving mental health treatment.”
  • The White House released its long-awaited National Cybersecurity Strategy. The strategy doubles down on the government’s support for the development of privacy-enhancing technologies and digital identity solutions. Also notable is “Strategic Objective 3.1” which seeks to “hold the stewards of our data accountable.” The administration calls for legislative efforts to provide “robust, clear limits on the ability to collect, use, transfer, and maintain personal data and provide strong protections for personal data like geolocation and health information. This legislation should also set national requirements to secure personal data consistent with standards and guidelines developed by NIST.”
  • The House Energy and Commerce Committee set the table for comprehensive privacy discussions. At a hearing hosted by the renamed subcommittee on Innovation, Data and Commerce, witnesses made the case for the importance of a federal comprehensive consumer privacy law. As reflected in their written testimonies, CDT’s Alex Reeve Givens, former director of the FTC Bureau of Consumer Protection Jessica Rich, and privacy-enhancing startup founder Graham Mudd all focused lawmakers on the need to continue the work from last term that resulted in the draft American Data Privacy and Protection Act. With remarkable support across committee leaders, a re-introduction of the next version of this bill is no doubt imminent. Meanwhile, in a joint letter, California’s governor, attorney general, and executive director of the California Privacy Protection Agency implored Congress not to preempt state privacy protections through federal law.
  • FTC Commissioner Christine Wilson will resign March 31, after submitting an official letter of resignation to President Biden this week. Although this will not impact the FTC’s ability to do business, it will leave the agency with two vacancies and three commissioners affiliated with the same political party.

Upcoming happenings

Please send feedback, updates and your old AIM messages to cobun@iapp.org.