Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.
If you are reading this, you probably care about children's privacy and safety. If you happen to be a U.S.-based policymaker, you almost certainly care about the safety of young people online.
At the federal level, the safety of children and teens online has been one of the most discussed tech policy issues of the past decade. In congressional committees with oversight over commercial and criminal law, dozens of hearings explored everything from the harmful impact of social media to Children's Online Privacy Protection Act to the ever-spreading scourge of nonconsensual pornography.
Last week, in fact, the House Committee on Energy and Commerce used its first hearing of the year to restart the conversation about the myriad risks young people face from on the internet. This time — when not focused instead on the contested removal of U.S. Federal Trade Commission commissioners — witnesses described the growing and sometimes life-threatening risks posed by sextortion and deepfakes. These risks are the subject of the Take It Down Act, which aims to require platforms to remove nonconsensual intimate visual depictions, including deepfakes.
Despite all the prolonged attention, federal policymakers have not been able to muster the momentum to pass youth privacy or safety laws in both chambers.
Notably, last term the Take It Down Act passed the Senate, as did the Kids Online Safety Act and the bill known as COPPA 2.0. None has made progress in the House beyond the committee level, though the recent Energy and Commerce hearing will set the stage for future progress and the new chair of the committee wants to see progress on youth protections "sooner rather than later," according to Politico, even as the committee starts from scratch on data privacy.
Incidentally, the 2024 refusal from House leadership to bring the KOSA to the floor was rooted in concerns over the bill's free speech implications.
Meanwhile, state legislatures remain active and engaged on these same issues.
Over the past few years, it has been difficult to keep track of the waves of legislation at the state level. Whether in the form of age-appropriate design codes, social media bans or age verification laws, or some combination of these ideas, states have been acting.
Just as swiftly, industry groups — particularly NetChoice — mounted legal challenges to these same laws, invariably on free speech grounds. This week saw yet another win for NetChoice when the U.S. District Court for the Western District of Arkansas declared a new age-verification law, Arkansas Act 689, to be unconstitutional, entirely invalidating the law before it went into effect.
Judge Timothy Brooks penned the opinion in a precise style with a confident handle on the First Amendment. It serves as a helpful refresher of the many landmines policymakers might encounter when constructing these sorts of laws.
Let us look at those lessons, in turn.
Everyone has free speech rights
In the U.S., the First Amendment provides broad protections to everyone, from non-Americans to corporations to, indeed, children.
Many of the court decisions analyzing challenges to kids’ safety laws have found government restrictions to unconstitutionally impede the rights of platforms to host and disseminate speech content. Examining the Arkansas age verification law, Judge Brooks is far more concerned about the rights of users, whether adult or children.
Courts have found over and over for the free speech rights of children, even in the context of the schoolyard, where compelling state interests, such as maintaining order and protecting minors from harm, can provide the government with unusually strong powers to limit speech.
The context is different when regulating online speech. Given the internet broadly, and social media platforms specifically, have become the primary means by which people access information and engage in discourse in the digital age, even well-intentioned rules can inadvertently impose sweeping restrictions on lawful speech.
The court points to an earlier preliminary injunction order in disposing with questions of NetChoice's standing to bring the lawsuit, not only on behalf of its member businesses — so-called "associational standing" — but also on behalf of the users of members' platforms. This second category, "third-party standing" is important in this case because the rights of users of all ages determine the outcome of this case.
Access to information is a protected right
The First Amendment protects an important bundle of rights, including the right to free speech but also the corollary right to access information.
The latter right is most directly burdened by the invalidated Arkansas law, according to the judge. "Act 689 forecloses access to social media for those minors whose parents do not consent to the minor's use of social media. It also burdens social media access for all Arkansans — both adults and minors whose parents would allow them to use social media."
The court reminds us that, from a constitutional perspective, age verification are more than just a hassle. Citing American Booksellers Foundation v. Dean, Judge Brooks highlights the fact that they can cause "website visitors forgo the anonymity otherwise available on the internet," which chills access to information.
Overall, the case is a good reminder that the rights of users could be just as determinative in invalidating youth safety laws as the rights of platforms to host speech. Restrictions on access to information must be narrowly tailored.
'Harmful to minors' is not a type of unprotected speech
In its brief, the Arkansas government makes an argument that social media users engage in both protected activities and those "not protected by the First Amendment, including criminal conduct and that which is harmful to minors."
This does not sit nicely with Judge Brooks, who uses this as a teaching moment for us to remember the very narrow types of speech that are not protected by the First Amendment. These "traditional limitations" are deeply enshrined in how court's approach free speech questions.
As the court analyzes, "It is, however, true that despite these entities' efforts to self-regulate, social media users of any age may still encounter some speech online that is not entitled to constitutional protection, including real threats, child pornography, obscenity, defamation, fighting words, or speech integral to criminal conduct. In addition, minors may encounter speech that is constitutionally protected as to adults, but not as to minors, and some of that speech may be harmful to them."
Future legislatures would do well to remember these categories of less-protected speech when constructing narrowly tailored laws to protect minors on the internet.
Making different rules for different platforms raises a red flag
Probably the most fatal flaw in the Arkansas law was its complicated scoping provision, which included and exempted various social media and communication platforms in ways that unavoidably raised issues around content, at least according to the judge.
Dear policymaker, I beseech you. Though many stakeholders will ask that you exempt them from your youth safety law in various ways, and you may be tempted to do, drafting complex scoping requirements into your online safety law is not a sustainable strategy.
Rules that distinguish between different types of platform could be seen as speaker-based distinctions and those that distinguish between different forms or types of media are likely to be seen by courts as content-based restrictions.
Either one renders a law subject to enhanced judicial scrutiny, requiring its limitations to be narrowly tailored. For example, singling out "short video clips of dancing, voice-overs, or other acts of entertainment in which the primary purpose is not educational or informative," for inclusion or exclusion in a bill will raise the First Amendment hackles of most judges.
Judge Brooks leaves us with a helpful rule of thumb, similar to the scope-based determination we saw in the recent California AADC NetChoice decision. If you "cannot determine whether the website is regulated without looking to the content posted on that website," the regulation creates a content-based restriction and will be scrutinized as such.
This is another fatal flaw in the Arkansas case. Future efforts to protect youth online will need to carefully navigate these lessons, or they will meet with the same fate.
Please send feedback, updates and other protected speech to cobun@iapp.org.
Cobun Zweifel-Keegan, CIPP/US, CIPM, is the managing director, Washington, D.C., for the IAPP.
This article originally appeared in The Daily Dashboard and U.S. Privacy Digest, free weekly IAPP newsletters. Subscriptions to this and other IAPP newsletters can be found here.