OPINION

NetChoice v. Fitch: Round two and consequences for online anonymity

NetChoice’s second challenge to Mississippi’s child age verification law highlights the tension between child‑safety mandates and online anonymity.

Published
Subscribe to IAPP Newsletters

Contributors:

Alexander Kleinman

CIPP/US

Attorney

Intelligence Federal

Editor's note

The IAPP is policy neutral. We publish contributed opinion pieces to enable our members to hear a broad spectrum of views in our domains. 

In early February, the U.S. Court of Appeals for the Fifth Circuit heard oral arguments in NetChoice v. Fitch. Again. This is the second time NetChoice's challenge to Mississippi's child age verification law reached the New Orleans-based appeals court. It returned after Mississippi appealed a federal district court's injunction against enforcing the law on remand. This circuitous path reflects the case's increasingly complicated procedural history.

Underlying the second round of oral argument is a familiar tension for privacy professionals since long before the initial challenge in 2024. The dispute turns on how the Constitution treats laws that require platforms to verify users' identities to protect minors, potentially at the expense of anonymity. The Fifth Circuit and other federal courts appear inclined to hold this and other state age-verification requirements constitutional. Privacy advocates worry that accepting this premise may gradually erode online anonymity.

The Mississippi Social Media Safety Act requires covered social-media platforms to verify users' ages and implement measures designed to reduce minors' exposure to harmful online interactions. The state argues the law is necessary to address the harms associated with social-media use by minors, including sextortion and is "narrowly tailored to the State’s powerful interest in protecting children from predators." On the law's privacy implications, NetChoice responds, "there is no age-verification system that is not also a deanonymization and identity-verification system" and warns the law would therefore "all but kill anonymous speech" on affected platforms.

During the oral argument, Judge Cory Wilson suggested the statute may not regulate speech in the same way as earlier social-media laws. He observed that the case appeared "very different" from earlier litigation. He noted earlier cases involved content moderation, while Mississippi's law is "about protecting children from predators ... (through) interactive messaging features." 

This statement embodies the broader shift in social-media litigation following the U.S. Supreme Court's decision in Moody v. NetChoice. In Moody, the court instructed lower courts to conduct a more careful analysis of how platform laws operate in practice. NetChoice argues that Mississippi's statute sweeps in enormous volumes of lawful expression because social-media platforms host "a 'staggering amount' of fully protected speech, across billions of posts or videos." Mississippi largely sidesteps that point, arguing instead that the regulation is justified by the state's compelling interest in protecting minors, particularly given the "unprecedented dangers" social-media platforms pose to young users.

For privacy professionals, this case's biggest consequence is the infrastructure required to enforce the statute. In practice, age-verification means building some form of identity verification infrastructure. NetChoice argues this would have direct consequences for anonymous participation online because users would be forced to "forgo the anonymity otherwise available on the internet."

The Supreme Court has long recognized the Constitutional significance of anonymity, noting speakers often rely on it to express controversial or unpopular views. Age-verification mandates complicate that principle. Even if a law is designed to protect minors, they often require platforms to build identity-verification systems that fundamentally change how any user can access a platform. These systems may include government identification checks, payment credentials, biometric age estimation or third-party verification vendors. And even if platforms do not retain identifying information, the verification process itself creates identity signals that erode anonymity.

The Fifth Circuit's analysis of the Mississippi law will also unfold in the shadow of another recent Supreme Court decision: Free Speech Coalition v. Paxton. In that case, the court upheld Texas' age-verification requirement for pornography websites, suggesting that age-verification mechanisms may be treated as access controls rather than direct speech restrictions. Mississippi relies heavily on that reasoning in defending its own statute. The state argues that liability stems not from the speech hosted on a platform, but from the platform's failure to comply with statutory safeguards. As Mississippi puts it, the law "imposes liability for failing to adopt a strategy — not for harm from third-party content."

Thus, round two of Fitch may represent an early example of this new, reframed inquiry. Instead of asking whether the government is regulating speech directly, the Fifth Circuit could instead hold the law is regulating the design of digital platforms that facilitate interactions among users. If courts adopt that framework more broadly — beyond the pornography context — age-verification requirements could become easier for states to defend, even when applied to speech activities on social media platforms.

Fitch illustrates the tradeoff. States increasingly view social-media regulation as necessary to protect minors from harmful online interactions. Protecting minors may require platforms to determine who their users are. Yet requiring users to identify themselves may undermine anonymity. The Fifth Circuit has not yet issued a decision in this case. But the direction of the court's questioning suggests it may uphold the Mississippi law under Paxton. As the Fifth Circuit and other federal courts continue interpreting the Supreme Court's recent platform-regulation decisions, the most important question may not be whether governments can regulate social media at all. Instead, the question may be whether the infrastructure created to protect minors will gradually transform the internet from a largely anonymous space into one where persistent identity verification becomes routine. 

CPE credit badge

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.

Submit for CPEs

Contributors:

Alexander Kleinman

CIPP/US

Attorney

Intelligence Federal

Tags:

Children’s privacy and safetyU.S. state regulationLaw and regulationPrivacy

Related Stories