In August 2024, the U.S. Court of Appeals for the 9th Circuit issued its decision upholding in part and vacating in part the lower district court's preliminary injunction that blocked the California Age-Appropriate Design Code Act from going into effect. More specifically, the federal appellate court upheld the preliminary injunction against the law's data protection impact assessment report requirements on constitutional grounds, while vacating the remainder of the order against other challenged portions of the law. Both NetChoice and California state officials hailed the decision as a victory, highlighting that questions still remain about its impact.

This is the latest decision in NetChoice v. Bonta, a case that highlights the friction between free speech principles and youth online privacy and safety laws.

Background

The AADCA was signed into law in September 2022. It was quickly met with constitutional and other legal challenges when NetChoice filed a lawsuit in the Northern District of California a few months later. NetChoice's motion for a preliminary injunction in February 2023 sought to block the law from being enforced as planned beginning in July 2024.

In September 2023, Judge Beth Labson Freeman of the U.S. District Court for the Northern District of California granted the preliminary injunction, peeking at the constitutional issues and finding various provisions would likely violate the First Amendment under a special type of intermediate scrutiny. In a prior article, the IAPP provided a detailed analysis of the district court decision. For those new to U.S. free speech law, the earlier article also provides a roadmap for how judges approach First Amendment challenges.

The AADCA is one of a handful of similar youth online privacy and safety laws that state policymakers have considered in recent years. Many of those that have become law, including the related wave of social media safety laws, have faced constitutional challenges. Nonetheless, policymakers across the U.S. have continued pursuing the passage of such bills, with states like Minnesota, New Mexico and Vermont introducing them and Maryland enacting its own age-appropriate design code. The Ninth Circuit's decision could serve as a helpful blueprint for legislators considering future youth safety legislation, and it may answer whether and how newer laws regulating children's online privacy can withstand constitutional scrutiny.

Mandatory assessments for determining harmful content are unconstitutional

The first step in a U.S. free speech challenge is determining whether First Amendment review applies because the law restricts or compels speech. The U.S. Supreme Court has long struck down statutes that compel speech, such as laws requiring compulsory flag salutes in public schools or requiring citizens to display a state motto on their vehicle's license plate. Under this precedent, courts will strike down statutes that compel speech if they do not comport with strict scrutiny review.

The DPIA report provision in the AADCA requires each covered business to submit a report to the attorney general's office disclosing any potential risks that children would access harmful content while using its products, services or features. Since the law does not define what type of content is considered harmful, each business is required to reach its own conclusions on what could be materially detrimental to children and disclose this to the state in the mandatory report.

The 9th Circuit found the DPIA report requirement likely triggered First Amendment review because it "clearly compels speech by requiring covered businesses to opine on potential harm to children." In doing so, the court clarified that even nonpublic documents that remain confidential once submitted to the state can count as compelled speech, citing U.S. Supreme Court precedent that the "First Amendment may apply even when the compelled speech need only be disclosed to the government."

The appellate court also reasoned that the DPIA provision triggered the First Amendment review because it "deputized covered businesses into serving as censors for the State." The court cited Interstate Circuit v. City of Dallas, a case in which the U.S. Supreme Court applied First Amendment review to a city ordinance that required film exhibitors to determine a proposed classification of whether a movie was "suitable" for children and report it to the government.

Content-based laws are held to a high standard of strict scrutiny

The next step in a free speech case is determining whether the law is content-based or content-neutral. Depending on the type of speech at issue in a First Amendment challenge, the court will apply different levels of scrutiny in deciding whether the regulatory requirement is proportionate with the harm it is trying to solve. Upon determining a law is content-based, the court will apply strict scrutiny review.

Although the appellate court agreed with Judge Labson Freeman's conclusion that the DPIA report provision likely violated the First Amendment, it found the lower court applied an incorrect, lower standard of scrutiny because DPIA reporting is not commercial speech. Instead, the court found "in requiring covered businesses to opine on and mitigate the risk that children are exposed to harmful content online," the provision required subjective, content-based speech that triggered the highest level of First Amendment scrutiny.

The DPIA report provision presents a unique mixture of both compelled and restricted speech because it requires certain disclosures, while also requiring businesses to censor content without directly specifying the precise type of content to be restricted. The Ninth Circuit concluded California had created a content-based regulation for both aspects of this requirement. Specifically, the court reasoned, since the DPIA report provision "compels speech with a particular message about controversial issues" and "deputizes private actors into censoring speech based on its content," strict scrutiny was the proper standard to apply.

Requirements to report harmful content policies are not the least restrictive means of protecting children online

To survive the high standard of strict scrutiny, a law must be "narrowly tailored" to meet the state's compelling goals such that it does not unnecessarily restrict speech. There is no dispute that the safety of minors online is a compelling state interest, but the parties strongly disagree about whether the AADCA's chosen mechanisms are the least restrictive means to achieve this goal.

In applying strict scrutiny in its preliminary assessment of the substance of the case, the appellate court found the DPIA report requirement would likely violate the First Amendment because California did not show that the provision was the least restrictive means of achieving its goal of protecting children online. The court suggested various alternative mechanisms California lawmakers could have added to the legislation to achieve its goals, such as using voluntary content filters, digital literacy resources for parents and kids, and existing criminal laws.

The court rejected the state's argument that the required DPIA report achieves its goal by encouraging companies to proactively consider how their data management practices and product designs could negatively impact potential child users by exposing them to harmful content.

The DPIA report provision provided eight factors for businesses to use when creating DPIA reports, with two factors focusing on the businesses' product designs and data management practices. The remaining factors focused on risks of children accessing and viewing harmful conduct, as well as their exposure to harmful contacts while using a company's services, products or features.

The court first assumed, for the sake of argument, the state had shown its interest in protecting children online was a compelling goal. However, the court found the DPIA report provision was not narrowly tailored because the factors not focused solely on content harms were "worded at such a high level of generality that provide little help to businesses in identifying which of the practices or designs may actually harm children." Since the majority of the factors focused on disclosing harmful content, the two design and data factors were not sufficient to show an attempt at narrowly tailoring the provision.

Instead, the 9th Circuit reasoned the statute creates a disclosure regime for the "forced creation and disclosure of highly subjective opinions about content-related harms." The DPIA provision is thus an unnecessary means of achieving the state's purported goal of creating a proactive environment for businesses to work with the state to protect kids' online safety.

The court even provided a hypothetical of a narrowly tailored provision, stating legislators could have developed a disclosure regime focused solely on defined data management practices and product design without reference to whether child users will be exposed to harmful content while using the company's product. Instead, the court found "the State attempts to indirectly censor material available to children online, by delegating the controversial question of what content may 'harm to children' to the companies themselves," and cannot survive strict scrutiny review.

It is important to note the Ninth Circuit went out of its way to distinguish this DPIA requirement from the California Consumer Privacy Act's DPIA requirement. The court characterized the CCPA's DPIA requirement as an obligation to collect, retain and submit factual information to the state and not a compelled determination and disclosure of the type of content hosted on its service. The Ninth Circuit appears to conclude DPIAs can be constitutional if they do not require the entity to determine whether content on their service is harmful.

What's left of the AADCA?

Despite the court finding the DPIA report provision unconstitutional, the AADCA is still alive and breathing — for now. The Ninth Circuit punted ruling on the merits of the remaining First Amendment challenges back to the district court after finding the record insufficient to decide on NetChoice's facial challenges.

Now, NetChoice must convince the district court the remaining portions of the law compel speech in a substantial majority of their applications to trigger First Amendment review.

The Ninth Circuit expressed doubt that NetChoice could meet this burden, and even if it was successful, the court further hinted it was not persuaded that the remaining provisions would fail strict scrutiny. The court reasoned the other portions of the law only compelled speech that would be "purely factual and non-controversial," and therefore could pass strict scrutiny. Nonetheless, the attorney general's office will need to defend the remaining provisions, which include age estimation requirements and prohibitions against the use of dark patterns, from facial attacks once again at the lower court.

However, the Ninth Circuit did not leave the statute unmarred. Upon finding the DPIA provision unconstitutional, the court had to conduct a severability analysis to determine whether part of the law would remain in force. It struck down unchallenged portions of the law that explicitly mentioned the DPIA report requirement under California's severability test, which requires a provision to be invalidated if it does not make grammatical sense without the unconstitutional provision. For all other provisions, the district court will need to decide if they are severable, that is, if they remain in force without the unconstitutional provision.

When will the remaining provisions become enforceable?

In vacating part of the preliminary injunction, the Ninth Circuit opened the door for the attorney general's office to begin enforcing non-DPIA provisions, as the statute was scheduled to go into effect in July 2024. However, NetChoice and the attorney general's office reached an agreement, delaying enforcement until 6 March 2025, according to a motion filed with the court 28 Aug. 2024. The attorney general's office will refrain from retroactively enforcing any of the remaining provisions before this date. NetChoice also included notice of its intent to file an additional motion for a preliminary injunction to prevent the enforcement of the remaining provisions until a decision has been issued on the merits of the case.

Now, businesses have a set date to assess whether their platforms, services and features comply with the non-DPIA provisions of the AADCA. They must prepare to comply before March 2025. However, with this deadline only six months away, businesses operating in California should work diligently to bring privacy and safety programs up to speed with the law's remaining provisions and similar laws. However, the lingering legal questions and previewed new challenges from NetChoice make this docket one to watch for all businesses that need to conform with the AADCA before the March deadline.

Kayla Bushey is the Westin Research fellow for the IAPP.