Policymakers, industry members and parents continue to focus their attention on strengthening online protections for children and teens. Although federal legislators have so far been unsuccessful in updating U.S. kids' privacy laws, since 2022, states have taken the initiative to create their own youth online safety frameworks, with a few taking effect this year.
After California adopted its Age-Appropriate Design Code Act in September 2022, many states followed suit by introducing or enacting similar rules designed to enshrine default privacy protections and enhance safety measures for minors. However, California's AADCA was quickly met with constitutional challenges after its passage, and its data protection impact assessment requirement was struck down on free speech grounds in August.
Undeterred, other states, including Maryland and Connecticut, passed children's privacy laws that went into effect this year without facial challenges.
New rules in the old-line state
Maryland's Age-Appropriate Design Code Act, also known as the Maryland Kids Code, is one of the more notable children's online privacy laws passed in 2024. Although Maryland legislators took inspiration from California's AADCA, the two laws have distinctions meant to remedy some of the more constitutionally precarious provisions — such as the DPIA requirement.
The first distinction between the Maryland Kids Code and California's AADCA is the Kid Code's definition of the "best interests of children" standard. California's AADCA mentions the best interests of children but fails to provide how the standard will apply. The Maryland definition creates a duty of care framework entities must follow when considering whether their services pose a risk to children who may access their products.
The best interests framework prohibits businesses from designing their products or using children's personal data in ways that could cause "(I) reasonably foreseeable and material physical or financial harm to … children; (II) severe and reasonably foreseeable psychological or emotional harm to … children; (III) a highly offensive intrusion on … children's reasonable expectation of privacy; or (IV) discrimination against … children based on race, color, religion, national origin, disability, gender identity, sex, or sexual orientation."
DPIAs are still required under the Kids Code. However, the law does not explicitly require entities to disclose what potentially harmful content may exist on their products in the DPIAs. The provision instead requires entities to assess their data management practices, such as steps they have and will take to protect minors, using the best interests of children framework. Entities must include these findings in their DPIAs and submit these reports to the state attorney general upon request.
Critics of the Kids Code believe these distinctions are not sufficient to survive free speech challenges. They argue the DPIA provision still impermissibly compels and regulates speech, even without explicitly mentioning mandatory disclosures of potentially harmful content in the DPIAs. Maryland's Attorney General Anthony Brown raised similar concerns in a letter to Gov. Wes Moore prior to the bill's passage, stating a court could reasonably find these reporting requirements to regulate or compel speech.
As discussed in part one, statutes that compel or regulate speech based on its content must meet heightened scrutiny under the First Amendment. The Kids Code requires businesses to use the best interests of children standard to determine what data management practices may pose "reasonably foreseeable" physical, financial, psychological or emotional harm to children who are likely to access the product. This could inherently require entities to disclose what content or conduct they consider harmful when applying the framework. The entities' DPIAs must include these findings and be provided to the attorney general's office upon request.
However, the explicit and granular harm framework is likely to help the Kids Code survive constitutional scrutiny, if the law is ever challenged. The law's explanation of the precise harms it is designed to stop, including privacy harms, makes its requirement to conduct a DPIA significantly more tailored. This addresses the U.S. Court of Appeals for the 9th Circuit's chief concern with the AADCA: that requiring businesses to curtail harms with a "high level of generality" leads to subjective, inconsistent and possibly overly censorious results.
Further, the Kids Code does not require businesses to implement age estimation measures on their products, as required under the AADCA. The law still obligates businesses to publish privacy policies, but unlike the AADCA, it has no explicit provisions for businesses to enforce these policies. It also requires stricter default settings for minors, in which entities must provide a high level of privacy by default unless they can show a compelling reason for the change that is in the best interests of children.
Stricter processing requirements under the Kids Code prohibit businesses from processing children's data when it is not in the children's best interest. Processing is broadly defined as "collecting, using, storing, disclosing, analyzing, deleting, or modifying personal data." The processing exception available under the AADCA, which exempts a business if it provides a compelling reason for processing children's personal data, is not available under the Kids Code.
Although the 9th Circuit deferred ruling on the constitutionality of non-DPIA provisions in California's AADCA, it did hint it believed opponents of the law would have a difficult time showing provisions other than the DPIA requirement to be unconstitutional.
The Maryland Kids Code went into effect 1 Oct. Covered businesses operating in the state should be actively working to ensure their products meet this new standard.
Constitutional balance in the constitution state
Connecticut took an alternative approach to passing youth online privacy legislation by amending its comprehensive state data privacy law to include provisions for individuals under the age of 18. While the youth-specific addition may have taken inspiration from California's AADCA, they differ in key aspects.
Similar to the Maryland Kids Code, the amendment to the Connecticut Data Privacy Act requires covered entities to create a DPIA — referred to in this instance as a data protection assessment — for each product that is likely to be accessed by children. Like Maryland, covered businesses will not need to assess whether children will be exposed to harmful content while using their products. Instead, the DPIA requirement focuses on the collection purposes and use of minors' data. This distinction may align with the recommendations from the 9th Circuit, which suggested a narrow DPIA requirement focusing on data management practices may be more consistent with the First Amendment.
A new duty of care under Connecticut's privacy law requires businesses to "use reasonable care to avoid any heightened risk of harm to minors" caused by their services or products. This heightened standard of care applies to all users that the business "has actual knowledge, or willfully disregards, are minors." This standard is distinct from the "likely to be accessed by children" standard that applies in California's AADCA and the Maryland Kids Code.
Age estimation measures and stringent default privacy settings were also absent from Connecticut's amendment. Yet, in other ways it goes further than California's AADCA and Maryland's Kids Code by placing additional restrictions on social media platforms. The amendment requires social media platforms to provide a heightened level of digital safety for minors and to delete minors' accounts and data by request.
The youth-specific amendment to Connecticut's privacy law went into effect 1 Oct., making it another statute businesses operating in the state should already have on their radar.
The genie is out of the bottle: More states are tuning in
The Maryland and Connecticut laws remain unchallenged in court and their potential enforcement in the next year could serve as a blueprint for robust children's online privacy and safety frameworks in the U.S. However, they were not the only states to pass and enforce youth online safety laws this year. Texas passed the Securing Children Online through Parental Empowerment Act in 2023, and after surviving a court challenge this fall, parts of the law went into effect 1 Sept. The Texas attorney general already filed a lawsuit against TikTok for alleged violations of the SCOPE Act.
Additionally, Florida and New York passed laws regulating children's access to social media platforms and the use of manipulative algorithms targeted at children and teens. Opponents of the Florida law filed a lawsuit 28 Oct. to prevent the law from going into effect 1 Jan. 2025. Utah repealed and repassed similar social media laws earlier this year, however, one of the modified laws was enjoined by a federal court in September. Colorado and Virginia passed youth-specific amendments to their state data privacy laws, following Connecticut's approach.
At the federal level, U.S. Congress has introduced numerous bipartisan, youth-specific bills throughout this congressional session, and a few have a lingering possibility of passing before the end of the legislative year.
As state governments continue developing their own children's online safety frameworks and enforcement mechanisms, legislators should remain diligent in addressing First Amendment concerns over content moderation, compelled speech and censorship. To be sure, free speech advocates will continue to bring these challenges as more states propose and pass their own children's privacy frameworks. Eventually, the right balance will take shape. With more state laws taking effect next year, 2025 remains one to watch for new children's privacy laws, their effective dates and pending litigation.
Kayla Bushey is the Westin Research fellow for the IAPP.