Editor's note: The IAPP is policy neutral. We publish contributed opinion pieces to enable our members to hear a broad spectrum of views in our domains.
In policymaking, as in video gaming, blink and you can be eliminated without warning. Like a teenage Fortnite player hot dropping into the most active combat zone on the map, South Carolina's Legislature came into 2026 with a bang, skipping the looting phase and going straight to the passage of a modernized age-appropriate design code held over from the first year of the state's legislative cycle. Despite quiet but concerted industry opposition, Gov. Henry McMaster, R-S.C., swiftly signed the bill known as H.3431 or, somewhat cryptically, the "Age Appropriate Code Design."
With an effective date of 5 Feb., the law offers no warning or cure period for its compliance obligations. Yes, that was last Thursday. Sorry, I don't make the rules. Rather than an on ramp, this is what our Fortnite friend might call a ramp rush.
Crucially, South Carolina's code is not at all like an AADC of yesteryear. As the Future of Privacy Forum's Daniel Hales writes in his comprehensive analysis of the new law, it could signal a "paradigm shift" toward a "new model for online protections entirely."
This is not one for privacy and safety teams to sleep on, for several reasons. The law:
- Covers all minors under age 18, using a modified and debatable actual knowledge standard.
- Provides for personal liability for officers and employees of covered companies.
- Imposes treble damages for actual financial harms.
- Includes a private right of action for violations of its dark patterns provision.
Some of the substantive updates from other AADCs are tweaks designed to color within the lines of constitutional limits, clarified somewhat in the U.S. Supreme Court's recent NetChoice v. Paxton decision. Ironically, however, these adjustments generally expand, rather than restrict, the scope of the law's requirements. By pivoting away from content-based triggers, which are almost always fatal under the First Amendment, drafters opt for content-neutral obligations like general product design mandates.
Further, some of these design mandates may even apply for all users, not just minors. This includes the law's expansive and indeterminate prohibition on dark patterns and the requirement that covered services provide "a user or visitor," without age qualification, with easily accessible tools to disable design features like infinite scroll, auto-playing videos and gamification, plus a handful of other safety- and privacy-protective tools, including user settings to:
- Implement a usage timer and quiet hours.
- Implement spending caps.
- Block interactions from nonconnected accounts.
- Hide engagement metrics.
- Disable search engine indexing.
- Block other users from viewing connections.
- Restrict geolocation visibility.
- Opt out of personalized recommendation systems, "except for optimizations based on the user's expressed preferences."
For minors, the listed tools must be set to a default of the protective setting, but the law explicitly mandates they must be available to all users for covered services. And while other listed mandatory tools explicitly reference minors, those listed above refer more generally to users, or do not specify. And though some of these are only applicable to services with social media functions, others could be interpreted to apply much more broadly.
That said, the law's scope of applicability is narrowed in a few ways. It applies to any data controller that conducts business in South Carolina and "owns, operates, controls, or provides an online service" that "is reasonably likely to be accessed by minors." One of three disjunctive scoping provisions must also apply. Adopting a similar structure to many state privacy laws, a company is subject to the law only if it (1) exceeds an inflation-adjusted 25-million gross revenue threshold, (2) “annually buys, receives, sells, or shares the personal data of fifty thousand or more consumers, households, or devices," or (3) derives at least 50% of annual revenue from the sale or sharing of consumers' personal data.
But in another plot twist, as David Stauss, CIPP/E, CIPP/US, CIPT, FIP, and Shelby Dolen, CIPP/US, point out in their analysis: the term "consumer" is undefined in the law and thus could be potentially interpreted to mean any consumers, not just South Carolina residents.
And before you go hanging your hat on the "reasonably likely to be accessed by minors" scoping language, remember this includes all users under 18. Unlike children, teens are everywhere on the internet. What online services are not reasonably likely to be accessed by teenagers?
Beyond the generally applicable obligations, the new South Carolina law includes many recognizable features from prior AADCs. These include a duty of care — though this time with a seemingly strict requirement to "prevent" listed harms to minors, data minimization rules, tools and settings for parental monitoring and prohibitions on profiling and targeted advertising.
There are also novel audit and public reporting requirements. In lieu of the usual expanded obligations for impact assessments found in most AADCs, the South Carolina law adopts a novel rule to "issue a public report prepared by an independent third-party auditor." The report generally is required to include an accounting of the company's governance program vis a vis this new law, with some interesting twists, including "description of algorithms used by the covered online service."
Finally, if you are wondering about the private right of action for dark patterns, this comes about by reference to the state's consumer protection statute. So, though the standard limits apply, the definition and application of dark patterns under the law adopts the standard but undetermined language of other similar laws.
On cue, NetChoice has already sued to block the South Carolina law. However, given the unpredictable nature of these lawsuits and the fact that the law is already in effect, privacy and safety teams would do well to review and fortify their practices.
Please send feedback, updates and loot containers to cobun@iapp.org.
Cobun Zweifel-Keegan, CIPP/US, CIPM, is the managing director, Washington, D.C., for the IAPP.
This article originally appeared in The Daily Dashboard and U.S. Privacy Digest, free weekly IAPP newsletters. Subscriptions to this and other IAPP newsletters can be found here.

