The pace of digital change accelerates at a staggering rate. Five years ago, children's online privacy was not yet a mainstream issue among regulators or policy makers. And now, even the wider world has started to understand the real-life consequences of an internet that was not designed with children in mind.
Young people's personal data has been drawn into the attention economy, via the business models of online platforms. Those growing up today face dangers that did not exist when they were born. We need an internet that allows children to safely develop, learn, explore and play in a manner appropriate to their age. The importance of tackling this challenge grows ever stronger, while artificial intelligence becomes central in shaping the online experience.
The U.K. was first out of the blocks in addressing this challenge with the statutory Age-Appropriate Design Code, which came into full effect in 2021. It was developed by the U.K. data protection regulator, the Information Commissioner's Office.
The U.K.'s breakthrough was followed by California adopting the California Age-Appropriate Design Code Act in 2022. The state's legislators recognized the U.K.'s leadership and expertise in implementation. The legislative findings of the CAADCA stated: "It is the intent of the Legislature that businesses covered by the California Age-Appropriate Design Code may look to guidance and innovation in response to the Age-Appropriate Design Code established in the United Kingdom when developing online services, products, or features likely to be accessed by children."
Today, other countries are taking inspiration from the U.K.'s model and engaging in the global conversation about how to best protect children's digital privacy.
But, in September 2023, progress received a significant, concerning setback. The U.S. District Court in California granted an injunction against the CAADCA, on First Amendment grounds in the NetChoice v. California Attorney General Rob Bonta case related to freedom of expression. The judgment was wide-sweeping. It declared legal incompatibilities: from age-assurance provisions right through to the requirement for impact assessments. Even while putting impediments in place, the issuing judge accepted "the State's assertion of a concrete harm to children's well-being, i.e., the use of profiling to advertise harmful content to children."
California took inspiration from the original U.K. Age-Appropriate Design Code, so perhaps the state's legal experts can also gain from our first-hand experience in developing and implementing it. As U.K. Information Commissioner and Deputy Commissioner at that time, we had direct involvement and oversight of this work at the ICO. After seeing the U.K.'s AADC come into effect, we feel that age-appropriate design codes are compatible with freedom of expression. We also want to share why such protections will make the digital realm safe for children. Regulatory design and thoughtful implementation each play a key role in ensuring the right balance is struck between free expression and children's privacy.
In the U.K., we needed to address significant concerns raised by the media about freedom of expression. We felt surprised by the media's reaction because we had expected that resistance would mostly come from the major technology companies. But surprise always attends innovative concepts. So, we listened to the concerns and made changes while upholding the essence of the code.
Key components of the UK AADC
The requirement for the ICO to prepare the AADC was contained in Section 123 of the Data Protection Act 2018. That piece of legislation completed the U.K.'s implementation of the EU General Data Protection Regulation, where the GDPR allowed for further discretion. The GDPR was later copied over into U.K. law and became the U.K. GDPR.
The requirement to draft the code was added to the Data Protection Act 2018 during its passage through Parliament by Baroness Beeban Kidron: a member of the House of Lords, advocate for children's digital rights, and chair of the NGO 5Rights Foundation, which gathered a powerful bipartisan grouping for her amendment. After a great deal of input and debate from both houses of Parliament, the U.K. government supported the amendment. The U.K. minister for data protection, Margot James, became a vocal supporter of the code.
What were the key requirements in Section 123? The ICO was to draft a code which recognized that children have different needs at different ages. Standards in the code had to weigh the best interests of children, considering the U.K.'s obligations under the United Nations Convention on the Rights of the Child. And the AADC would apply to any online service that was "likely to be accessed by children." This last, broad-scope provision was vitally important. It recognized the challenges caused by laws that focused only on services specifically targeted or explicitly directed at children. The "likely to be accessed" test would ensure that the AADC's standards applied where children went in the reality of daily online life. Crucially, under the law, regulated companies had a duty to provide privacy to children by design and default. That obligation recognized that parents and children were outmatched by complex and innovative technological systems. The AADC also allowed for a great deal of creativity from companies about how to provide privacy.
The code is statutory. This means that it was laid in the U.K. Parliament by a negative-resolution procedure. Such status also means the ICO, along with courts and tribunals in the U.K., must consider the code’s standards when enforcing the GDPR in relation to children and online services.
The AADC itself contains 15 standards. They are complementary and seek to build up a holistic approach to protecting children's privacy online. The code's standards are proportionate and risk-based, ensuring organizations take responsibility for implementing the standards in the context of their unique services. While the code is detailed in its guidance, it equally allows online service providers choice in how to develop practical solutions.
Explaining its standards one-by-one goes beyond the scope of this article, but a few are important to note. The first standard focuses on the best interests of the child, which must affect the design and development of online services. While the phrase "best interests" may seem nebulous on paper, in reality, a decision taken with children in mind looks quite different from one that only considers commercial outcomes. Next comes the second standard, which concerns data protection impact assessments. This pivotal process enables companies covered by the code to understand the risks and design their mitigations. Further, AADC standards mandate data protection by design and use of default settings.
Importantly, AADC promotes age assurance, not age verification. It ensures that online services follow a risk-based approach to recognizing the age of individual users, requires that they effectively apply the standards in this code to children and youth. Services can either establish age with a level of certainty that is commensurate with the risks or apply the standards to all. The risk-based approach therefore does not require age verification for all online services. In low-risk scenarios, the age assurance provided by simple declaration can be appropriate. Alternatively, companies can apply all the standards to how all users are treated.
The AADC and freedom of expression
Freedom of expression is of crucial importance to U.K. society and enshrined in the U.K. Human Rights Act of 1998. As a public body, the ICO must account for the Human Rights Act in all decisions and dealings. Like the U.S., the U.K. has a long history of jurisprudence related to free expression.
The process for conducting a data protection impact assessment is a flexible one. Assessments can accommodate other rights in the trade-offs that must be made, including with freedom of expression. The AADC explains that when children's interests and commercial value are finely balanced, concern for the child should prevail. But that does not mean freedom of expression cannot be considered, as well. An impact assessment can consider the benefits of using personal data, guided by a risk-based approach toward proportionality.
The U.K. AADC does not require content moderation. The only focus on content relates to profiling using children's personal data. The code restricts how personal data can be used to profile and recommend content to children. It also ensures that recommendations are not detrimental to their health and well-being. The AADC specifically states "the ICO does not regulate content." It also makes clear that the ICO uses its regulatory discretion to defer to other established guidance and expertise when considering harms to well-being linked to profiling. Without objective evidence, the ICO cannot act and the AADC does not enable the office to become a content regulator.
As a public body subject to the Human Rights Act, the ICO is also bound to consider the relevance of freedom of expression when taking any AADC-related enforcement action under GDPR. The ICO's wider Regulatory Action Policy further ensures the enforcement approach is proportionate to the risk of harm. Therefore, the code is not an absolute set of standards and allows for a measure of regulatory discretion.
ICO's strategy behind the AADC
The AADC came out of significant consultation and engagement. During development and implementation, we: Issued an open call for evidence before drafting the code; released a draft of the AADC for open consultation; held 57 roundtable meetings with stakeholders; and socialized and explained the code for major technology firms in Silicon Valley.
Engaging with stakeholders marked the AADC's development. When media raised concerns about the code's possible impacts on access to their digital content and advertising models, we listened carefully. The news media also supported the AADC's aims. One newspaper, The Daily Telegraph, regularly set out its support as part of a wider campaign about children's online safety.
We held many meetings with the U.K. News Media Association. After hearing their concerns, we gave reassurances that the expectations around age assurance would be risk-based and proportionate. Given the special importance of protecting free expression, we developed a special set of Media Frequently Asked Questions about the AADC. We also ensured that the FAQs would have additional status. The FAQs were laid in Parliament alongside the AADC itself, as part of the explanatory memorandum.
Our FAQs sought to explain how the media could implement the code's standards. The document recognized the importance of free expression, analyzed the likely risk profile of the media sector, and suggested that formal age-verification should not be necessary.
Our FAQs also explained that the U.K.'s ePrivacy directive already set rules around consent for cookies, which required a default-setting. We also referenced the ICO's guidance on data protection and its journalism exemption under the GDPR. That carve out recognizes a broad definition of freedom of expression, covering citizen journalism and not just mainstream media.
Since the code came into force in 2021, the ICO has continued to provide guidance with additional resources and supplementary guidance. Those resources address data protection impact assessment requirements for different sectors as well as what child-centered digital design means in practice.
International trends in children's privacy and safety
The safety-by-design principles in the U.K.'s AADC align with international instruments, such as the Organisation for Economic Co-operation and Development's "Recommendation of the Council on Children in the Digital Environment." Recall that the U.S. supported its adoption at the OECD Council. Alongside that recommendation, the OECD Guidelines for Digital Service Providers have been developed in recognition of the essential role developers play in providing a safe and beneficial digital environment for children.
The guidelines set out the following:
- Take a child safety by design approach when designing or delivering services.
- Ensure effective information provision and transparency through clear, plain and age-appropriate language.
- Establish safeguards and take precautions regarding children's privacy, data protection and the commercial use of such data.
- Demonstrate governance and accountability.
This international direction is complemented by the work the IEEE Standards Association is doing to develop 2089-2021 – Standard for Age Appropriate Digital Services Framework. This standard will encourage global adoption of privacy-engineering expectations, which in turn will accelerate practical implementation.
Other jurisdictions are planning to introduce age-appropriate design codes. Australia recently announced privacy law reforms that will include a code similar to AADC.
Lessons learned and the way forward
After two full years, the initially controversial AADC has not been legally challenged in the U.K. We believe the ICO's policy of engagement, support and guidance created reassurance for companies as they made investments in children's privacy. The AADC is an agile, thoughtful piece of regulation that will deliver effective protections for young people online, who deserve a digital world where they can develop, learn and play.
Regulators must address the public dangers and reassure those who fear unintended consequences. Both messages will advance the implementation of age-appropriate design codes in practice. In the California AADC, a working group is envisioned to be established and charged with advising the legislature on practical implementation, mirroring what the ICO has done in the U.K.
The concern that risk assessments, such as DPIAs, constrain freedom of expression should be challenged. Assessments of risk are fundamental to driving default protections for children. The October 2023 preliminary enforcement action by the ICO against a large social media platform highlighted the importance of risk assessments. The company allegedly rolled out generative AI chatbots for kids to use without conducting an impact assessment. Such enforcement actions do not prescribe solutions or constrain the free choice of technology companies. Rather, they place responsibility on companies to be accountable about how they prevent harms that stem from their use of children's personal data.
Recall that privacy impact assessments originated in North America, in the early 2000s. Subsequently, the EU imported the practice into the GDPR. Today, U.S. companies conduct impact assessments as a central component of their corporate governance, week-in and week-out. Mitigating against risk is a central feature of President Joe Biden's October 2023 Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. Privacy and children are both referenced in the order, and we struggle to envision its implementation without impact assessments as a regulatory tool.
Privacy-engineering approaches can help provide technical solutions that mitigate the risks for children. Doing so, they do not need to undermine the fundamental role the internet plays in helping people find and receive information, which relies on free expression, including the airing of children's views. We lose when we see the trade-offs as a zero-sum game.
The U.K. AADC has already made a fundamental difference to how platforms protect children's privacy. While more remains to be done, the following changes were introduced in light of the AADC:
- Facebook and Instagram limited targeting based on age, gender and location for under 18-year-olds.
- Instagram launched parental supervision tools, along with new features like "Take A Break" to help children manage their time on the app.
- YouTube turned off autoplay by default and turned on a "Take A Break" feature and bedtime reminders by default for Google Accounts owned by under those under 18.
- TikTok changed its default privacy setting to private for all registered users ages 13-15. With a private TikTok account, only someone who the users approve can view their videos.
There is no evidence these changes have undermined freedom of expression. They have, however, made a meaningful difference to children's privacy online. In fact, a major technology company has shared with us that the AADC had a greater impact than GDPR enforcement actions, highlighting the code's importance. Furthermore, many U.S. companies are already globally implementing standards in the code, because following its standards increases trust and bolsters confidence in their services. U.S. children and families are benefitting from these protections. It would seem appropriate if they had a voice in the ongoing development and enforcement of an age-appropriate internet.
Freedom of expression is not a trivial matter. Nor is privacy, particularly as it affects children. Along its regulatory journey in the U.K., the AADC established beyond all doubts the close-knit relationship between children's privacy and their safety. Policy makers around the world have gone on record to testify to that link. In the U.S., there is so far no clear path to explain how legislation that protects child privacy is incompatible with First Amendment protected speech. We hope that our lessons from the U.K. can show a path forward for achieving both ends.