Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.

While privacy is undoubtedly central to the conversation around age assurance, parents, technologists, youth, lawmakers and educators all bring unique perspectives to discussions on how to best protect young people online.

The age assurance issue requires more than technical fixes — it requires listening to young users, thoughtful questioning, and the design of secure, understandable, protective and empowering platforms.

As one privacy regulator noted, age assurance is a tool, not an end in itself, for positive online youth experiences. At its best, it protects; at its worst, it bars young people from essential information, expression and connection.

The legislative landscape for age assurance across North America, Europe, the Middle East and Africa, Latin America, and Asia-Pacific is shifting — with evolving regulatory efforts impacting technical and policy questions.

North America

While the United States lacks a comprehensive consumer privacy law, the recently attempted American Privacy Rights Act would have prohibited the transfer of a minor's sensitive information without affirmative, express consent and would have — due to provisions added later — prohibited targeted advertising to covered minors.

The APRA's knowledge standard would have applied when a covered entity had knowledge fairly implied based on objective circumstances that an individual is a child, teen or covered minor. As such, the APRA is not necessarily an age verification law, but the bill would have directed the U.S. Federal Trade Commission to evaluate the feasibility of a common verifiable consent mechanism.

While the future of the APRA is unclear, the authors of the Kids Online Safety Act attempt to fill the void. While the KOSA's sponsor denies that the law is an age verification attempt, it indisputably gives federal agencies the ability to examine the feasibility of age verification systems.

While federal laws are pending, states have stepped into the child privacy and age verification fray, with social media and children's privacy laws enacted in dozens of states, including Georgia and New York.

Meanwhile, age-appropriate design legislation in California, Maryland and Mississippi has been subject to intensive lawsuits regarding their constitutionality. The U.S. Supreme Court, which has seen several constitutional challenges to age assurance laws, recently upheld an age verification law for websites hosting "obscene" content, but social media age verification laws continue to face legal challenges. Briefs to the Supreme Court, including a July 2025 brief from LGBT Tech, have noted the impacts that restrictive age verification laws, like Mississippi's, would have on LGBTQ+ youth in particular.   

Similar to the U.S., Canada does not yet have a comprehensive legal framework mandating age verification for children across digital services, but pressure is mounting. TheProtecting Young Persons from Exposure to Pornography Act is a key legislative development that would make it an offense to make sexually explicit material available to minors online without effective age verification.

The bill explicitly contemplates the use of privacy-preserving technologies. However, it has raised significant privacy concerns, specifically around mandatory identity or biometric verification, the potential for data breaches, and the chilling effect on the operations of mainstream services.

In parallel, the Office of the Privacy Commissioner of Canada launched a consultation in 2024 on the use of age assurance technologies and endorsed global best practices focused on proportionality, data minimization and privacy-by-design. The OPC continues to call for a balanced approach, particularly for services "likely to be accessed by children," that ensures meaningful protection without compromising children's or adults' digital rights.

The OPC also emphasized that meaningful engagement with children, parents, educators and children's rights advocates remains paramount to reflect lived experience. However, the OPC highlights key questions that remain unresolved — namely, who should bear responsibility for age assurance? Proposed models for assigning responsibility vary, including at the individual device level, the website or online service level, and the app store level.

While age assurance laws are still emerging in Canada, it has become clear that engagement with impacted communities remains a critical next step in crafting effective and proportionate frameworks.

Europe, the Middle East and Africa

The European Union's Digital Services Act requires online platforms to implement age assurance mechanisms for content inappropriate for minors or where national laws set age limits, such as for social media platforms.

The European Commission recently released guidelines and a blueprint for an age verification mechanism, designed to be compatible with future EU Digital Identity Wallets.

Individual member states, in addition to implementing EU law, are actively pursuing their own legislation. Italy's Decree-law n.123 mandates "double anonymity" age verification for adult content. France's standard for age verification systems to access pornographic sites requires the same, building off recommendations from its data protection authority, the Commission Nationale de l'Informatique et des Libertés, for balancing privacy and the protection of minors with online age verification. In efforts to make progress on this complex issue, the CNIL partnered with a researcher to complete an exploratory study for more "privacy-friendly" age assurance that uses zero-knowledge proofs. Meanwhile, Spain's new draft organic law to protect minors in digital environments includes an obligation for manufacturers of mobile devices to have effective, free and accessible parental control systems.

The U.K. Information Commissioner's Office has advocated for collaborative approaches, encouraging engagement with civil society, children's development experts and technologists to inform evolving expectations around age-appropriate design. The U.K.'s Age Appropriate Design Code, which sets standards for how the General Data Protection Regulation applies in the context of children's use of digital services, similarly prioritizes input from young users, recognizing their capacity to articulate the privacy and safety outcomes they value most.

Additionally, the U.K.'s Online Safety Act introduces new regulatory obligations for platforms to assess and mitigate risks to children, including through proportionate and privacy-preserving age assurance mechanisms. The Office of Communications, commonly known as Ofcom, is responsible for enforcing these obligations and has issued detailed guidance requiring platforms — particularly those hosting pornographic content — to implement highly effective age assurance and conduct children's access assessments to reduce the risk of harm.

Across Africa, countries are beginning to introduce child data protection measures, though most frameworks emphasize parental control more than children's autonomy. Kenya's draft Industry Guidelines for Children's Online Protection and Safety gesture towards age-appropriate design principles but lack clear consent rules for minors and detailed industry requirements to enforce. Nigeria and Rwanda have general data protection laws that require parental consent for processing children's data, but neither framework acknowledges children's agency or their differentiated needs by age.

Latin America

Established in 1990, Brazil's Statute of the Child and Adolescent — also known as ECA — provides a comprehensive framework for the full protection of children and adolescents. While not explicitly detailing specific online age verification methods, ECA emphasizes children's fundamental rights, dignity and protection from all forms of exploitation and violence, including in the digital environment. This overarching principle places a significant responsibility on service providers to ensure content and services are age-appropriate and to prevent harm.

Brazil's General Data Protection Law also requires that platforms obtain explicit parental or guardian consent before processing the personal data of minors under 18. Recently, members of the Brazilian National Data Protection Authority participated in the Global Age Assurance Standards Summit 2025, reinforcing the DPA's commitment to finding workable age-assurance methods.

Asia-Pacific

The Asia-Pacific region is adopting a variety of digital regulation approaches. Australia is perhaps one of the most active jurisdictions for regulatory efforts around age assurance.

In 2023, Australia's eSafety Commissioner submitted a comprehensive roadmap on age verification. To study the effectiveness of age assurance methods across a variety of deployment contexts, the Australian government commissioned an age assurance technology trial in 2024. Australia's Online Safety Act, which places the responsibility on social media platforms to verify users' ages and restrict those under 16 from accessing their services, will go into effect 10 Dec. 2025.

Meanwhile, South Korea's Youth Protection Act focuses on instituting protective measures around "environments harmful to juveniles."

Vietnam's updated Law on Protection of Consumer Rights 2023 and Decree 147 — replacing Decree 72 — introduce stricter regulations for online services, including content moderation and child protection; both largely prohibit cross-border online gaming without a local entity.

India's Digital Personal Data Protection Act, which is in effect but not fully enforced, establishes a comprehensive framework for digital personal data processing, requiring parental consent for the processing of data from minors up to age 18.

Lessons from around the globe

Current proposals for age verification vary significantly in scope and design; legislators often focused on environments such as social media platforms, gaming spaces and adult content services. These are dynamic spaces where users interact, create and consume content.

While the intention to protect is well-founded, the technical specifications included in many legislative efforts often require more definition. These specifications rely on evolving and highly complex technologies and would benefit from greater precision and additional input on technical feasibility.

Discussions around these questions have taken place between industry leaders, regulators and other stakeholders. For instance, the Centre for Information Policy Leadership and the WeProtect Global Alliance have convened a number of meetings as a part of their Multi-Stakeholder Dialogue on Age Assurance, bringing together industry, regulatory bodies, NGOs and more.

Broadly crafted age verification measures can affect users’ ability to access content and may have significant implications for the privacy of all users, not just youth. The debate around children's access to digital spaces is quickly evolving, and organizations offering youth-accessible content should prepare for greater legal and regulatory scrutiny.

Melanie Selvadurai, CIPP/C, CIPM, is privacy program manager, privacy risk at TikTok; Katelyn Ringrose, CIPP/E, CIPP/US, CIPM, FIP, is privacy and security attorney at McDermott Will & Schulte; Bailey Sanchez, CIPP/E, is deputy director of the Future of Privacy Forum's U.S. Legislation Team; and Basia Walczak is privacy and product counsel at Trulioo.