OPINION

The importance of involving neurodiverse teens and kids in privacy feature testing

For privacy features to be genuinely safe, effective and equitable for children, especially neurodiverse youth, diverse young users must be included in testing and design.

Published
Subscribe to IAPP Newsletters

Contributors:

Basia Walczak

Privacy and Product Counsel

(Axiom - Airbnb Secondee)

Melanie Selvadurai

CIPP/C, CIPM

Privacy Program Manager

TikTok Information Technologies UK Limited

Candela Yatche

Psychologist, Research Fellow

Stanford University

Digital technologies have become a central part of children's lives, as social interactions, education, government services and commerce increasingly rely on online systems, shaping how children access relationships, opportunities, services and protections. 

As a result, when considering how children use technology — particularly privacy features — it is essential that products are designed with young users in mind and reflect their full diversity. 

This begins with testing. Testing populations should mirror real-world users so designers and engineers can understand how users actually experience features and refine them accordingly. When neurodiverse users are excluded, assumptions about comprehension and intent increase, creating significant risk, especially as attention to nondiscrimination and support needs continues to grow. 

This risk is compounded by the fact that privacy tools actively shape children's emotional experiences, often through anxiety, fear of making mistakes, avoidance or over-trust. Recognizing this emotional dimension highlights the critical intersection of mental health, lived experience and design. 

From a psychological and developmental perspective, many privacy features implicitly assume levels of attention, abstract reasoning and language processing that do not consistently align with how adolescents engage with digital environments, particularly given the wide variability across neurodivergent profiles.

When neurodiverse young people are left out of testing, their needs and ways of interacting with technology may be overlooked. This is particularly significant for privacy features intended to protect and connect young users and trusted guardians. Without their inclusion, these tools may be less accessible and effective for neurodiverse youth.

Young users often make privacy decisions under cognitive load, emotional activation or social pressure. For some neurodiverse youth, these conditions can intensify challenges in interpreting defaults, understanding consequences or distinguishing between protective and performative privacy features. As a result, a tool may be formally available while functionally inaccessible.

This gap highlights why inclusion in user testing is not merely a matter of representation, but of safety and effectiveness. These insights cannot be assumed by designers; they must emerge through direct observation and participation of neurodiverse young people themselves. Only then can design risks that remain invisible in neurotypical testing populations be meaningfully addressed.

These risks are particularly pronounced for children who already face structural disadvantage, including those with learning disabilities, children from ethnic minority backgrounds, and unaccompanied minors. Particularly, neurodiversity includes a broad spectrum of cognitive profiles, many of which may be invisible, situational or never formally diagnosed. 

For children with disabilities, nondiscrimination requires access to inclusive communication technologies and accessible formats that support independent living, full participation in society, informed decision-making and protection from harmful or discriminatory content. 

Anti-discrimination frameworks should be extended to the design and operation of digital features that shape children's online experiences. Under international law, the right to nondiscrimination obliges states to ensure all children can access the digital environment in ways that are meaningful and equitable. This right is codified in Article 2.1 of the United Nations Convention on the Rights of the Child, which affirms that the rights of the child apply to all children without discrimination on any grounds. 

In practice, EU member states are encouraged to take all feasible steps to respect, protect and fulfill the rights of every child in the digital environment. Beyond state actors, the international community has also urged businesses to safeguard children through their products and services, including by restricting access to harmful or age-inappropriate content while ensuring that all measures align with international standards, including the principle of nondiscrimination. Technology companies have an outsized role to play in how these rights are realized. 

Furthermore, when privacy features confuse or exclude young users, engineers are required to work closely with product and design teams to quickly refine flows, language and defaults so protection works in practice, not just on paper. This feedback loop has the potential to turn children's real-world experiences, including those of neurodiverse users, into concrete improvements that make privacy tools simpler, more inclusive and more effective.

An overarching trend in technological development is ensuring that diversity is reflected both in how products are designed and how their outcomes are experienced, in practice. Achieving this begins early in the software development life cycle, particularly in testing, and in the design choices that shape privacy features and products.

Privacy by design helps embed privacy into systems from the outset and a natural next step is to consider how neurodiverse users actually perceive and navigate these privacy tools. Ensuring that their experiences, needs and interpretations are woven into the design process can help make privacy technologies genuinely usable, supportive and protective for all young users and their guardians. 

Changran Wei, privacy product lead engineer for teen and kids experience, contributed to this article. 

The views expressed in this article belong solely to the authors. 

CPE credit badge

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.

Submit for CPEs

Contributors:

Basia Walczak

Privacy and Product Counsel

(Axiom - Airbnb Secondee)

Melanie Selvadurai

CIPP/C, CIPM

Privacy Program Manager

TikTok Information Technologies UK Limited

Candela Yatche

Psychologist, Research Fellow

Stanford University

Tags:

Children’s privacy and safetyProgram managementRisk managementPrivacy

Related Stories