Age-verification tools are increasingly viewed as the silver bullet solution to improving children's online safety, supporting organizations' regulatory compliance efforts and building trust among parents and minors alike. During the U.S. Federal Trade Commission's 28 Jan. age verification workshop, stakeholders outlined competing visions for how age-assurance tools should be deployed and to what extent companies should verify users' ages without undermining privacy.

Verification systems are shaping how organizations assess children's online safety requirements, including the Children's Online Privacy Protection Act. As assurance technologies become more common, concerns are shifting toward whether these systems can be deployed without extensive data collection.

Graham Dufault, general counsel at the App Association, noted organizations should not "lose sight of the fact that conducting age verification in particular does itself still pose privacy risks, and even to the extent that there are technologies that allow you to pass along a signal that doesn't include an entire identity." He added deployment risks still exist, requiring companies "to match that against whatever risks are presented by the app itself or the content that's provided or the services that you're giving access to."

ADVERTISEMENT

Radarfirst- Looking for clarity and confidence in every decision? You found it.

Risk-based age verification

Verification deployment first requires considerations around proportionality. The level of risk posed by a particular platform or service will be top of mind in those pre-deployment assessments.

Google's Child Safety Policy Lead Emily Cashman Kirstein said the company has adopted a "principled approach" grounded in risk assessment and data minimization using its age inference AI model.

Google asks users to declare their age at account creation though its age inference model uses existing signals rather than collecting additional data to verify if a user is over 18 years old. Kirstein argued that the level of verification should match the platform’s potential risks.

Adequate implementation of verification is important "to ensure that users have the right experience for products and services," according to Kirstein. "If the user's an adult, we don't want to block them from access to critical information or put restrictions on their account to hamper their ability to use those services as an adult. We don't want to ask for privacy-intrusive information if the risk doesn't warrant it."

Meta's Global Head of Safety Antigone Davis highlighted the company's comprehensive approach to age verification through its Teen Accounts. The accounts system enlists additional safeguards for underage users through parental controls. 

Meta also joined the OpenAge Initiative, a global interoperability scheme established by age-verification provider k-ID. Davis said the initiative aims to create a "privacy preserving age key that can be shared with many different apps."

Apple has made similar efforts to protect users from potentially inappropriate content by introducing App Store restrictions and advanced parental controls. Apple Federal Government Affairs Director Nick Rossi indicated the company avoids framing age verification as a compliance trade off.

"We believe in advancing technologies and policies that protect children from online harms and protecting their privacy," Rossi said. "We want users of all ages, really, to be able to have a great and safe experience with our products and services. And that's why a core part of our design is focused on keeping our users safe, and that's especially true for kids."

Apple's risk mitigation efforts include its age-range sharing tool that allows parents to share a child's age range with developers without disclosing their birthdate. Rossi said Apple recognizes that its App Store plays a key role in age verification and believes the implementation of assurance tools could help with “mitigating the risks of scams, frauds, or other harms that could result from children's personal information being collected or retained or shared unnecessarily with folks who don't need it.”

Organizational and app store responsibility

Disagreement continues over where responsibility for age assurance should reside within the digital landscape. Davis argued establishing age verification and parental consent at the device or app store level could reduce the burden on platforms and parents.

"We need to be providing a mechanism to ensure there's a way for someone to prevent teens from having access,” said Davis, noting the millions of apps available to minors on their mobile devices. He also argued that having parents individually approve parental controls for individual apps would likely make consumer safety more challenging than allowing verification one time at the app store level.

A universal verification signal also poses the potential for a signal to be shared in a privacy-preserving manner, allowing developers to apply age-appropriate protections without collecting sensitive information.

SuperAwesome Chief Privacy Officer and Head of Legal Amy Lawrence observed that assumptions about where minors are present online are increasingly outdated. She said if underage users are likely to show up on a platform, "age-aware design should be the safety and compliance baseline instead of just avoiding the idea that there might be kids showing up."

Age awareness can help organizations assess legal obligations, reduce enforcement risk and apply age-appropriate protections. From an advertising perspective, Lawrence noted that knowing the approximate age of users can help companies ensure their advertising and marketing is appropriate.

"If safe spaces for young people are making money, if they are profitable, it encourages more and higher quality safe spaces for kids, and that improves the entire ecosystem around kids on the internet," she said.

Lexie White is a staff writer for the IAPP.