Kids' online privacy and safety has never been timelier than it is right now. Policymakers around the globe are grappling with how to keep kids — and their data — safe and secure. In the U.S., there have been unprecedented efforts to pass new state laws —  some privacy-specific and others targeting online safety, social media access and "age appropriate" content. There are also numerous new and revived federal efforts on the child online privacy and safety fronts. 

The Entertainment Software Rating Board's Privacy Certified program keeps track of these developments. As one of a handful of organizations authorized by the Federal Trade Commission to serve as a self-regulatory Safe Harbor under the Children's Online Privacy Protection Act, we work with member companies in the video game and toy industries every day to help them comply with COPPA and the variety of other laws and best practices covering children, teens and adults.

COPPA, of course, governs the online collection, use and disclosure of information from children under the age of 13. It requires companies to provide notice and obtain verifiable parental consent before collecting, using or disclosing a child's personal information. Although the implementing COPPA Rule does not mandate the mechanism, most companies rely on the seven FTC-approved methods as a "cautionary measure." 

Apart from VPC, COPPA contains strong, enforceable provisions on data minimization, retention and security. Though families, lawmakers, businesses, privacy advocates, academics, government regulators and even safe harbors have called for COPPA changes and criticized existing VPC methods as outdated, burdensome, easily circumventable, inequitable and even privacy-invasive. (Maybe those printed consent forms that parents return by "postal mail, facsimile, or electronic scan" are exhibit A?) The ESRB also asked the Federal Trade Commission to modernize VPC to make it more user-friendly for parents and less frictional for businesses.

To that end, in June, Privacy Certified joined SuperAwesome and Yoti, two trusted global companies, to seek the FTC's approval for a new VPC method based on privacy-protective facial age estimation. The FTC published our application 19 July.

SuperAwesome is a "kidtech" company delivering services that enable safe, privacy-compliant digital engagement with almost half a billion kids and teens every month. SuperAwesome, owned by Epic Games, is a member of ESRB Privacy Certified. Yoti is a leading digital identity and age assurance provider which delivers millions of facial age estimations for regulated industries and consumer brands. 

The pending VPC application

Under the COPPA Rule, ESRB Privacy Certified has the authority to approve members' use of a VPC method as long as it meets the rule's requirements. Given intense discussions about VPC, not to mention the FTC's pending COPPA Rule review, we decided instead to seek FTC approval. Public comments are due 21 August, 30 days after the notice is published in the Federal Register.

In our application, we explain how privacy-protective facial age estimation works based on Yoti's implementation. This technique uses proven computer vision and machine-learning technology to estimate a person's age based on an analysis of patterns in an image of their face.

The system scans the face, extracts patterns, compares those patterns to those of known ages and estimates age. The processing usually takes less than one second and produces only a yes or no results on whether the individual in the image meets a designated age threshold.

Yoti deletes the images immediately and permanently, and does not use them for training purposes. Yoti also includes a liveness test to prevent spoofing.

For COPPA compliance, the technology can confirm the person from whom VPC is sought is an adult, without requiring the user to submit additional personal information. It doesn't require the collection of any information from the child, apart from the parent's email, which the COPPA Rule permits for VPC.

We then show how the proposed method meets the rule's requirements. First, we explain how facial age estimation is "distinct" from the currently authorized VPC methods. Second, we explain how it is "reasonably calculated, in light of available technology," to ensure the person providing consent is the child's parent. Third, we demonstrate how facial age estimation promotes children's privacy and does not present any substantial risk to parents' privacy.

We also provide evidence that facial age estimation can alleviate access and equity issues: everyone has a face, but not everyone has a payment card, driver's license, passport or Social Security number.

To support these points, the application contains extensive evidence from Yoti, and from SuperAwesome's integration of Yoti's technology into its Kids Web Services consent platform. Since 2022, Yoti and SuperAwesome have delivered more than 4.8 million age estimations for non-COPPA legally required parental consent. Yoti's methodology, which has also been validated by external sources, shows the method is highly accurate, quick, and cost-effective.

In addition, evidence shows variations in age, gender and skin tone do not materially impact the accuracy of its estimations on whether a user meets the threshold set for determining whether a person is an adult. 

Going beyond VPC

Our application for recognition of facial age estimation as a VPC method has implications for new child online privacy and safety laws and codes. As the Future of Privacy Forum explained in their recent report on the status of VPC, some of the "same technologies used to establish VPC also undergird age estimation technologies required by these new laws." In turn, these new age verification requirements "will invariably impact the use of VPC methods." 

To date, though, many of the new and proposed laws have been written without considering either the state of VPC, age assurance tech or the potential secondary consequences of mandating age verification for all users, especially for privacy. Right now, drafters of new age verification requirements risk creating new privacy risks for consumers, compounding equity issues or imposing unreasonable liabilities for businesses.

Here, our experience with COPPA and new methods of VPC can be instructive.

COPPA's VPC requirement, of course, is not an age assurance mandate. It requires companies only to obtain verifiable consent from parents, not to use a method of age assurance that verifies the age of a child.

However, it allows operators of "mixed audience" services to use a neutral age-gate to identify users under-13. By contrast, new state laws that require platforms to verify the ages of children and minors also effectively require companies to verify the age of adult users too. Like the COPPA VPC requirement, many of these laws, in practice, ask companies to obtain parental consent and verify that the consenter is the parent or could be a parent. 

It's important to note there are currently no methods to verify a child's age that are not privacy-invasive, requiring payment or other financial information, inequitable, requiring access to government documents, or inherently unscalable, requiring a live video call. Here too, age estimation using facial analysis may be a promising approach.

One advantage of privacy-protective age estimation for COPPA and emerging laws is that people like it. SuperAwesome reports, whenever facial age estimation is available as an option for parental consent outside the U.S., more than 70% of parents choose it over other methods. Recent evidence from the Family Online Safety Institute bears this out. In a cross-jurisdictional survey on the attitudes of parents and children regarding age assurance methods, FOSI found a high level of support for methods like facial age estimation that can be used "without a connection to identity or personal information."

Untangling age estimation and biometrics

Any discussion of facial age estimation for VPC or age assurance eventually gets tangled up with biometrics. Many new and emerging technologies for age assurance use biometric information, such as facial or voice recognition technology, which in turn may incorporate algorithms and/or artificial intelligence techniques.

In its recent Biometric Information Policy Statement, the FTC outlined its concerns about the use of biometric information with respect to consumer privacy, data security, and the potential for bias and discrimination. These concerns are certainly valid, but not all biometric technologies raise such issues. This is especially true when technology is not designed to identify a unique individual.

In this sense, privacy-protective facial age estimation differs from most uses of facial recognition technology. Facial recognition technology looks for unique geometric measures of the face, such as the distance and relationship among facial features, and tries to match these to an existing set of measurements already recorded in a database, i.e., in a photo, along with unique information to identify a person.

By contrast, facial age estimation takes a live facial image, converts it into numbers and compares the numbers to patterns in its training data set that are associated with known ages. Here, even though the method works by processing a photograph of a person's face, the only output is a nonidentifying age estimation. 

The U.K. Information Commissioner's Office recognized this distinction, explaining facial age estimation "can be distinguished from other facial recognition technology . . ." It explained Yoti "is not using the tool for the purpose of uniquely identifying the individuals whose images are captured using the age estimation tool. Instead, it is being used to categorise them by age without uniquely identifying them."

Similarly, France's data protection authority, the Commission nationale de l'informatique et des libertés, opined the use of facial age estimation for pornography sites, based on facial analysis without facial recognition, could comply with EU data protection law.

Facial estimation scans may still raise privacy concerns. These must be balanced against factors such as efficacy, accuracy, equity, and the child privacy and safety goals policymakers are trying to achieve. Many of the existing solutions for VPC and age assurance, such as photo ID, Social Security number or payment card, are often unnecessarily invasive, enabling the collection of name, address, birthdate and unique ID number. And they might deter children from accessing safe and compliant services.

If children can't pass through a VPC flow because their parent doesn't have a credit card or if they are barred from game play by an age verification system, they may end up lying about their age or accessing less privacy protective services.

FPF recently released an infographic, Unpacking Age Assurance: Technologies and Tradeoffs, looking at common and emerging age assurance methods, including biometric identifiers, such as face scans, and algorithmic methods, such as using multiple data points or signals from a VR game. In considering the tradeoffs between accuracy/efficacy and privacy/security, FPF cautioned it is "important to consider context to determine a proportionate method of age assurance for each specific use case."

Indeed, there is a wide range of different use cases and corresponding solutions for age assurance, ranging from keeping kids out of adult services, like gambling or porn, to adapting a service to the age of a kid or teen, for compliance with the U.K. children's code and the California Age Appropriate Design Code, to confirming someone is an adult so they can provide privacy consents for a child under a privacy law like COPPA. 

Although it might be appropriate, in some cases, to use a more privacy-invasive method for higher-risk, regulated or age-restricted services, a less accurate method such as self-declaration, might be appropriate for lower-risk services. For most use cases, though, facial age estimation is sufficient. Unfortunately, poorly written laws, which do not adequately differentiate between mere collection of biometric identifiers and their use case, risk limiting innovation and leaving kids' privacy unprotected. 

Conclusion

To date, the FTC has not commented on age-assurance requirements in state laws, and it will not likely do so. But, the FTC can influence the development of such laws and requirements by approving the pending application to authorize facial age estimation for VPC under COPPA and by requiring companies that use facial age estimation to implement strict privacy and security safeguards. 

To protect kids fully beyond COPPA, lawmakers, privacy regulators and other privacy professionals can collaborate to ensure any new laws distinguish between age assurance use cases and biometric approaches, and build in privacy protections that address foreseeable privacy risks. They can focus on creating effective, implementable, and equitable VPC and age-assurance mechanisms to protect kids in online spaces.