Popular discussions on children’s data protection are often framed as a zero-sum game between those who want children to be protected in the digital age and those who fear a chilling effect on content creation. But is this a false choice? Is it possible to do both? Isn't it true that proper safeguards — including embedding privacy-by-design principles and obtaining verifiable consent — allow organizations both to abide by relevant laws and regulations on protecting children and facilitate children’s access to entertainment, education and information through new and innovative content?

US regulatory approach: FTC workshop

In September 2019, the U.S. Federal Trade Commission announced Google’s YouTube had agreed to pay a $170 million fine for violating the Children’s Online Privacy Protection Act, a federal law designed to protect children under the age of 13. The FTC found that YouTube had illegally harvested personal information from children and used it to target them with ads. As part of the settlement with the FTC and New York attorney general, YouTube must now ask video channel owners to specify whether their content is intended for children. Furthermore, YouTube will treat data from anyone watching child content videos as “coming from a child, regardless of the age of the user” and stop personalized ads to watchers of such videos.

Shortly after the FTC fine and settlement (which sent shock waves throughout the high-tech industry), the FTC held a workshop Oct. 7 to explore if and how COPPA should be revised. The discussions addressed how technologies affect children’s privacy, how education technology vendors should address parental consent, how technology companies should deal with audio files and whether 2013 FTC revisions of COPPA have worked as intended.

However, this workshop has been widely viewed as an attempt by high tech industry advocates to weaken the COPPA Rule. Indeed, the director of the FTC’s Bureau of Consumer Protection has reiterated the industry’s main argument that increased privacy protection would harm consumers' internet experience, in that, without targeted ads which enable creators to monetize their creativity, YouTube would be “a desert of crap.”

Before the workshop, a group of bipartisan senators sent a letter to the FTC “strongly caution[ing] against undertaking a process that ultimately weakens children’s privacy instead of improving it.” The senators expressed concern that the “FTC is at risk of favoring the interests of giant tech companies over the interests of parents and children,” advising the agency that if any changes should come from the workshop, children’s privacy and well-being must take priority.

The wider legal landscape on children’s data protection

These FTC developments come at a time when considerations of how to better protect children have been taking shape in other fora. The EU General Data Protection Regulation and new California Consumer Privacy Act that comes into effect Jan. 1, 2020, emphasize special protection for children. Under the GDPR, parents/guardians must provide consent before the personal data of a child under 16 years of age is collected or processed. Member states have discretion to lower this requirement to age 13. However, when online preventive or counseling services are offered to a child, it is not necessary to have parental or guardian consent.

In California, consumers under the age of 16 have the right to opt in to have their information sold if a business has “actual knowledge” that the consumer is under the age of 16. “Actual knowledge” under the CCPA means that the business willfully disregards the consumer’s age. For children under the age of 13, California businesses must have affirmative authorization by the child’s parent or guardian before selling a child’s personal data.

Under COPPA, “actual knowledge” is not defined, but the FTC has explained that a user’s age can be determined when an operator of a website asks for and receives information that allows the operator to determine the user’s age. For example, the operator may ask for the date of birth to register on a site or app or use “age-identifying” questions, such as, “What grade are you in?” Moreover, as YouTube found out, if a website/online service has content that is directed at children under the age of 13 and collects their information without parental consent, the provider may be in violation of COPPA.

Like the GDPR and CCPA, COPPA requires entities that collect children’s information to make reasonable efforts to provide direct notice to parents and obtain verifiable consent prior to collecting, using or disclosing personal data of children under the age of 13.

The biggest difference between the U.S. and the EU is that in the EU there are no exceptions for a controller not being aware that it is actively collecting a child’s personal data or offering services to children. However, as the FTC’s case against Google’s YouTube shows, having child-related content on a website, whether it is placed there actively or passively by permitting third parties to upload content, can constitute having “actual knowledge” in the U.S.

Balancing children’s data protection and business growth

While the U.S. authorities are discussing substantive ways to change COPPA to balance children’s privacy and business growth, countries like the U.K. are working to improve current child privacy standards. In April 2019, the Information Commissioner’s Office rolled out a code that sets out 16 standards for age-appropriate design for online services. These standards are designed to strengthen children’s data protection while also allowing them freedom of expression, access to information and the ability to engage in recreational activities appropriate to their age.

Similarly, the Irish Data Protection Commissioner published a preliminary report (Stream 1) in early September after a public consultation to address questions on children’s data protection rights under the GDPR. Some of the issues highlighted in this successful consultation included the role of parents/guardians in the context of children's personal data rights, the feasibility of checking a child’s age using alternative verification methods, and procedures to better operationalize data protection by design and by default.

The $170 million fine against YouTube, $5.7 million fine against Musical.ly (now TikTok) for illegally collecting personal information from children, $32.5 million against Apple for allowing children to make in-app purchases without parental consent, and $20,700 against a school in Skellefteå, Sweden, for experimenting with facial recognition on high-school students show that the issue is serious and gaining traction both in legislative circles and the general public.

Photo by Anna Samoylova on Unsplash