TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | The future of youth privacy is here Related reading: Data protection trends in children’s online gaming

rss_feed

""

A new era of children’s privacy rules has arrived. New laws focusing on the safety and mental health of kids and teens will have wide-ranging implications for the design of digital systems. California’s Age-Appropriate Design Code Act, which will go into effect July 1, 2024, is the first law in the United States to reform privacy practices through the operation of youth design standards. The law is part of a trend of proposed bills and regulations that go far beyond traditional privacy harms, introducing new considerations, standards and liabilities that privacy professionals should become familiar with as soon as possible.

Are the kids and teens alright?

For more than two decades, the privacy of U.S. children online has been shaped primarily by operation of the Children’s Online Privacy Protection Act and the associated COPPA Rule, kept up to date by the U.S. Federal Trade Commission. COPPA was a landmark bill in 1998, at a time when only 41% of adults browsed the internet. Now, consumers of all ages seem to always be online, and digital devices and services have become ubiquitous in every aspect of our daily lives.

Policymakers have noticed this trend and are scrutinizing the plethora of potential harms to consumers. In October 2021, after a whistleblower brought attention to ongoing risks on Meta’s platforms related to misinformation, mental health and safety — with particular attention to the disproportionate risks of harm to children and teens — the Senate Commerce Subcommittee on Consumer Protection, Product Safety and Data Security convened a hearing with representatives from Snapchat, TikTok and YouTube. At that time, IAPP Senior Westin Research Fellow Müge Fazlioglu, CIPP/E, CIPP/US, reflected on the enhanced bipartisan scrutiny flowing from the whistleblower complaint and the lived experience of many parents: The digital world may be too complex for parents to hold sole responsibility for protecting their kids online.

When applied to the vulnerable population of kids and teens, the regulatory response to this bundle of concerns has coalesced around the idea of “age-appropriate design.” This concept has been famously spearheaded by Beeban Kidron, a British baroness and U.K. House of Lords member who started the 5Rights Foundation, which is pushing the adoption of this regulatory approach in both the U.K. and the U.S. Although the concept of “privacy by design” has long been an influence on the practice of privacy, age-appropriate design asks those who develop digital products and services to go farther in two ways: Considering potential impacts to specific vulnerable populations and thinking broadly about the harms that can be perpetuated through design features.

New kids’ rules on the block

The U.K. Information Commissioner’s Office was first to put these ideas into practice through the promulgation of its Age Appropriate Design Code, a “code of practice” that essentially serves as an implementing regulation under the Data Protection Act of 2018. The U.K. Children’s Code focuses on implementing 15 standards, which are “not intended as technical standards, but as a set of technology-neutral design principles and practical privacy features.” These include a duty of loyalty for the “best interests of the child,” a duty of care to avoid “the detrimental use of data,” privacy-protective defaults (including for geolocation and profiling) and data minimization.

Since the U.K. Children’s Code went into effect, the ICO has been reviewing the compliance efforts of over 50 entities, with a focus on social media, gaming and streaming service organizations. In its one-year anniversary update, the ICO described the positive changes it has witnessed in that time. Noted changes include Nintendo's update to only allow users older than age 16 to create their own account and set their own preferences, Google's update that allows anyone under 18 to request their images be removed from Google image search, YouTube's update that changed default settings for autoplay videos and bedtime reminders, and Meta’s updated policy limiting ad-targeting based on age, gender and location for users under 18. Active enforcement of the Children’s Code is expected. Just this month, the ICO issued a notice of intent against TikTok relating to alleged children’s privacy violations—with a proposed fine of £27 million. 

With the passage of California’s Age-Appropriate Design Code Act, this style of regulatory scrutiny is now in the U.S. On Sept. 15, the California AADC was signed into law by Gov. Gavin Newsom, D-Calif., after passing unanimously in both state legislative chambers. When it enters into force on July 1, 2024, the law will apply to any entity that qualifies as a “business” under the California Privacy Rights Act and operates “online products, services, or features” that are “likely to be accessed” by children. Note: when the law refers to “children” it includes all minors under 18.

Who is a likely child?

Likelihood of access by children under the California AADC is determined in a much broader fashion than the traditional COPPA “actual knowledge” standard. A business will need to determine whether it is “reasonable to expect” that its online product, service or feature will be accessed by youth “based on the following indicators,” which appear to be a disjunctive list of elements:

  1. “The online service, product, or feature is directed to children as defined by the Children’s Online Privacy Protection Act (15 U.S.C. Sec. 6501 et seq.).
  2. The online service, product, or feature is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children.
  3. An online service, product, or feature with advertisements marketed to children.
  4. An online service, product, or feature that is substantially similar or the same as an online service, product, or feature subject to subparagraph (B).
  5. An online service, product, or feature that has design elements that are known to be of interest to children, including, but not limited to, games, cartoons, music, and celebrities who appeal to children.
  6. A significant amount of the audience of the online service, product, or feature is determined, based on internal company research, to be children.”

Although age verification is not required under the California AADC, businesses must estimate the age of minor users with a “reasonable level of certainty appropriate to the risks.” In the alternative, the law provides that the business can treat all users as though they are children under the AADC. In practice, this means that a business must either provide two versions of its service — one for youth and one for adults — and implement measures to properly sort its users or simply conform its overall design to the privacy-protective requirements of the AADC.

Privacy (and safety and mental health) by design

California’s AADC broadly tracks the requirements of the U.K. Children’s Code. The operational takeaways for businesses designing covered systems include the following:

  • Data minimization. No collection without necessity (for the service the child knows about) and no precise geolocation collection, except when strictly necessary and with obvious signals.
  • Duty of loyalty. Consider the best interests of children and, if there’s a conflict, prioritize this over commercial interests.
  • Duty of care. Avoid using personal data in a manner materially detrimental to the physical health, mental health or well-being of a child.
  • Expanded data protection impact assessments. In addition to privacy risks, analyze “harmful” content, features or algorithms that could harm kids or facilitate harm, and addictive features.
  • Privacy-protective defaults. Set defaults to “settings that offer a high level of privacy” and disable features that profile using a child’s previous behavior, browsing history, or assumptions of their similarity to other children (with some exceptions).
  • No dark patterns. Avoid manipulative design features that encourage youth to disclose data beyond that reasonably expected to provide the service, forego privacy protections, or take an action materially detrimental to their physical health, mental health or well-being. 

For interpretive questions and future guidance, California’s AADC creates a new multistakeholder rulemaking entity, the California Children's Data Protection Taskforce. Although early versions of the bill would have been enforced by the California Privacy Protection Agency, the final version empowers the California attorney general to enforce its provisions, with a 90-day cure period and a financial penalty of up to $7,500 per violation per child.

Federal rules could follow suit

In the style of the age-appropriate design codes, a pending bill at the federal level known as the Kids Online Safety Act aims to strengthen data protection and online safety for minors up to 16. KOSA would apply to “a commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.” A covered entity would need to address “physical, emotional, developmental, or material harms to minors … including: 

  1. promotion of self-harm, suicide, eating disorders, substance abuse, and other matters that pose a risk to physical and mental health of a minor;
  2. patterns of use that indicate or encourage addiction-like behaviors;
  3. physical harm, online bullying, and harassment of a minor;
  4. sexual exploitation, including enticement, grooming, sex trafficking, and sexual abuse of minors and trafficking of online child sexual abuse material;
  5. promotion and marketing of products or services that are unlawful for minors, such as illegal drugs, tobacco, gambling, or alcohol; and
  6. predatory, unfair, or deceptive marketing practices.”

Although KOSA was voted out of the Senate Commerce Committee unanimously July 27, it has been the subject of critical feedback from some industry and civil society groups, including the Electronic Frontier Foundation, which argues the bill would require too much data collection and gives too much power to parents. Nevertheless, the Biden administration has continued to signal a strong interest in enhancing digital protections for kids and teens after featuring the topic in the 2022 State of the Union Address. In a set of “Principles for Enhancing Competition and Tech Platform Accountability” released Sept. 8, the administration again highlighted the importance of “prioritizing safety by design standards and practices for online platforms, products, and services” including by “restricting excessive data collection and targeted advertising to young people.”

Age-appropriate considerations are here to stay

The types of considerations embodied in the California AADC and U.K. Children’s Code are here to stay. In fact, a similar bill was recently introduced in the New York state senate. Even those organizations not covered under existing laws would do well to begin modifying processes to adapt to this new normal. Traditional impact assessments often consider the risks of privacy harms to “average” consumers. Moving forward, privacy professionals should engage with those who design and build digital systems to ensure the unique risks to vulnerable populations are also considered throughout the data life cycle. This starts with children and teens, but it may not end there. Policymakers are increasingly embracing an understanding of privacy rights that is responsive to the needs of other at-risk groups, from seniors to persons with disabilities to minoritized groups. Adapting early to this trend may pay dividends for privacy teams everywhere.


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.