TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Still growing up: Top takeaways from the FTC's proposed COPPA rule update Related reading: The ‘big shift’ around children’s privacy




Since the last Children's Online Privacy Protection Rule update in 2013, the conversation over regulation of children's online activity has echoed throughout the halls of capitals across the U.S. and around the world. On 11 Jan., the U.S. Federal Trade Commission issued a notice of proposed rulemaking to begin its latest update of the COPPA Rule, picking up where it left off in its 2019 request for comment. With this proposed rulemaking, the FTC adds to the attention given to children's privacy protection.

The COPPA statute, passed in 1998, authorizes the FTC to issue clarifying regulations, which it does in the form of updating what is known as the COPPA Rule. By posting its proposed updates in the Federal Register, the FTC commenced a 60-day public comment period. The FTC expects healthy interest in the rule update — the 2019 review saw over 175,000 comments submitted — and will review public comments and incorporate changes as it sees fit. Comments on the proposed rule must be received by 11 March.

The specificity of consent

Under existing rules, and absent an applicable exception, an operator of a commercial website or online service directed to children under 13 years old may collect, use or disclose children's personal information only upon obtaining verifiable parental consent. A single consent could cover all of an operator's privacy practices.

The January proposed rules require a business to obtain two new layers of parental consent, beyond what is required for the collection of children's data: for its (a) disclosure to third parties and (b) use for purposes of maximizing user engagement.

This proposal reflects a continued shift toward specificity of consent and away from its bundling. Under the EU General Data Protection Regulation, consent must be specific. Consent is invalid if not obtained in relation to "one or more specific" purposes of data processing, giving the data subject's choice about each and preserving for them a degree of control and transparency. Similarly, Washington state's My Health My Data Act requires separate, affirmative consents "for any collection, use, disclosure, or other processing of consumer health data beyond what is necessary to provide a consumer-requested product." In insisting that operators obtain verifiable parental consent for specific uses of children's data, the FTC aims to cultivate more informed — if not fatigued — parents.

Back to school: Expanding the universe of who can provide consent in the classroom

In the educational context, whether schools as well as parents or legal guardians may consent on a child's behalf has been subject to debate, competing policy goals and overlapping legal obligations. Careful to not overburden districts by requiring parental consent for every educational technology use, and to not rely upon ad hoc decisions from teachers, the FTC has advanced a flexible middle ground in the draft. It proposes a school authorization exception from COPPA's notice and consent requirements, instead mandating a detailed written agreement between the edtech provider and school to govern the relationship.

Under the proposal, this agreement must identify the individual providing consent and specify that the school has authorized them to do so. This departs from previous COPPA guidance, which vested decision-making in the school or school district, favoring trained, institutional staff over teachers. The contract must also include other information, including limitations on the use and disclosure of student data, the school's direct control over that use and disclosure, and the operator's data retention policy.

To help students and parents understand the contours of school privacy, the operator would also be required to provide notice on its site or service that it has obtained authorization from a school to collect a child's personal information; it will use and disclose that information for school-authorized purposes only; and the school may review information collected from a child and request its deletion. Operators would be required to allow schools to review the personal information collected, a right previously afforded only to parents.

These changes continue the thorough spate of FTC oversight of the growing edtech industry. Through a policy statement and settlement, the agency has emphasized edtech operators' obligations to ensure that parents consent — and not assume a school has done so — and to limit use of that personal information to the purpose for which consent was given. In requiring the establishment of a detailed contractual relationship between provider and educator, the proposed rules appear aimed at mitigating these same issues.

Going beyond the business-to-consumer relationship in adtech

Children's privacy protections as written have largely grown outdated and maladapted to handle today's restless and increasingly complex advertising technology ecosystem. Regulators crafted the 2013 COPPA Rule update with more traditional notions of online advertising in mind, predominantly wary of operators' first-party relationships with children. Presently, many businesses processing children's data do not have direct relationships with them, instead exchanging children's data across a network of third parties. Throughout the proposed rules, the FTC seeks to rewrite the rules of children's privacy for the modern online advertising economy, expanding their scope to include business-to-business scenarios.

In particular, the FTC set its sights on the use of children's personal information in advertising under the guise of internal operations. COPPA includes an exception to its notice and consent requirements for operators that collect persistent identifiers for the "sole purpose of providing support for the internal operations of the website or online service." Businesses handling children's personal information for purposes such as evaluating advertising campaign effectiveness, measuring conversion, detecting fraud and compliance have found refuge under the internal operations exception.

The proposed rules restrict operators' use of personal information to a clarified narrow interpretation of the exception, emphasizing the rule's purpose limitation. Notably, while ad attribution remains a permitted internal operation under the exception, the FTC noted purposes such as behavioral advertising or profile-building will not qualify for the exception and will require verifiable parental consent.

The proposed rules call for transparency in this area as well, requiring operators relying on the internal operations exception to "specifically identify the practices for which the operator has collected a persistent identifier and the means the operator uses to comply with the definition's use restriction" in their privacy notices.

In a similar vein, the FTC proposed expanding the categories of evidence it will consider in analyzing whether an operator's website or online service is directed to children — and thus, the extent of that operator's COPPA obligations — to include "marketing or promotional materials or plans, representations to consumers or to third parties, reviews by users or third parties, and the age of users on similar websites or services." Where the rule initially focused chiefly on representations made on an operator's site or service, the proposal accounts for situations in which an operator makes one statement of audience composition or intended audience to adtech service providers, while indicating differently to users.

Developing the mixed audience framework and diving into the age verification discourse

Distinguishing whether a website or online service is directed at children or a general audience has never been an exact science — such determinations result from a multifactor "totality of the circumstances" test that assesses a site or service's intended audience to determine whether it must comply with COPPA. If a site is child-directed, COPPA always applies. If it is directed at a general audience, COPPA only applies if the operator has actual knowledge about a user's age. But the test's outcome is not strictly binary: the rule provides a narrow exception for a site or service that may be directed to children but does not target children — also known as a "mixed audience" site or service.

Currently, the mixed audience category sits within the child-directed category. Websites within this group may screen users based on age and need only adhere to COPPA's notice and consent obligations for those users who identify themselves as under thirteen. Building on its YouTube settlement, the FTC now proposes a clarifying, standalone definition of a "mixed audience website or online service" as "one that meets the criteria of the Rule's multi-factor test but does not target children as the primary audience."

This clarification aims to establish the mixed audience label not as a broad exception, but rather as incentive for companies to know the age of their users. Through a neutral age gate, operators can determine whether each user should receive the COPPA-compliant or the general audience version of their service.

The means through which operators may collect age information remain fluid. Paralleling similar work by the U.K. Information Commissioner’s Office, the FTC has increasingly disfavored self-declaration mechanisms that children often easily circumvent. Mindful to avoid impeding innovation in age screening, the FTC's proposed definition permits operators to collect age information or use "another means that is reasonably calculated, in light of available technology, to determine whether the visitor is a child."

Combatting nudging

In these rules the FTC attempts to rein in companies using children's data to maximize engagement. Aside from requiring verifiable parental consent when using or disclosing children's data to nudge users toward greater usage, the proposed rules limit the internal operations exception by prohibiting "using or disclosing personal information in connection with processes, including machine learning processes, that encourage or prompt use of a website or online service." Operators intent on nudging children toward greater usage of a site or service must then make online disclosures such that parents have sufficient notice and can provide informed consent.

In affirmatively looping in parents and injecting friction into operators' pursuit of engagement, the FTC follows the lead of lawmakers diligently working to regulate young people's use of social media. California has been most active in this space, passing 2022's now-enjoined Age Appropriate Design Code Act, which inter alia limited business's use of dark patterns and nudging, and recently proposing the Social Media Youth Addiction Law. Several other states have passed likeminded legislation, which litigants have frequently challenged on First Amendment grounds. Considering many online services' business models beget UX designs that increase screen time, this proposed COPPA Rule adjustment may face similar resistance.

Giving detail to data security requirements

For much of the FTC's time spent regulating data security, businesses determined reasonableness of safeguards for personal information by looking at evolving case law and industry standards. The FTC was forced to develop a new approach after a 2018 appellate court ruling held that the term "reasonable" was "devoid of any meaningful standards" and that the FTC's approach said "precious little about how this is to be accomplished."

FTC settlements have since included far more prescriptive technical safeguards. This round of COPPA Rule changes would formalize such requirements, at least within the realm of kids' data, requiring operators to tailor their more detailed security programs "to the sensitivity of children's information and to the operator's size, complexity, and nature and scope of activities."

These proposed rules find parentage in recent consent decrees, including Retina-X and Unixiz, focused on alleged failure to provide adequate safeguards for children's personal information. The agency also attempts to harmonize COPPA data security rules with those prescribed by the Gramm-Leach-Bliley Act's Safeguards Rule, which governs the data security practices of certain financial institutions.

Additional takeaways

While the proposed rules contain plenty more substance to grapple with, some points deserve further mention.

Data minimization. In no uncertain terms, the FTC reiterates that COPPA prohibits operators from collecting more personal information than reasonably necessary for children's participation in a game, the offering of a prize, or another activity, even if an operator has validly obtained verifiable parental consent. This emphasis comes at a time when data minimization has increasingly been the forgotten Fair Information Practice Principle, and reinforcing that not even verifiable parental consent can overcome these limits should send warning to operators of the FTC's intent to mitigate overcollection.

The mutable definition of personal information. The proposed rules expand the definition of personal information to broadly include biometric identifiers that "can be used for the automated or semi-automated recognition of an individual." While a broad functional definition, this definition remains narrower than that offered by 2023 guidance on biometrics. Due to statutory constraints, the FTC opted not to align with other jurisdictions in defining personal information to include inferences.

Audio file exception. Codifying a 2017 enforcement policy statement, the FTC created an exemption from COPPA's verifiable consent requirement for operators that collect an audio file of a child's voice as a replacement for written words, provided the file is retained only for as long as necessary to fulfill the request for which it was collected. Necessary to adapt to increasing reliance on voice-assist technology, the change comes with several restrictions, including an online notice provision and strict purpose limitation.

Methods of parental consent. The FTC proposed adding knowledge-based authentication and facial recognition technology, which were both previously approved through a different process, to its enumerated list of approved methods. Meanwhile, an application for the use of facial age estimation as another method of verified parental consent is currently awaiting approval.

Avatars. The FTC seeks comment on whether, among other possible identifiers, an avatar generated from a child's image constitutes personal information. Such a change may have broad implications for the metaverse and virtual and augmented reality platforms forecasted as social venues for children in the not-so-distant future.

Safe Harbor updates. The proposed rules include updated criteria for approval and revocation of Safe Harbor participation, as well as reporting and recordkeeping requirements for Safe Harbor entities, including publishing lists of their participating operators. Aligning with its bend towards more detailed data security requirements, the FTC proposes additional "reasonable procedures" that a Safe Harbor must require its participating operators to establish and maintain, and expands the scope of Safe Harbor assessments to include comprehensive review of both privacy and security policies.

Children's Privacy and Safety

Children’s Privacy and Safety intends to help the practitioner and advocate on their journey to protect children’s data privacy and safety. The privacy laws and safety protections impacting children vary from country to country and across industries and data types, making it hard to apply a singular global approach. This comprehensive treatise is intended to provide reliable and substantive background on children’s privacy, data protection, and safety issues.

View Here

Credits: 1

Submit for CPEs


If you want to comment on this post, you need to login.