TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Takeaways from Epic Games settlement: Teen privacy arrives at the FTC Related reading: The future of youth privacy is here

rss_feed

""

After years of developing under the surface, teen privacy safeguards in the United States may have finally reached puberty. Although headlines about the Federal Trade Commission’s enforcement action against Epic Games are likely to focus on the high price tag — $275 million in administrative penalties and $245 million in consumer refunds — privacy professionals should zoom in on the operational takeaways for any organization that runs an online site or service used by individuals under 18, whether children or teenagers.

The genesis of teen privacy

A teen privacy action at the FTC has been a long time coming. Over a decade ago, in its report on "Protecting Consumer Privacy in an Era of Rapid Change," the FTC pointed to numerous comments from “consumer advocates and others” calling for “heightened protections for teens between the ages of 13 and 17,” which highlighted the particular vulnerabilities of this demographic. The agency’s FAQs for business compliance with the Children’s Online Privacy Protection Act were later updated to address multiple questions about teen privacy best practices, despite COPPA’s purview stopping at the age of 12. 

Over the past few years, it has seemed the FTC has taken every opportunity to mention concerns about teenagers. In a subtle but substantial shift in discourse, speeches and reports that until recently would have mentioned only “children’s privacy” also incorporated references to teens. In 2020, when the FTC sent inquiries to nine social media and streaming companies under its Section 6(b) investigatory authority, it requested information about how each company’s practices “affect children and teens,” among other questions. The move reflected calls from advocates during the 2019 COPPA Rule Workshop, and in their filed comments, for the FTC to exercise its Section 6(b) powers. Advocates also doubled down on comments addressing the particular privacy risks of teens.

Most recently, in its Advance Notice of Proposed Rulemaking, the FTC dedicated an entire section to answering the question, “To what extent do commercial surveillance practices or lax data security measures harm children, including teenagers?” Meanwhile, lawmakers at the federal level have continued to propose — and, in California, have passed — legislation that would apply specifically to the teenage demographic.

But despite all the ink spilled about teen privacy, there has yet to be a case demonstrating how the FTC will approach the unique risks visited on this demographic in its enforcement work. Until now.

Epic fail, allegedly

Enter Epic Games and its wildly popular cross-platform multiplayer video game Fortnite. This week, Epic settled with the FTC over allegations of privacy violations and deceptive billing practices. The stipulated consent order was approved by all four current FTC commissioners, with Commissioner Christine Wilson filing a separate concurring statement. Epic Games posted its own takeaways about the case along with a warning to other game developers, “The old status quo for in-game commerce and privacy has changed, and many developer practices should be reconsidered.”

The proposed settlement is the culmination of two separate complaints. The first complaint, which leads to proposed consumer refunds of $245 million, alleges a slew of dark patterns designed to enable automatic billing sign-up and frictionless, at times unintended, in-game purchases. In a second count of unfair practices, Epic is alleged to have punished customers who instituted chargebacks on their — often validly disputed — credit card transactions by denying them access to their Fortnite accounts.

The other complaint concerns alleged privacy missteps; first, failure to comply with COPPA, and second, unfair default privacy settings for both teens and kids.

Dilly-dally and find out

Epic’s most direct alleged violations relate to its approach to COPPA. The facts here are a good reminder for privacy professionals to think early and often about COPPA compliance, even if you think you may be exempt. Why? One good reason: Epic’s penalty for its alleged COPPA Rule violations is a record-setting $275 million.

As with any COPPA case, the company’s compliance obligations turn on the extent to which Fortnite was directed to children and/or included known children among its users. On both prongs of this analysis the FTC is armed with much evidence describing why the company should have been aware that it fell within COPPA territory. The FTC complaint analyzes the game’s visual style, gameplay and features, and relies on internal Epic documents showing decisions to build a “living room safe, but barely” theme, intentionally marketing Fortnite-branded products to children and becoming aware through news reports and internal studies that children were one of its main demographics.

Nevertheless, for the first two years Fortnite was operational, the FTC alleges the game included “no parental controls and minimal privacy settings,” that Epic “took no steps to seek parental consent before collecting children’s personal information or explain how the company handled it,” and that Epic included a standard disclaimer in its privacy policy claiming that its games were not directed to children. As the FTC explains, “even when Epic obtained actual knowledge that particular Fortnite players were under 13, Epic took no steps to comply with COPPA. Indeed, Epic went to great lengths to pretend it never obtained actual knowledge at all.”

By the third quarter of 2019, when Epic began to undertake a compliance posture more in line with a child-directed product, it implemented what the FTC calls “dilatory COPPA measures”; essentially, too little too late. A prime example of this was the company’s effort to seek parental consent for children on its platform, a core COPPA requirement. While it implemented an age gate for new users and required parental consent for those who self-identified as under 13, Epic allegedly failed to do enough to identify existing users who fell within COPPA. The company did implement customer service processes to flag any existing account that appeared like it might be underage based on interactions with agents, including by analyzing a trove of old customer service files. But this was not enough. Now, instead, the FTC order requires Epic to treat all its current Fortnite users as children until they successfully pass a neutral age gate.

Unfair default settings

Far more groundbreaking than the COPPA takeaways are those related to the final charge in the FTC complaint: unfair default settings in Fortnite that led to “matchmaking children and teens with strangers while broadcasting players’ account names and imposing live on-by-default voice and text communications.”

Privacy is contextual, so to best understand this charge, it’s important to understand a bit about Fortnite. Standard gameplay involves matching the player with 99 other players in a melee-style “battle royale” set on an expansive map. Building forts and eliminating others, players compete to be the last one standing. The game is free to play (with paid in-game extras available), so each game session is open to anyone with an account.

Communication platforms are youth privacy risks

Much of the focus of the FTC’s unfairness charge is rooted in the use of Fortnite as a live communication medium among players, for in-the-moment audio and text-based discussion. To understand the privacy risks the way the FTC sees them, it is important to consider the particular risks that could be visited on various demographics of players.

There are real harms for children and teens flowing from live communication with a room full of adults. The FTC’s analysis seems to focus on three general types: (1) exposing information about minor users to others without clarity about what is exposed, (2) exposing minor users to the unfiltered communications of adults, which could include harassment and abuse, and (3) enabling the minor to connect with adults in a “public” setting, which could facilitate sexual exploitation and other abuse. Commissioner Wilson includes heartbreaking real-world stories of some of these harms in her concurrence.

The lesson here is clear: on an interactive platform, other users are third parties that can observe information about each other. If minor users are present, ensure their personal information is private by default and that valid consent is given before open communication is enabled.

Privacy settings should be granular and easy-to-use

The FTC’s complaint focuses a lot on the on-by-default nature of audio communication within the Fortnite environment. Despite repeated advice from the company’s UX team to make voice chat an opt-in feature — and similar feedback from users, including parents — Epic continued to keep the default. When it implemented an opt-out toggle switch for voice chat, the FTC alleges that the company “did not inform players of the setting’s availability and placed the control in the middle of a detailed settings page.”

Over time, Epic began to introduce more privacy-protective features into Fortnite, including settings that prevent the user’s name from being displayed to other users. When it introduced parental controls, it incorporated these settings and others, like the ability to block all friend requests, into its parental tools. These were important steps, but the FTC was still dissatisfied with the default settings given the nature of the platform and the demographics of the users.

Exposure of minors’ information requires opt-in consent

Under the terms of the consent order, Epic must change the default settings of the game, at least for certain users, and ensure that it receive affirmative express consent before doing any of the following:

  • Disclosing a minor’s personal information to other users.
  • Enabling a minor to disclose their personal information to other users.
  • Enabling a minor to converse with or be party to conversations between or among any other users.

Affirmative express consent means “any freely given, specific, informed, and unambiguous indication of an individual’s wishes demonstrating agreement by the individual, such as by a clear affirmative action, following a clear and conspicuous disclosure to the individual.” A disclosure is only clear and conspicuous if it is “difficult to miss, i.e., easily noticeable, and easily understandable by ordinary consumers.” The FTC includes detailed guidance in the order about the meaning of “clear and conspicuous” in various media, including video and audio contexts.

The mandated disclosure in this case must include:

  • Each type of personal information that will be disclosed.
  • Each category of persons, i.e., other users or types of third parties, to which each type of personal information will be disclosed.
  • Each type of communication the minor will be able to make or receive.
  • Each category of person to, or from which, the minor will be able to make, or receive, each type of communication.
  • A simple, easily located means for the individual to withdraw consent.
  • Any limitations on the individual’s ability to withdraw such consent.
  • All other information material to the provision of consent.

This can serve as a roadmap for the delivery of notice and choice for any service that includes interactive elements among users, particularly among users of mixed ages. Even for those that do not include interactive elements, Epic’s up-front disclosure requirements appear apply to any third-party sharing of personal information. At a minimum, extra efforts to keep teens informed about privacy practices and risks are likely warranted.

Teens can give their own consent

For children, the consent required under the order falls within processes for securing parental consent under COPPA. But what about for teens? The FTC makes clear that either a teen or their parent can provide affirmative express consent under the consent order.

This is largely consistent with existing and proposed standards for teen privacy controls, although it is worth noting that the age-range for “teen” in the FTC’s order incorporates all individuals between the ages of 13 and 17, inclusive. Some federal proposals exclude 17-year-olds.

Users should be informed of what other users can see

Although the FTC’s complaint mentions numerous times the inclusion of a setting related to the exposure of display names to other users, by default, the consent order does not mandate that this setting be changed. That is, display names do not count as “covered information” under the order, and are therefore exempt from the consent-for-disclosure requirement.

However, Epic is required to ensure parents receive a direct notice to inform them that their child’s display name will be exposed in a “multiuser game or other interactive multiuser experience.” For teens, there is no similar requirement in the order, except that Epic must also describe the disclosure of display names in its privacy policy any time it impacts minor users.

Stay tuned

This is a groundbreaking case, but the takeaways, at least when it comes to teens, are somewhat limited by its singular facts. Facilitating interaction and communication between users, especially of mixed ages, is a fraught activity that should be carefully considered against the lessons from this case. Other teen best practices are still emerging; it is unlikely we have seen the end of teen privacy actions at the FTC.

Children's Privacy and Safety

Children’s Privacy and Safety intends to help the practitioner and advocate on their journey to protect children’s data privacy and safety. The privacy laws and safety protections impacting children vary from country to country and across industries and data types, making it hard to apply a singular global approach. This comprehensive treatise is intended to provide reliable and substantive background on children’s privacy, data protection, and safety issues.

View Here


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.