TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

United States Privacy Digest | A view from DC: Kids Online Safety Act could pass — and it applies to you Related reading: A view from DC: FTC v. Kochava — License to litigate

rss_feed

There has been a major shift in the wind behind proposed U.S. youth privacy and safety laws this week. Before we think of the children, we should mark the unique context of this moment.

The U.S. Congress is experiencing an era of legislative paralysis that has reached historic proportions. While the first half of most congressional terms usually sees between 50 and 150 bills become law, the first half of the 118th Congress passed fewer than 30 bills. As we enter a major election year, the federal government is operating under a temporary budget extension and many hard-fought bipartisan legislative frameworks are withering on the vine.

Nevertheless, the Senate is considering a package of measures to protect young people online, all of which were updated yesterday with new language, after being voted out of committee last fall. It includes three proposals, rolled up into two bills:

  1. The proposed Children and Teens' Online Privacy Protection Act, known colloquially as COPPA 2.0., which was replaced by U.S. Sen. Ed Markey, D-Mass., with a substitute amendment along with an announcement that the bill is now co-sponsored by Senate Committee on Commerce, Science and Transportation Chair Maria Cantwell, D-Wash., and Ranking Member Ted Cruz, R-Texas, along with Markey's original partner, Sen. Bill Cassidy, R-La. COPPA 2.0 is primarily designed to provide targeted privacy protections for teenagers, though it also changes the knowledge standard under COPPA.
  2. The proposed Kids Online Safety Act, which was replaced with much fanfare through a substitute amendment co-sponsored by 62 senators, more than enough to pass the measure if it was brought to the floor for a vote. You can find a redline comparison of the updated text of KOSA against the prior version here. Changes to the enforcement section of KOSA, limiting the powers of state attorneys general to enforce the bill to only a sub-set of its requirements, appear to have allayed the concerns of some of the bill's longstanding opponents.
  3. And the Filter Bubble Transparency Act, which was first introduced in 2021, and has been included as Title II of KOSA since the markup of the bill in the Commerce Committee late last year.

Each of these bills, if enacted, would apply to different sets of websites and online services. All would be significantly broader than existing federal kids' privacy requirements. The somewhat dizzying overlapping rings of scope are worthy of closer examination.

Let's start at the edges, the broadest set of covered entities, and work our way in.

First, any online platform that delivers a feed of content would be subject to new transparency requirements about the "opaque algorithms" that rank and sort such content. These protections, which were first proposed in the separate Filter Bubble Transparency Act, now incorporated into KOSA, apply to users of all ages. That's right; these are not targeted only at young people. They also apply to all types of organizations, without listed exemptions, if they operate an "online platform" using an "opaque algorithm."

  • The term "online platform'' means any public-facing website, online service, online application, or mobile application that predominantly provides a community forum for user generated content, such as sharing videos, images, games, audio files, or other content, including a social media service, social network, or virtual reality environment.
  • The term "opaque algorithm" means an algorithmic ranking system that determines the selection, order, relative prioritization, or relative prominence of information that is furnished to users of the online platform "based, in whole or part, on user-specific data that was not expressly provided by the user to the platform for such purpose."

Notably, the requirements under this part of the law could represent the first major transparency obligations of the AI age, requiring explicit notices and options for users of all ages to exercise control over how their feeds are sorted, including the option to switch to an "input-transparent algorithm."

The rest of KOSA is focused on a subset of online platforms that are used or "reasonably likely to be used" by minors under age 17, with explicit exemptions for common carriers, broadband providers, email services, video conferencing services, wireless messaging services, nonprofits, schools, news media, libraries, cloud providers and VPNs.

Well, actually, we should slow down. Title I of KOSA is also arguably broader than Title II in a different respect. Please picture a Venn diagram, if you would. The circle on your left represents all online platforms, without the above exemptions or age target. This is the scope of KOSA's Title II. The overlapping circle on your right (Title I) may be smaller — due to the age function — but it also has an area that does not overlap with the first circle.

This special new area that we've uncovered consists of companies that operate an "online video game, messaging application, or video streaming service" used or reasonably likely to be used by a minor. Video games are only covered under certain conditions, including those that allow for user-generated content or include targeted advertising.

For all such services, KOSA creates a duty to "exercise reasonable care" that any "design feature" on the platform is created and implemented to "prevent and mitigate" harms to minors, including mental health issues, addiction, physical violence and sexual exploitation.

But wait, "design feature" is also a defined term in the act. The above duty of care only applies to features like infinite scrolling, notifications, or in-game purchases that "will encourage or increase the frequency, time spent, or activity of minors on the covered platform."

Other requirements in KOSA apply to the entire covered platform, not just to its potentially addictive "design features." These requirements include safeguards for "personalized recommendation systems," geolocation sharing, communications with minors, and the viewing of minors' data by other users or visitors to the platform, as well as default privacy-protective settings, options to limit screen time and other parental tools.

When do these rules apply? KOSA's general privacy and safety provisions are generally triggered when the company "knows" that an individual is a minor. But this is a more expansive standard than privacy pros are used to under the Children's Online Privacy Protection Act.

Both KOSA and COPPA 2.0 apply a new standard of knowledge about the age of users on covered services, the latter of which would also apply to existing COPPA rules. Instead of just "actual knowledge," the new knowledge standard also includes "knowledge fairly implied based on objective circumstances."

This knowledge standard is familiar in consumer protection law as the level of knowledge required for a company's violation of Section 5 of the FTC Act. It is somewhere in the middle of what courts have called the "knowledge continuum," which is marked by "constructive knowledge at one end and actual knowledge at the other with various gradations, such as notice of likelihood, in the poorly charted area that stretches between the poles."

Taken together, these overlapping rings of scope would represent a major expansion of the types of entities that must consider the privacy and safety of consumers under 17. Of course, these federal proposals come at a time when states are experimenting with a variety of controversial new youth protections, the next wave of which will likely see a few new state laws this year. Notably, the updated KOSA language includes a floor preemption provision, which would nullify any contradictory state laws, unless they provide comparatively stronger protections.

Privacy pros should consider revisiting the protections they provide to kids and teens who might use their services.

That said, even if the Senate takes the time to prioritize and pass youth safety and privacy bills on the floor, their final passage remains uncertain. None of these bills have yet been introduced in the House of Representatives.

Upcoming happenings:

  • 21 Feb., 17:00 ET: The monthly Tech Policy Happy Hour takes place (Brasserie Liberté).
  • 22 Feb., 11:00 ET: The Centre for Information Policy Leadership hosts a webinar on best practices in accountable governance of AI: new research about the experience of leading practitioners (virtual).
  • 25 Feb.: The deadline to submit speaking proposals for the IAPP Privacy. Security. Risk. conference in Los Angeles, scheduled for 23-24 Sept.
  • 27 Feb., 17:30 ET: The Future of Privacy Forum hosts its 14th annual Privacy Papers for Policymakers event (U.S. Capitol Visitor's Center).
  • 29 Feb., 9:00 ET: The Data Transfer Initiative hosts a summit on Empowerment through Portability (National Union Building).

Please send feedback, updates and Venn diagrams to cobun@iapp.org.

Children's Privacy and Safety

Children’s Privacy and Safety intends to help the practitioner and advocate on their journey to protect children’s data privacy and safety. The privacy laws and safety protections impacting children vary from country to country and across industries and data types, making it hard to apply a singular global approach. This comprehensive treatise is intended to provide reliable and substantive background on children’s privacy, data protection, and safety issues.

View Here

The Kids Are All Rights: The Conflict between Free Speech and Youth Privacy Laws

This article analyzes the complex relationship between First Amendment rights and the future of youth privacy and safety laws.

View Here


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.