With the midterm elections (mostly) behind us, D.C. speculators have started their speculative engines and turned their soothsaying sensors to full power. Recent analyses of the prospects for federal legislative activity on data privacy either focus on the remainder of the year — the lame duck session — or look to the upcoming 118th Congress.

Next year’s congress will see power split between a Democratic Senate and a Republican House, placing the fading art of bipartisanship again at the core of any meaningful action. With limited prospects for full passage of the American Data Privacy and Protection Act in the remainder of the year, the bipartisan draft bill will likely form the foundation of future deliberations on comprehensive data privacy. But Sen. Roger Wicker, R-Miss., one of the champions of the compromise legislation, will no longer be at the table after he moves to chair the Armed Services Committee. In the meantime, successful navigation of the rumored path to House passage of ADPPA is unlikely given competing political priorities — like the need to fund the government.

Most senators who sponsored privacy legislation are instead indicating that they will focus their energy in the lame duck session on youth privacy. The best, or perhaps only, path for this would be to attach a proposed bill to a piece of “must-pass” legislation, like an omnibus spending bill. At the moment, the Kids Online Safety Act represents the clearest candidate for such an endeavor. Axios reports that both sponsors of the bill, Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., intend to give this a shot.

As a bill that blends data privacy restrictions with concerns for safe and healthy digital design practices, KOSA bears a striking resemblance to California’s Age-Appropriate Design Code. The current KOSA draft was advanced by the Senate Commerce Committee July 27, with substantial amendments. Rather than applying to any online service that is likely to be accessed by children under age 18, as the AADC does, KOSA would apply only to certain types of services likely to be used by someone under 17, specifically:

  • Social media services.
  • Social networks.
  • Video games.
  • Messaging applications.
  • Video streaming services.
  • Educational services.
  • Any service that “primarily provides a community forum for user generated content.”

This list largely corresponds to the types of services the U.K. Information Commissioner’s Office spotlighted as it seeks to guide business behavior under its Children’s Code — the first age-appropriate design regulation. One exception: the inclusion of educational services, which raises questions about the interaction of KOSA with the Family Educational Rights and Privacy Act. The updated draft clarifies that the bill would not preempt FERPA, but some have raised the prospect of tricky conflicts such as new youth choice requirements running up against software controlled by school administrators.

While KOSA is not as game changing as the California AADC, it would still represent a substantial shift in privacy practices for users under 13, currently covered by COPPA, plus those between 13 and 16. Some things to know:

  • Not a COPPA amendment. KOSA is a safety and design bill, so it supplements COPPA without amending it.
  • No content restrictions or “age-appropriate” language. KOSA does not explicitly distinguish between children and teens or other discrete age groups. It applies across age groups. And prior language restricting the inclusion of illegal content has been revised to ban only the delivery of ads about such content.
  • Transparent audit requirements for the largest platforms. Covered entities with more than 10 million monthly active users must publish annual reports on their compliance activities, based on external audits.
  • Duty of care. An explicit duty for a covered entity to act in the best interest of children by implementing “reasonable measures in its design and operation of products and services to prevent and mitigate enumerated harms.
  • Opt-outs for algorithmic recommendation systems. A requirement to provide settings allowing youth users (and parents) to adjust timelines or newsfeeds or the like — or opt out entirely.
  • Privacy-protective defaults. For young users, covered entities would be required to have default privacy settings tuned to “the most protective level of control over privacy and safety for that user.”
  • Parent and child controls. Requirements for a number of safety controls, if relevant, that both youth and parents can adjust — with notice to kids when their parents are involved. Most relevant to privacy professionals is a requirement to restrict geolocation tracking.

More to come. Please send feedback, updates and KOSA thoughts to cobun@iapp.org.