U.S. Congress' pivot on 2024 consumer privacy efforts is moving forward with a range of uncertainties attached.

The Senate and House have mostly turned their attention away from comprehensive privacy legislation this year in favor of children's online safety bills, which are now closer to final passage after a House Committee on Energy and Commerce markup 18 Sept. Amended versions of the proposed Children and Teens' Online Privacy Protection Act — also known as COPPA 2.0 — and the Kids Online Safety Act were approved for House floor consideration, however, lawmakers' comments suggest both bills require more time and thought than the 118th Congress has left.

Out of 16 bills included in the full committee markup, the children's bills were the only ones to receive voice votes instead of roll call votes. That decision highlighted the diminishing support of each measure as currently drafted, which Energy and Commerce Ranking Member Frank Pallone, D-N.J., acknowledged in his opening remarks and more pointedly during respective debate over each bill.

"The filing of substantial and substantive replacements for (COPPA 2.0 and KOSA) the day before markup has left members and key stakeholders insufficient time to identify and try to address the consequences of those changes," Pallone said. "Particularly as they will impact some of our most vulnerable children and teens. Fundamental issues have simply not been hashed out."

In addition to wavering House support, Congress is likely to face the challenge of reconciliation with both bills between chambers. The current House bills do not align with the Senate-approved Kids Online Safety and Privacy Act, an omnibus package that includes COPPA 2.0 and KOSA, meaning House passage would lead to a conference committee and concurrence votes from both chambers on a negotiated final bill.

Beginning 23 Sept., the Senate and House have 29 mutual working days remaining in the current congressional term.

Differing approaches

The leading contrast between the House and Senate bills for both children's proposals is the House's decision to use a tiered knowledge standard as opposed to the broad actual knowledge standard passed by the Senate.

The three-tier approach from the House creates knowledge requirements for "high-impact social media companies," companies that have "an annual gross revenue of USD200,000,000 or more, collects the personal information of 200,000 individuals or more," and those organizations that don't fall under either scope. Each level brings a different standard for knowledge of a child or teen user on a given website.

Additionally, the House removed the Federal Trade Commission's authority to determine knowledge standards and produce guidance around its definition of knowledge.

Each House bill contains its own issues that were raised and drew additional amendments to improve the bill. All of the amendments offered were withdrawn with the intent of flagging issues that needed to be addressed in the leadup to a floor vote.

Pallone used the amendment strategy on the COPPA 2.0 pressure points. Interestingly, Pallone offered an amendment that would reintroduce the proposed American Privacy Rights Act. That pitch included foundational data minimization while leaving out the contentious provisions for civil rights protections, a private right of action and preemption.

"I filed this amendment with the goal of sparking renewed bipartisan conversation about how to find common ground and support for a strong privacy foundation we can build on," said Pallone, who earlier in the markup was critical of House Republican leadership's intervention on the APRA that led to a cancellation of an anticipated markup of the comprehensive bill in June.

Pallone also submitted and withdrew amendments seeking to add data minimization standards and data broker rules. He noted the proposed APRA included both provisions and subsequently protected minors more effectively than the amended COPPA 2.0.

Energy and Commerce Chair Cathy McMorris Rodgers, R-Wash., indicated she would be open to "looking for any opportunity" to improve COPPA 2.0 while noting Big Tech lobbying "pivoted" to COPPA 2.0 from the APRA, hinting at a rationale behind the nature of the amended children's bill.

The children's online landscape

Both COPPA 2.0 and KOSA aim to address growing issues around minors' online presence. In the absence of legislation, those problems are only growing.

A day after the House markup, the Federal Trade Commission offered a scathing bipartisan report on the privacy and security practices of nine prominent social media and video streaming platforms. The agency's review, ordered December 2020, generated allegations that the platforms "engaged in vast surveillance of consumers in order to monetize their personal information while failing to adequately protect users online, especially children and teens."

"While lucrative for the companies, these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from (identity) theft to stalking. Several firms’ failure to adequately protect kids and teens online is especially troubling," FTC Chair Lina Khan said in a statement. "The Report’s findings are timely, particularly as state and federal policymakers consider legislation to protect people from abusive data practices.”

The growing issues around social media platforms' privacy and security for minors are not new, and minors themselves are beginning to better understand those problems.

The Bipartisan Policy Center conducted a recent poll of more than 2,000 individuals ages 13-26 regarding social media use and safety. While 75% of respondents indicated they have learned social media privacy and safety strategies, 40% of all respondents indicated they would like to increase their knowledge and learning around online safety.

Additionally, the BPC published a policy brief urging Congress to take a range of steps to help or require online platforms to improve youth privacy practices. Recommendations included establishing "responsible data practice requirements" and "acceptable secondary uses of data" along with implementing strong age verification and parental consent policies.

The children's privacy risks connected to artificial intelligence are also an area of concern given the data scraping concerns cropping up in regards to training AI models.

Common Sense Media released a study analyzing trends in generative AI use among minors. The study revealed seven out of 10 teens have used a generative AI tool while respondents acknowledged using those tools for, among other things, "a joke, planning activities, and seeking health advice."

The survey also looked at parental awareness, noting 49% of surveyed parents have not discussed generative AI with their kids and six out of 10 parents said their child's school does not or may not have concrete rules around the use of generative AI in an educational setting.

"The findings tell us that young people are quickly understanding the potential of generative AI platforms, perhaps without fully grasping the pitfalls, which underscores the need for adults to talk with teens about AI," Common Sense Media Head of Research Amanda Lenhart said in a statement. "We need to better understand their experiences so we can discuss the good and the bad — especially around bias and inaccuracy."

Joe Duball is the news editor for the IAPP.