Waiting for congressional action can feel, sometimes, like waiting for Achilles, sulking in his tent, soured on the promise of the spoils of war, while the battle rages on. It has been 143 days since the U.S. Congress convened for its 118th term, but there is no updated draft of the American Data Privacy and Protection Act, the legislative vehicle with the most momentum in the last term.
Of course, Congress has bigger problems now, as legislators cancel their holiday weekend plans, preparing for the still seemingly distant possibility of a vote on a debt ceiling deal. Meanwhile, matters of governance, privacy only a small one among them, sit on the back burner. Not even wily Odysseus could talk Achilles back to the battlefield to help to stop the bleeding.
Rumor has it, when we do see the new ADPPA, it could even have a new name. What other elements might shift? Only time will tell.
As luck would have it, even with all this waiting on comprehensive rules, we are far from bored. Sectoral rules continue to keep us on our toes.
This was the week of education privacy. For starters, the FTC made good on its earlier threat to hold education technology vendors and schools to refined privacy expectations related to applying the Children’s Online Privacy Protection Act Rule in the educational context.
In a settlement with the now defunct Edmodo, the FTC prohibited the company from collecting more information from children than necessary to deliver its service, if it ever resumes operating in the U.S. The complaint reiterates the limitations of school authorizations of vendors, as previewed in the agency’s policy statement. In a quick-hitting blog post, Amelia Vance helped to break down the nuanced takeaways from this settlement for both edtech providers and schools.
FTC watchers also noted the algorithm disgorgement term in the proposed order requires Edmodo — which has already deleted the personal data of U.S. students — to also delete any algorithms it trained using such data. This is the second time this remedy has been deployed in a COPPA enforcement action.
At the same time, we learned the Biden administration is again prioritizing the protection of “youth mental health, safety and privacy online” through a series of coordinated actions. Continuing the student privacy theme, the Department of Education will soon initiate a rulemaking update to the Family Educational Rights and Privacy Act. Afterwards, it will also “update its model FERPA notification and consent forms to ensure that they are clear and concise and will also provide best practice guidance to schools and school districts regarding FERPA and contracting with third-party vendors.”
Highlighting its existing engagement on privacy-adjacent work, the Department’s Office of Educational Technology released a new policy report this week on artificial intelligence and the future of teaching and learning.
On the related subject of “kids online health and safety,” the White House briefing announced a new task force, to be led by the Department of Health and Human Services and the Department of Commerce, in coordination with over a dozen other agencies.
The task force will “recommend best practices and technical standards for transparency reports and audits related to online harms to the privacy, health, and safety of children and teenagers.” The White House describes a two-step process, this year focusing on a review of “the status of existing industry efforts and technologies to promote the health and safety of children and teenagers vis-à-vis their online activities” as well as compiling “best practices to assist parents and legal guardians in protecting the privacy, health and safety of their children who use online platforms.”
Next, “by Spring 2024, the Task Force will develop voluntary guidance, policy recommendations, and a toolkit on safety-, health- and privacy-by-design for industry developing digital products and services.”
A final youth privacy nugget in the White House statement was the mention that the FTC “is also undertaking a review” of its COPPA Rule. The FTC began this review almost four years ago and collected stakeholder comments, but has been quiet about the status of the process ever since.
Though this may have been our week for student privacy developments, it adds to the theme of strengthening sectoral rules that we have seen time and again this year. From expanded health privacy scrutiny, to reproductive privacy protections, to new biometric policies, to state-level youth privacy developments, already-regulated areas of data privacy are being modernized, even as comprehensive rules lag behind.
Here's what else I’m thinking about:
- Ireland Data Protection Commission's decision on Meta’s data transfers dominated the conversation in U.S. privacy circles this week. My insightful colleagues published a detailed analysis of the decision, which invalidates Meta’s reliance on SCCs plus additional safeguards. The decision essentially gives the U.S. and EU a new deadline of approximately six months to finalize the Data Privacy Framework so that a legal mechanism for receiving EU personal data in the U.S. is again in place, though the issue of precisely what to do with already-transferred data remains a head-scratcher.
- President Joe Biden nominated telecom attorney Anna Gomez to fill the Federal Communications Commission seat that has been vacant for over two years.
- The White House Office of Science and Technology opened a formal request for information on National Priorities for Artificial Intelligence, which includes questions about the standards, practices and tools that can identify and mitigate AI risks. Responses are due 7 July. Microsoft has its own blueprint for regulating AI, as showcased by Brad Smith at a D.C. event this week. The plan includes a licensing regime for AI companies overseen by a new agency. If you have ideas too, why not share them with the community at IAPP’s AI Governance Global in Boston on 2-3 Nov. The call for proposals is open through 11 June.
- The Electronic Privacy Information Center released a report on the harms of generative AI. The clear-eyed analysis describes a plethora of both privacy and nonprivacy harms, including data breaches, intellectual property theft, labor manipulation and discrimination, while highlighting the potential for data privacy rules to be part of the solution. EPIC uses the opportunity to again call for the passage of the ADPPA, which would limit the collection of personal data that could train AI systems, or otherwise profile users.
- 31 May at 12:00 PM EDT, the Family Online Safety Institute hosts a briefing on Online Safety in the Free State (Annapolis Waterfront Hotel).
- 1 June at 4:00 PM EDT, IAPP’s Baltimore KnowledgeNet hosts a Women in Privacy discussion (virtual).
- 6 June at 1:00 PM EDT, ITIF’s Center for Data Innovation and R Street host a webinar titled Does the U.S. Need a New AI Regulator?
- 10 June at 1:00 PM EDT, the Future of Privacy Forum and friends will march in the 2023 Capitol Pride Parade. All are welcome to join, but you must register.
- 11 June at 11:59 PM EDT, speaker submissions are due for IAPP’s AI Governance Global in Boston on 2-3 Nov.
- 12 June at 5:30 PM EDT, the Internet Law and Policy Foundry presents its annual Foundry Trivia (Terrell Place).
Please send feedback, updates and gold talents to firstname.lastname@example.org.
If you want to comment on this post, you need to login.