Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.
As legislative sessions in the states wind down, policymakers are making mad dashes to finish concocting their privacy and AI governance brews for the year.
Many such potions will remain incomplete. It looks increasingly likely that 2025 may be a year without a new comprehensive consumer privacy law on the books. But across every other area, from kids’ safety to AI sectoral privacy rules to AI, it has been a busy session. This includes quite a few significant tweaks to existing state privacy laws, informed by enforcement experience and comparisons with the evolving patchwork.
Connecticut has not disappointed in this regard. This active state for tech policy legislation sent a bill this week to its governor’s desk that, among other things, modifies its existing comprehensive consumer privacy law.
Sections 5 through 14 of the bill, as passed by Connecticut’s Senate and concurred by the House, would replace the current privacy law on the books. If signed by the governor, the changes to Connecticut's law will be effective as of 1 July 2026.
Broaden the scope, why not?
Sometimes in the world of data privacy, more is in fact more. Perhaps after reflecting on its relatively small state population, Connecticut is updating its threshold for processing personal data to 35,000 consumers instead of the current 100,000. The current rule puts it roughly in the middle of the pack of states, measured as a percentage of population, requiring that a business reach just under 3% of its residents before privacy requirements are triggered.
Moving forward, the amendment would give Connecticut a hair-trigger scope of less than 1% of the state's population. This threshold is stronger than every state except California and Maryland — not to mention Nebraska and Texas, which do not have thresholds.
Even more importantly for enforcement potential, the amendment would also add two more prongs to the scope determination. If a business offers any amount of Connecticut consumers’ personal data "for sale in trade or commerce" or processes any amount of sensitive data from Connecticut consumers, it will be subject to the law. No volume threshold required.
But not so fast. The updated scope would also add exemptions to remove some highly regulated industries, including specific types of financial, insurance and health companies — plus political committees — from coverage under the law. And an exemption for data subject to the Gramm-Leach-Bliley Act would align Connecticut with the approach in most states.
A swiftly tilting planet of sensitive data
As ever, policymakers cannot resist tinkering with the list of sensitive data types, which remains one of the most divergent areas between comprehensive state privacy laws. Connecticut would add to the patchwork with new covered data — largely contiguous with at least one other state — while also creating a new substantive requirement for when the processing of sensitive data is allowed.
The updated list of sensitive data types would also:
Modify health data to include treatment as well as disability status, in addition to the existing "condition or diagnosis" language, cobbling together language from a few states to make a relatively broad health data definition.
Broaden the knowledge standard for children's data to include any individual for whom the controller "has actual knowledge, or wilfully disregards, is a child," rather than the existing "known child" language.
Drastically expand the biometric and genetic data categories, eschewing the existing purpose-based limitation to not only include this data regardless of its purpose of collection but also to include "information derived therefrom," that is inferences created based on genetic or biometric data types.
Add neural data to the list, joining a small but growing list of states, including California and Colorado, with extra protections for this emerging frontier of personal data. In the amendment, neural data is defined as "any information that is generated by measuring the activity of an individual's central nervous system." As the Future of Privacy Forum's Jameson Spivack has explained it, this is a narrower definition from the other states, which also include at least some level of peripheral nervous system data.
Add "status as nonbinary or transgender," joining Delaware and Oregon in recognizing this data as sensitive.
Add financial data and government ID numbers to the list, not to be outdone by California.
On substance, if given effect, the amendment will require not only the consumer's consent to process personal data, but a minor data minimization element too: a showing that the processing is "reasonably necessary in relation to the purposes for which such sensitive data are processed." The legislator considered and rejected stronger data minimization language in this draft bill, leaving us with this tweak as well as the addition of "proportionate" to the existing "reasonably necessary" requirement in the general data minimization provision.
A unique blend of other updates
If states are the laboratories of democracy, state Sen. James Maroney, D-Conn., continues to embrace a mad scientist image, crafting new creative compounds as he observes the trajectory of other states and the broader policy conversation.
His 2025 amendment would also incorporate some other notable tweaks, which often appear inspired by the legislative innovations in other states from recent years, including:
New privacy notice requirements, including a description of how consumers may appeal decisions about their requested exercise of data rights, disclosures about targeted advertising and data sales, and a "statement disclosing whether the controller collects, uses, or sells personal data for the purpose of training large language models."
New rights to review and question automated decisions based on profiling, reminiscent of language in Minnesota's privacy law. In the specific situation of a housing decision, the amendment also includes the ability to request a correction and re-analysis. The definition of automated decisions is also broadened slightly to include decisions impacting anyone, not just the specific consumer involved, and new impact assessment requirements are outlined whenever profiling provisions are triggered.
An Oregon-inspired, but much narrower, requirement to respond to consumer requests for a list of third parties to which a controller has sold personal information.
Aligning with many other states on the definition of publicly available data, adding "widely available media," while exempting biometric data from inclusion.
Clarity on when privacy impact assessments are required, with a list of situations that trigger the "heightened risk of harm" to consumers standard.
It is also worth flagging that the legislative vehicle containing the amendment to Connecticut's privacy law also includes new rules for social media platforms as well as some tweaks to kids' safety requirements, but those are a subject for another day.
Please send feedback, updates and potion recipes to cobun@iapp.org.
Cobun Zweifel-Keegan, CIPP/US, CIPM, is the managing director, Washington, D.C., for the IAPP.