As of 28 June, this year, the oldest comprehensive U.S. state privacy law — the California Consumer Privacy Act — will be old enough to enter the second grade. Even though U.S. state privacy laws are fresh-faced newcomers, they can still struggle to account for changes in the rapidly developing technological landscape.

To address shifting needs and new issues, lawmakers have proposed a flurry of amendments to the 19 comprehensive U.S. state privacy laws. While some amendments implement changes that mirror language in other states' laws, some respond to developments in public priorities or to highly public enforcement actions.

Some commentators have said the law trails behind technology, but these amendments look forward, seeking to keep up with the times so legislation can be flexible enough to adapt alongside technology. Even if the bills fail, they can still signal legislators' values, priorities and the focus of their attention. Taken collectively, proposed amendments offer a window into the key trends shaping the future of U.S. state privacy legislation.

The opt-out era

Several proposed amendments aim to make exercising consumer privacy rights easier. The first way they do this is by introducing requirements around allowing consumers to use an automated tool to opt out of the sale or sharing of their personal information or to exercise similar privacy rights based on the consumer's jurisdiction. This idea goes by many names, including universal opt-out mechanisms, opt-out preference signals, consumer choice mechanisms or privacy preference tools. There are technical differences between these terms, but the general idea is the UOOM allows the consumer to automatically indicate their privacy preferences instead of clicking on cookie banners on a site-by-site basis.

The most widely known UOOM tool is the Global Privacy Control, a protocol available as a browser extension or an option built into some browsers. To simplify, if a consumer has GPC enabled, their web browser automatically sends a signal with each website request that tells the site the consumer wants to exercise their privacy rights as given by their jurisdiction. If the site is configured to accept the GPC signal, then it can voluntarily choose to comply with the user's preferences. The IAPP has a resource page with more information about GPC.

Most of the laws on the books implement UOOMs by allowing consumers to designate an "authorized agent," which can be a person or a "technology, including a link to an internet website, an internet browser setting or extension, or a global setting on an electronic device, that allows the consumer to indicate the consumer's intent to opt out of (personal information) processing." Slight variations on this wording appear in all of the state privacy laws and rules except for Indiana, Iowa, Kentucky, New Jersey, Tennessee, Utah and Virginia. Rhode Island mentions authorizing an agent but does not say whether the agent can be a technology. In states with this provision, businesses must comply with UOOM requests.

Several amendments would incorporate or increase UOOM applicability in their respective laws. Tennessee's legislative session has ended, but Senate Bill 663 would have added the "authorized agent" provision into its law, mimicking the language seen across the other states' laws. California's Assembly Bill 566 would require all businesses that "develop or maintain a browser" to incorporate a UOOM setting, including mobile browsers. Texas House Bill 5495, which failed, would have added a similar provision requiring controllers that operate browsers to "automatically recognize and comply with a consumer's global privacy control choices."

Additionally, although Texas state privacy law already includes the "authorized agent" provision, HB 5495 would have added a section that defines "global privacy control" and requires controllers to "treat the consumer's use of a global privacy control as a valid request submitted by the consumer not to sell, share, or disclose the consumer's personal data." Violators of that provision would be subject to civil penalties of up to USD5,000 per violation, with repeat offenders fined an amount that is "sufficient to deter future violations of that section, as determined by the court."

These amendments show lawmakers hear and understand their constituents' complaints that consent management for privacy choices is difficult and tedious. Around the world, users overexposed to repeated data processing consent requests have started to get consent fatigue, also called cookie fatigue. Some banners pop up in the middle of the screen and obscure the content the user was trying to view, which can cause frustration and lead to the user trying to click the option that makes the pop-up go away the fastest or sometimes exiting the page entirely. Other consumers don't read or don't trust promises not to process their information, so they may not indicate their actual preferences, which can tamper with marketing data or analytics.

To combat cookie fatigue, the European Commission has coordinated a voluntary initiative to simplify the cookie consent process by, for example, allowing users to indicate their cookie preferences with a UOOM. These amendments take that concept a step further by imposing legal obligations on businesses to comply with the consumer's signals. As states refine how adults assert control over their data, they are also turning their attention to the privacy of younger users.

Guardrails for growing up

Many of the U.S. state privacy amendments proposed this year would increase privacy protections for children. This reflects a broader trend in the privacy field; for example, this year the FTC published final amendments to the Children's Online Privacy Protection Rule, which went into effect 23 June 2025. At IAPP's Global Privacy Summit, Federal Trade Commissioner Melissa Holyoak explained in her keynote speech that COPPA amendments seek to give parents "meaningful choices to protect their children, not mere fig leaves to insulate the operator from liability for design choices that expose kids to sexual predators, obscene content, violence or other such harms."

State legislators are addressing concerns about children's privacy on social media platforms via amendments like Virginia's SB 854, now passed into law, which requires social media platforms to age-screen users and limit any minor's use of the social media to one hour per day by default, with parental consent needed to increase or decrease this limit. Connecticut's SB 1295, which also passed, requires social media platform operators to create an online safety center, a cyberbullying policy and a plan for mitigating or eliminating any heightened risk of harm to minors, which it defines expansively. It also requires platforms to refrain from using features to keep a minor scrolling as well as a setting that, by default, bars adults from sending unsolicited communications to minors.

Other bills come at this from a different angle: they regulate collecting and processing children's data. Oregon's HB 2008, which passed, entirely prohibits controllers from selling a minor's data or processing their data for purposes of targeted advertising or to make legally significant decisions. It also raises the upper age limit from 15 to 16, similar to Iowa's failed Senate File 143, which would have changed the definition of "child" from those under 13 to those under 18. In Colorado, processors must obtain a parent or guardian's consent before processing a child's data, and a rider in SB 276 amends the Colorado Privacy Act to require consent before selling a child's data as well.

In laws that regulate how controllers process a minor's data, there is a standard for when the controller can be liable that varies by jurisdiction. For example, the Montana Consumer Data Privacy Act has specific provisions that apply when a controller "has actual knowledge" that a consumer is a minor. Montana's SB 297 would expand that standard to when a controller "has actual knowledge or willfully disregards" that a consumer is a minor. Connecticut has passed SB 1295, which expands the standard even more; currently, the Connecticut Data Privacy Act encompasses when a controller "has actual knowledge, or willfully disregards" a consumer is a minor, but SB 1295 changes it to protect "consumers whom such controller has actual knowledge, or knowledge fairly implied based on objective circumstances, are minors."

Texas, which has positioned itself as a strong enforcer of its privacy laws, proposed a bill broadly limiting AI development and deployment that includes provisions to protect children. HB 149 prohibits developing or distributing an AI solely intended to produce sexually explicit deepfakes or child sexual abuse material. It also forbids intentionally developing or distributing conversational AI systems that can simulate or describe sexual conduct while impersonating or imitating a child.

These are far from the only approaches legislators are taking to address increasing children's privacy. For example, some states have passed standalone age-appropriate design codes, social media addiction laws or laws strictly limiting the collection and processing of children's data. Kids aren't the only ones getting new layers of protection though; some amendments zero in on what kinds of data deserve special treatment.

Redrawing the location privacy map

States are also expanding protection for sensitive data, with some specifically focusing on geolocation data. For example, Colorado's SB 25-276 expands the definition of "precise geolocation data" and adds precise geolocation data as a subcategory of sensitive data. This imposes additional obligations on collecting and processing geolocation data, including requiring consent before collection.

Texas's HB 4636 adds "real-time driving data" to the definition of sensitive data. It requires controllers that sell precise geolocation data to include "NOTICE: We may sell your precise geolocation data" in its privacy notice. This bill follows in the wake of enforcement actions by Texas Attorney General Ken Paxton against vehicle manufacturers and insurance providers that allege widespread data collection about drivers.

Similarly, Oregon's HB 3875, now passed, changes the current applicability of the law to specifically include "motor vehicle manufacturer(s) and any affiliate of a motor vehicle manufacturer that controls or processes any personal data obtained from a consumer's use of a motor vehicle or any component of a motor vehicle" regardless of their size. Moreover, HB 2008 from the same state would completely bar selling precise geolocation data. Oregon doesn't stop there, though — HB 3899 would forbid processing any category of sensitive data for the purpose of targeted advertising or for profiling for a legally significant decision "whether the controller has the consumer's consent or not." It also proscribes selling sensitive data for any reason, with or without consent.

Minnesota's failed SF 2940 would have taken a narrower approach by allowing processing and sharing sensitive data with consent, but it required that a controller acquire "separate and distinct" consent to process a consumer's health data. A controller would also have needed to obtain separate and distinct valid authorization to sell any category of personal data.

Connecticut's SB 1356, now passed into law, amends "sensitive data" to include similar categories as those found in California and Colorado: neural data, data revealing a disability or treatment, status as nonbinary or transgender, financial information data and government-issued ID numbers. Genetic and biometric data were already included in sensitive data, but the amendment adds information derived from genetic and biometric data.

It also significantly changes the Connecticut Data Privacy Act's applicability thresholds. Instead of the law applying to entities that process the data of more than 100,000 consumers per year, SB 1356 lowers the threshold to 35,000 per year. Connecticut's law also applies to entities that process the data of more than 25,000 consumers per year and derive more than 25% of their gross revenue from selling personal data. SB 1356 reduces that to 10,000 consumers per year and 20% of gross revenue. The bill also adds blanket inclusion for entities that control or process sensitive data or offer consumers' personal data for sale in trade and commerce.

SB 1356 is just one of several amendments with the goal of tweaking what kinds of data the law protects, limiting access to that data or reducing what a controller may do with it. This current wave of bills signals a deliberate shift toward more granular and assertive data governance frameworks. Many proposals will fail, but their cumulative direction is clear: states are filling perceived regulatory voids with increasingly sophisticated privacy legislation.

These laws are still relatively young, but some of them are clearly entering a new phase: refinement and maintenance. These amendments help the law grow to fit emerging technologies and priorities. Taking cues from the world around them, U.S. state privacy laws are maturing into more permanent but still evolving fixtures of the legal landscape.

C. Kibby is a Westin Research Fellow for the IAPP.