This past year was another jammed one for privacy teams and it was not easy to stay on top of all the privacy litigation, enforcement trends, and new laws and regulations in the U.S.
Distilling those into focus areas and actionable steps can be an important way to make sure your privacy program has the right strategy and priorities for 2025.
Here are 10 areas to focus on for the year ahead.
1. Tracking and targeted advertising
Regulators have continued to issue guidance and bring enforcement actions against companies for how they describe tracking practices and consumer choices, including for advertising and analytics.
This year eight additional states will join California and Colorado in requiring browser signals to be honored as opt-outs of "sales" and targeted advertising. Companies will continue to face legal demands and lawsuits for common website and app practices under wiretap laws like the California Invasion of Privacy Act, and potentially under other laws like the California Consumer Privacy Act.
These developments and trends mean tracking and targeted advertising practices should be a focus area this year.
Consider these steps:
- Validate understanding of all ways data is shared for targeted advertising — including by pixel, tracking technology, list upload, and server-to-server integration.
- Identify what data is passed, testing old assumptions that data is anonymous/deidentified.
- Align what is said about practices and choices in privacy notices, cookie banners and preference management pages.
- Get consent before data is shared — for example, if sensitive data, or worried about CIPA — and have effective opt-out processes for sales and targeted advertising.
- Confirm the tracking technology assessment process: one, identifies and limits data passed; two, confirms processing purposes; and three, classifies ones for "sales" and targeted advertising so opt-out processes work.
- Effectively govern use of tracking technologies on your company's website and apps.
2. Sensitive data collection, consent and use
Last year, state laws expanded definitions of sensitive personal data and additional laws took effect requiring consent to process sensitive personal data. State and federal regulators have focused on required sensitive data consents and uses, including with enforcement actions for location and other inferentially sensitive data sharing by websites and apps.
This year, under its Consumer Data Privacy Act, Minnesota will require small businesses — that may not be in scope for other state privacy laws — to obtain opt-in consent before sensitive data can be sold. The Maryland Online Data Privacy Act will also prohibit sensitive data collection or processing unless it is "strictly necessary to provide or maintain a specific product or service requested" by a consumer, and selling sensitive data will be prohibited.
Take these steps:
- Look carefully at your company's privacy program, including policies and assessment processes, for the expanded types of sensitive data.
- Validate opt-in consents are obtained where required.
- Confirm uses are "strictly necessary" in Minnesota.
- If sensitive data is sold, for example, used for targeted advertising, get required consents, post required notices — Texas has a specific notice — and stop where prohibited, as in Maryland.
3. Data protection assessments
Ten states now have requirements in effect to document data protection assessments before certain personal data processing activities start.
The California Privacy Protection Agency has proposed significant additional requirements for these assessments that may be finalized this year. These include requiring assessments in two scenarios not under current state laws: when automated decision-making technology is used for "extensive profiling," like observing people in public places or when using information technology systems; and when processing personal information for certain artificial intelligence or automated decision-making training purposes.
The proposed California regulations and Minnesota's law, which takes effect at the end of July, may require changed assessment contents, with Minnesota requiring details on how the processing aligns with policies on several privacy-related topics. California may require proactive submission of assessment details to the CPPA, and executive certifications to the CPPA about privacy assessment processes and compliance.
With more state regulators empowered to request these assessments, it's important to make sure a data protection assessment process keyed to these specific U.S. state law requirements is effectively operating.
Three actions to consider:
- Confirm business processes trigger assessments when required under the laws.
- Validate assessments address all required elements, including the new Minnesota requirements.
- Maintain regulator-ready assessment copies, making process changes as needed to protect attorney-client privilege.
4. AI and automated decision-making
If your company uses AI or automated decision-making technologies, 2025 may require more focus on governance, assessments and individual rights. For example, new or updated practices may be needed by early 2026 when Colorado's AI Act takes effect, and California may finalize automated decision-making technology regulations.
Colorado's law will require entities using AI in certain use cases to have an AI risk management program, conduct AI risk assessments annually on in-scope AI uses, provide specific disclosures to consumers before use, address new individual rights, provide detailed information online about the in-scope AI uses and risk management practices, and report incidents of discrimination to the attorney general.
California's proposed regulations would also impose new obligations for entities using ADMT for certain decisions, profiling and public monitoring, or certain ADMT training purposes. New assessment processes, consumer notices, and individual opt-out and access rights would also be required.
Identify practices that are in-scope for these laws, and then take these steps:
- Confirm consumers know when they're interacting with AI.
- Assess where existing AI governance practices may need to be enhanced, including to develop and maintain information needed in assessments and disclosures.
- Validate existing processes trigger assessments and reassessments when required.
- Update assessment processes where needed to address required elements.
- Design processes to address new individual rights.
- Plan for required consumer and website disclosures.
- Confirm an AI incident response plan is in place and addresses required regulator notifications.
- Align on cross-functional roles and responsibilities for each of the above.
5. Biometric data processing
If your company deals with biometric data of consumers or employees, work may be needed to verify and enhance compliance practices. Regulators continue to be focused on biometric privacy, as we saw with Texas' first enforcement action, and USD1.4 billion settlement, under its 15-year-old biometric privacy law. In July, Colorado's new biometric privacy law will take effect.
Colorado's law has some similar requirements to Illinois' Biometric Information Privacy Act, but there are some net new requirements that other state biometric privacy laws do not contain.
For example, the Colorado law and regulations have specific content requirements for notices that must be provided before biometric identifiers are processed, require annual reviews to confirm that biometric identifiers can still be retained, have tighter timelines for deleting biometric identifiers and require data security breach response protocols.
It also prohibits requiring individuals to consent to processing of biometric identifiers unless it's necessary to provide a good or service, and there are restrictions on providing different prices or levels of service for people who exercise individual rights for biometric identifiers.
In the employment context, there are additional requirements for when and how consent needs to be obtained.
Finally, there is a unique new individual right for people to request information about how their biometric data was obtained, why it is collected or processed, the third parties it is disclosed to, and what biometric data is disclosed.
To tackle this area, consider:
- Confirming where consumer and employee biometric data is processed in your company, and what vendors or third parties it is shared with.
- Validating that prohibited disclosures of biometric identifiers stop by July.
- Addressing biometric notice and consent requirements, including in employment contexts.
- Checking that biometric collection practices comply with applicable restrictions.
- Determining that retention periods and deletion processes for biometric data are defined and followed.
- Confirming incident response policies address incidents for biometric data.
- Updating or drafting publicly available policies to address new requirements.
- Establishing annual review processes for retained biometric identifiers where required.
6. Minor data collection and use
This past year a court ruling allowed parts of the California Age-Appropriate Design Code to take effect, and the Maryland Age-Appropriate Design Code and Connecticut privacy law amendments for minor personal data also took effect in October. These and laws that come online this year will create new privacy compliance obligations for entities that deal with data of minors — including entities that don't knowingly or intentionally process minor personal data. For example, Maryland's law applies if it's "reasonable to expect" minors will access an online product based on criteria in the law.
In June the New York Child Data Protection Act takes effect, and rulemaking is underway. The act will restrict when minors' personal data can be processed without "informed consent" that meets specific requirements under the law. Device signals declining consent need to be honored, and there are required contractual provisions for any third parties — vendors, processors, etc. — that minors' personal data is disclosed to. There will be new obligations to destroy personal data once entities learn a user is a minor, and to notify all third parties that the specific user's data was disclosed to.
In October, the Maryland Online Data Privacy Act and amendments to Colorado's Privacy Act pertaining to processing of the personal data of minors, under age 18, take effect. Maryland will prohibit sales of minors' personal data or uses for targeted advertising.
In Colorado, consent will be required for certain uses of minors' personal data — targeted advertising, sales, and profiling — collecting minors' location data, which will also have new restrictions, and designs that drive minors' engagement with online services or products. Colorado also has additional requirements for assessments when minors' personal data is processed.
Validate which minor data privacy requirements apply to your company's operations and plan to address new obligations, including for:
- Privacy default settings.
- Consent practices.
- Governance, management and limited processing of minors' personal data.
- Documented assessments.
- Contracting processes.
Test assumptions that your company is out of scope by understanding product, marketing, and business stakeholder data and estimates on customer ages.
7. Data products and services, and data collection methods
There have been a lot of changes in laws and enforcement trends for "data brokers," and 2025 is a good time for a fresh look at whether any of these laws apply to your company. The laws vary significantly and can apply to various business practices that use and disclose personal data your company doesn't obtain directly from the data subjects. If your company has products, services or partnerships that disclose personal data it did not collect directly from the data subjects, some of these laws may apply.
For example, under new regulations approved by California's privacy agency, a company that "sells" personal data — that is, discloses it to a vendor for targeted advertising — that it didn't obtain directly from the data subject can be a data broker, even if it has a direct business relationship with data subject.
If in scope, work may be needed this year to plan for the California Delete Act requirements and integration with the state-managed data broker deletion request system that will come online in 2026.
The new federal privacy law that was enacted in 2024 — the Protecting Americans' Data From Foreign Adversaries Act — also applies to a variety of business models and requires robust compliance processes to address. Both the U.S. Federal Trade Commission and state regulators — including in California and Texas — have been active in enforcing new and existing laws against data brokers.
Assess if your company is in scope by:
- Identifying personal data obtained from sources other than the data subject.
- Confirming whether any of that data is shared with or disclosed to third parties — for targeted advertising or other "sales," for example — or business customers.
- Determining if exclusions under these laws apply, such as based on the personal data types or sources.
If in scope, focus on these areas:
- Do required registrations for state laws.
- Address third party/business customer diligence and oversight requirements for the federal law and end data sharing that is unlawful. It may be illegal to provide data to certain third parties/business customers, including some U.S. companies.
- Start planning for California Delete Act requirements.
8. Consumer-facing UI and flows
Many state privacy laws prohibit "dark patterns" in customer-facing journeys and interfaces where personal data is collected or privacy choices are presented, and last year several regulators focused on this in investigations and guidance. These requirements and regulator expectations can call into question longstanding approaches to user interfaces and experiences.
If you haven't done a recent review of your company's customer-facing UIs, a little time focusing on this space may reduce the risk that your company will receive regulator attention.
Consider:
- A proactive review of primary customer journeys, and UIs, including for privacy and preferences.
- Revisiting previously approved practices on customer choices, consents, and opt-outs for data uses.
- Training and socializing these new expectations with responsible teams.
- Updating privacy assessment processes to confirm UIs are reviewed.
9. Documented privacy program policies and procedures
Effective privacy programs often require documented policies and procedures in a number of domains, and regulators frequently request these in enforcement actions.
At the end of July, Minnesota's law will require data controllers to have policies and procedures on a number of topics, including that: identify the privacy officer or person responsible for compliance with the law; fulfill data subject rights requests; address security and maintenance of a data inventory; address data minimization requirements; prevent personal data retention when it is no longer needed; and identify and remediate violations of the law.
A review and revision of existing policies and procedures to address these topics can help prepare for Minnesota's law and validate that your company follows privacy law requirements end-to-end.
10. Calibrate data collection practices
In October, Maryland's privacy law will prohibit collection of personal data unless it is reasonably necessary and proportionate to provide or maintain a specific product or service requested by a consumer. This may not permit common practices of collecting personal data for legitimate business purposes where the data collection and use is generally described in a privacy notice.
To help address compliance, consider:
- Updating internal privacy policies that pertain to when and why personal data can be collected.
- Refining privacy assessment processes to confirm personal data collected is necessary for the consumer-requested product or service.
- Validating existing data collected is needed and proportionate for this purpose.
- Training business stakeholders on this new requirement.
Sam Castic, AIGP, CIPP/US, CIPM, FIP, PLS, is a partner at Hintze Law.