OPINION

Top 12 data protection action items for global businesses

As global businesses navigate numerous privacy and data protection laws worldwide, GDPR and CCPA compliance can provide a common baseline for meeting requirements in other jurisdictions.

Published
Subscribe to IAPP Newsletters

Contributors:

Lothar Determann

Partner

Baker McKenzie

Graham Doyle

Deputy Commissioner; Head of Corporate Affairs, Media and Communications

Irish Data Protection Commission

Jennifer Urban

Board Chairperson

California Privacy Protection Agency

Michael Will

President

Bavarian State Office for Data Protection Supervision

Global businesses are subject to compliance obligations under privacy and data protection laws around the world. The EU General Data Protection Regulation and the California Consumer Privacy Act are particularly comprehensive, modern and strict data protection laws that most global businesses are subject to. Companies that manage to comply with these two can leverage their efforts to meet requirements under other data protection laws around the world. Based on these laws, the authors prepared a dozen key action items for a panel at the IAPP Global Summit 2026.

Secure data

Without security, there can be no privacy. Companies must implement, document and audit technical, administrative and organizational data security measures — often referred to as technical and organizational measures or TOMs — under most privacy laws around the world. These requirements are reflected in Articles 24 and 32 of the GDPR and the "American Bar Association's Cybersecurity Handbook." 

Effective 1 Jan. 2026, the California Privacy Protection Agency issued regulations with a list of detailed cybersecurity audit requirements. Companies around the world can use these as a benchmark in their efforts to keep data secure and for compliance with the CCPA, which applies to many companies both within and outside California. With a phased implementation based upon businesses' annual gross revenue, executive management team members must submit annual certifications with prescribed attestations concerning the completion of cybersecurity audits to CalPrivacy. California Code Regulations, Title 11, § 7124; id. at § 7120 et seq. define applicability thresholds and deadlines.

Allocating the required resources now to develop accurate audit reports is essential. This provides businesses with insight into the cybersecurity risks they are facing, and prepares them for the likelihood that CalPrivacy, other authorities and private litigants will require submission of audit reports in case of data breaches. In order for companies to obtain accept cybersecurity audit reports by 2028, preparing now is key.

Contract for compliance 

Companies execute statutorily required contract terms with corporate affiliates, external service providers and other business partners to ensure adequate data protection levels for international data transfers, legitimize disclosures that might otherwise be subject to consent or opt-out rights under the GDPR and CCPA or because laws or regulations expressly require certain contract terms. 

Both parties benefit from the execution of data processing agreements mandated by law. Each party may have divergent interests in particular clauses, but if the parties draft contracts to reflect what the law requires — and not more or less — they should be able to satisfy applicable requirements relatively quickly. With contracts' wording closely aligned with statutory requirements, companies can also prove compliance relatively quickly and easily to authorities. In practice, businesses find it much more complex, costly, and time-consuming to negotiate heavily customized contracts with added commercial clauses, such as limitations of liabilities, indemnification obligations or contractual penalties. 

Businesses can achieve compliance more efficiently if they negotiate and document commercial separately from agreements mandated by applicable laws. Authorities regularly review data processing agreements in the context of investigations. With separate agreements optimized to achieve and document adherence to data privacy laws, companies can prove compliance more effectively to authorities as well as data protection officers and privacy professionals during the contract review process. 

Answer data subject requests

With respect to their personal data, individuals have rights to know, delete, limit, correct and opt out of direct marketing messages and sale or sharing of their information. 

If companies do not respond to requests within statutory deadlines, individuals complain to authorities, which can then investigate and impose fines. When responding to access requests, companies can withhold information based on protections of the rights of others and do not have to provide copies of specific documents, as reflected in for example, case studies published by Ireland's Data Protection Commission. Companies must authenticate individuals before providing copies of personal information in response to a data access request, but must not introduce unnecessary friction into opt-out processes. 

Do not sell, share

Under the CCPA, companies must not sell or share consumers' personal information for cross context behavioral advertising if consumers opt out, including by using an opt-out preference signal such as the Global Privacy Control. They must also provide easy opt-out mechanisms, obtain opt-in consent from minors under age 16 and parental consent for minors under 13, and comply with numerous related obligations. CalPrivacy and the California attorney general have focused CCPA enforcement on these obligations. Additionally, CalPrivacy has taken action against data brokers that sold personal information without first registering as a data broker.

At the federal level, the U.S. Department of Justice has enacted a statute and issued a rule prohibiting the sale of certain personal information to countries of concern. Meanwhile, the Federal Trade Commission has pursued companies for selling health and children's information in the context of online advertising. Plaintiffs' firms have also filed a barrage of class-action lawsuits in U.S. courts concerning cookies, pixels and other advertising technologies. In Europe, data protection authorities have imposed significant fines for failure to comply with consent requirements pertaining to cookies and other forms of online advertising. As a result, companies find it difficult to adopt globally uniform compliance strategies, because requirements vary significantly from jurisdiction to jurisdiction concerning opt-in, opt-out and process details. 

CalPrivacy has expressly stated in its regulations that EU-style cookie banners, by themselves, do not satisfy CCPA requirements: "A notification or tool regarding cookies, such as a cookie banner or cookie controls, is not by itself an acceptable method for submitting requests to opt-out of sale/sharing because cookies concern the collection of personal information and not the sale or sharing of personal information. An acceptable method for submitting requests to opt-out of sale/sharing must address the sale and sharing of personal information."

Numerous vendors offer privacy tech compliance automation, but these may or may not fully comply with the relevant legal requirements, and in some cases, their customers have also drawn fines. Given the extremely high enforcement and litigation exposure, businesses have to either invest in jurisdiction-specific compliance mechanisms or refrain from selling and sharing personal information for online advertising purposes.

Organize accountability

Under the GDPR, companies must ensure their data protection officer is properly involved, in a timely manner, in all issues relating to the protection of personal data, including data protection impact assessments for AI development and deployment (Articles 38(1), 35(2)). Under the CCPA regulations, members of the executive management team have to personally sign attestations concerning cybersecurity audits and risk assessments (California Code Regs., Title 11, §§ 7124(c), 7157(c)). Additionally, companies should consider assigning a human steward to each AI system to monitor performance and compliance.

Prepare for incidents

Companies are required, under numerous, evolving laws worldwide, to notify individuals, authorities, investors and corporate customers of personal information data breaches, cyberattacks, serious AI incidents and other security events. 

These laws require formal notifications in official languages, which may need to occur within strict timeframes: for example, 30 days in California, 72 hours under the GDPR, six hours under India's CERT‑In Directions, and one hour under China's Cybersecurity Incident Reporting regulations. 

California enacted the world's first breach notification law in 2002, and all U.S. states and many countries followed suit. Additionally, companies have reporting obligations under AI regulations (Article 73 EU AI Act), infrastructure security laws (Article 23 NIS2), and numerous sector specific laws. 

To comply with these laws, businesses must develop, maintain and regularly test processes to detect, mitigate and report incidents, which requires employee training and exercises.

Minimize data collection

The GDPR requires that personal data be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed. Similarly, the CCPA requires a business's collection, use, retention and sharing of personal information to be reasonably necessary and proportionate to achieve the purposes for which it was collected or processed, or for another disclosed purpose that is compatible with the context of collection.

Under both laws, data subjects must be informed about such purposes, collection, retention and deletion practices. Companies that collect or retain too much personal data can be sanctioned not only under privacy laws but also under consumer protection and unfair competition laws.

Delete data

Once data is no longer needed or a data subject requests deletion, companies must delete the data. 

Inform data subjects

Businesses must inform individuals about their data processing practices. In California, online service operators have been required to post website privacy policies since the California Online Privacy Protection Act was enacted in 2004. 

Privacy policies should be easy to understand, but also contain specific disclosures. Under the GDPR companies must communicate "in a concise, transparent, intelligible and easily accessible form, using clear and plain language" and disclose "where applicable, the fact that the controller intends to transfer personal data to a third country or international organisation and the existence or absence of an adequacy decision by the Commission, or in the case of transfers referred to in Article 46 or 47, or the second subparagraph of Article 49(1), reference to the appropriate or suitable safeguards and the means by which to obtain a copy of them or where they have been made available."

Similarly, the CCPA requires businesses to "use plain, straightforward language and avoid technical or legal jargon" and include several lists of personal information categories using statutorily prescribed terminology. 

Global businesses likely can address applicable requirements by creating one easy-to-read general privacy policy, supplemented by jurisdiction-specific privacy notices to satisfy disclosure requirements mandatory in some jurisdictions and that might confuse individuals elsewhere. 

Automate lawfully

Under Article 22 of the GDPR and a French predecessor law from 1978, Europeans have the right not to be subject to a decision based solely on automated data processing, including profiling, which produces legal effects or similarly significantly affects them. 

CalPrivacy has adopted restrictions and requirements that businesses using automated decision-making technology to make a significant decision concerning a consumer have to implement by 1 Jan. 2027. 

As companies deploy automated decision-making systems in recruiting, employment, enrollment, financial services, healthcare and other sensitive areas, they have to obtain meaningful information and commitments from vendors to provide meaningful information to individuals, document risk assessments, arrange for opt-out and human-review processes, and ensure safety and security. 

To safeguard against unlawful decisions, companies must or should examine whether data used to train machines could perpetuate human biases and retain independent auditors where required or appropriate — an obligation specifically required under New York City Law 144 for automated employment decision-tools and increasingly expected under other jurisdictions' anti-discrimination laws.

Assess impacts

Businesses are required to document data protection impact assessments and risk assessments, which regulators, and in certain situations also individual plaintiffs and others can access. Companies should conduct assessments in three phases. They should first assess the legality and risks associated with a project holistically, with all facts available, subject to confidentiality and attorney client privilege. Next, they will need to decide on if or how to proceed with a particular project. Finally, they will need to document an assessment that explains why the conduct assessed is legal and how the company mitigated or eliminated certain risks. Companies that conflate these three steps risk proceeding with projects that should have stopped or adjusted after an initial review.

Monitor change

Businesses, technologies and laws change rapidly. After issuing a series of digital regulations, from the GDPR in 2016 to the AI Act in 2024, the European Commission published the Digital Omnibus Regulation Proposal aimed at simplifying the EU digital regulatory framework in November 2025. The proposal would delay the applicability of restrictions on high-risk AI systems under EU AI Act, narrow the definition of personal data and relax restrictions on automated decision-making under Article 22 of the GDPR.

In the U.S., the Trump administration has sought to prevent or preempt state regulation of AI even as state and federal legislators continue to enact laws regulating the development and deployment AI systems and protecting consumer privacy. Against this backdrop, companies have to implement processes to adjust their compliance programs as laws and business practices change. Most organizations will also need to address requirements beyond the 12 priority action items recommended in this article, including  sector-specific requirements. Nevertheless, considering and acting on these practices will go a long way to achieving compliance and mitigating risks. Act today.

 

This article reflects personal opinions of the authors and must not be attributed to their agencies, boards, law schools, law firm, clients or others.

CPE credit badge

This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.

Submit for CPEs

Contributors:

Lothar Determann

Partner

Baker McKenzie

Graham Doyle

Deputy Commissioner; Head of Corporate Affairs, Media and Communications

Irish Data Protection Commission

Jennifer Urban

Board Chairperson

California Privacy Protection Agency

Michael Will

President

Bavarian State Office for Data Protection Supervision

Tags:

Data securityLaw and regulationInternational data transfersCCPA/CPRAGDPRPrivacyCybersecurity law

Related Stories