Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains. 

On 19 Nov. 2025, the European Commission is due to publish a Digital Package on simplification. This is intended to simplify, clarify and improve the EU’s digital laws. A leaked version of the Commission's proposals published on 6 Nov. shows there will be two legislative instruments: One primarily focuses on the EU General Data Privacy Regulation, ePrivacy and Data Act; the other focuses on the EU AI Act. The text that will be published on 19 Nov. will change from the leaked version. For a preview of what the Commission has in mind for the reform of the GDPR, ePrivacy, and Data Act via the Digital Omnibus, read on. 

One breach portal to rule them all

There is a proposal for a single reporting point for all incidents, following the "report once, share many" principle. This would address requirements under the GDPR, Network and Information Security Directive, Digital Operations Resilience Act and the eventual Critical Entities Resilience Directive. The reporting obligation for communications service providers under the ePrivacy Directive would be repealed on the basis that it would be superfluous.

The threshold for reporting personal data breaches to data protection authorities would be raised — only incidents posing a high risk to data subjects would be reported. The period for reporting to authorities would be increased to 96 hours and a single incident reporting form would be introduced.

Taking the sting out of DSARs used as ammunition in employment tribunal proceedings

Where data subjects use data subject access requests for purposes other than protecting their data, the controller would be able to either reject the DSAR or charge an appropriate fee for handling it.

Rethink the cookie consent rule

The Commission's introductory material notes that the current approach of detailed consent overlays isn't easy for users to understand and is perceived as nuisance. At the same time, it imposes a considerable cost to business. 

The Commission proposes that there should be no need for consent when cookies and similar technology is used for aggregated audience measurement and security purposes. Whenever the tracking technology is used to process personal data, then the GDPR would trump ePrivacy and the controller could look to any lawful basis under the GDPR, not just consent.

In the long term, the Commission advocates moving to universal settings-based mechanisms for users to indicate their preferences — either consent or objection. Standards bodies would be instructed to develop benchmarks for machine-readable signals. If necessary, the Commission would instruct manufacturers of browsers, operating systems and app stores to allow users to set preferences via their technology. After a sixth month grace period, all publishers would be obliged to respect such signals. Media service providers, however, would be exempted from this, recognizing that advertising revenue is indispensable for independent journalism.

Consistent approach to DPIAs across the EU

The European Data Protection Board would be required to develop lists setting out when data protection impact assessments are required. These lists would be valid across the EU and would replace national lists. The EDPB would also be commissioned to develop a common EU-wide template and methodology for conducting a DPIA.

A more targeted approach to special category data with more exemptions

Data would only qualify as special category data if it "directly revealed" information about an individual's racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, health, sex life or sexual orientation — marking a step back from the broader inferenced-based rule set out by the Court of Justice of the European Union in its Lindenapotheke ruling.

There would be two new derogations from the prohibition on processing special category data. The first would allow residual processing of special category data for the development and operation of an AI system or model, provided attempts are made to identify and remove the data and — where this would be disproportionate — methods are used to prevent disclosure of the special category data to others as output data.

The second derogation regards when biometric data is used to prove ID, when it is under the sole control of the user, i.e., on-device biometrics.

Processing of personal data to train AI models is a legitimate interest

A new provision would confirm that processing personal data for model training is a legitimate interest. Nonetheless, it remains necessary to demonstrate that the processing is justified and to conduct a balancing test.

No need to tell data subjects what they already know

Where a controller directly collects data from a data subject, a privacy notice may not be required if there are reasonable grounds to believe that the individual already knows the controller's identity, the purpose of processing the personal data, and how to contact any data protection officer. This exemption would not apply if the data were shared with third parties or third countries, used for automated individual decision making, or is otherwise used for high-risk processing.

Minor flexing of automated individual decision-making rules

Entirely automated individual decisions with legal or similarly significant effect can be undertaken, such as when necessary for a contract. An amendment stating the same decision could be made manually wouldn't stop this provision being relevant.

A subjective approach to the definition of personal data 

The fact that another entity may be able to identify data subjects wouldn't make the data personal in the hands of the current holder. The relevant question would be whether it's reasonably likely that this particular entity can identify the data subject using all means likely to be used. The fact that a recipient of disclosed personal data may later be able to identify data subjects does not, in itself, make the data personal for the current controller — a position supported by the Scania case, as confirmed by the Single Resolution Board.

The Data Act small exclusions to 'cloud switching' and data sharing introduced 

Custom-made data processing services, i.e.,  where a higher level of adaptation is needed for the service to be usable, a specific lighter regime would apply. Small and medium-sized enterprise as well as small mid-cap sized data processing service providers would also get an out. However, in each case, this would only apply to contracts already in place before 12 Sept. 2025 — making it of limited use.

In response to the political pressure of many EU based companies, data holders should not be obliged to disclose trade secrets if there was a high risk that such information would be unlawfully transferred to third countries with weaker protections than those provided in the EU. What this means in practice and how such risk should be substantiated remains unclear — general risk vs. objective reasons.

The Data Governance Act and the Free Flow of Non-Personal Data Regulation would be repealed but with relevant provisions amended and saved in either the Data Act or the Open Data Directive. The Data Act would inherit provisions on data intermediaries and mandatory government access to data. Both would be substantially amended. The requirement for data intermediaries to be licensed and legally separate from other entities would be replaced by a voluntary licensing scheme and a requirement for functional — not legal — separation of the intermediary services. Mandatory government access to data would be restricted to public emergencies — not the broader and woollier exceptional need in the current Data Governance Act. 

Last, the Commission proposes the entire repeal of the Platform to Business Regulation. 

Ruth Boardman is co-head of the International Privacy and Data Protection Group at Bird & Bird.