This resource provides a comprehensive summary of changes to UK data protection law.
Last updated: July 2025
Contributors:
Navigate by Topic
The U.K.’s efforts to reform data protection law have finally come to fruition; the Data (Use and Access) Act having received Royal Assent and come into being on 19 June 2025.
This completes a journey that began on 23 October 2024 with the first reading of the Data (Use and Access) Bill in the House of Lords. Although less extensive than the previous government’s proposed legislation, the Data Protection and Digital Information (No.2) Bill, the Act still makes a significant number of changes to U.K. data protection law. Some of these modifications will make data protection compliance slightly easier for organizations — liberalizing the requirements for automated individual decision-making is a good example. There will be a small number of obligations though, too — in particular, privacy notices will have to be amended to refer to a new data subject right to complain. The U.K. Information Commissioner’s Office will also be re-constituted and given strengthened powers — including in relation to enforcement of ePrivacy breaches.
This is a comprehensive summary of the changes to data protection law. The ICO is planning to update and amend its guidance over the coming months; it will be important to follow this to gauge the real practical impact of some of the provisions. Many organizations will have benchmarked their privacy programs against the EU General Data Protection Regulation, and some changes will make life easier, others more difficult, by comparison with GDPR.
The Act contains a myriad of provisions which go beyond data protection — although these often overlap with privacy thematically. For example, the Act confirms long-proposed provisions that relate to digital ID verification services and provisions that aim to give customers real-time access to data processed by businesses via smart data schemes — rather in the way that open banking already operates. The Act also makes changes to the Online Safety Act. In the interests of brevity, these adjustments are not addressed in this note.
Legend
The barometer below, used throughout the resource indicates the degree of change in U.K. data protection law from -10, easier, to +10, more difficult, in comparison to the U.K. GDPR.
More special category data
The Act includes a new mechanism to allow the secretary of state to introduce more classes of special category data, per Section 74, adding new Article 11A to the U.K. GDPR. This will be a power to be exercised by secondary legislation under the affirmative resolution procedure. As there is a prohibition on processing special category data, this provision could have wide-reaching effect if it is used. One proposed amendment to the predecessor DPDI Bill before it was dropped was the addition of all children’s data as special category data. Proposals such as this could have dramatic impact with little legislative oversight.
Data transfers
The broad structure of the GDPR is retained — namely, that you look first to adequacy decisions, then to safeguards, and to derogations last. The obligation for exporters to consider the risks posed by the transfer, in addition to implementing a safeguard, is also retained. However, within this familiar framework, there are changes to terminology and a more risk-based approach.
In a shift of language, the Act replaces the somewhat condescending notion of an adequacy decision with the more diplomatically tactful "data protection test." This is also the term the Act uses instead of transfer impact or transfer risk assessment.
The secretary of state must consider if countries or international organizations pass the data protection test. The data protection test requires the secretary of state to consider if the “standard of protection provided for data subjects with regard to general processing of personal data in the country or by the organisation is not materially lower” than the equivalent standard in the U.K. The factors to be considered are more flexible than those in the GDPR, covering respect for the rule of law and human rights; the existence and powers of an enforcement authority — unlike the EU test, independence is not stipulated; redress; onward transfer rules; relevant international obligations; and the constitution, traditions and culture of the country. In addition, the desirability of transfers of data to and from the U.K. can be considered — although this does not remove the need to satisfy the data protection test.
For situations when a country has not been held to satisfy the data protection test, all the usual safeguards, such as standard contract clauses, binding corporate rules, etc., then remain. Some gaps in the current regime have been addressed. At present, mechanisms that are most suitable to public authorities — legally binding instruments and administrative arrangements — can only be used with other public authorities, not for transfers to private sector organizations; the Act widens this for transfers to persons who are exercising functions of a public nature.
Data exporters using a safeguard must also consider if the data transfer test is met. Again, this involves considering if the standard of data protection is “materially lower” than that provided for by the U.K. GDPR. The test that exporters must apply is less detailed than that to be applied by the secretary of state. In addition, the Act says that transfers can take place so long as the exporter, “acting reasonably and proportionately,” considers the test is met.
The exporter is expressly allowed to consider the nature and volume of data transferred under Article 46 (1A) and (6 –7). Lastly, a rule making power is introduced to allow the secretary of state to approve new clauses which, of themselves, can ensure the data protection test is met (Article 47A). If such clauses were introduced, this would entirely remove the need for exporters to undertake transfer risk assessments. All these provisions should give organizations scope to streamline transfer risk processes for low-risk data transfers — although existing guidance from the ICO already makes this a possibility in the U.K., unlike in the EU.
All the current derogations are retained. The secretary of state is given a rule-making power to specify situations when transfers will, or will not, be considered necessary on substantial public interest grounds. In addition, the secretary of state is given powers to introduce do-not-transfer lists, when transfers to a particular third country may be restricted for important reasons of public interest.
e-Privacy
The Act introduces new exemptions from the cookie consent requirement. These include processing:
- Solely for the purpose of analytics, carried out with a view to improve the website or information society service, to optimize content display or to reflect user preferences about content display.
- Closely connected with delivering an information society service — such as where processing is strictly necessary to protect information for security purposes, to prevent or detect fraud or technical faults in connection with the requested service, or to facilitate automatic authentication or maintaining a record of the selections made on a service by the subscriber or user.
It remains to be seen how useful the analytics exemption will be. Clear information must still be given about the processing and there must still be an ability to opt-out. So, it seems that overlays will likely still be required, albeit they could include an opt-out check box rather than a check for consent. Any organization that also uses advertising cookies will still have to obtain consent for those cookies. A mixture of opt-in and opt-out boxes risks being off-putting and confusing, so many organizations will likely not want to make any changes. The changes for security and fraud are more fulsome — the problem is avoided here because these purposes are considered to be strictly necessary for the service. The secretary of state will have a rule-making power to exempt further processing from the cookie consent rule in the future.
The soft opt-in rule for email marketing is extended to charities, where an individual provides their contact details while expressing an interest in or offering to support one of the charity’s philanthropic purposes.
The Act provides cookie consent rules that will now also apply to a person who “instigates” the storage or access to stored data. This may allow the ICO to take enforcement action against a party who has not itself directly placed a cookie. The ICO’s power to impose penalties under the Privacy and Electronic Communications Regulations — both for cookie and electronic marketing related breaches — is currently capped at 500,000 GBP. This is addressed and enforced under the U.K. GDPR; the Data Protection Act 2018 will soon apply to ePrivacy breaches. Most breaches will attract the higher maximum penalty cap of 17.5 million GBP or 4% of worldwide turnover.
Communications service providers are subject to a parallel personal data breach reporting regime. This will be aligned with the 72-hour deadline under the U.K. GDPR, although the requirement to notify all breaches remains. Lastly, there are obligations for the Commission to encourage representative bodies to produce codes of conduct relating to the regulations.
Data subject rights and automated decision-making
The Act contains relatively minimal changes to existing data subject rights.
The legislation now provides a more explicit deadline for responding to requests. It reflects ICO guidance since 2020 that, if the controller reasonably requests further information to identify the processing covered by the request, then the “clock stops” until this information is provided.
The Act also makes clear the controller’s obligation is to provide such data as it can provide after a reasonable and proportionate search. Prior case law had established that this would be the case based on the principle of proportionality under EU law. Brexit had cast doubt on whether this should still be considered — so this will put the point beyond doubt. This echoes existing ICO guidance on the topic, so it will not come as a surprise to controllers and should provide reassurance when dealing with aggressive requests.
Subject access requests
Further changes to subject access requests are made in respect of court procedure. Under the Act, if a court is required to determine whether a data subject is entitled to information under their right of access or portability, the court can require the controller to make this information available for the court’s inspection, although the court may not require it to be disclosed to the data subject or their representatives until after it makes a decision.
Subject access requests
Data subject rights
Data subjects are given a new right to complain to controllers through the new Section 164A of the Data Protection Act 2018. This will require controllers to facilitate the making of complaints, to adopt measures such as an electronic complaint form, and to include information about this new right in privacy notices, BCRs, etc. If the secretary of the state introduces regulations, controllers may be obliged to notify the ICO of the number of complaints they have received — so, this could become a soft enforcement tool in future.
Data subject rights
Automated decision-making
Solely automated decision-making is substantially liberalized and more clarity is given to the meaning of solely in this context; the Act defines it as where there is no meaningful human involvement in the decision and provides factors to consider when assessing this. Broadly, the same restrictions as in GDPR are retained where the decision will rely on processing special category data.
However, significant, solely, automated decisions that do not rely on special category data are no longer prohibited, provided certain safeguards are put in place. These safeguards must include abilities for data subjects to make representations, contest the decision and require human intervention. This change reflects a similar approach to the U.K.’s position prior to the GDPR; it will be welcome as the existing prohibition on automated decision-making is often problematic and has been particularly stifling as organizations increase the amount of automation in their processes and adopt AI.
Legitimate interests
The Act makes it easier for controllers to know if the purpose for which they are processing data will be accepted as legitimate. Article 6(9) includes examples of this, such as direct marketing, to ensure the security of network and information systems and transfers of personal data intragroup, which are already mentioned in Recitals 47- 49.
In addition, the Act formally recognizes certain interests as legitimate, listing them in Annex 1. These include disclosures to public bodies who assert they need personal data to fulfill a public interest task, disclosures for national or public security or defence purposes, emergencies, prevention or detection of crime, and safeguarding vulnerable individuals. For these limited purposes, the requirement to carry out and document a balancing test against the rights of individuals is effectively removed. It is not usually difficult for controllers to determine that an interest is legitimate. It can be more difficult to determine if processing is necessary.
Purpose limitation
The Act rewrites the GDPR provisions on purpose limitation; further processing for public interest purposes is made easier, but further processing for purposes of scientific research is somewhat restricted.
Under the GDPR, processing for a new purpose may only take place if the data subject gives consent, a derogation under Article 23 applies, such as national security, or the processing is compatible. The Act reorganizes this so that new processing may only take place if it is compatible.
However, the Act says that processing will be compatible where the data subject gives consent or a derogation applies. Annex 2 introduces a list of derogations deemed compatible with the original purpose. These include disclosures to public authorities where the authority states it needs the data for a task in the public interest, which is also recognized by Article 23 of the U.K. GDPR, disclosures for public security purposes, emergency response, safeguarding vulnerable individuals, protecting vital interests, preventing and detecting crime, assessing tax, and complying with legal obligations.
Processing that is necessary to safeguard an objective listed in Article 23(1)(c) to (j) and is authorized by an enactment or rule of law is also stated to be compatible processing. Previously, U.K. data protection law only provided for derogations from purpose limitation on public interest grounds if they were specifically listed in Schedule 2 to the DPA 2018. The Act removes this requirement for them to be specifically listed in this way, so that there is scope for multiple other laws, whether pre- or postdating the Act, to operate as exceptions from purpose limitation.
The provisions make clear that purpose limitation is relevant when one controller wishes to use personal data for a new purpose. If controller B wishes to acquire personal data from controller A, controller A would have to consider how purpose limitation may affect disclosure of the data. However, controller B would be processing the data for its primary purpose so purpose limitation would not be relevant (Article 8A(1)). There is also an odd provision that provides that processing that is carried out to ensure that processing complies with the fair and lawful processing principle of the U.K. GDPR, or to demonstrate that processing meets this principle, will also be regarded as compatible.
If a controller originally relies on consent as its lawful basis, then the Act writes into law the view held by the ICO, that new consent will be required for further processing unless a derogation applies and the controller cannot reasonably be expected to obtain consent (Article 8A(4)). Where a controller wishes to carry out further processing for purposes of scientific or historical research and where the original lawful basis was consent, then the controller will need to re-consent individuals.
The Information Commission
The Act sets out the Commissioner’s primary objectives, which are to secure an appropriate level of protection for personal data and to promote public trust and confidence in processing of personal data (Sec.120A DPA 2018). The Commissioner must have regard to other wider public interest factors – such as preventing and detecting crime and the desirability of promoting innovation and competition.
The Commissioner must also consider the fact that children may be less aware of the risks of personal data processing and of their rights. The Commissioner must adopt a strategy for carrying out their functions and must annually report on the extent to which the strategy has been achieved (Section 139) and on key performance metrics in respect of regulatory action taken that year (Section 161A).
The Commissioner was already required to produce stipulated Codes of Practice — such as the Children's Code. The Act has added enabling provisions that allow the secretary of state to add to the list of codes the Commissioner must produce. Readers may recall that codes on ed-tech were suggested during the progress of the DPDI Bill, so there is a chance these might return. The Commissioner is also required to carry out and publish reports on the impact of any proposed codes of conduct.
The Commissioner’s role and obligations
The Act sets out the Commissioner’s primary objectives, which are to secure an appropriate level of protection for personal data and to promote public trust and confidence in processing of personal data (Sec.120A DPA 2018). The Commissioner must have regard to other wider public interest factors – such as preventing and detecting crime and the desirability of promoting innovation and competition.
The Commissioner must also consider the fact that children may be less aware of the risks of personal data processing and of their rights. The Commissioner must adopt a strategy for carrying out their functions and must annually report on the extent to which the strategy has been achieved (Section 139) and on key performance metrics in respect of regulatory action taken that year (Section 161A).
The Commissioner was already required to produce stipulated Codes of Practice — such as the Children's Code. The Act has added enabling provisions that allow the secretary of state to add to the list of codes the Commissioner must produce. Readers may recall that codes on ed-tech were suggested during the progress of the DPDI Bill, so there is a chance these might return. The Commissioner is also required to carry out and publish reports on the impact of any proposed codes of conduct.
The Commissioner’s role and obligations
The Commissioner’s enforcement powers
Additional enforcement powers are granted to the Commissioner.
Information notice
The Commissioner can already serve an information notice requiring controllers or processors to provide requested information. This is expanded so that the Commissioner can require specified documents be provided. When organizations are solely required to provide information, there are opportunities to shape the narrative and tone of responses — and to avoid providing documents that may contain prejudicial content by extracting a subset of information. This provision will also assist the Commissioner’s investigators when they know that a document exists but where the controller or processor has not otherwise provided it.
Information notice
Assessment notice provisions
The Commissioner currently has powers to issue an assessment notice. This encompasses a range of activities necessary for assessing compliance, from entry onto site to observation of processing operations. A new power is added, whereby the Commissioner can require the controller or processor to procure and pay for an approved person to prepare a report for the Commissioner on a specified topic. The controller or processor can suggest who should prepare the report — but the ultimate decision on appointment is with the Commissioner. This may assist the Commissioner in investigations of data breaches as it will allow the Commission to mandate the affected organization to provide a forensic report where the organization sets the parameters and timing of the report.
Assessment notice provisions
Call an individual to be interviewed
When a controller or processor is suspected of wrongdoing, the Commissioner can now require personnel employed by or involved in managing the controller/processor to attend an interview notice. The Commissioner cannot compel specific individuals to answer questions. There are exemptions when parliamentary or legal privilege apply and in respect of self-incrimination other than under the Data Protection Act. It is an offense to knowingly or recklessly make a false statement. The Commissioner will have a power to impose a penalty notice for failure to comply with an interview notice. When deciding to issue an interview notice, the Commissioner must produce guidance on the factors to be considered.
Call an individual to be interviewed
Research
Researchers often want to reuse data for further research not anticipated at the date of collection. In this case, Article 14(5)(b) provides that there is no need to provide a privacy notice to individuals if it would be impossible or involve disproportionate effort, particularly for processing for research purposes. However, this exemption only applies where personal data has not been collected directly from individuals. There was no equivalent exemption for directly collected data.
This can be problematic where contact details have changed, or for large cohorts where the cost of providing new notice would make the research non-viable. A new exemption is introduced in Article 13(5) that is similar to the Article 14(5) exemption, but is limited to processing for research purposes that complies with research safeguards. Article 13(6) notes that the age of the data, number of data subjects and safeguards applied should all be considered.
Research
Scientific research
Required safeguards for research were previously split between Article 89 and Section 19 DPA 2018. These are now consolidated in one place in Chapter 8A of the U.K. GDPR. A new acronym, “RAS purposes," indicates processing for scientific and historic research and archiving in the public interest and for statistical purposes. The substance of the existing law is unchanged.
The new Article 4(7) provides that consent to an area of scientific research will still be specific, provided that, at the time consent was given, it was not possible to fully identify the purposes. This approach was consistent with generally recognized ethical standards and so far as possible, individuals were allowed to consent to only part of the research. This takes the existing succinct drafting from Recital 33 but spreads it out over multiple sections and phrases it slightly differently.
There is a definition of “scientific research” that incorporates the existing text from Recital 159, noting that research can be either a commercial or non-commercial activity. A definition of statistical purposes is included. This is the processing to produce statistical surveys or results when the resulting information is aggregate and not personal data; the resulting information is not used to take measures or decisions with respect to an individual whose data was processed to produce the results.
The change to purpose limitation, providing that processing originally relying on consent must almost always be based on consent, is unhelpful. See the purpose limitation section above. If researchers undertake a research project and base the processing on consent and then later want to make use of the data for further research, this will not be possible unless they go back and obtain new consent.
International law enforcement requests
The Act acknowledges the special relationship between the U.K. and U.S. by recognizing the agreement between the two governments on requests to access data for the purposes of countering serious crime.
Under the new provisions, controllers may rely on legal obligation as the lawful basis or condition for processing personal and special category data where it is necessary to respond to such requests. This simplifies the rules for responding to law enforcement requests from U.S. authorities but does not help organizations that need to comply with other types of legal obligations across different jurisdictions.
As there has historically been concern in the privacy community regarding U.S. law enforcement’s access to personal data, it remains to be seen whether this move will impact the European Commission’s adequacy finding for the U.K. It may be of some comfort that the U.K.-U.S. agreement was signed in October 2019, prior to Brexit, and therefore would have been considered in the adequacy procedure.
International law enforcement requests
Additional resources
-
expand_more
Additional UK privacy resources