The UK's new Data (Use and Access) Act is a significant achievement, and launches real divergence from the EU. What are the most interesting data protection impacts and how do they play against the UK’s ambitions in the race for data and AI innovation?
So, it’s here at last. The landmark regulatory reform to the UK’s data protection and ePrivacy law — the Data (Use and Access) Act (‘the DUA’) — received Royal Assent on 19 June.
Like the first post-breakup relationship, this legislative incarnation can expect some steely scrutiny from the ‘ex’ (the EU), and there are tangible risks: foremost to the UK’s data protection adequacy status under the GDPR when the European Commission reviews — and hopefully renews — this at the end of 2025.
Passage of the DUA is a significant achievement, and its scope is far broader than data protection. The Act has been at least three years in the making. After careful amendment of some more contentious aspects, the DUA is poised to achieve a margin of innovation-friendly data reform for the UK, while avoiding significant backlash.
Here, we look at the most interesting data protection impacts and set them against the UK’s ambitions in the race for data and AI innovation, and the ongoing risks of divergence.
Automated decision-making changes – low profile AI liberalisation?
Several of the DUA’s central changes to data protection law are clearly framed with innovation in mind.
These include changes to rules on automated decision-making (ADM) in Article 22 of the UK GDPR. The pre-DUA position prohibits solely automated decision-making with significant or legal effects for individuals, unless specific conditions are met and safeguards are in place.
Narrow restrictions on such decision-making conflicts with government ambitions to automate public services.
In essence the DUA removes the general prohibition, maintaining the prior, strict controls just for ADM using ‘special category’ data. The intention is to enable significant decisions to be automated in a wider variety of settings, with some risk mitigation achieved by excluding sensitive data points about race, gender identity or political belief, for example.
The UK’s ‘light touch’ approach may be seen as undermining EU data subject rights if it flows to the UK, bearing in mind that an AI system conducting ADM — even without special category data — may be a ‘high risk system’ under the EU AI Act, with significant compliance obligations attached.
Special category data
The DUA empowers the Secretary of State to add new categories of special category personal data which are subject to greater protection. This may serve as a counterweight to the reduced protection against ADM mentioned above.
The change also reflects the increasing importance and nuance of the distinction between special categories of data and mere personal data, in the context of ADM and more widely. This concerns the ability to infer special category data about someone, for example, from their name or the way they dress.
In the UK, historically the Information Commissioner’s Office (ICO) has given data controllers responsibility for assessing whether they are using such ‘proxy’ information to treat someone differently and therefore handling special category data. The DUA increases the stakes by making the boundary between special category and other data more significant.
It risks substantial divergence between the EU and UK as both courts and laws separate. Recent examples are the European court finding that information about medicine ordered online is special category data (about health) even where the intended patient is unclear. The EU Digital Services Act contains a blanket prohibition for in-scope online platforms on use of special category data for targeted advertising.
AI and training data
Although the ICO has been reluctant to discourage the use of personal data for training AI systems and confirms that it may be legitimate for businesses to do so, the lack of definitive guidance (notably on special category data) means many organisations in the UK remain wary of using such data for AI training purposes. New guidance is expected in a “statutory code of practice”, due shortly.
To further encourage take-up, the DUA creates a newly separate ‘RAS’ (research, archives and statistics) protocol in Article 84A of the UK GDPR which will replace the previous regime under Article 89 to suspend data subject rights. It explicitly applies to research activities within the private sector, even ones outside traditional ‘academic’ research.
The newly constituted Information Commission must have regard to the desirability of promoting innovation when carrying out its functions, which may indicate that the ‘hands off’ regulatory approach to scraped personal data for AI training will stay.
In contrast, EU innovators are hampered by significantly diverging views of Supervisory Authorities and the risk of cross-border enforcement and possible collective claims under the EU Representative Actions Directive. The UK has a potential advantage here, and it will be interesting to see whether the changes succeed in bringing about greater confidence and clarity for the UK private sector, particularly AI development.
Recognised legitimate interests
A major change made by the DUA to the UK GDPR is the creation of ‘recognised legitimate interests’ for use by the private sector which do not require a legitimate interests test to be carried out (easing the compliance burden).
A little-noted change is the inclusion of law enforcement purposes among them. Traditionally, reliance on data protection law enforcement purposes was reserved to the police. The DUA conclusively blurs the line between the public pursuit of law enforcement and the private sector, on which law enforcement agencies now rely for support and innovation.
The change will make it easier for a UK private sector organisation working for law enforcement purposes to justify its activity. On a more general level, all organisations will find it easier to justify (or harder to resist) passing information like CCTV footage to the police. However, the changes may be of concern to EU personal data exporters, particularly when the cumulative effect of changes is considered.
Repurposing data
One of the payoffs sought by the UK government in the DUA was facilitating the use of datasets (especially relating to health), which can be mined and monetised to benefit the UK. The DUA is designed to ease this process by relaxing the rules on data reuse, with an eye to enabling further research into new health treatments without having to ask data donors for a fresh consent or predicting the nature of future research.
The change will potentially open some clear blue water between the UK and EU, which has found itself mired in regulatory complexity over the reuse of personal data, for example in clinical trials.
International transfers
On the vexed topic of international transfers, the jury is out. For UK exporters of personal data doing their own transfer risk assessments under Article 46, the new ‘not materially lower’ data protection test applicable to importers may not result in much change from previous arrangements.
When it comes to adequacy, the UK may make early efforts to boost trade with third countries by extending UK adequacy to them. Changes in the DUA, tailor-made for the US, now avoid the need for the Secretary of State to consider whether a recipient third country has an independent data protection authority, or indeed a coherent body of data protection law.
Instead, the third country’s ‘constitution, traditions and culture’ is one of the mandatory factors to take into consideration (cue the ‘special relationship’?). This would represent a major divergence from the EU, magnified further if the EU/US Data Privacy Framework was to fall following legal challenge.
The DUA gives the UK a means of creating and defending an independent ‘data bridge’ with the US, leaving it with political cards to play in the event of a Schrems III decision in the European court. However, if the UK strikes out too boldly, the Commission could look to revoke the UK’s then current adequacy status, on the grounds that the UK had become a ‘data laundering hub’ for tech companies to bypass EU data protection. How this will all play out is difficult to predict.
No UK AI Act yet
The DUA does not make explicit arrangements for AI. Some of its measures will bring benefits for developers and deployers of AI systems (liberalising the reuse of data, and some loosening of ADM controls). However, there is no control of foundation models, no requirement for transparency by AI developers in their use of copyright works (despite strenuous efforts by the House of Lords on behalf of content creators) and the UK seems a long way from an AI Act. This means that for the foreseeable future, the UK represents a lower risk, lower compliance-burden destination for AI development than the EU. However, there is much to play out, with new ICO guidance — including on AI and tracking technologies — not only expected but urgently required.
Drifting closer?
Various European NGOs, including the EDRi, have expressed concern about some of the changes in the DUA (see the data laundering comment above). The Commission has sought extra time to consider whether the UK’s adequacy decision, permitting frictionless international data transfers from the EU, should be renewed in December 2025.
At the same time, ironically the Commission is considering some reforms to the GDPR which were abandoned in the UK due to adequacy status fears. In particular, the Commission’s proposals to exempt medium sized organisations as well as small ones from keeping records of processing activity reflect changes dropped from the DUA.
The EU’s plans to simplify the GDPR, focusing on reducing burdens for smaller businesses while maintaining core principles, are part of the Commission’s 2025 work programme that includes a ‘fitness check’ of digital legislation and ‘digital package’ planned for Q4/2025 (the same time the Commission decides upon the UK’s adequacy). Depending on the outcome, it is possible that the EU’s efforts could narrow some of the gaps created by the UK’s DUA.
Whether the arrival of the DUA marks the start of a transition for UK organisations to leverage the benefits of independence from the EU, and engage more fully in the data economy, it is far, far too soon to say.
(A longer version of this article was first published in PDP Journals in June 2025)
Disclaimer
This information is for general information purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. Please contact us for specific advice on your circumstances. © Shoosmiths LLP 2025.