Happy International Data Privacy Day!
And what better reason than that to explore what 2025 is likely to have in store for data and privacy?
We are under a year on from UK and US election results that may well shape the regulatory landscape in the coming years. Just over two years on from the European Commission kick starting the process to adopt an adequacy decision for the EU-US Data Privacy Framework. Three years on from the UK government hinting that it might think outside the box in terms of data protection regulation. Four years on from the introduction of the UK GDPR in a post-Brexit Britain. Five years on from the start of a global pandemic which forced a discussion around the tension between public health and data privacy. And over six years on from the GDPR coming into force across Europe, and by extension the world. But the passing of time does not appear to have diminished the worldwide focus on data and privacy issues.
In this post, we set out some of our predictions for data protection and privacy developments across the UK and EU in the year to come.
2025 is set to be a crucial year for data privacy regulation in the UK. October 2024 saw the UK's latest plan to reform the data privacy landscape arrive in the form of the Data (Use and Access) Bill ("DUA Bill"). Currently at the report stage in the House of Lords, the DUA Bill is not an all-singing, all-dancing reform of UK data privacy law. Labour's first bite at the cherry since the Conservative's proposal fell away pre-election, the Bill represents an evolution of the existing landscape and, to some extent, resembles its predecessor, the Data Protection and Digital Information Bill, proposed by the previous government. Key proposals to "boost innovation" and support innovative uses of data, sit alongside a more robust sanctions regime for direct-marketing and mandatory complaints procedure for data subjects.
2025 is likely to see a version of this UK data reform enacted at last. Given its introduction in the House of Lords first and the Labour government's majority in the House of Commons, a relatively smooth trajectory is expected through the remaining legislative stages. However, it remains to be seen whether further changes will be made to the Bill to dovetail with the UK's newly-published and still-evolving AI Opportunities Action Plan, as the initial draft faced criticism for failing to address a number of AI-related data issues. We expect organisations to consider their compliance programmes in preparation for the DUA Bill. However, multi-jurisdictional organisations operating across both a UK and EU footprint are likely to continue to align their practices with the EU GDPR standard for consistency across their compliance programmes. It therefore remains to be seen the extent to which the Bill will have an impact on the compliance burden in practice and whether 2025 will see the UK become a hotly anticipated sandbox for innovation.
As the so-called "sunset clause" in the EU's adequacy decision for the UK comes to an end in June 2025, all eyes are on whether the decision will be renewed with the UK continuing to ensure an adequate level of data protection. The European Commission has previously warned that any significant amendment to the UK version of the GDPR could put the adequacy decision in jeopardy. Whilst not a slam dunk, the latest iteration of the reform, the DUA Bill, arguably diverges least from the EU GDPR and is the version that least resembles the more radical reform that was first touted in the DCMS' Consultation Paper (Data: A new direction) back in October 2021.
It seems the government may have had one eye on the significant one-off SCC costs and annual costs of lost export revenue if the UK were to lose its adequacy status, when developing this current incarnation. In fact, recommendations from the House of Lords European Affairs Committee, suggest that a successful legal challenge to the adequacy decisions through the CJEU is "more likely than a Commission decision not to renew the UK's adequacy status". And with the UK reform still to be finalised, will the Commission insert another "sunset clause" as a precaution?
Will 2025 become the year of the data class action in Europe? The spectre of data class actions has haunted European business since the EU GDPR first came into force. But so far, no attempt has really got off the ground and, in the UK at least, cases such as Lloyd v Google seemed to close the door on any realistic opportunity to bring a successful data class action. However, in a significant development, 2024 saw Austria’s Federal Cartel Prosecutor and the Irish Ministry of Justice grant ‘qualified entity’ status to Noyb under the Representative Actions Directive, to be able to bring collective actions in the Europe Union.
Consumer privacy advocacy organisation, Noyb, is headed by Max Schrems whose legal challenges famously invalidated both the EU-US Safe Harbour and the EU-US Privacy Shield. Schrems has announced that noyb plans to bring the first actions in 2025 and that "so far, collective redress is not really on the radar of many – but it has the potential to be a game changer". Given the organisation has already brought hundreds of GDPR complaints, it looks like 2025 may well turn out to be the year of the data class action.
AI is likely to remain a hot topic through 2025. Following the introduction of the landmark EU AI Act in August 2024 (see our blog here), all eyes remain on the UK and whether or not it will introduce a new AI Bill for the UK. While timings are unclear, it is rumoured that the government is likely to release a related consultation in early 2025. As the US takes the lead in the AI race with President Trump's monumental $500 billion investment, it will be fascinating to see how the UK's approach to AI regulation might change over the next year, perhaps taking a stronger "pro-innovation" approach than previously, something that is also suggested in the recent AI Opportunities Action Plan published on 13 January 2025.
The AI legislation we have seen so far has deferred to existing data protection regulation rather than imposing new data obligations, and it is likely that any potential UK AI Bill will do the same and instead focus on strengthening "safety frameworks" (as mentioned in the King's Speech). As such, the use of personal data in AI will largely be governed by existing technology-agnostic data protection legislation (i.e. the UK GDPR and Data Protection Act 2018). However, regulators in the UK and EU are continuing to release guidance to clarify how data protection legislation should be applied to AI technologies. For example, the EDPB's recent opinion on personal data in AI models (see our blog here) and the ICO's consultation responses on generative AI (see press release here) emphasise the need for AI developers and deployers to start thinking about compliance from the early design phases.
One key area that remains to be clarified in 2025 is the regulators' approach to web-scraping. In the midst of delays (and suspensions, in the case of X) to social media platforms' AI training plans, 16 global data protection authorities released a joint statement in October 2024, urging platforms to protect individuals from unlawful data scraping. The EDPB also recently issued its guidance on "anonymisation, pseudonymisation and data scraping in the context of generative AI" – more on this in our next HSF Data Wrap.
2025 will mark the first year of Trump's second presidential term, and within hours of his inauguration on the 20 January, Trump revoked the AI executive order signed by Biden in 2023, which was aimed at reducing the risks of AI on consumers through enforcing testing requirements on developers. It remains to be seen whether this is indicative of a wider trend towards de-regulation, as Trump has highlighted his focus on reducing challenges to US innovation and competition, as well as the impact a Trump administration may have on the EU-US Data Privacy Framework.
However, the New York Times reported last week that the Trump Administration had demanded that all three Democratic members of the Privacy and Civil Liberties Oversight Board resign or be fired. The Board was set up to be an independent watchdog of surveillance practices and with oversight of the EU-US Data Privacy Framework. As such, it will be interesting to see the impact such changes could have on Europe's adequacy decision for the US. It is also interesting to note that this latest development has not gone unnoticed by Max Schrems and his privacy non-profit organisation. Could 2025 be the year for Schrems III?
In 2024, the EDPB published its opinion on the "consent or pay model" used by large online platforms in the context of behavioural and targeted advertising. This model typically gives users two options: pay to avoid ads, or receive the service for free in return for letting the company use their personal data for advertising purposes. The EDPB's headline position was that, in most cases, large online platforms cannot meet the GDPR's valid consent requirements if they only provide this kind of binary choice. Nevertheless, the EDPB did help clarify some issues, including the form of consent needed from users, the application of the principles of necessity, proportionality and fairness, and the provision of alternative services that offer genuine equivalence, such as contextual advertising.
In response, a number of big tech companies began to roll back their consent or pay models in the EU in 2024, opting to offer less-personalised ad formats without a fee. For example, Meta proposed three options: (i) no ads but with a fee (at a reduced rate, 40% cheaper than before); (ii) free service with highly personalised ads; and crucially, (iii) a new free option which uses "a minimal set of data points" to show "less personalised adverts" (including unskippable ads). The EDPB's initial comments suggest Meta's new proposals may ease the regulators' concerns, but formal assessment is still needed.
Looking ahead to 2025, it is likely that big tech companies will continue to test different alternatives to see if they are both regulatorily compliant as well as commercially viable. The EDPB's formal review of Meta's proposals will probably influence these changes. This trend may also impact smaller companies, especially if the EDPB, as promised, develops further guidelines on consent or pay models that apply to non-large online platforms as well. In the UK, the ICO has also just last week published its own guidance on consent or pay models, which appears at first glance to adopt a more permissive approach than perhaps the EDPB – watch out for more on this guidance in our next HSF Data Wrap.
Will 2025 see a continuation of the ICO's more holistic approach to enforcement, which has deprioritised monetary fines for public sector bodies?
In November 2022, the ICO announced a two-year trial of its revised "public sector approach" ("PSA"), with the Commissioner exercising his discretion to rely on other enforcement actions in order to reduce the impact of fines on public bodies. The focus was on improving data protection standards through guidance and proactive engagement (as opposed to monetary penalties), and avoid the diversion of public funds away from the duties of the public body in question – which could result in a 'double punishment' to victims of a data breach who would likely suffer from resulting budget impacts.
In December 2024, the ICO announced it will continue with the PSA and launched a consultation to gather feedback on: (1) the types of 'public sector' organisations that should be in scope; and (2) the circumstances that will lead to a fine under the PSA (i.e. when the infringements are especially serious). With the decision to continue the PSA, along with the message of collaboration from the Commissioner and the regulator's less than successful track record on significant GDPR enforcement, query whether this will be an enforcement approach which becomes mirrored more broadly for other organisations.
Could 2025 be a year in which organisations need to re-visit (again) their existing SCCs? When the "new" SCCs were first released in June 2021, the accompanying FAQs flagged a gap in the suite of documentation; namely, where the data importer is located in a third country but is directly subject to the EU GDPR under Article 3(2). A gap that has gave rise to uncertainty, regulatory scrutiny and enforcement action in 2024. Whilst the European Commission was expected to launch a public consultation regarding these new SCCs at the end of 2024 (with the aim of adoption in Q2 2025), the consultation is (as at the date of these predictions) yet to be published..
So what should we expect? The additional SCCs are expected to "complement the existing clauses" for data transfers to third country importers not subject to the EU GDPR. We envisage the new terms will likely be a simplified version of the current ones, focusing more on risks specific to third-country transfers, such as conflicting laws or difficulties in obtaining legal redress outside the EU. But will they require a transfer impact assessment? And how long a transition period will organisations be given to determine which transfers must be conducted under yet another set of SCCs and to implement them?
2024 saw a number of challenges by Meta and other tech firms against key EDPB opinions, including Meta's June 2024 lawsuit against the EDPB for the annulment of its opinion cracking down on "consent or pay" models (as mentioned above), and its challenge (Case C-97/23 P) to the CJEU in relation to the binding decision issued by the EDPB on Whatsapp's transparency breach relating to its data-sharing plans with Facebook and Instagram leading to the Irish DPC's €225 Million fine in 2021.
Similarly, the Business Software Alliance ("BSA"), a global advocate for the software industry before governments and in the international marketplace, also made its concerns known against the EDPB's October Opinion on sub-processors (see our blog here). Given the renewed confidence of big tech, stemming from the US, could we see more challenges from the tech industry in 2025 against regulators' opinions?
In 2018, the European Commission's strategy for data highlighted its recognition of the ever-growing importance of data, as an essential resource for economic growth. This sentiment is clearly reflected in 2025, which will bring in a roster of new European legislation designed to harmonise and regulate the way in which organisations access and use data including non-personal data. In particular, the EU Data Act, which will apply in the EU from September 2025, aims to foster a competitive data market by enhancing data accessibility and usability by individuals and organisations alike, whilst also bringing in additional obligations on organisations to ensure that users of their products which generate certain product and related service data can access such data freely and directly.
This principle is mirrored in both the European Health Data Space ("EHDS") (adopted by the Council of the EU in January), and the European Commission's regulation for financial data access ("FiDA") which is expected to be finalised by the end of 2025. The EHDS will create an ecosystem that aims to provide individuals with digital access to their personal health data, and also facilitate the secondary use of health data for research, policy-making and regulatory activities in a trustworthy manner. FiDA will have a similar impact on the financial data landscape across the EU, by granting individuals more control over their financial data, as well as implementing enhanced transparency obligations on financial services providers. These regulations are intended to enhance, rather than undermine, the existing legislative framework for data privacy, so we expect that organisations will need to reassess their compliance programmes holistically in light of these changes.
The protection of children online continues to be at the top of regulators' agendas for 2025. Following the ICO's adoption of the Children’s Code (Age-Appropriate Design Code) in 2021, the UK Online Safety Act ("OSA") will also introduce a host of safeguarding measures for children this year. In addition to the Children's Access Assessment Guidance published by Ofcom on 16 January 2025, the Children's Safety Codes of Practice mandated by the OSA are expected to be finalised in April 2025, with regulated providers having to complete their first children's access assessments by 16 April 2025. The full set of child related safety duties are expected to apply from July 2025 (see the latest chapter of our OSA series focusing on children safety here).
Given the increasing threats against children online, including cyberbullying and age-inappropriate content, we are also seeing other countries imposing new measures to protect children. For example, Australia passed the Online Safety Amendment (Social Media Minimum Age) Bill 2024 on 29 November 2024 (see our blog here) which aims to prevent users under 16 from having an account on social media platforms. Building on the existing Children's Online Privacy Protection Act (COPPA) 1998, the US Congress is also considering the Kids Online Safety Act ("Kosa") which passed through the Senate in a 91-3 vote in August 2024. If passed, Kosa would establish a "duty of care" for social media platforms to protect children from harm.
Key contacts

Mackenzie Zhang
Trainee Solicitor, London
Disclaimer
The articles published on this website, current at the dates of publication set out above, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your specific circumstances should always be sought separately before taking any action.