Follow us

On 28 June 2021, the European Commission (the "Commission") published two adequacy decisions which allowed for transfers from Europe to the UK under the EU GDPR and the Law Enforcement Directive. Both adequacy decisions are due to expire on 27 June 2025; however on 18 March 2025 the Commission proposed delaying the deadline by six months. If the European Data Protection Board approves the extension, there will continue to be a free flow of data between the EU and the UK until 27 December 2025.

The Commission acknowledged that the UK is currently moving to pass legislation to amend its data protection regime, notably the Data (Use and Access) Bill. Once concluded, the Commission has stated that it will "assess whether the UK continues to provide an adequate level of protection for personal data. If that assessment is positive, the Commission will propose to renew the UK adequacy decisions."

On 11 February 2025, the European Commission (the "Commission") released its 2025 Work Programme (the "WP"), outlining its initiatives for AI and privacy for the year aimed at  enhancing competitiveness, strengthening security, and improving economic resilience within the EU. Notably, the Commission announced its decision to withdraw certain pending legislative proposals, including the ePrivacy directive (intended to replace the current ePrivacy Regulation ) and the AI Liability Directive (designed to complement the new Product Liability Directive), due to a lack of consensus on their adoption.

The AI Liability Directive (the "ALPD") was intended to complement the EU Artificial Intelligence Act by addressing non-contractual civil liability for damage caused by AI systems. However, the ALPD faced significant opposition from critics who questioned the necessity of the ALPD, particularly given the concurrent revision of the EU Product Liability Directive, which already expanded liability to include software. Moreover, technology companies argued that the ALPD would further complicate the regulatory landscape, increase liability exposure, and hinder innovation. Due to these concerns, and the broader calls for regulatory simplification in the digital sector, the Commission decided to withdraw the ALPD but has reserved the right to assess whether to introduce a new proposal or explore alternative regulatory approaches to address AI liability.

The proposed ePrivacy Regulation, which had been under negotiation for over a decade, will also be withdrawn. The Commission cited two primary reasons for this decision. Firstly, the co-legislators (the Council of Member States and the European Parliament) were unable to reach an agreement. Secondly, the proposed Regulation was deemed outdated in light of recent technological and legislative developments.

Despite these withdrawals, the Commission has announced several new initiatives as part of the WP. It plans to introduce a Digital Networks Act (the "DNA") to regulate the future of the European telecommunications market. Additionally, in the first quarter of 2025, the Commission will publish the "AI Continent Action Plan," which aims to consolidate Europe’s position in the global AI race, with a particular focus on computing capacities and AI funding. These developments indicate that while certain legislative proposals have been abandoned, the EU remains committed to shaping its regulatory framework for AI and digital markets.

The Court of Justice of the European Union (the "CJEU") has confirmed that, where a subsidiary company has breached GDPR, any resulting fines may be calculated based on its parent company's total annual global turnover rather than the turnover of that subsidiary only.

In case C-383/23 ILVA A/S, the CJEU clarified that, when identifying the relevant "undertaking" for the purposes of calculating the maximum fine threshold for a GDPR violation, competent authorities should have regard to EU competition rules set out in Articles 101 and 102 of the Treaty of the Functioning of the European Union. As such, calculations by reference to total worldwide annual turnover should encompass turnover of the entire economic unit (i.e. group undertaking) rather than solely the relevant subsidiary.

Notably, this is the first time the CJEU has directly ruled on this issue. The ruling therefore provides a significant increase in certainty to entities assessing their potential exposure under the GDPR.

However, the CJEU also stated that the competent authority must evaluate “each individual case in an effective, proportionate and dissuasive manner.” This calculation is therefore pertinent only for determining the maximum amount of any possible fine, rather than determining its specific value.

 

On 11 February 2025, the European Data Protection Board (the "EDPB") adopted a Statement on Age Assurance (the "Statement") to guide the GDPR-compliant enforcement of age assurance across the EU. Age assurance refers to the methods used to determine an individual's age or age range with varying degrees of certainty.

The European regulatory framework has, over time, increasingly shifted its focus towards protecting children in the digital environment. Subsequently, the EDPB's Statement serves as a reminder to online service providers (the "Service Providers") and involved third parties that safeguarding children through age assurance should not come at the expense of their privacy rights. Though not legally binding, the Statement is expected to shape a consistent approach to age assurance data processing and guide GDPR enforcement. It also urges the Service Providers to re-evaluate their age verification practices.

The Statement outlines ten high-level principles for the compliant processing of data in relation to age assurance:

  1. Full and effective enjoyment of rights and freedoms: Service Providers should respect fundamental rights and freedoms, with the best interests of children as the primary consideration.
  2. Risk-based assessment of proportionality: Service Providers should conduct a risk-based assessment to demonstrate proportionality and necessity of methods used.
  3. Prevention of data protection risks: Effective measures and safeguards should be implemented to prevent data protection risks, and the processing of personal data must not be used for unrelated purposes.
  4. Purpose limitation and data minimisation: Service Providers should ensure that data use is minimised and purpose-driven, processing only "age-related attributes that are strictly necessary for their specified, explicit, and legitimate purposes."
  5. Effectiveness of age assurance: The methods employed should achieve their intended purpose and be evaluated for accessibility, reliability, and robustness.
  6. Lawfulness, fairness, and transparency: The processing of personal data for age assurance should be lawful, fair, and transparent, and it should be presented in a clear and understandable manner for children.
  7. Automated decision-making: Automated decision-making should comply with the GDPR and the Statement cautions against its use concerning children as the GDPR generally prohibits sole reliance on such methods, except where necessary to protect a child's welfare.
  8. Data protection by design and default: Service Providers should incorporate safeguards to meet GDPR requirements, including privacy-preserving methods.
  9. Security of age assurance: Appropriate technical and organisational measures should be established to ensure security, taking into account the nature, sensitivity, and volume of personal data involved.
  10. Accountability: Service Providers should establish governance frameworks to ensure accountability for their use of age assurance methods and demonstrate compliance with GDPR. 

Following its completion of all stages in the House of Lords, the Data (Use and Access) Bill (the "Bill") moved to the House of Commons, where the First Reading was conducted on 6 February 2025, followed by a Second Reading on 12 February 2025.

As a recap, during the Bill's progression through the House of Lords, several key amendments were introduced, including provisions on data processing for scientific research, safeguarding children, allowing charities to use the "soft opt-in" for direct marketing, and the use of web crawlers. Following the amendments, on 10 February 2025, the Information Commissioner reaffirmed his support for the Bill, emphasising that it takes a balanced approach to reforming automated decision-making by facilitating innovation while protecting special category data.

During the Second Reading, MPs deliberated on the Bill’s aim to support economic and innovative growth without compromising data protection. Some raised concerns about the proposed GDPR relaxations and its potential impact on the UK’s EU data adequacy status, which remains under review and has been extended by six months (see above for further details). Ministers, however, maintained that the reforms remain aligned with EU standards. Despite these deliberations, no significant departures from the Bill’s current framework were proposed.

The Bill is now awaiting its Report Stage, after which it will proceed to a Third Reading. It is expected to pass through the Commons by late spring and return to the Lords for a final amendment review. Upon Parliamentary approval, it will be submitted for Royal Assent and is widely anticipated to be enacted due to broad cross-party support.

Organisations should start preparing for a potential new data protection regime by reviewing their privacy policies and operational procedures. The Bill's provisions may offer opportunities to revise data protection strategies, particularly where new exceptions or relaxations could be leveraged.

The Government has also indicated plans to introduce transitional periods and guidance from the proposed new Information Commission to aid businesses in adapting to the new legislation.

We will continue to monitor and provide updates on the Bill's progress.

The Interactive Advertising Bureau Europe (the "IAB") and several member organizations have responded to the European Data Protection Board's (the "EDPB") public consultation regarding draft Guidelines 1/2025 on pseudonymisation. The coalition welcomes the acknowledgment of pseudonymisation as a valuable privacy-enhancing technology that supports compliance with the GDPR.

However, they raise concerns about what they regard as "excessively high standards" for defining pseudonymised data, which they believe overlap with anonymisation and conflict with established GDPR requirements and CJEU rulings. The coalition suggests that the stringent stance with regards to legitimate interests as a basis for the lawful processing of personal data may discourage businesses from investing in pseudonymisation efforts and could stifle innovation in areas such as privacy-enhancing practices and AI development.

The coalition also points out that the guidelines overlook the risk-based and proportionality principles set forth in Article 32 GDPR, by requiring controllers to consider hypothetical re-identification methods that are not realistically applicable. They recommend that the EDPB refrains from imposing extra requirements that could create unnecessary compliance challenges and instead focus on providing clear, practical guidance that encourages innovation and promotes the effective use of pseudonymisation techniques (for example, by providing a greater list of processing activities that may be pursued on the basis of legitimate interest).

Over three years ago, Luxembourg's privacy regulator (the Luxembourg National Commission for Data Protection (the "CNPD")) issued a decision against Amazon for processing personal data in breach of the GDPR, a claim which Amazon disputed was "without merit". A complaint was filed on behalf of over 10,000 Amazon customers, who claimed that whilst the company declares the data they collect and how they process it, they do not explicitly seek consent for processing. As a result of its violations, Amazon was handed a €746 million fine - the second largest GDPR fine in history.

Amazon launched its appeal against the CNPD's decision on 29 October 2021, with the hearing taking place on 9 January 2024. On 18 March 2025, the Luxembourg Administrative Court dismissed Amazon's appeal and upheld the CNPD's original decision, meaning the fine still stands and Amazon is required to implement the corrective measures ordered by the CNPD. However, the effects of the decision remain suspended during the appeal period and, if applicable, during any potential appeal procedure before the Administrative Court.

On 20 March 2025, Advocate General (the "AG"), Campos Sánchez-Bordona of the European Court of Justice (the "CJEU") opined that individuals can apply to the courts for an injunction under the GDPR, to stop the unlawful processing of their personal data.

The AG's opinion follows the Bundesgerichtshof's (Federal Court of Justice, Germany) request for preliminary rulings on the ongoing case, IP v Quirin Privatbank AG (Case C-655/23), in which the claimant seeks both damages and injunctive relief for the unlawful disclosure of his personal data to a third party. It is expected that the CJEU will clarify whether the GDPR permits individual injunction claims.

The AG drew on key GDPR provisions in support of his opinion, including Articles 5(1)(a), 6(1), and 79(1), which require fairness in processing, a valid legal basis for data processing, and the right to effective judicial remedy.

The AG's opinion is advisory, and the CJEU's final judgment will determine precedent. However, it serves as an important warning to data controllers that non-compliance with the GDPR could not only lead to fines but also civil litigation and court orders. The AG's opinion also signals a notable development in enhancing data subjects' rights by providing an additional means for enforcement.

We will provide an update once the judgment is delivered. In the meantime, organisations should ensure continued GDPR compliance, as the potential for injunctive relief raises the risks for data controllers who breach the GDPR.

The Information Commissioner's Office (the "ICO") has published its Monetary Penalty Notice against Advanced Computer Software Group Ltd ("Advanced"), issuing a £3.07million fine against the company. The ICO found that Advanced's failure to implement sufficient security measures led to a ransomware attack in August 2022, which was found to have put the personal information of 79,404 individuals at risk.  This included personal information held by NHS entities and affected various NHS services, including NHS 111. With respect to the specific infringement, the ICO found that Advanced had infringed Article 32(1) UK GDPR by failing to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk posed by processing in the following areas:

  • Vulnerability management, encompassing vulnerability scanning and patch management; and
  • Risk management (relating to multi-factor authentication).

The ICO had initially proposed to fine Advanced £6.09million for these failures, under the UK GDPR regime, as covered in our previous Data Note published here.  However, the fine was reduced following Advanced's proactive engagement with the ICO and various regulatory and governmental bodies. The ICO and Advanced have now agreed a voluntary settlement. Advanced has acknowledged the ICO's decision to impose a reduced fine and agreed to pay the final penalty without appealing.

The fine is particularly significant as it marks the first instance under the UK GDPR regime where a fine has been levied against a data processor, rather than a data controller.  Duc Tran and Kamilia Khairul Anuar discuss the potential implications for data processors, from the ICO's originally proposed fine, in an article published in the Personal Data Protection Journal, which can be found here.

Earlier last month, it was confirmed that investment bankers Joseph Pacini and Carsten Geyer had settled their data protection claims against Dow Jones (publisher of the Wall Street Journal). In 2017 and 2018, the Wall Street Journal published articles on the two bankers which included details of alleged misconduct around the activities of XIO Group, an investment firm. The bankers claimed that the Wall Street Journal had breached Articles 5(1)(a) and (b) of the UK GDPR and that they were entitled to compensation under Article 82 GDPR and/or section 168 of the Data Protection Act 2018.

In 2024, the Defendant sought a strike out order on the basis that the claim was a "purely tactical" attempt to disguise a defamation lawsuit through the means of a data protection claim. However, the High Court refused the application and the claim proceeded to a preliminary issues trial.

The main issue considered related to the definition of personal data (though the court also considered a second issue in relation to Article 10 UK GDPR, finding that the Wall Street Journal's news articles did not contain any criminal offence data).

A key takeaway from this case was the approach used to determine the meaning of personal data within the news articles, given the unique overlap between data protection and defamation laws. The High Court held that defamation law concepts should be taken into account, including: (1) the "single meaning rule" (i.e. that the news articles should be considered as a whole, interpreting each element in the full context of the articles); and (2) the "repetition rule" (i.e. that a party that repeats a defamatory statement should be treated as if it made the original statement).

Following a joint investigation by the UK's Information Commissioner's Office ("ICO") and the Office of the Privacy Commissioner of Canada ("OPC") into the 23andMe data breach affecting more than 6 million customers (see our previous data wrap post for details on the breach here), the ICO has made clear its intention to issue a fine on the genomics company. Since the start of the investigation, 23andMe has also been subject to a class-action lawsuit for its failure to provide timely notice of the breach to customers which was settled in September 2024 for $30million, as well as 50 other lawsuits.

On 24 March 2025, the ICO's Deputy Commissioner, Stephen Bonner, released a statement confirming that the ICO had issued a notice of intent to fine £4.59million along with its provisional findings on the breach. The statement took into consideration 23andMe's voluntary Chapter 11 bankruptcy filings in the US on 23 March 2025 (see 23andMe's press release here), noting that "as a matter of UK law, the protections and restrictions of the UK GDPR continue to apply and 23andMe remains under an obligation to protect the personal information of its customers" regardless of the sale process triggered by the bankruptcy filing.

Following the bankruptcy filing and resignation of 23andMe's CEO Anne Wojcicki (supposedly to pursue the company as an independent bidder), customers have been advised by California attorney-general Rob Bonta to "delete your data and destroy any samples of genetic material held by the company” (with news outlets such as the Guardian providing guidance to individuals on how to do so). In the meantime, 23andMe notes in its latest press release that "there are no changes to how we store, manage, or protect customer data. Any buyer of 23andMe will be required to comply with applicable law with respect to the treatment of customer data."

On 25 March 2025, the ICO and CMA published a joint article providing guidance to developers and deployers of AI foundation models, focusing on safeguards required to protect data privacy and customers rights. In particular, the article focused on the differences between open and closed-access foundation models, clarifying that the ICO and CMA do not have a preference between the two release approaches from a regulatory perspective.

The article provided guidance on how developers and deployers might mitigate data protection and consumer risk by implementing certain practical measures, noting that such mitigations will differ based on the release approach. For open-access foundation models (which can be treated somewhat similarly to open-source software), the ICO/CMA suggested the use of technical and organisational measures or the implementation of licensing terms to ensure greater controls around the downstream use of foundation models which use personal data. On the other hand, for closed-access foundation models (which are only accessible to selected recipients, such as for internal business use or paid-for external use), the ICO/CMA suggesting putting in place technical controls to prevent misuse, given the narrower scope of downstream use. Finally, the article also included a reminder that similar to other technologies, the ICO and CMA will continue to ensure that competition, consumer and data protection law work together to protect principles such as user choice and control, a level-playing field for data access, and supply chain accountability.

On 3 March 2025, the ICO announced it was launching investigations into three social media platforms due to concerns over their handling of children's personal data. The regulatory body stated that they are investigating whether there have been any infringements of data protection legislation. TikTok is under scrutiny for using recommender systems which leverage the data generated by children's online activity to suggest content which may be age inappropriate or harmful. Meanwhile Reddit and Imgur are being examined for the adequacy of their age assurance measures (see also our data wrap entry about the EDPB's views on age assurance above).

Also in March, the ICO reported on its progress in protecting children's online safety. In August 2024, it initiated a review of 34 social media and video sharing platforms, including BeReal, Twitch and X (formerly Twitter). Specifically, it looked at their use of default privacy and geolocation settings, targeted advertising, recommender systems and the information of children under 13.  Since its last update, the ICO has seen various improvements. For example, platforms have turned off personalised adverts for all under 18s, prevented children from sharing geolocation information and limited the extent to which children can change high privacy default settings.

These initiatives indicate the ICO's increasing focus on protecting young social media users in the UK from the dangers presented by their online activity, particularly with regards to access to their personal information and exposure to inappropriate or harmful content.

Key contacts

Miriam Everett photo

Miriam Everett

Partner, Global Head of Data Protection and Privacy, London

Miriam Everett
Sara Lee photo

Sara Lee

Associate, London

Sara Lee
Georgie Green photo

Georgie Green

Associate, London

Georgie Green

Key contacts

Isabel Rigby photo

Isabel Rigby

Associate, London

Laksh Kawatra photo

Laksh Kawatra

Trainee Solicitor, London

Simge Aslan photo

Simge Aslan

Trainee Solicitor, London

Miriam Everett Sara Lee Georgie Green