Follow us

On 23 January 2025, the ICO published its long-awaited guidance on "consent or pay" models following its call for views last year.

In summary, the consent or pay model implemented by providers of online products or services essentially offers service users a choice:

  1. consent to the use of their personal data for personalised advertising in exchange for free access to a product or service;
  2. pay a fee to access the service (without the need to consent to personalised advertising); or
  3. decide not to use the product or service.

The new ICO guidance comes after the Opinion of the EDPB on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms which was adopted on 17 April 2024. Although both do allow for the possibility of compliant models, the ICO guidance appears to generally hold a more positive view on the possibility of data protection compliant consent or pay models. Though, its important to note that the EDPB opinion was provided specifically in the context of 'large online platforms' and so its application is narrower. We await further guidance from the EDPB with broader scope, which is expected to be published later this year.

The ICO guidance makes an important distinction between consent or pay models and the "take it or leave it" approach. It confirms that a "take it or leave it" approach would be non-compliant in most cases as it does not provide users with a genuine free choice.

Under the guidance, consent or pay models are not prohibited and it is possible for such models to be UK GDPR and PECR compliant. What it does make clear is that organisations must complete a DPIA or review and update any relevant existing DPIAs, and assessments should then be kept under review. It is the organisation's responsibility to be able to demonstrate that there is valid consent.

When conducting an assessment as to whether consent can be deemed to be freely given, organisations must consider four factors:

  1. power imbalance – is there a power imbalance between your organisation and the user? It is perhaps important to note here that, whilst the ICO's guidance at first blush appears to be more lenient towards consent or pay models, this factor of imbalance of power will likely result in a challenge for large organisations with market power who are looking to adopt these kind of models;
  2. appropriate fee – can you demonstrate the fee is at an appropriate level?;
  3. equivalence – is the core product or service offered under both options 'equivalent'?; and
  4. privacy by design – have you implemented 'privacy by design'?.

The Data (Use and Access) ("DUA") Bill was introduced to Parliament on 24 October 2024 and completed its Third Reading in the House of Lords on 5 February 2025. The Bill was subject to a number of amendments and significant debate in the House of Lords. Key amendments include:

  • Data processing for scientific research: As a recap, the Bill had been introduced with a proposal to broaden the scope of the processing of data for scientific research under the UK GDPR. The scope of "scientific research" was broadened to include any research which "can reasonably be described as scientific". Since its passage through the House of Lords, this section of the Bill was amended to include a public interest test, such that references to the processing of personal data for the purposes of scientific research are limited to processing "for the purposes of any research that can reasonably be described as scientific and that is conducted in the public interest, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity." 
  • Duties to protect children: The amendments to the Bill include further duties in respect of children’s data, including an additional duty on the ICO to have regard to the fact that children "merit specific protection with regard to their personal data". An amendment was also introduced to ensure that when processing personal data in the course of providing services likely to be accessed by children, the controller must take account of "higher protection matters" when assessing appropriate technical and organisational measures for data protection. This includes considering how children can best be protected and supported when using the services, and the fact that children will have different needs at different ages and different stages of development.
  • Direct marketing "soft opt-in": The government has amended the Bill to extend the direct marketing "soft opt-in" to the charity sector. The "soft opt-in" is currently only available to commercial organisations, and allows them to send direct marketing emails to existing customers provided that customers are given the option to opt out. The amendment allows charities to utilise this rule where people support their cause, eg via donations, or express an interest in their charitable purposes.
  • Web crawlers: A Lords amendment was introduced which would require all operators of web crawlers to be transparent about their identity and purpose, to allow creators to understand if their content had been scraped. The proposal would also give the ICO enforcement powers, and gives copyright holders a right of private action in respect of this.

On 17 January 2025, the EDPB published new draft guidelines on pseudonymisation ("Guidelines"). The pseudonymisation of personal data is a common method that is used to reduce data protection risk and is defined under Article 4(5) GDPR as "the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person". For example, data controllers might require additional information in a lookup table or a cryptographic key to link pseudonymised data back to a data subject (sometimes referred to as "pseudonymisation secrets"). The new Guidelines clarify that:

  1. Pseudonymised data is still personal data (even where the additional information required to link the data back to the data subject is held securely by the data controller). This reflects the position in Recital 26 of the GDPR, and the ICO's 2022 draft guidance on Anonymisation & Pseudonymisation (see our blog here) which is due for publication in Spring 2025 according to the ICO's announcement here.
  2. Pseudonymisation helps organisations comply with their GDPR obligations (though pseudonymisation alone is not sufficient for full compliance). For example, it may make it easier to rely on legitimate interests as a legal basis (under Article 6 GDPR), be used as a "supplementary measure" for third country data transfers, and help controllers comply with other GDPR obligations (e.g. those regarding implementation of the data protection principles under Article 5; data protection by design and default under Article 25; and security under Article 32). The Guidelines also note that pseudonymisation helps to reduce the risk of "function creep" as it makes it harder to use the data for purposes other than those for which it was collected, and that the assignment of "pseudonyms" can also enhance the accuracy of data by reducing the risk of incorrectly attributing data to the wrong data subjects.

While the GDPR does not impose a general obligation to use pseudonymisation, the Guidelines encourage pseudonymisation and note that it may be advisable in some cases in order to comply with GDPR principles, such as data minimisation (and in some specific situations it may be mandatory under Union or Member State law). Additionally, the Guidelines provide practical examples of how pseudonymisation may be applied.

The guidelines also state that controllers may choose to limit pseudonymisation to a limited number of internal and/or external recipients (i.e. a "pseudonymisation domain") depending on the context, and that controllers should ensure that appropriate measures are put in place to keep the the hidden additional information required for pseudonymisation separate from the domain. Where the domain consists of a defined set of recipients, the Guidelines also suggest that the "responsibilities of all parties involved should be defined by an arrangement, preferably in contractual form".

The Guidelines are subject to public consultation until 28 February 2025. Interested stakeholders may comment using the EDPB's online form here.

On 24 January 2025, the UK's Information Commissioner published a letter sent to the Prime Minister in response to a request for proposals to boost business confidence, improve the investment climate, and foster sustainable economic growth.

Notably, the letter proposes:

  • regulatory certainty on AI through the creation of a single set of rules for AI development and use, supporting the government in legislating for such rules to become a statutory Code of Practice, with a view to supporting businesses with clear guidelines and encouraging responsible innovation;
  • support for SMEs through an enhancement of the ICO's digital content and business advice services, the launch of a Data Essentials training program, and use of generative AI to provide tailored advice to these businesses;
  • innovation through regulatory sandboxes, giving businesses a time-limited derogation from specific regulatory requirements to test their new ideas, under strict governance by the ICO;
  • privacy-preserving online advertising through a review of a consent requirements under PECR to facilitate privacy-friendly forms of online advertising; and
  • streamlined international data transfers under new guidance to be published by the ICO, with a focus on making safe data transfers "quicker and easier" and working through international fora, including the G7 and Global Privacy Assembly, to build international agreement on increasing mechanisms for "trusted free flows of data".

These proposals highlight the ICO's focus on boosting the development of AI in the UK and enabling smoother data transfers across international borders. Perhaps most significantly, the reference to creating "a single set of rules" for AI suggests that the UK may continue to avoid strict AI legislation and focus instead on a lighter-touch regulatory approach.    

On 29 January, the General Court of the European Union (the "Court") ruled against Ireland's Data Protection Commission (the "DPC") in its challenge against the European Data Protection Board (the "EDPB"). This landmark decision clarified the EDPB's authority over national supervisory authority investigations.

The challenge stems from complaints against Facebook and Instagram in 2022. The DPC, in its role as lead supervisory authority, submitted draft decisions to the other concerned EU supervisory authorities, who raised objections about the investigation's scope. Unable to come to an agreement, the DPC referred the matter to the EDPB, which found the investigations to be too narrow in scope and ordered the DPC to carry out fresh investigations.

In its challenge to the Court, the DPC argued that the EDPB could not mandate changes to the scope of the DPC's investigations and emphasised the importance of maintaining discretion. The Court rejected this, affirming the EDPB's power to instruct regulators to conduct further investigations and mandate the production of new draft decisions. However, the Court also noted that it is within the Court's power to review the substantive legality of EDPB decisions. The DPC challenged only the EDPB's ability to make such a decision, rather than the substance.

In a decisive move, Italy's data protection authority, the Garante, has ordered the suspension of data processing activities by Hangzhou DeepSeek Artificial Intelligence and Beijing DeepSeek Artificial Intelligence, the companies behind the DeepSeek chatbot service ("DeepSeek"). The company recently gained attention for its R1 generative Al model, which rivals leading models at a fraction of the cost and computing power. Once launched globally, DeepSeek was downloaded by millions of people in just a few days. This suspension follows a series of regulatory steps aimed at protecting the personal data of Italian users.

The entire regulatory action unfolded over just three days, which highlighted the urgency of the Garante's response. The scrutiny began with a request for information on 28 January 2025, seeking details on the types of personal data collected, their sources, purposes, legal basis for processing, and storage locations. DeepSeek responded on 29 January 2025, claiming no operation in Italy and removal of their App from local stores. By 30 January 2025, the Garante issued an urgent order to limit DeepSeek's data processing activities for Italian users, effective immediately, and opened an investigation.

The Garante stated the GDPR applies to DeepSeek pursuant to Article 3(2)(a) because despite being removed from local App Stores, DeepSeek remained accessible via the website and continued to process personal data of users in Italy. The Garante identified several compliance issues, including an inadequate privacy policy (Articles 12, 13 and 14), lack of lawful processing basis (Article 6), lack of information to enable exercise of rights (Chapter III), lack of safeguards to data storage (Article 32), absence of an EU representative (Article 27) and non-cooperation with authority by providing insufficient information (Article 31). Based on these findings, the Garante suspended DeepSeek and initiated its investigation.

This is not the first time the Garante has confronted a major AI provider. In December 2024, the Garante fined OpenAI €15 million for GDPR violations related to its ChatGPT service. The investigation, initiated in March 2023, uncovered multiple breaches, including failure to notify about a data breach, processing personal data without a valid legal basis, and lack of age verification mechanisms.

These regulatory actions underscore the importance of GDPR compliance for AI service providers operating in the EU. DeepSeek now faces the challenge of addressing these compliance issues or risking significant penalties.

Additionally, government agencies in the United States, South Korea, Australia and Taiwan are seeking or enacting bans on DeepSeek for their employees due to data protection or security concerns. It remains to be seen how the company, and other AI service providers, will navigate the increasingly complex regulatory landscape.

Key contacts

Miriam Everett photo

Miriam Everett

Partner, Global Head of Data Protection and Privacy, London

Miriam Everett
Claire Wiseman photo

Claire Wiseman

Knowledge Lawyer, London

Claire Wiseman
Sara Lee photo

Sara Lee

Associate, London

Sara Lee
Alice Bourne photo

Alice Bourne

Associate, London

Alice Bourne

Key contacts

Georgie Green  photo

Georgie Green

Associate, London

Mackenzie Zhang photo

Mackenzie Zhang

Trainee Solicitor, London

Florine Stoner  photo

Florine Stoner

Trainee Solicitor, London

Miriam Everett Claire Wiseman Sara Lee Alice Bourne