Follow us

On 22 May 2024, the surprise call for a General Election triggered a short "wash-up period" allowing the Government to enact legislation that was essential or subject to minimal debate before Parliament was dissolved on 24 May 2024. The short list of priority legislation that was hurried through during this period did not include the UK's "reform" of its data protection framework through the Data Protection and Digital Information Bill ("DPDI Bill"). As such, the DPDI Bill was not enacted and now cannot be carried over to the next Parliament. In the words of the Liberal Democrat peer Lord Clement-Jones the DPDI Bill is now "as dead as a dodo".

The reform was originally premised on the UK "forging its own path" in a post-Brexit age. Whilst privacy activists criticised some of the proposed changes to the UK GDPR for allegedly "watering down" data subject rights and safeguards, others have suggested the proposals were actually much less of a departure from the current EU GDPR framework than had initially been anticipated.

The status quo therefore remains for now - with data protection in the UK continuing to be governed by a combination of the current UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications (EC Directive) Regulations 2003. This is arguably no bad thing from the point of view of the UK maintaining its EU adequacy status in the face of any proposed data protection reform - an issue that had threatened to tie the Government's hands throughout the reform process. It is not yet clear whether the Labour party would resurrect the reform, however, whilst Labour has not broadly opposed the DPDI Bill, reform of the UK GDPR seems unlikely to be an immediate priority. That said, it is possible that elements of the DPDI Bill will return in some form, although it is not clear which elements and whether this would be in the form of a new revised data bill or part of a broader digital package.

For further information please refer to our full piece "General Election: Data protection reform falls, but what does this mean for digital regulation (including AI)?"

Staying with the "General election" theme, in response to the fall of the DPDI Bill, Lord Clement-Jones has suggested that Labour plan a "digital bill in the autumn on entirely different lines" including artificial intelligence.

Whilst Labour's official AI policy is currently relatively light, a growth paper on AI is expected to provide further clarity in the summer. That said, the party considers the UK Government's existing light-touch, pro-innovation, sector-led "wait and see" approach as inadequate. Back in November 2023, Shadow Secretary of State for Science, Innovation and Technology, Peter Kyle, highlighted that the Bletchley AI Summit was an opportunity for the UK to "lead the global debate on how we regulate this powerful new technology for good. Instead the Prime Minister has been left behind by US and EU who are moving ahead with real safeguards on the technology". He has also voiced concerns specifically around disinformation and deepfakes.

According to Global Counsel's Top in Tech podcast, there is an expectation that Labour wants to prioritise "recalibrating" UK – EU relations, which could influence a broad range of policy areas such as digital regulation and make divergence unappetising (not necessarily a bad thing given the challenge for multi-national organisations looking to comply with the current fragmented international landscape around AI regulation). It therefore seems possible that a Labour Government could look to take a more interventionalist, prescribed approach to AI, perhaps aligning the UK's approach more with that of the EU (where the comprehensive functional risk-based EU AI Act is due to come into force imminently). For further information see our blog here. As a last minute addition to this Data Wrap, in its manifesto published on 13 June 2024, Labour described existing regulators as "ill-equipped to deal with the dramatic development of new technologies," and suggested it would introduce "binding regulation on the handful of companies developing the most powerful AI models." That said, campaign realities mean that manifestos often veer away from detailed policies in complex areas like industry regulation, suggesting it could be months before a Labour Government's digital vision truly comes into focus.

For further information please refer to our full piece "General Election: Data protection reform falls, but what does this mean for digital regulation (including AI)?"

On 17 April 2024, the European Data Protection Board (EDPB) issued an opinion on the validity of consent to process personal data for behavioural advertising in "consent or pay" models. The headline coming out of the Opinion is that "in most cases, it will not be possible for large online platforms to comply with the requirements for valid consent if they confront users only with a binary choice". This would seemingly be the EDPB charting a course which aligns with the Bundeskartellamt decision (i.e. that a consent or pay model is permissible) but substantively narrows the circumstances in which it would be feasible and so addressing the privacy concerns.

The detail of the Opinion though puts more meat on the bones and raises a range of issues and interests for market participants to consider, including:

  • Applicability of the Opinion: The Bundeskartellamt decision was in the context of large online platforms, a term which is not defined for the purposes of GDPR. The Opinion does not set out a test of what would constitute a large online platform for the purposes of the Opinion, but it does provide a range of aspects which may determine whether an organisation is one.
  • Consent must still be GDPR consent: The Opinion is clear that a consent or pay model can operate provided that the controller can demonstrate that such consent meets the requirements of GDPR: freely given, informed, unambiguous and specific.The Opinion emphasises that the specifics of a given consent or pay model (and in particular the organisation implementing such model) are critical and that case-by-case consideration will be required. There is then no 'one size fits all' approach for large online platforms.   
  • Other GDPR principles: Even if a controller considers that its consent or pay model meets the GDPR consent requirements, the Opinion emphasises that all GDPR principles will need to be complied with, such as principles of necessity, proportionality and fairness.
  • Real alternative(s): While the Opinion does not unambiguously state that any one approach by a large online platform would render a consent or pay model workable from a consent compliance perspective, it does strongly suggest that the provision of an alternative service which provides genuine equivalence to those which are offered pursuant to the consent or pay model options, would be beneficial.

Ultimately the Opinion is non-binding and it will be for supervisory authorities to determine whether a particular consent or pay model is appropriate in the context in which it is being used. The Opinion does however offer some further clarity on the considerations which supervisory authorities will have if and when reviewing such models.

For further detail please refer to our full piece "Consent or Pay: Boom or Bust?".

May saw the American Privacy Rights Act ("APRA"), continue its legislative journey in the US. As mentioned in our March Data Wrap, on 7 April 2024 the U.S. Senate Committee on Commerce, Science and Transportation Chair, Maria Cantwell, and Chair of the House Committee on Energy and Commerce, Cathy McMorris Rodgers, first published a draft of the APRA. The aim of the Act is to create a comprehensive federal level piece of privacy legislation for the US, to enable consistency and harmonisation across the existing patchwork of state laws, and to give individuals greater control of their own personal data. The APRA introduces rights for individuals residing in the US and places obligations on organisations to safeguard the privacy of those individuals. The Act is expected to pre-empt many provisions of state-level privacy legislation (with some exceptions).

The first draft was updated relatively swiftly on 21 May 2024 to include several amendments, the most substantial change merging the Children and Teen's Online Privacy Protection Act (so-called COPPA 2.0) into the Act - albeit with some omissions when compared when compared with the COPPA 2.0 proposal that was previously before the Senate. Other topics that were amended in the new version include changes to handling of advertisements, data minimisation, obligations placed on smaller businesses and algorithmic impact assessments, as well as greater obligations on data brokers. The updated draft proceeded to the U.S. House Committee on Energy and Commerce Subcommittee on Data, Innovation and Commerce and was approved on 23 May. It will now advance to the full committee for consideration.

The Act introduces some concepts that are similar to those set out in the EU GDPR that has become a global benchmark for data protection regulation since 2018. It is worth noting that these provisions contain more than just a copy of the GDPR wording; many seek to incorporate best practice formed over six years of enforcement of the GDPR (and similar legislation worldwide), as well as concepts from other pieces of EU digital regulation such as the EU Digital Services Act. Arguably some of these provisions aim to specifically fill the gaps around matters such as dark patterns and data brokers which are only specifically covered by the European Data Protection Board's guidelines across Europe.

Given that talks around its predecessor, the Data Privacy and Protection Act, were stalled since early 2023, the APRA is a long time coming when compared with the comprehensive national privacy frameworks in other international jurisdictions. It is still early days and it remains to be seen how the Act progresses through the legislative process, with close scrutiny expected around areas such as federal pre-emption rights, private right of action, protection of children and data brokers. That said, the US initiatives at both state and federal level do indicate an intention to create a more holistic privacy and data protection legislation.  

For further information please refer to our more detailed blog here.

Amidst growing interest from European regulators, May saw the Dutch Data Protection Authority ("AP") publish guidelines stating that "data scraping will almost always be a violation of the General Data Protection Regulation".

The AP stated that data scraping is the automated collection and storage of information from web pages, and the guidelines are intended to apply to scraping by both private individuals and organisations. Whilst the AP differentiated between data scraping and web-crawling using a search engine, the guidelines are expected to apply to both methods of data collection - organisations must comply with the GDPR and, in particular, require a legal basis under Article 6 for data processing in this way. The AG's opinion seems to significantly rely on the view that the only likely legal basis on which to rely for data scraping is "legitimate interests" that could only cover interests that are protected by law (and that "purely commercial interests" would not suffice). Reliance on "legitimate interests" in this context is something that is currently the subject of debate worldwide, and it is an area that the EDPB is still assessing, however, its 23 May Report of work undertaken by the ChatGPT Taskforce suggests "legitimate interests" might be possible and provides some useful considerations. In practice, it is also worth checking that a suitable data protection impact assessment, in particular, has been conducted to assess GDPR compliance.

The AP also debunked the "wide-spread misunderstanding" that scraping is permitted simply because information is publicly available on the open web. The regulator seemed to suggest that as data scrapping "almost always" includes personal data meaning that, among other things, organisations must comply with Article 5(1)(a) that personal data must be processed lawfully, fairly and in a transparent matter regarding the data subject. Individuals also ought to be informed of the "legitimate interest" basis relied on to process their personal data and be given the right to object (although query how workable this is in practice or consider whether it is possible to rely on an exemption such as if it "proves impossible or would involve a disproportionate effort" to do so). This recent set of guidelines follows several data protection regulators issuing a joint statement on data scrapping in August 2023 (see our August Data Wrap) and the ICO issuing a related consultation at the start of the year. Despite its relatively sceptical approach, it remains to be seen whether other EU regulators align with the AP's guidance. Meanwhile, other jurisdictions – such as India – have taken an alternative path with the scraping of publicly available personal data exempt from data protection compliance to differing degrees.

Given the increasing threat of cyberattacks, the ICO called for organisations to "boost their cyber security and protect the personal data they hold" in a blog post on 10 May 2024. The statement went on to consider the UK regulator's own trend data which states that over 3,000 incidents were reported to the ICO in 2023, with the finance, retail, and education sectors being most affected. In parallel, the ICO also published a report titled "Learning from the mistakes of others" which included practical advice to help organisations understand common security failures and take steps to improve their own security, preventing security breaches before they happen.

Keeping on the cyberattack theme, governments worldwide are also grappling with the conundrum of how far they should seek to legislate to ban, or at least dissuade, the payment of ransoms, which ultimately fuel widespread criminal enterprise as part of a cyberattack. On 28 May 2024, Stephen McPartland MP published the McPartland Review of Cyber Security and Economic Growth following a "call for views" on the Government's website and a series of 26 evidence sessions with industry participants including business organisations, academics, law firms, IT providers and insurers. This included specific recommendations that the Government "tighten rules" on ransom payments, by increasing reporting obligations and potentially seeking market driven "rewards" for organisations which resists extortion attempts, such as lower insurance premiums. The reported proposals, which may be included in a future public consultation, go further and include a scheme whereby victims would need to seek a license to make any ransom payment, as well as a complete ban on ransom payments for organisations involved with critical national infrastructure. It remains to be seen whether any of these proposals will be taken forward under a potential new Government, following the recent call for a General Election.

For further information please refer to our piece "To pay or not to pay - UK poised to consult on new regulatory restrictions on ransomware payments?"

Advocate General Rantos ("AG") has provided his opinion following a referral for a preliminary ruling from the Austrian Supreme Court relating to Maximilian Schrems v Meta Platforms Ireland Ltd (Case C-446/21). In a rare break from Max Schrems' challenges regarding international data transfers, the case related to Meta's alleged use of data about Schrems' sexual orientation and, in particular, the opinion took a close look at the "data minimisation" and "purpose limitation" principles. In short, the opinion stated that a public statement by the user of a social network about their sexual orientation makes those data "manifestly public", but does not permit their processing for the purposes of targeted advertising.

The first issue referred for consideration was whether a statement made by a person about his or her sexual orientation as part of a panel discussion permits the processing of other data concerning that topic for the purposes of offering him or her targeted advertising. The AG found that the EU GDPR precludes the processing of personal data for the purpose of targeted advertising without restriction as to time or type of data. If followed by the referring national court, the court must assess, based among other things on the principle of proportionality, "the extent to which the data retention period and the amount of data processed are justified having regard to the legitimate aim of processing those data for the purposes of personalised advertising". This reconfirms the importance of proportionality and necessity when processing personal data and, if followed, could require much more specific drafting of privacy notices going forward.

Under the second question considered, the AG found that the fact Schrems made a statement concerning his own sexual orientation during a panel discussion that was open to the public, may constitute an act by which the data subject "manifestly made public" those data for the purpose of the exemption under Article 9(2)(e) of the EU GDPR. Whilst data concerning sexual orientation fall under the special category data provisions (the processing of which is prohibited), the prohibition does not apply when the data are manifestly made public by the data subject. That said, this exemption does not in itself permit the processing of those data (or other data concerning the sexual orientation of that person) for the purposes of personalised advertising. To lawfully process the data, the relevant principles and provisions of the GDPR still need to be followed (e.g. the need for a relevant lawful basis under Article 6). This outcome enforces the strict application of the GDPR's purpose limitation principle. Although not legally binding, AG opinions are generally followed by the European Court of Justice. Going forward it will also be interesting to see what other scenarios render special category data "manifestly made public".

As global regulators explore their own path to AI regulation, the UK has taken a pro-innovation, "wait and see" sector-led approach. Responding to the end-April 2024 deadline from the AI Regulation White Paper (the "Response") published in February 2024, the ICO – keeping with other regulators - has doubled down on this approach in its strategic approach paper. While the ICO acknowledged that many AI risks (e.g., deepfakes) sit outside data protection, it emphasised that the focus must be on resourcing and empowering existing UK regulators within their remits.

The ICO mapped its statutory data protection principles with the 5 principles from the Response, highlighting their similarity and its active experience and contributions. Despite AI posing novel security risks (e.g., model inversion attacks), the ICO has been addressing these through Network and Information Systems Regulations and institutional collaborations. Transparency and explainability extends beyond providing information regarding data processing in AI systems, requiring organisations to explain their "logic" when they make solely automated decisions with legal or significant effects. Fairness in the data protection context is broader because it focuses on not only the distribution of outcomes amongst individuals but also the contextual factors affecting the power dynamics between them. Accountability requirements already extend to AI systems, but the ICO is evaluating revision for generative AI and procurement. Contestability and redress can be indirectly operationalised through information rights (e.g., access).

Under its agile approach, the ICO has published specific guidance in these areas: AI and Data Protection, Explaining Decisions Made with AI, Accountability, and Risk toolkit. It has used a wide array of other tools for guidance, like the Regulatory Sandbox, Innovation Advice, and Innovation Hub, and for implementation, with Information, Assessment, Enforcement, and Penalty Notices. The ICO has extensively collaborated with other regulators, international partners, and standard-setting bodies. This is typified by the Digital Regulators Cooperation Forum recently releasing the AI and Digital Hub that provides free advice on complex questions crossing multiple regulator's remit.

As AI is one of its three focus areas for 2024/25, the ICO is expected to wrap up its consultation series on generative AI and update its AI and Data Protection guidance to reflect changes following passage of Data Protection and Digital Information Bill. It will also launch its Enterprise Data Strategy and support algorithmic transparency recording standards to "role-model responsible use of AI".

Additional policy intervention on data protection and AI can be expected globally too in the upcoming year, with the German Data Protection Authority having already issued guidance on AI applications and Large Language Models.

 

Key contacts

Miriam Everett photo

Miriam Everett

Partner, Global Head of Data Protection and Privacy, London

Miriam Everett
Claire Wiseman photo

Claire Wiseman

Professional Support Lawyer, London

Claire Wiseman
Alasdair McMaster photo

Alasdair McMaster

Senior Associate, London

Alasdair McMaster
Ankit Kapoor photo

Ankit Kapoor

Graduate Solicitor (India), London

Saara Leino photo

Saara Leino

Associate (Finland) (External Secondee), London

Tommaso Bacchelli photo

Tommaso Bacchelli

Trainee Solicitor, London

Miriam Everett Claire Wiseman Alasdair McMaster