The UK's latest plan to reform the data protection and privacy landscape arrived at the end of October in the form of the new "Data (Use and Access) Bill" (or "DUA" (Lipa) as it will affectionately be called). The Bill, formerly known as the "Digital Information and Smart Data Bill" (in the King's Speech) and the "Data Protection and Digital Information Bill" (DPDI) before that, is Labour's first bite at the cherry since the Conservative's proposal fell away pre-election.
The Bill is not an all-singing, all-dancing reform of UK data protection law (not that anyone was expecting that). Whilst some proposals impose greater burdens on organisations (such as the more robust sanctions regime for direct-marketing and cookies), others require relatively minor adjustments to their compliance programmes. As for the potential for the Bill to ease the compliance burden, it remains to be seen whether the UK will, in fact, become a hotly anticipated sandbox for innovation in practice.
However, at first glance you would be forgiven for having a déjà vu feeling. The Bill bears a striking resemblance in parts to the Conservative's incarnation before it, retaining perhaps more of the DPDI than some may have expected. The UK's so-called "reform" appears to be more of an evolution, rather than a revolution.
That said, a number of the DPDI proposals have fallen by the wayside, particularly those provisions around accountability and governance that were originally intended to streamline compliance for organisations with the UK GDPR. It seems likely that the Government had one eye on the upcoming European Commission review of the UK's adequacy status next summer, when deciding which proposals to retain or drop.
Will it be a case of third time lucky or this version of the data reform bill? It seems likely that the UK data reform will finally take place. Already, momentum from the Bill's First Reading looks to be sustained for the Second Reading in the House of Lords on 19 November 2024, and a relatively smooth trajectory is expected.
For our full analysis of the Bill and its implications for business, please refer to our piece here.
As the dust settles on the US 2024 Presidential Election, what does a Trump presidency mean for US policy on data protection and privacy?
Privacy has not been an issue that has wildly divided Republicans and Democrats to date. With key topics such as abortion and immigration taking the limelight, it is unsurprising that neither candidate's campaign touched specifically on this area. So, what does Trump's previous tenure from 2017 to 2021, and his subsequent stance to date, tell us about the likely road ahead?
Will we see further challenge to the EU-US Data Privacy Framework? It was during Trump's presidency that his administration announced the EU – US Privacy Shield, to replace the Safe Harbour Framework which was struck down in 2015. The Privacy Shield followed suit and was also invalidated in 2020 by the ECJ. Whilst the latest mechanism to allow transatlantic transfers of personal data from the EU (the EU – US Data Privacy Framework), recently received a positive first review from the European Commission, will ongoing concerns in Europe over US surveillance practices destabilise the framework? Whilst Trump's stance on surveillance is a little unpredictable, could his "America First" rhetoric strain the framework? And let's not forget we have also had a so-called "Schrems III" challenge waiting in the wings since the framework's inception.
Will Trump follow through with his de-regulation promises? Setting the stage for a change to federal AI policy, on 6 November Trump re-iterated plans to swiftly dismantle Biden's AI Executive Order. Could de-regulation at the federal level give rise to an increase in state-level regulatory activity, especially in democratic attorney general states, leading to an even more fragmented US legislative patchwork around AI? Could the move exacerbate further international fragmentation? De-regulation also runs the risk of fostering uncertainty and reducing confidence in the US AI market.
…and what does this mean for federal privacy regulation? Let's not forget that Biden's AI Executive Order also called on Congress to "pass bipartisan data privacy legislation to protect all Americans" and prioritise children. Whilst the order didn’t set out a legislative process for doing so, if this mandate fell way, what would that mean for federal privacy efforts? Will we continue to see FTC enforcement as the US tool of choice to centrally regulate privacy, pending a comprehensive federal law? Will we see democratic state attorney generals escalating their privacy enforcement efforts as well? Only time will tell.
Our very own Global Head of Data and Privacy, Miriam Everett, presented a keynote speech on the "Regulation of Data and AI – The Evolving Global Landscape" at City & Financial Global's "Data, AI and Future of Financial Services" summit on 22 October 2024.
Following a presentation from UK Information Commissioner, John Edwards (who commented that "data protection law is not an obstacle, it is enabling"), Miriam outlined how the "global AI regulatory landscape appears to be evolving in an entirely different and even more challenging direction" than the existing data protection landscape. For more of Miriam's insights on the topic please refer to her follow up piece in Computer Weekly "AI Governance: Mapping the road ahead".
Miriam also chaired a lively panel on the UK international data transfer strategy, at which panelists discussed their frustrations with the current compliance burden for multi-jurisdictional organisations - particularly when conducting transfer impact assessments about third countries (which can be the "trickiest part"), the complexity of addressing compliance in upwards of 200 countries (with potentially diverging regulatory requirements), and the need to factor in other data-related regulatory requirements given that personal and non-personal data are typically stored together (for example data localisation measures in respect of KYC financial services requirements). However, a representative from the Department for Science, Innovation and Technology did confirm that the UK is trying to be "as pragmatic as possible" in its approach to international data transfers (as already shown by its approach to use of the International Data Transfer Agreement and the EU Addendum to assist interoperability).
A recording of the summit is available on request from City & Financial Global.
On 7 October 2024, the European Data Protection Board ("EDPB") adopted Opinion 22/2004 (the "Opinion"), addressing key questions principally around around controllers' duties under Article 28 of the EU GDPR when relying on processors and sub-processors (see here).
The Opinion provides insights into the EDPB's relatively high expectations for controllers in managing processors and sub-processors along the full length of the supply chain – with some requirements absolute and others commensurate to the risk of the data processing activity down the supply chain. For example, controllers must at all times have readily available information on the identity (i.e. name, address, contact person) of all processors, sub-processors etc. as well as a description of the processing (including a clear "delineation of responsibilities"), so that they can best fulfil their obligations under Article 28. This requirement applies regardless of the risk associated with the processing activity.
The EDPB reiterates its Guidelines 07/2020 (on the concepts of controllers and processors in the GDPR) (the "2020 Guidelines") in a number of places as the rationale for some of the statements in its Opinion. Whilst to some the Opinion may be regarded as current "good" or "best" market practice already, other stakeholders have criticised the Opinion, stating that "imposing such an end-to-end requirement would overburden businesses" and would be "unrealistic" – particularly for long complex data processing supply chains, for example where cloud service providers are involved or entities located in third countries.
We may see both controllers and processors revisit their existing data processing / sub-processing arrangements in light of the transparency and oversight highlighted by the Opinion. Although the ICO is not bound by EDPB opinions, it remains to be seen whether the UK data protection authority will consider this Opinion when interpreting (and enforcing) the same provisions under the UK GDPR.
The Opinion was prompted by a request from the Danish Supervisory Authority and aims to ensure consistent application of the EU GDPR by the national supervisory authorities across the EEA.
For further information on the Opinion and practical guidance please see our piece here.
On 4 October 2024, the CJEU released its judgment in KNLT vs AP (C-621/22) ("October Judgment") which clarified that a commercial interest of a data controller may be relied on as a legitimate interest under Article 6(1)(f) EU GDPR, subject to certain conditions around necessity and data minimisation, data subject rights and the balancing test. This is contrary to the strict position of the Dutch Data Protection Authority ("AP" or "Dutch DPA") in its related guidance from earlier in the year (see our previous April/May Data Wrap here).
In making its judgment, the CJEU referred to the three-pronged test for legitimate interests as set out in Meta Platforms (C-252/21) and SCHUFA Holding (C-26/22 and C-64/22), which states that for a legitimate interest to arise:
- there must be a pursuit of a legitimate interest by the data controller of a third party;
- the processing of personal data for the purposes of the legitimate interest pursued must be necessary; and
- the interest of fundamental freedoms and rights of the data subject concerned must not outweigh the legitimate interest of the controller.
The CJEU also held that the meaning of "legitimate interest" in Article 6(1)(f), EU GDPR did not need to be "limited to interests enshrined in and determined by law" (though it does need to be lawful), and that Recital 47 of the GDPR expressly stated that "the processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest".
Ultimately, it will be for the referring court to consider the facts of the case and determine whether the above conditions are met. As such, the assessment as to whether a commercial interest can amount to a legitimate interest should be conducted on a "case-by-case" basis. For further detail please refer to our piece here.
Shortly after the KNLT vs AP decision (see above), on 8 October 2024, the European Data Protection Board ("EDPB") launched a consultation on draft Guidelines on legitimate interests, building on the previous WP29 Opinion ("Draft Guidelines"). The Draft Guidelines broadly mirror the principles in the KNLT vs AP decision, including the three-pronged test for legitimate interests and the principle that a commercial interest should not be categorically excluded from the legitimate interest ground.
However, the EDPB also stressed that Article 6(1)(f) of the EU GDPR should not be considered a legal basis "by default" and should not be considered as an "open door" to legitimise all data processing activities. On the contrary, a legitimate interest should be interpreted restrictively and used as a "last resort". The EDPB also encouraged controllers to "perform a careful assessment of the planned processing and follow a specific methodology".
The Draft Guidelines provide further clarity around how this assessment should be carried out in practice as well as worked examples.
Interestingly, the Draft Guidelines also set out a number of elements that data controllers should consider when assessing whether they can rely on legitimate interests where there is an automated decision-making ("ADM") aspect to the processing (e.g. level of detail of the profile, impact of the profiling, possible future combinations etc). Although the Draft Guidelines do not comment on whether data controllers may rely on legitimate interests in relation to web-scraping for use in AI systems (including as training data), the EDPB has not ruled it out. This aligns with its 23 May Report of work undertaken by the ChatGPT Taskforce which suggests "legitimate interests" might be possible as a lawful basis and provides some useful considerations. For now, we will have to wait until the EDPB's 'Guidelines on generative AI – data scraping' are published later this year (as hinted in the EDPB's Work Programme 2024-2025).
The consultation on the Draft Guidelines closes on 20 November 2024, following which the EDPB will publish the final version of the guidelines.
Also on 4 October 2024, the CJEU handed down a key judgment in the case of Maximilian Schrems v Meta Platforms Ireland Ltd (Case C-446/21). Privacy activist Maximilian Schrems' made complaints to the Austrian courts in 2020 that personal data regarding his sexual orientation (which constitutes special category data under the EU GDPR) had been misused by Meta's Facebook to target him with personalised adverts without his consent.
The CJEU's decision was broadly in line with the previous AG Opinion after the Austrian Supreme Court escalated the case (see our comments in our previous April Data Wrap here). As per the AG Opinion, the CJEU held that:
- the data minimisation principle (under Article 5(1)(c), EU GDPR) precludes the processing of personal data for the purposes of targeted advertising without any restrictions as to time or type of data. This principle was stated to apply both to personal data obtained from a data subject, as well as that obtained from third parties and collected outside the platform to aggregate, analyse and process the data for the purpose of target advertising. In this case, the CJEU found Meta's collection of personal data relating to users' activities both on and off the platform was "particularly extensive"; and
- the purpose limitation principle (under Article 5(1)(b), EU GDPR) still applies where special category information has been "manifestly made public" under the Article 9(2)(e) exemption in the EU GPDR (including where a data subject makes a statement about their sexual orientation in a public forum). The CJEU held that making such information public means that the processing of the data for the purposes of personalised advertising is not automatically permitted and that Meta would still need to show that it has lawfully processed such data based on one of the lawful bases under Article 6.
In light of the above principles, the Austrian courts will now need to assess, based on the facts, whether the data retention periods imposed, and amount of data processed, by Meta were proportionate to the legitimate aim of processing such data for the purposes of personalised advertising.
On 7 October 2024, the European Data Protection Board ("EDPB") adopted the final version of Guidelines 2/2023, which aim to provide a technical analysis of the scope of Article 5(3) of the ePrivacy Directive ("ePD") ("Guidelines"). Article 5(3), also known as the cookie rule, only allows an organisation "to store information or to gain access to information stored in the terminal equipment of a subscriber or user" subject to subscriber or user consent (unless at least one of two limited exemptions apply).
The Guidelines seek to clarify what is covered by the wording in italics above, beyond the well-established tracking technology, cookies. In doing so, the EDPB have taken a relatively broad interpretation of the scope, perhaps to take account of the technological developments since the inception of the ePD. In particular, the scope covers:
- any "information" on the user or subscriber's terminal equipment – which is defined broadly to include more than just 'personal data'. It is also irrelevant from where the information is sourced (e.g. whether it is placed on a device by the user, embedded by the hardware manufacturer, or collated from sensors on the device for example);
- "terminal equipment" which refers to any device that is "connectable" to a public communications network (whether or not the device is in fact connected), such as a computer, phone, laptop, tablet, IoT device etc; and
- the focus on "storage or gaining access" to an individual's device is broadly interpreted too, such that storage and access do not need to occur within the same communication or be performed by the same party. By way of example, the Guidelines clarify that this can include use of a software development kit (SDK) (e.g. if an organisation sends, or instructs an operator to send, information about a user's device or store information on that user's device).
The Guidelines clarify the application of Article 5(3) to new emerging tracking technologies for gathering information, such as customised URLs (which tell an operator where visitors come from), pixel tracking (where almost invisible images are triggered when a user visits a webpage or reads an email), IP-based tracking, IoT reporting, and unique identifiers. In this way the Guidelines confirm to organisations that the cookie rule extends beyond just cookies.
Whilst EDPB guidelines are no longer binding in the UK, the Information Commissioner’s Office has suggested they may still offer useful guidance for UK organisations. It remains to be seen whether the ICO will be influenced by this broad interpretation in the Guidelines in respect of the Privacy and Electronic Communications Regulations 2003 (which originally implemented the ePD in the UK).
On 22 October 2024, the Irish Data Protection Commission ("DPC") adopted a final decision and fined LinkedIn Ireland €310 million for breaches of the EU GDPR related to the processing of personal data for behavioural analysis and targeted advertising of users who had created LinkedIn profiles (see here). This decision followed an inquiry initiated by a complaint from the French non-profit organisation, La Quadrature Du Net.
The DPC’s investigation revealed that LinkedIn did not have a lawful basis for processing personal data relating to its members for the purpose of behavioural analysis and targeted advertising. Specifically, LinkedIn: (1) failed to obtain valid consent that was freely given, sufficiently informed, specific, or unambiguous; (2) could not rely on legitimate interests as LinkedIn's interests were overridden by the interests and fundamental rights and freedom of data subjects; and (3) did not meet the requirements for contractual necessity.
Accordingly, LinkedIn contravened Article 6, EU GDPR and Article 5(1)(a) EU GPDR in so far as it requires the processing of personal data to be lawful. In addition, LinkedIn was found to be in breach of transparency requirements (Articles 13(1)(c) and 14(1)(c), EU GDPR) for not adequately informing users about its data processing practices, and the principle of fairness (Article 5(1)(a), EU GDPR).
As a result, the DPC issued a reprimand, ordering LinkedIn to "bring its data processing into compliance with the EU GDPR", and imposed administrative fines totalling €310 million.
Unlawful data scraping and the training of AI-systems using personal data are top priorities for global data protection authorities. Each month our Data Wrap features another big tech organisation needing to pause or re-calibrate processing of personal data from its platform to train its AI systems (for example see our October Data Wrap here in respect of X, Meta and LinkedIn).
On 28 October 2024, DPAs from 16 jurisdictions, including Australia, Canada, China, Spain, and the UK, issued a joint statement on data scraping ("Joint Statement") (see here and refer to the ICO's press release here). The Joint Statement urges social media companies to protect individuals from unlawful data scraping.
Key expectations set out in the Joint Statement include:
- Compliance in LLM Development: Social media companies and operators of websites that host publicly accessible personal data have an obligation to protect such data on their platforms from data scraping that breaches data protection and privacy laws. In particular, compliance is essential when using personal information to develop AI Large Language Models ("LLMs").
- Safeguarding Measures: Organisations must deploy and regularly update a combination of safeguarding measures to keep pace with advances in scraping techniques and technologies.
- Lawful Data Scraping: Any permissible data scraping for commercial or socially beneficial purposes must be conducted lawfully and under strict contractual terms. Whilst those contractual terms cannot, in and of themselves render such scraping lawful, they can be an important safeguard.
- Mass data scraping incidents: Mass scraping that harvests personal information can constitute reportable data breaches in many jurisdictions.
This Joint Statement builds on the initial joint statement published on 24 August 2023 (see here) and follows a year of engagement with industry stakeholders, including the parent companies of YouTube, TikTok, Instagram, Threads, Facebook, LinkedIn, Weibo, and X (formerly known as Twitter).
This constructive dialogue between authorities and industry representatives has enabled both sides to further understand and examine practical issues related to data scraping. In light of industry cooperation and openness in related discussions with regulators, this enabled the DPAs to develop and share their expectations without the need for formal enforcement action.
Key contacts
Key contacts
Mackenzie Zhang
Trainee Solicitor , London
Disclaimer
The articles published on this website, current at the dates of publication set out above, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your specific circumstances should always be sought separately before taking any action.