Follow us

On 11 September 2024, the EU Commission confirmed it will launch a public consultation on further new EU SCCs. The consultation is planned for Q4 2024, with the aim of adoption in Q2 2025.

The new SCCs will cover the situation where the data importer is located in a third country but is directly subject to the EU GDPR under Article 3(2) (Territorial scope). This is a gap in the current suite of SCCs published by the Commission and some will argue the consultation is long overdue.

Originally the (then "new") SCCs were published by the European Commission in June 2021. In the accompanying FAQs which were published in March 2022, the Commission asked the question "Can these SCCs be used for data transfers to controllers or processors whose processing operations are directly subject to the GDPR?" In answer, the Commission stated that "No. They do not work…as they would duplicate and, in part, deviate from the obligations that already follow directly from the GDPR." To address the gap at the time, the FAQs went on to say that the Commission was "in the process of developing an additional set of SCCs for this scenario".

In parallel, the EDPB issued Guidelines 05/2021, which clarified the interplay between Article 3 (which addresses the extraterritorial scope of the EU GDPR) and Chapter V of the EU GDPR (which addresses international transfers of personal data from the EEA to third countries). The EDPB recognised that data transfers to third countries still trigger Chapter V of the EU GDPR, even if the data importer in the third country is already subject to the GDPR under Article 3(2). In those guidelines, the EDPB also called on the Commission to develop a new set of SCCs to address scenarios where the data importer is subject to the EU GDPR.

The Commission has not yet issued a draft of these further new SCCs and we will provide additional details when it does. In the meantime, it seems likely that the new SCCs will be a cut down version of the current ones, likely focusing on third-country transfer risks (e.g. conflicting local law around government access to data). It also seems likely that the EDPB will produce related guidelines and that organisations will have a transition period of at least one year within which to implement the new SCCs. The further new SCCs will also complement the existing set of SCCs which can be used for data transfers to third country importers that are not subject to the EU GDPR. It remains to be seen whether a transfer impact assessment will still be required for this further type of SCCs as well. Further clarity will also be needed about the extent to which data importers need to be directly subject to the GDPR. As we know, the extra-territorial application of the GDPR applies to on a per processing activity basis rather than blanket application to the legal entity. It is currently unclear how this would map on to the need for the new SCCs.

For organisations located in third countries that import personal data from the EEA, it will be important to review existing SCCs to determine which transfers must be conducted under the further new SCCs. This will depend on whether the data importer is directly subject to the EU GDPR (or not), to ensure that the correct set of SCCs are applied.

In a recent opinion, Advocate General (AG) Richard de la Tour of the European Court of Justice (ECJ) provided important guidance for companies on balancing transparency requirements and EU GDPR compliance, with the protection of trade secrets around automated decision-making processes.

The opinion relates to Case C-203/22, where an Austrian individual was denied a mobile phone contract following a fully automated credit assessment. The assessment concluded that the individual lacked the financial capacity to pay the €10 monthly fee. To understand the rationale behind the decision, the individual requested meaningful information about the logic involved in the automated processing. However, the mobile phone company refused, citing the protection of trade secrets under Directive 2016/943 in respect of their AI algorithms. This led to the case being referred to the ECJ for a preliminary ruling.

The AG’s opinion addressed two main points: (i) how much information needs to be disclosed to comply with the EU GDPR’s transparency requirements; and (ii) how to balance this with the protection afforded to companies through trade secrets. In summary, the AG opined that trade secrets cannot be used to avoid transparency obligations under the EU GDPR. However, companies are not required to disclose complex algorithms or formulas. Please refer to our blog here for further details of the practical advice in the opinion on these two points.

Although AG's opinions are not binding on the ECJ, they are often adopted by the court. Therefore, this opinion offers valuable insight for those companies using automated decision-making and navigating the complex interplay with EU GDPR compliance.

On 12 September 2024, a Privacy Reform Bill (the "Bill") was introduced in the Australian Parliament, marking the next step in the ongoing reform of the Australian Privacy Act. This Bill comes almost a year after the Government announced that it ‘agreed’ or ‘agreed in-principle’ with 106 of the 116 recommended reforms in the Attorney-General's 2022 Privacy Act Review Report.

The Bill includes amendments to address most of the 25 ‘agreed’ proposals, including important topics relating to automated decision-making, overseas disclosure of personal information, data security and breaches, children’s privacy, civil penalties and enforcement powers, and a new statutory tort for serious invasions of privacy.

New offences will also be added to the Commonwealth Criminal Code in relation to doxing.

While the Bill contains some important reforms, it nonetheless leaves many of the ‘agreed in-principle’ proposals from the Review Report unaddressed. The Attorney-General has stated that his department intends to prepare draft legislation for these other important topics in the coming months, for consultation with stakeholders.

For a detailed look at the Bill and its implications please click here.

The AI training plans of big tech companies continued to make headlines this month, with X and LinkedIn suspending their use of user data to train their respective AI models, while Meta re-started its plans in light of talks with the ICO.

In early September, X (formerly Twitter), agreed to extend its suspension "indefinitely" in relation to its use of EU resident user data (e.g. personal data contained in public posts) to train its AI model, Grok. This followed the Irish Data Protection Commission's ("DPC") urgent enforcement action last month (please refer to our entry in the August Data Wrap) and the conclusion of a subsequent hearing before the Irish High Court on 4 September 2024. As mentioned in our previous Data Wrap post, this was the first time that the DPC had used its urgent enforcement process to push through a suspension of this kind. The DPC explained that it had engaged extensively with X prior to issuing the urgent enforcement action, so it is currently unclear whether the action was used as a last resort, or if the DPC had decided to take a particularly strict approach (which appears to be unprecedented and diverges from the approach adopted by the ICO in respect of Meta in the UK).

The DPC also submitted a formal request for an opinion from the European Data Protection Board on the use of personal data to train AI models, focusing in particular on "the extent to which personal data is processed at various stages of the training and operation of an AI model, including both first party and third party data and the related question of what particular considerations arise, in relation to the assessment of the legal basis being relied upon by the data controller to ground that processing". Once published, the opinion will provide helpful guidance for organisations around the extent to which they can rely on "legitimate interests" as a lawful basis for processing personal data in the context of data scraping – an area which currently remains subject to debate worldwide.

In contrast to X's "indefinite" suspension, on 13 September 2024, Meta announced it would re-start its AI training plans to train its AI systems using public content shared by adults on Facebook and Instagram in the UK over the coming months (see Meta's press release here).

It is worth noting that Meta has "engaged positively" with the ICO and has considered the ICO's guidance in respect of the "legitimate interests" Meta can rely on as a lawful basis for its processing of first party data to train general AI models for Meta's features and experiences. In particular, Meta appears to have implemented various measures in response to concerns, including providing in-app notifications in respect of its training plans and making the user form to object to the use of data for AI training "even simpler, more prominent and easier to find". The ICO acknowledged these changes in Meta's approach in its own press release on the issue (see here), noting that it would continue to monitor the situation and clarifying that "the ICO has not provided regulatory approval for the processing and it is for Meta to ensure and demonstrate ongoing compliance".

It remains to be seen whether the ICO's dialogue focused approach, or the DPC's more enforcement-focused approach will prevail in subsequent investigations around the use of personal data to train AI systems. However, it appears that the UK may be taking a divergent approach to the EU with respect to this issue.

In particular, it is worth keeping an eye on LinkedIn's suspension of its own training plans which was announced on 20 September 2024 (see ICO statement here). The UK regulator has confirmed that LinkedIn's suspension is pending further engagement with the ICO. It will be interesting to see whether this scenario also follows the Meta example; and whether LinkedIn will be able to re-start processing data relating of its UK users for AI training following such engagement and after implementing the regulator's guidance.

September saw the former European Central Bank chief and Italian prime minister Mario Draghi publish his report on the future of European competitiveness (the "Report"). The Report, coupled with European Commission President, Ursula Von der Leyen's, statement on the Report, suggest a shift from the previous digital agenda around regulating "Big Tech", to delivering on Europe's digital ambition.

In particular, the Report and the statement focus on "technology sovereignty… Boosting European cohesion and regions…Ensuring Europe can assert its interests and lead in the world". Of the three key areas to boost sustainable growth, the Report highlights include: closing the innovation gap with the US and China, as well as enhancing security and reducing reliance on non-European external suppliers for critical raw materials and technology imports, in particular.

To address challenges to competition, the Report presents a new industrial strategy for Europe. It identifies the underlying reasons for the EU's declining position in crucial strategic sectors and offers a range of proposals to regain competitive strength, including specific recommendations in a wide range of sectors. For further detail on each of the recommendations please refer to our competition blog here.

The Report indicates that the EU’s struggle with technology innovation is evident in its difficulty to establish globally successful platforms. Despite some Member States promoting 'sovereign cloud' solutions, the Report refers to the EU continuing to lose ground in the cloud services market to US-based companies. It also notes the EU's strong foothold in high-performance computing (HPC) and suggests leveraging this to boost AI adoption and private investment.

However, slow progress in AI development threatens the competitiveness of EU companies, and regulatory hurdles such as the EU GDPR and the AI Act (and inconsistency / overlap between the two) could further stifle innovation. To remain competitive, the Report emphasises that the EU should aim to lead in AI development across key sectors, regain control over data and sensitive cloud services, and build a robust financial and talent foundation to support innovation in computing and AI. The Report suggests several regulatory initiatives aimed at achieving sovereignty objectives, particularly for critical technologies.

For further information on these recommendations please refer to our blog here.

It will be interesting to see the development of a new regulatory landscape to support the technology sovereignty objective under the Commission's new mandate. As well as whether any new approach does, in fact, deliver on "Europe's digital ambitions".

Key contacts

Miriam Everett photo

Miriam Everett

Partner, Global Head of Data Protection and Privacy, London

Miriam Everett
Claire Wiseman photo

Claire Wiseman

Professional Support Lawyer, London

Claire Wiseman
Sara Lee photo

Sara Lee

Associate, London

Sara Lee
Mackenzie photo

Mackenzie

Trainee Solicitor , London

Kamilia Khairul Anuar photo

Kamilia Khairul Anuar

Trainee Solicitor , London

Miriam Everett Claire Wiseman Sara Lee