Stay in the know
We’ll send you the latest insights and briefings tailored to your needs
In the wake of the recently published UK action plan to "turbocharge" AI, questions are being asked about how important the role of data protection is in the responsible use of AI by consumers?
According to the UK's data protection regulator in response to the plan, "AI has huge potential to transform businesses and public services, which is why it is a priority area". However, the regulator also noted "data protection is essential to realising this opportunity and ensuring that the public can have trust in AI".
While 2018 saw the EU GDPR put data protection regulation on the map, the Information Commissioner's Office response suggests data protection is not just a regulatory compliance exercise, but a significant driver of consumer behaviour and the key to unlocking the potential of AI.
In fact, in December 2024, in response to its own consultation series on data protection in generative AI, the ICO found "a serious lack of transparency, especially in relation to training data within the industry is negatively impacting the public's trust in AI".
This finding goes beyond the confines of stakeholder responses to the ICO's consultation series. Our own research conducted to coincide with International Data Privacy Day, recently revealed that 65% of consumers were uncomfortable with the way in which their personal data might be used to train AI systems.1 Although 6% of those surveyed expressed some comfort if organisations are transparent about how they train AI systems, a significant majority (93 per cent) want more done to better inform people about how their data is being used.
Combined, these results suggest consumers are increasingly mindful of how their personal data is being used. The data also highlights just how pivotal a role data transparency, one of the core tenants of data protection regulation, plays in consumer perception and brand trust – including in the context of AI. It is a crucial enabler to maintaining consumer engagement and credibility in the digital environment. This data protection principle is of particular relevance to emerging technology given the 'black box' nature of some AI models. The results also establish a call to action echoed by the ICO who recently told generative AI developers "it's time to tell people how you're using their information." That said, fulfilling data transparency requirements is a balancing act and is not without its challenges.
And the practice of so-called "data scraping" is firmly on the radar of domestic and international regulators alike. 2024 saw the glare of regulatory scrutiny levelled at the use of personal data to train AI models, with investigations focussing on the developers of large language models in particular. This led to various supervisory authorities setting out how best to navigate the data protection landscape as a key component of the existing regulatory patchwork governing the use of AI, urging online services to protect individuals from unlawful data scraping. In an effort to provide further regulatory certainty in the UK in lieu of AI-specific legislation, only last week the ICO announced plans to produce "a single set of rules for those developing or using AI products" which it suggests should form a statutory code of practice on AI.
Yet with only half (56%) of consumers wanting stricter data protection regulations, it is clear data protection compliance holds significant value to a more privacy-aware customer base. It catapults compliance from a challenging evolving international regulatory issue to a brand strengthening and investor enticing asset in its own right. Compliance is now a strategic imperative, particularly for the innovative business models of technology companies, that can significantly influence operational practice and financial performance. It is an enabler for responsible, sustainable innovation and data-driven growth and a differentiator against market competitors. In the words of the ICO "there is no excuse for generative AI developers not to embed data protection by design into products from the start".
1. The research was undertaken to coincide with International Data Privacy Day and explores the views of approximately 5000 consumers aged 18-plus across the UK, Australia, Singapore, Germany, France, Spain, Italy and Indonesia.
The articles published on this website, current at the dates of publication set out above, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your specific circumstances should always be sought separately before taking any action.
We’ll send you the latest insights and briefings tailored to your needs