A recent audit by international data protection authorities found a high prevalence of deceptive design patterns on the websites and applications that negatively impact users’ online privacy decisions
The Sweep
This year's Global Privacy Enforcement Network ("GPEN") Sweep (the "Sweep") focused on the use of deceptive design patterns or 'Dark Patterns' ("DDPs") in popular websites and applications. The Sweep took place from 29 January to 2 February 2024 and a report highlighting the key findings was published on 9 July 2024. The main objective of the Sweep was for participants to replicate the consumer experience of engaging with various websites and mobile applications to assess and determine whether those participants could: (i) make informed and free privacy choices; (ii) easily and clearly obtain privacy information; and (iii) log out and delete an account.
What are DDPs?
DDPs are design choices that mislead or deceive users of online services, and regulators tend to refer to them as "harmful online choice architecture". From a data protection perspective, DDPs can manipulate users to make decisions that may not be in their best interests or influence their decision, for example, to consent to particular processing activities etc. These practices therefore often lead to the provision by users of more personal information than necessary, making privacy-protective options harder to choose and therefore making available more of the users' data, or obstructing access to privacy-related information when needed.
The Sweep identified multiple indicators of DDPs, including:
- Complex and confusing language – Websites or apps might use overly technical or excessively long privacy policies to discourage the user from exploring them or making them harder to understand;
- Interface interference – specific design elements can be used to influence users' perception and understanding of their privacy options or to push them to choose a specific option that might not be in their best interest;
- Nagging – repeated prompts, especially through pop-ups, for users to take specific actions may undermine their privacy interests by influencing them to choose the option in the website's interest;
- Obstruction – the insertion of unnecessary, additional steps between users and their privacy-related goals may discourage them from fully enacting their privacy decisions; and
- Forced action – requiring or tricking users into providing more personal information to access a service than is necessary.
For further information, a non-exhaustive list of DDPs can be found here.
Why have they been subject to recent regulatory scrutiny?
There is currently a patchwork of legislation that governs DDPs (including around consumer, advertising, online safety, AI and competition legislation). However, DDPs have increasingly been subject to scrutiny due to their manipulative nature and the potential harm they can cause to users; particularly in an increasingly digital society where privacy and consent are of paramount importance. Multiple key concerns linked to DDPs include:
- User Manipulation: DDPs can be seen as "tricks", used by website and application operators to manipulate users into making decisions they might not otherwise make, often in the interest of the website or application operator. This could include signing up for recurring subscriptions, making unwanted purchases, or (as mentioned above) giving away more personal information than the individual intended. This manipulation can lead to a number of negative consequences, such as user frustration, a loss of trust in the platform and potentially financial loss or privacy breaches for the user.
- Privacy Concerns: Many DDPs involve the collection and use of users' personal data. For example, a DDP might result in a user unknowingly consenting to tracking cookies, third-party data sharing, or other privacy-invasive practices. This can infringe users' privacy rights and breach data protection regulations, such as the EU or UK General Data Protection Regulation ("GDPR"). These regulatory regimes require clear, informed consent for data collection and use; consent obtained using DDPs might be deemed invalid. Indeed, there has already been legal action regarding whether consent given through DDPs is validly given and complies with GDPR requirements.
- Lack of Transparency: DDPs can make it difficult for users to understand what they are agreeing to, often employing confusing language, hidden information, or misleading visuals to confuse the user. This lack of transparency can undermine informed consent, a key principle in both privacy and consumer protection law. DDPs can also make it hard for users to find out how to opt out or change their preferences later, further eroding user control.
- Ethical Considerations: Beyond the commercial and legal considerations, from an ethical standpoint, DDPs can be viewed as deceptive and unfair, exploiting cognitive biases and can disproportionately affect certain populations, such as children, the elderly, or those with less digital literacy. Under the GDPR, the processing of personal data pertaining to children under the age of 16 and/or the inability to erase any processed personal data may also raise compliance considerations. This raises concerns about digital equity and the ethical responsibilities of organisations.
DDP action examples
Given these concerns, there is increasing regulatory attention on DDPs with authorities calling for stricter proposals to regulate and prevent their use, with growing interest in research and advocacy to raise awareness around DDPs and their impact. Some examples of regulatory scrutiny include:
- 'X' blue check marks: The European Commission issued preliminary findings against X, under the Digital Services Act, accusing it of using DDPs and restricting access to required data. The European Commission identified X's blue checkmark system as a DDP, stating that it deceives users into believing the accounts are verified or authenticated. The Commission also found shortcomings in X's advertising repository, which is intended to maintain records of all paid advertisements, stating that it is not functional, lacks necessary information, and is technically deficient. X was also found to have failed in its obligation to supply data to third-party researchers. These findings, the first under the Digital Services Act, are considered a "milestone" by the European Commission. Further information can be found here.
- Linkedin 'friend spam': LinkedIn was ordered to pay $13 million in a 2015 settlement due to its 'Add connections' feature, which used a 'friend spam' DDP to harvest contacts from email accounts and send messages appearing to come from the user. The lawsuit resulted in eligible LinkedIn members receiving about $10 each in compensation and caused significant reputational damage to the company. As part of the settlement, LinkedIn agreed to implement new functionalities to allow users to stop these emails. Further information can be found here.
- Noom's auto-renewing traps: Noom, a weight-loss program, agreed to a $62 million settlement following allegations of deceptive business practices. The company was accused of luring customers with "risk-free" trial periods, only to trap them in expensive, auto-renewing contracts that were difficult to cancel. Noom's practices included offering low-cost or free trials, activating an auto-renewal program without explicit customer consent, and charging non-refundable membership fees, sometimes as much as $199. The settlement includes a $56 million cash payment and $6 million in subscription credits. Further information can be found here.
GPEN Sweep and next steps
The Sweep involved 26 privacy enforcement authorities from five continents (amongst others, the UK ICO, the French CNIL and the Italian GPDP) and examined over 1000 websites and applications. The Sweep was coordinated in conjunction with the Internatuonal Consumer Protection and Enforcement Network, due to the relevance of DDPs to both privacy and consumer protection. As part of the initiative, the Sweep discovered DDPs in 97% of the websites and applications examined, indicating that users frequently encounter at least one DDP when trying to make privacy-protective decisions or access privacy-related information.
The most prevalent DDP was the use of complex and confusing language in privacy notices, with 89% of the reviewed notices being excessively long (over 3,000 words) or containing technical language that made them hard to understand.
The Sweep aims to promote compliance with privacy and data protection legislation and cooperation between global privacy enforcement authorities. In its findings, the Sweep recommended that organisations improve platform design to enable users to better understand and control their personal data use. Good privacy design patterns include defaulting to the most privacy-protective settings, using neutral language, reducing navigation clicks for privacy choices, and providing just-in-time consent options. Overall, the Sweep recommended that organisations design their platforms to provide users with the ability to make informed privacy decisions, implement privacy-friendly design practices, and build consumer trust.
The Sweep follows a similar study that was conducted by the ICO and CMA last year, followed by a joint paper published in August 2023, where the regulators called on businesses to "stop using harmful website designs that can trick consumers into giving up more of their personal data than they would like." Stephen Almond, Executive Director of Regulatory Risk at the ICO stated "Businesses should take note that if they deliberately and persistently choose to design their websites in an unfair and dishonest way, the ICO will not hesitate to take necessary enforcement action". Back in March 2022, the EDPB also provided Guidelines on dark patterns in social media platform interfaces.
Whilst the Sweep is not a formal investigation, the concerns it has raised may be used to inform future enforcement actions, and help support targeted education and outreach to organisations.
Key contacts
Tommaso Bacchelli
Trainee Solicitor, London
Disclaimer
The articles published on this website, current at the dates of publication set out above, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your specific circumstances should always be sought separately before taking any action.