Introduction
Over a year has passed since the Online Safety Act ("OSA") received Royal Assent and the online safety regulator, Ofcom, continues to press ahead with its phased plan for implementation.
2025 will be a big year for the OSA with the Illegal Harms codes of practice due to become enforceable from March, age assurance duties already applying to some services, and further codes of practice and guidance on topics such as protection of children expected. Service providers will be watching developments carefully: the consequences of non-compliance with the OSA could be very significant - there is provision in the OSA for fines of up to £18 million or 10% of qualifying worldwide revenue, and corporate officers could face criminal liability for certain breaches.
One direction that service providers can look to ensure they are well-prepared for the changes ahead is the European Union, where the EU's Digital Services Act ("DSA") has been enforceable for some time now. The European Commission (the "Commission") has already commenced a number of investigations into alleged breaches of the DSA by service providers and some service providers have challenged decisions by the Commission. Whilst there are undoubtedly key differences between the OSA and the DSA, including the approach to categorisation of service provider and the types of content covered, there is also considerable overlap. The Acts share the same common goal - making the internet safer - and they seek to achieve this through similar means: by imposing obligations on service providers to mitigate and manage the risks of harm arising from illegal content and activity on their services, including through mechanisms for the reporting and removal of illegal content and risk assessments. Service providers can therefore look to proceedings under the DSA for a sense of the kinds of disputes that we may see under the OSA when it starts to take effect.
Enforcement Proceedings
Proceedings commenced by the Commission under the DSA to date include:
- Proceedings against X, initiated in December 2023, concerning compliance with obligations including risk assessment and mitigation measures to combat the dissemination of illegal content and the mechanisms for service users to flag illegal content, as well as the reliability of so-called "verified" accounts (marked on the platform with a blue tick). In July 2024, the Commission announced preliminary findings of non-compliance against X, indicating that "verified" accounts operated in a way that misled users and that since it was open to anyone to subscribe for "verified" status, this interfered with users' ability to make free and informed decisions about the authenticity of the accounts and content they were interacting with. The Commission also considered that X had failed to comply with other requirements in the DSA concerning transparency of advertising and to provide access to data to researchers.
- Two separate sets of formal proceedings against Meta. The first set of proceedings, initiated in April 2024, concerns Facebook and Instagram policies and practices in relation to allegations of deceptive advertising and disinformation, the political content on its services, the non-availability of an effective third-party real-time civic discourse and election monitoring tool and their mechanisms to flag illegal content. The second set of proceedings, opened in May 2024, relates to the protection of children, including allegations that systems and algorithms could stimulate behavioural addictions in children, as well as concerns regarding the effectiveness of age-assurance methods put in place to prevent children from accessing inappropriate content. Both sets of proceedings against Meta are still in their preliminary stages.
We can expect to see similar enforcement proceedings arise under the OSA. For example, the OSA also obliges service providers to use age verification tools as part of the duties in relation to protection of children, and this is an area of particular priority for Ofcom. Ofcom has repeatedly and publicly stressed that it will not hesitate to take swift enforcement action once duties in the OSA become enforceable.
However, while both Acts represent a significant change for the design and operation of service providers' systems and processes, the scope of the risks to be addressed by service providers differs between the Acts.
The DSA has a broader coverage than the OSA in that it extends beyond illegal content to 'systemic risks', including negative effects on fundamental rights (such as freedom of speech, privacy and non-discrimination), civic discourse and electoral processes. On the other hand, the OSA focuses on protecting against illegal content and activity but also requires companies to prevent children from accessing content that is harmful but not necessarily illegal, a novel requirement that is not present in the DSA. The DSA places an obligation on service providers to put in place proportionate measures to ensure a high level of privacy, safety and security for minors, but the OSA goes a step further by placing a duty on service providers to use proportionate systems and processes designed to prevent children from encountering certain types of harmful content.
Similarly, the DSA generally does not hold online platforms accountable for illegal content posted by their users, as long as they promptly take action to eliminate it upon becoming aware of its presence. By contrast, the OSA places a positive statutory duty on companies to take proactive steps to protect their users from harm. This is a significant obligation that goes beyond the DSA’s requirements. Since the OSA places more onerous obligations on service providers in certain areas, this may be likely to result in more enforcement proceedings arising under the OSA.
Challenges by service providers
Both Acts impose tiered obligations on service providers but adopt different approaches. The OSA will categorise service providers based on size and functionality, whereas the DSA categorises service providers based on the size of their operation alone. The DSA therefore imposes the most stringent obligations on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) that have at least 45 million average monthly active users or recipients in the EU, as compared to the 34 million user threshold for Category 1 services currently awaiting Parliamentary approval under the OSA. Although there have been attempts at international regulatory co-ordination, the difference in approach means service providers will not necessarily fall into equivalent categories under the different regimes.
We have already seen a number of challenges mounted by service providers against the designation of their services as VLOPs under the DSA. For example, both Zalando, an online fashion retailer, and Amazon are challenging their designations as VLOPs on the basis that their business falls outside the scope of the definition of a VLOP. Both emphasise that the majority of their business model is of a retail nature, including, in particular, selling direct to the customer, rather than as an intermediary for third party traders. Both challenges also allege discriminatory or unequal treatment in their designation as VLOPs as compared to other platforms or retailers.
Category 1 providers who are subject to the most onerous obligations under the OSA will be designated on the basis of particular features of their systems, not just size. This represents a more complex and nuanced approach to categorisation than the DSA's bright line rule of monthly user numbers. We could therefore see early stage challenges to the secondary legislation on categorisation if organisations consider that to be unjustified or disproportionate, even before individual categorisation decisions are made. If services have a fundamental issue with the methodology or approach set out in that secondary legislation, then they should consider their position at that initial stage.
Formal designation of in-scope services will also be a key point for businesses to consider whether they are content with their categorisation or have reason to challenge it, noting that significantly more stringent requirements will apply to companies with a higher categorisation.
Any such challenges by service providers under the OSA will need to be brought on judicial review principles in the Upper Tribunal under section 167 OSA. Examples of the types of arguments that could be brought include:
- procedural unfairness in the designation process;
- unreasonableness e.g. if the service provider can demonstrate that Ofcom's decision to designate it as a Category 1 provider was so unreasonable that no reasonable authority would have come to it, or there has been a failure to consider relevant considerations, or a failure to undertake sufficient inquiries;
- illegality e.g. if Ofcom did not have the legal power under the OSA or any subsequent secondary legislation to make the decision it did, for example, by imposing a categorisation on potentially out-of-scope service providers.
Appeals against other Ofcom decisions under the OSA, such as certain enforcement decisions, also need to be brought on judicial review grounds (s168).
In Amazon's legal proceedings contesting its designation as a VLOP, it is arguing that there has been a violation of its "fundamental rights". Service providers affected by the OSA might similarly seek to challenge Ofcom decisions on the basis that they disproportionately interfere with their rights under the Human Rights Act, for example, under the Article 1 Protocol 1 right to peaceful enjoyment of possessions (i.e. property rights) or the Article 10 right to freedom of expression. In fact, certain human rights principles are specifically imported into the OSA, such as proportionality and the importance of protecting freedom of expression and privacy of users of services, which may strengthen challenges of this nature.
Conclusion
While the OSA and the DSA are distinct pieces of legislation, the types of proceedings we have seen under the DSA provide valuable insights into the potential challenges that might be brought by affected businesses in the UK under the OSA.
In addition, the OSA goes further than the DSA in some respects and is therefore likely to give rise to novel legal issues in any enforcement activity, for example, relating to the requirements around preventing children from accessing content that is harmful but not necessarily illegal. Given the significant public interest in this area and the resultant pressure on Ofcom to be seen to enforce the OSA aggressively, businesses should expect a tough approach from Ofcom as soon as the relevant duties become enforceable. Ofcom has made it clear that it wants service providers to be taking action now in preparation, so services should not expect a significant grace period.
As in the EU, in view of the significant impact on in-scope services and additional regulatory burdens being imposed on them, we expect to see service providers challenging Ofcom's decisions under the OSA, with powerful grounds of challenge such as human rights and fairness being available.
As the digital landscape continues to evolve, it will be crucial for businesses to stay abreast of developments, seek to comply to avoid potential enforcement action and consider all their options promptly if they are adversely impacted by particular decisions.
Key contacts
Disclaimer
The articles published on this website, current at the dates of publication set out above, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your specific circumstances should always be sought separately before taking any action.