Follow us

After a series of delays, on 13 May 2021, the UK Government unveiled its proposed legal framework to introduce tough new measures requiring social media companies and technology companies, among others, to protect online users.

The long awaited Online Safety Bill (the "Bill") is a comprehensive package which seeks to combat illegal and harmful practices online, whilst also "ushering in a new era of accountability and protections for [freedom of expression and] democratic debate." The Government has made it clear that "what is unacceptable offline will also be unacceptable online". The proposed Bill kicked off the new parliamentary session, highlighting these issues as a high priority for the UK Government.

We are currently reviewing the detail of the Bill but have set out below some of the key takeaways in the meantime.

 Key points to note:

  • As expected, the Bill comprises a robust approach to combating online harms and illegal content; incorporating legal "duties of care", backed by fines of up to £18 million or 10% of annual worldwide revenue for non-compliance alongside a range of other enforcement powers granted to Ofcom.
  • The Bill is broad in scope - applying both to (i) internet services that allow users to share user generated content and (ii) providers of search engines - and has extra-territorial reach.
  • The Bill incorporates a tiered, risk-based approach to the obligations placed on online platforms and services, with larger / higher risk organisations being subject to greater regulation. It also sets out certain exempt services falling outside the scope of the Bill, including:
    • email, SMS and MMS services (if these are the only user generated content enabled by the service); and
    • "limited functionality services" which place certain limitations on a user's ability to engage with service provider content.
  • As well as taking action in respect of illegal content, the Bill addresses lawful but harmful content too. Whilst it includes a definition of "harmful content" in an effort to provide clarity, early critics have suggested the definition is currently too vague for organisations to determine without further guidance.
  • Much of the detail of the Bill (including the extent of applicable duties relevant to regulated service providers), will be set out in secondary legislation and codes of conduct prepared by Ofcom that have not yet been published. The true impact of the new legislation (including the related compliance requirements and cost implications for organisations within scope), is therefore not clear at this early stage. The Government also intends to improve its cost estimate of the likely impact through further engagement with businesses as part of the pre-legislative scrutiny process.
  • The amount of time it has taken the Government to publish the Bill reflects the unenviable challenge of seeking to "regulate the online environment"; striking the balance between protecting online users and safeguarding freedom of speech and democratic debate online, whilst also supporting innovation and the digital economy. Only time will tell as to whether the finalised framework strikes this balance or the extent to which it gives rise to potential constraints on innovation and investment for those within its scope.

Background and market context

Harmful and illegal content is seen as too readily available online. Voluntary industry action through existing self-regulation has proven to be unsuccessful in combating this issue and has struggled to keep pace with the changing online environment.

In April 2019, the UK Government published the Online Harms White Paper (the "White Paper") and launched a consultation on a new framework for online user safety. The White Paper set out the Government's plans for a new statutory duty of care requiring organisations to take more responsibility for the safety of their users. A huge number of stakeholders responded during the consultation period. In light of the responses, the Government published its interim response to the consultation (the "Interim Response") in early 2020 highlighting how the proposed initiative would be taken forward, which indicated some change in direction from the White Paper. The Government's full response to the consultation (the "Full Response") was published on 15 December 2020 and proposed further detail around regulatory framework, much of which has been taken forward through the Bill. See our previous HSF blogs here and here.

Publication of the Bill in the UK follows hot on the heels of the proposed European Digital Services Act ("DSA") published by the European Commission in December 2020, which proposes new rules to increase the responsibilities of providers of online intermediary services into the EU and reinforce oversight over online platforms' content policies. It principally seeks to protect consumers and their fundamental rights online, establish a transparency and accountability framework for online platforms and foster innovation, growth and competitiveness within the single market. For further information please refer to our blog here.

Broad scope: Who does it impact?

The Bill imposes duties of care on providers of internet services that allow users to share user-generated content (user-to-user services) and providers of search engines (search services). These duties include undertaking illegal content risk assessments, protecting children's online safety and that of vulnerable users, as well as duties regarding freedom of expression.

The legislation takes a tiered, risk-based approach to online platforms and services in scope. In particular it envisages three categories of services that are subject to greater regulation (to varying degrees), with additional duties over and above those that apply more generally to all regulated service providers under the Bill.

These three categories comprise

  • Category 1 services - user-to-user services meeting certain threshold conditions, but expected to cover the "largest, most popular social media sites" and "high reach" and high risk services such as Facebook;
  • Category 2A services - search services meeting certain threshold conditions; and
  • Category 2B services - user-to-user services meeting certain threshold conditions for this category.

Category 2 services are expected to cover smaller businesses and the majority of companies are expected to fall into this category.

The threshold conditions for each of these categories will be set out in regulations made by the Secretary of State. Ofcom will create a "register of categories" setting out services meeting the threshold conditions for each of the categories.

Regulated service providers with qualifying worldwide revenue at or above a specified threshold will also have an obligation to notify Ofcom and pay an annual fee. The qualifying threshold and level of the annual fee will be determined by Ofcom and are currently unknown.

User-generated fraud, such as romance scams and fake investment opportunities posted by users on user-to-user services, is also captured under the Bill.

 Extra-territorial reach

The Bill applies to regulated services that have "links with the UK" - i.e. those services that have a significant number of users in the UK, target the UK market or can be used in the UK by individuals in circumstances where there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK. As such, it could cover both UK-based regulated service providers as well as those based outside of the UK.

It remains to be seen how the Bill will be monitored and enforced in respect of regulated service providers located overseas (and in particular whether the Bill will seek to replicate the enforcement model under the DSA, whereby extra-territorial service providers are required to appoint a legal representative within the EU to ensure effective oversight and compliance with the DSA).

 Codes of practice: The devil is in the detail  

The Full Response anticipated Ofcom publishing codes of practice which sit alongside the legislation and focus on systems, processes and governance which in-scope organisations need to put in place to fulfil the new legal duties. Accompanying the Full Response were two interim codes of practice which outlined the actions organisations are encouraged to take to tackle terrorism and child sexual exploitation and abuse online, with further codes of practice to be published in due course for other areas, for example in relation to the safeguarding of democracy, journalistic content, and user reporting and redress. It will be key for Ofcom to engage with a range of industry stakeholders in preparing these codes of practice to ensure that the codes of practice are proportionate and appropriate.

 Lawful but harmful content: Still nebulous?

In addition to taking action in respect of illegal content, the Bill addresses lawful but harmful content as well. The Bill distinguishes between content that is "harmful to children" and that which is "harmful to adults" in terms of the duties and obligations that apply to regulated service providers. As suggested in the Full Response, the proposed legislation includes a definition of harmful content in both scenarios.

For content that is harmful to children in relation to a regulated service, this means regulated content which: (i) the Secretary of State has designated in regulations as either "primary priority" content or "priority" content which is harmful to children; or (ii) meets the conditions set out in the Bill. These conditions include where a provider has reasonable grounds to believe that the nature of the content risks directly or indirectly having a significant adverse physical or psychological impact on a child of ordinary sensibilities.

The Bill states that where harmful content would reasonably be assumed to particularly affect people with certain characteristics or belonging to a certain group, (e.g. those with disabilities), the provider should assume that the child accessing that content possesses those characteristics or belongs to that group. The Bill also provides that content may be harmful to children due to the way in which it is disseminated, even if the nature of the content is not itself harmful (e.g. repeatedly sending apparently innocuous content to a user could be bullying or intimidating).

A similar formulation is adopted for the corresponding definition for content that is "harmful to adults".

By comparison, it is worth noting that the proposed DSA is stated to only apply to illegal content. However, in reality EU and Member State law is not always unequivocal as to whether or not a particular act is or is not compliant and often laws allow for a degree of discretion on the part of the enforcement body. As a result, although lawful but harmful content is not explicitly covered by the DSA, it is possible that in practice, some content which may not technically be illegal but may nevertheless be considered extremely harmful or dangerous could be caught by the notice and action mechanism under the DSA. It will be interesting to see how the concept of lawful but harmful content develops as the UK and EU legal frameworks progress through their respective legislative processes.

 The balancing act: freedom of expression, democratic content and journalistic content

In response to previous criticism around the potential impact of the Bill on freedom of expression online, the Bill also aims to ensure that those in the United Kingdom can express themselves freely and participate in debate online. Organisations in scope will therefore be required to put in place safeguards to protect freedom of expression as part of their duties. The codes of practice are expected to go some way to setting out these safeguards. Category 1 service providers are also responsible for conducting and publishing updated impact assessments of their policies and procedures on freedom of expression and the steps they have taken to mitigate any adverse effects.

It remains to be seen (pending the outcome of the relevant secondary legislation) how the potentially competing (or even conflicting) objectives of safeguarding freedom of expression and protecting against lawful but harmful content will be achieved in practice and this is likely to be an area of particular complexity for regulated service providers under the Bill.

In addition, Category 1 service providers will be subject to additional obligations in relation to "democratically important" content. This includes content that promotes or opposes government policy or a political party ahead of a Parliament vote, election, referendum, or campaigning. Organisations in scope cannot discriminate against any particular political viewpoint – protections for this content would need to apply uniformly and policies setting out these protections must be clear and accessible. Regulated service providers will also need to assess the political context of content during their content moderation process, with "democratically important" content being given a high degree of protection.

Category 1 providers will also have a statutory duty to protect UK users' access to journalistic content shared on their platforms (i.e. articles on news publishers' websites as well as any user comments on those articles). Content moderation processes will have to account for journalistic content and must not arbitrarily remove this content.

 Sting in the tail: Enforcement, substantial fines and potential liability for senior managers

The Bill provides Ofcom with a suite of new enforcement powers which include the power to fine companies failing to fulfil their duty of care up to £18 million or 10% of their annual worldwide revenue, whichever is higher. Any such penalty must be an amount that Ofcom considers appropriate and proportionate to the failure in respect of which it is imposed.

Whilst significant in size, this penalty cap aligns with a growing theme across recently proposed UK and EU legislation to regulate elements of the digital economy; starting with the EU General Data Protection Regulation and most notably set out in the proposed DSA (where failure to comply with the DSA would result in fines of up to 6% of annual income or turnover of the relevant provider or platform). That said, the UK Government's robust approach appears to be reflected in the 10% figure, which is at the higher end of the scale of legislative sanctions in this area.

Alongside these fines, Ofcom will have the power to block access to sites. The Bill also contains deferred powers for Ofcom to take criminal action against named senior managers of a regulated service provider that fails to comply with an "information notice" from Ofcom (whereby Ofcom may request certain information from regulated service providers in order to exercise any of its functions or powers under the Bill). The Government has suggested that these deferred powers may be introduced at a later date if "tech firms don’t step up their efforts to improve safety" and a related review is expected to take place at least two years after the new regulatory regime is fully operational.

Exclusions and exemptions: What is out of scope?

The Bill sets out certain services that are exempt from its scope. These include email services, SMS and MMS services (if they represent the only user-generated content enabled by such services), one-to-one live aural communication services (i.e. where users communicate with each other in real time only through speech or sounds, if they present the only user-generated content enabled by the service), user-to-user or search services used internally within a business (such as business intranets), and certain services provided by public authorities. The Bill also exempts "limited functionality services", which only allow users to post comments or reviews on content which is published by the service provider, or use like/dislike buttons, engage in voting or rate such content. This is likely to mean that aggregator websites (for example, in the hospitality sector) will not be covered by the scope of the Bill.

In an effort to keep the legal framework responsive to technological change, the Secretary of State can amend the list of exempt services further through subsequent regulations.

 Next steps

On 13 May 2021 the Bill was published in draft form and will be subject to pre-legislative scrutiny by a joint committee of Members of Parliament in this parliamentary session, before being formally introduced to Parliament.

Hayley Brady photo

Hayley Brady

Partner, Head of Media and Digital, UK, London

Hayley Brady
James Balfour photo

James Balfour

Senior Associate, London

James Balfour
Claire Wiseman photo

Claire Wiseman

Professional Support Lawyer, London

Claire Wiseman
Ananya Bajpai photo

Ananya Bajpai

Associate (India), London

Ananya Bajpai

Related categories

Key contacts

Hayley Brady photo

Hayley Brady

Partner, Head of Media and Digital, UK, London

Hayley Brady
James Balfour photo

James Balfour

Senior Associate, London

James Balfour
Claire Wiseman photo

Claire Wiseman

Professional Support Lawyer, London

Claire Wiseman
Ananya Bajpai photo

Ananya Bajpai

Associate (India), London

Ananya Bajpai
Hayley Brady James Balfour Claire Wiseman Ananya Bajpai