Follow us


As part of Ofcom's phased approach and accompanying roadmap to implementation of the Online Safety Act 2023 (OSA), the duties relating to illegal content and associated harms are being prioritised as part of Phase 1.

In this second chapter of our 'Your questions answered' series relating to the OSA we take a look at:

  • how the OSA defines illegal content;
  • the key OSA duties relating to illegal content; and
  • how these duties vary depending on the size and risk-profile of the regulated provider.

For the purposes of this chapter and the remaining chapters in this series, we have used the term 'regulated provider' to refer collectively to all online service providers within the scope of the OSA. See Chapter 1 for more information on which online services are caught by the OSA.

The information set out in this Chapter 2 is primarily based on the OSA together with the draft Illegal Content Codes of Practice and accompanying guidance published on 9 November 2023 as part of Ofcom's consultation on 'Protecting people from illegal harms online' (referred to in the remainder of this chapter as the 'illegal harms consultation').

At the time of writing, Ofcom was expecting to finalise the Illegal Content Codes of Practice during Q4 of 2024 with a view to the codes coming into force by the end of Q1 2025. It remains to be seen whether these timelines will slip given the change of government over the summer.

What is 'Illegal content'?

The term 'illegal content' is defined under section 59 of the OSA.

Content (including words, images, speech or sounds) will constitute 'illegal content' if the use, possession, viewing, accessing, publication or dissemination of such content constitutes a 'relevant offence'.

'Relevant offences' are broken down into 'priority offences' and 'other' non-priority offences (note that the inclusion and categorisation of these offences within the OSA has been decided by Parliament rather than Ofcom).

Priority offences

At a high-level, the priority offences fall into the following categories:

  • terrorism (including offences relating to 'proscribed organisations', encouraging terrorism, disseminating terrorist materials etc.)
  • threats, abuse, harassment and hate
  • image-based sexual offences (including possession of extreme pornography and non-consensual disclosure of intimate images)
  • child sexual exploitation and abuse (CSEA) (including creation, possession and distribution of CSEA content as well as 'arranging' or 'encouraging' offences which could take place between adults and children)
  • sexual exploitation of adults (including inciting prostitution or controlling a prostitute for gain)
  • unlawful immigration and human trafficking (including offences relating to facilitation of unlawful immigration)
  • fraud and financial offences (including fraud by false representation and offences related to articles for use in fraud)
  • assisting or encouraging suicide
  • buying and selling of controlled articles (including unlawful supply of drugs and weapons as well as offences relating to 3D printing instructions for guns and gun parts)
  • false information offences (including foreign state-backed information / disinformation operations and other malign activities)

Non-priority ('other') offences

Non-priority offences essentially cover all other offences (whether under the OSA or other legislation) not covered by the priority offences listed above, where the victim or intended victim of the offence is an individual (or individuals).

The following offences are explicitly excluded from non-priority offences (and therefore from 'relevant offences' for the purposes of the 'illegal content' definition):

  • offences relating to intellectual property, product safety / quality and the performance of a service by a person not qualified to perform it; and
  • consumer protection from unfair trading offences under Chapter 1 of Part 4 of the Digital Markets, Competition and Consumers Act 2024.

Note that the priority and non-priority offences referred to above also include related 'inchoate' offences (i.e. offences resulting from a person assisting, encouraging attempting or conspiring to commit one of the primary offences referred to above).

Who decides whether content constitutes a 'relevant offence' ('illegal content judgments' and the 'reasonable grounds' test)?

The short answer is that (for the purposes of compliance with the OSA) regulated providers are responsible for making these decisions, however both the OSA and Ofcom provide further guidance on how regulated providers should approach these decisions (referred to as 'illegal content judgments' in the Ofcom guidance).

In summary, where a regulated provider finds itself having to make a judgment (for the purposes of compliance with the OSA) as to whether content is illegal content, the content should be considered illegal content if the service has 'reasonable grounds to infer', based on 'all relevant information that is reasonably available' to the service, that the content constitutes one or more relevant offences as referred to above.

The OSA further clarifies that such 'reasonable grounds' will exist where the service:

  1. has reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied; and
  2. does not have reasonable grounds to infer that a defence to the offence may be successfully relied upon.

Detailed draft guidance on how regulated providers should approach illegal content judgments can be found in Ofcom's Illegal Content Judgments Guidance.



What are the key OSA duties relating to illegal content?

The duties under the OSA relating to illegal content can be broken down into 6 key categories:

  1. Risk assessment duties
  2. Safety duties
  3. Reporting and complaints duties
  4. Freedom of expression and privacy duties
  5. Record keeping and review duties
  6. Other specific duties (including duties specific to Category 1 and 2A providers)

Note that the OSA addresses duties for user-to-user (u2u) services separately from search services, however the duties are broadly similar, so we have combined them for the purposes of the below summary (but have included the relevant u2u and search section references for further detail).

These duties require regulated providers to carry out an initial illegal content risk assessment in respect of illegal content and keep that risk assessment up to date (including by updating the risk assessment each time the provider makes significant changes to its service's design or operation).

The OSA includes a list of matters to be taken into account as part of such risk assessments, including:

  • the algorithms used by the service and how easily, quickly and widely content can be disseminated on the service;
  • the level of risk of harm to individuals presented by illegal content of different kinds;
  • the level of risk of functionalities of the service facilitating the presence or dissemination of illegal content; and
  • how the design and operation of the service may reduce or increase the risks identified.

(The full list of relevant matters can be found at s9(5) (u2u services) and s26(5) (search services) of the OSA).

As part of the illegal harms consultation referred to above, Ofcom has also published draft guidance to providers on how to assess the risk of online harms, including a four-step risk assessment methodology.

The safety duties are the core duties at the centre of the overall OSA framework, and require:

  • implementation of proportionate measures to:
    • prevent users from encountering priority illegal content (or in the case of search services, minimise the risk of individuals encountering such content);
    • mitigate the risk of services being used to commit or facilitate priority offences (u2u services only);
    • mitigate the risk of harm to individuals arising from the presence of illegal content;
    • minimise the length of time for which priority illegal content is present on the service (u2u services only); and
    • swiftly take down any illegal content upon being alerted to its presence (u2u services only); and
  • inclusion of clear and accessible provisions in the terms of service (or in the case of search services, in a publicly available statement) as to how individuals are protected from illegal content (which must include information about any proactive technology used as part of the measures referred to above).

Ofcom is required under OSA s41 to prepare Codes of Practice setting out measures recommended for the purpose of compliance with the safety duties referred to above, and has published draft Codes of Practice for u2u services and search services as well as accompanying guidance relating to the illegal content duties as part of its illegal harms consultation.

The draft Codes of Practice go into much more granular detail as to the recommended measures to be implemented. For example, recommendations include:

  • appointment of named officers responsible for compliance with the illegal content safety duties;
  • implementation of content moderation systems; and
  • use of specific technologies such as 'hash matching' to detect certain types of illegal content.

The Codes of Practice are not binding, but regulated providers will be treated as compliant with the relevant duties under the OSA where they have implemented the measures recommended under the Codes of Practice (and will have to keep records where they have deviated from the Codes of Practice, including information as to how the regulated provider considers that any alternative measures implemented meet the relevant duties).

These duties require all regulated providers to:

  • implement mechanisms for user-reporting of illegal content;
  • operate an easy-to-access, easy-to-use (including by children) and transparent complaints procedure which:
    • allows for various types of complaints (including complaints regarding the presence of illegal content and non-compliance by the regulated provider with its duties under the OSA, as well as complaints by content publishers (or in the case of search services, by 'interested persons') regarding take-down decisions made by the regulated provider); and
    • provides for appropriate action to be taken in response to complaints; and
  • include in the terms of service (or in the case of search services make publicly available) easily accessible provisions specifying the policies and processes governing complaints handling and resolution.

These duties require regulated providers to have particular regard to:

  • the importance of protecting users' right to freedom of expression within the law; and
  • the importance of protecting users from a breach of any statutory provision or rule of law concerning privacy that is relevant to the use or operation of the regulated provider's service (including, but not limited to, any such provision or rule concerning the processing of personal data),

when deciding on and implementing safety measures and policies.

There is some high-level guidance on considerations of freedom of expression and privacy contained within Ofcom's Illegal Content Judgments Guidance. In addition:

both of which should be considered by regulated providers when balancing their duties under the OSA with their obligations under data protection regulations.

These duties require regulated providers to make and keep written records of:

  • in an easily understandable form, all aspects of their risk assessments (including details of how each assessment was carried out and its findings);
  • measures taken or in use to comply with OSA duties and which are recommended under a code of practice and are applicable to the regulated provider and its service;
  • where alternative measures to those set out in a code of practice have been taken:
    • the applicable measures in the relevant code of practice that have not been taken;
    • the alternative measures that have been taken;
    • how the alternative measures amount to compliance with the relevant duties; and
    • how the regulated provider has complied with its freedom of expression and privacy duties (see above).

In addition, regulated providers are required to review their compliance with the OSA duties regularly and in any event as soon as reasonably practicable after making any significant change to any aspect of the design or operation of their service.

In addition to the above duties, the OSA contains a variety of specific duties or duties which apply to specific categories of regulated provider, including the following:

  • for Category 1 regulated providers of u2u services (per OSA s7(5)):
    • enhanced duties in relation to illegal content risk assessments, freedom of expression and privacy and record-keeping; and
    • additional duties relating to:
      • adult user empowerment in respect of content which is legal but may be harmful to adults;
      • protection of content of democratic importance, news publisher content and journalistic content; and
  • for Category 2A regulated providers of search services (per OSA s24(5)), enhanced duties in relation to illegal content risk assessments and record-keeping.
  • for Category 1 and 2A regulated providers, duties relating to prevention of users encountering fraudulent advertising (per OSA ss38 and 39); and
  • for all regulated providers, a duty to operate their services using systems and processes which secure (so far as is possible) that the provider reports all detected and unreported CSEA content present on the service to the National Crime Agency (s66).

See Chapter 4 (What are categorised services?) for more information on Category 1 and Category 2A services.

Note that in addition to the above, the OSA also contains specific duties relating to services that are likely to be accessed by children – these are dealt with in Chapter 3 (What are the key duties under the Online Safety Act relating to protection of children)?

How do the illegal content duties vary depending on the size and risk-profile of the regulated provider?

The concept of proportionality (taking into account the size and capacity of the regulated provider) is found throughout the OSA (in particular when it comes to the safety duties referred to above – see for example s10(10) and s27(10)).

In addition, Ofcom has made clear in its draft guidance that it will adapt its expectations in relation to compliance with the OSA and Ofcom's Codes of Practice depending on the type of service it is dealing with and will not expect the same of a small low-risk service as it does of the largest or riskiest services.

In light of this approach, Ofcom has created certain sub-categories of services according to their size and risk, which it refers to throughout the illegal harms consultation and its draft illegal content Codes of Practice, as follows:

  • all services;
  • services that are at a medium or high risk of specific types of content (eg, child sexual abuse material (CSAM));
  • multi-risk services (being services which have assessed themselves as being medium or high risk in respect of at least two kinds of priority offences)*; and
  • large services (being services which have an average user base greater than seven million users per month in the UK)*.

*The definitions of multi-risk and large services are contained in section A.11 (Definitions and interpretation) of Ofcom's draft illegal content Codes of Practice for u2u services (and A.8 of the draft illegal content Codes of Practice for search services).

Note that Ofcom's proposed definition of large services is intended to closely mirror the definition of large services adopted by the EU in the Digital Services Act, in the hope of reducing the burden of regulatory compliance across different regimes.

Many of the more onerous measures proposed in Ofcom's illegal content Codes of Practice (for example, use of automated 'hash matching' to detect CSAM, board-level annual risk assessments and the implementation of performance targets for determining complaints) are proposed only to apply to large or multi-risk services.

Even with this proportionate approach, there will still be challenges (especially for smaller regulated providers), and Ofcom has explicitly recognised this in the illegal harms consultation. Nevertheless, Ofcom considers that the cumulative impact of the proposals set out in the consultation is proportionate.

Key contacts

Claire Wiseman photo

Claire Wiseman

Professional Support Lawyer, London

Claire Wiseman
James Balfour photo

James Balfour

Senior Associate, London

James Balfour

Stay in the know

We’ll send you the latest insights and briefings tailored to your needs

London Media and Entertainment Technology, Media and Entertainment, and Telecommunications Data Protection and Privacy Technology, Media and Telecommunications Tech Regulation Technology, Media and Telecoms Risk and Regulation Claire Wiseman James Balfour