On 14 December 2021, the joint parliamentary committee for the draft Online Safety Bill (the "Joint Committee") published its 193-page report (the "Report") on the draft Online Safety Bill (the "Bill") which was published by the Government on 13 May 2021. For more on the publication of the Bill, please see our previous blog post here. The Report is the culmination of the Joint Committee's inquiry which began on 22 July 2021 and aims to strike a balance between the evidence the Joint Committee heard from various stakeholders (including academics, NGOs and tech companies) during the enquiry.
While the Joint Committee is supportive of the objectives of the Government's draft Bill, it has called for a large scale rework in the next iteration to ensure that online spaces are "no longer the land of the lawless".
Key recommendations include:
- restructuring the Bill to make it clearer and easier to understand the key safety objectives;
- removing the requirement of service providers to remove content which is legal but harmful to adults from their platform;
- requiring Ofcom to produce a mandatory code of practice on safety by design which will require online service provider to mitigate the risks cause by the design of their platforms;
- introducing new criminal offences such as cyberflashing;
- introducing criminal liability for tech executives where there are systemic failures; and
- bringing paid-for advertising within the scope of the Bill.
Structural changes to the Bill
The Joint Committee Report suggests at the outset that the Bill be restructured to set out at a clear statement of its core safety objectives, with all the subsequent requirements in the Bill flowing from these objectives. The Joint Committee recommends the core objectives should be that Ofcom aims to improve online safety for UK citizens by ensuring that service providers:
- comply with UK law and do not endanger public health or national security;
- provide a higher level of protection for children than for adults;
- identify and mitigate the risk of reasonably foreseeable harm arising from the operation and design of their platforms;
- recognise and respond to the disproportionate level of harms experienced by people on the basis of protected characteristics;
- apply the overarching principle that systems should be safe by design whilst complying with the Bill;
- safeguard freedom of expression and privacy; and
- operate with transparency and accountability in respect of online safety.
Legal but harmful content concept removed
One of the most substantive changes to the Bill set out by the Joint Committee is the removal of Section 11 of the Bill, which relates to the requirement for the largest online user-generated content platforms to specify in their terms of service how content that is harmful to adults will be dealt with.
The Joint Committee instead suggests the introduction of a statutory requirement on providers to have in place proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities, with such regulated activities being clearly defined under the Bill.
The Report states that such definitions should be linked to specific areas of established law or should already be recognised as legitimate grounds for interference with freedom of expression. The examples the Report gives include: abuse, harassment or stirring up of violence or hatred based on the protected characteristics in the Equality Act 2010; and knowingly spreading false information which endangers public health.
The Report also recommends that any additions to the list of regulated activities should be done via statutory instrument from the Secretary of State, which will allow the list to evolve with the changing landscape of online harms but with a certain degree of scrutiny coming from Parliament.
Focus on safety by design
The Joint Committee's Report highlights that throughout its inquiry it heard details of design features specific to online services which both created and exacerbated risks of harm. Several of the Joint Committee's findings are therefore aimed at "holding online services responsible for the risks created by their design and operation of their systems", and include:
- requiring service providers to "have in place systems and processes to identify reasonably foreseeable risks of harm arising from the design of their platforms and take proportionate steps to mitigate those risks of harm"; and
- requiring Ofcom to produce a mandatory safety by design code of practice which should set out the minimum standards providers will be required to meet, for example in relation to algorithms and anonymous or pseudonymous accounts, and steps to take to mitigate risks created by the design and operation of their systems.
New criminal offences
Following the Law Commission's report on reforming the communications offences (which pre-dated the age of social media and is seen as outdated and ill-suited to addressing the harms which are currently prevalent online) the Joint Committee has advocated for the following new offences, which were first set out in the Law Commissions' report, being added to the statute book "at the first opportunity":
- sending a communication likely to cause harm amounting to significant psychological distress;
- sending harmful, threatening and knowingly false communications;
- sending communications to promote self-harm;
- cyberflashing (i.e. the sending of unsolicited sexual images); and
- trying to induce seizures in people with photosensitive epilepsy.
As the Bill will require online service providers to protect both adults and children from illegal content, any expansion to the communications offences would result in a responsibility on online service providers to mitigate the risks of harm cause by these new offences.
Criminal liability for tech execs
The Joint Committee has recommended each company within the scope of the Bill be required to appoint a board-level executive to act as the company's "Safety Controller". This individual would be liable for a new offence set out in the Report: failure to comply with obligations as regulated service providers where there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users. The Joint Committee has highlighted that pursuing criminal liability should be a last resort for Ofcom.
Fraudulent advertising
A further addition to the Bill suggested by the Joint Committee is the inclusion of paid-for advertising within the scope of the Bill. The Joint Committee clarified that Ofcom's role would be to act against service providers who consistently allow paid-for advertisements which create a risk of harm to be placed on their platform, rather than regulating advertisers themselves, targeting individual cases of illegal advertising or pursuing the criminals behind illegal adverts.
Greater freedom for news publishers
The Report proposes that the news publisher content exemption in the Bill is strengthened to include a requirement that recognised news publisher content "should not be moderated, restricted or removed unless it is content the publication of which clearly constitutes a criminal offence".
Commentary
The headline recommendation from the Joint Committee is likely to be the removal of Section 11 of the Bill. The requirement for online service providers to remove content which was "legal but harmful" from their platforms was a controversial aspect of the Bill. Critics highlighted that this could lead to online censorship and the stifling of freedom of expression, as online providers could look to take an overly-precautious approach to taking down content in an attempt to not fall foul of their new regulatory obligations. The Joint Committee's revised proposal mitigates this by offering a more flexible approach, which is likely to be welcomed by those in the market.
The introduction of criminal liability for systemic failures is also likely to be contentious. From a practical perspective online service providers may find it difficult to find one board level executive willing to take on such a responsibility.
Safety by design features prominently in the Joint Committee's Report and explicit references are made to combatting the harms which can be both caused and proliferated by platforms algorithms and the prevalence of anonymous accounts. The requirement for online service providers to carry out risk assessments on the design of their platforms as well as complying with a mandatory safety by design code of practice published by Ofcom will create an increased regulatory burden for those in scope online service providers. The Joint Committee has suggested that Ofcom begin work on the codes of practice "immediately so they are ready for enforcement as soon as the Bill becomes law". As these codes of practice will be mandatory and legally binding, in-scope online providers will want to ensure their voices are heard when these codes are developed. Under the Bill as currently drafted, Ofcom will be required to consult those "who represent the interests of UK users of regulated services" before preparing codes and therefore regulated providers should begin considering their approaches to lobby Ofcom to ensure the codes of practise are developed with a pragmatic approach. Online service providers may wish to begin considering what technical aspects may need to be changed on their platforms to mitigate the risk of harms arising from certain factors highlighted in the report. For example some of the mitigation factors suggested in the Report in relation to harm caused by anonymous or pseudonymous accounts include limiting the speed with which new accounts can be created and achieve full functionality and requiring the largest and highest risk platforms to offer the choice of verified or unverified status and giving users options on how they interact with accounts in either category. Online service providers should begin to consider if such steps can be practically implemented. If such steps are not practically workable it will be important to make any concerns known when Ofcom eventually begins its consultations.
There is likely to be much debate on the inclusion of several of the Joint Committee recommendations and online service providers will be closely watching to see how the Bill, and as a consequence their future obligations, evolve over the coming months. Though the Bill's evolution is inevitable, it may still be prudent for service providers to begin to prepare for the introduction of the new regime, given the potential for significant operational impact. This may include carrying out risk assessments in respect of platforms to ascertain the extent of illegal content and content arising from regulated activities as set out in the Report and the steps which can be taken to mitigate the likelihood of harm for users of their platforms.
Next steps
The Government must respond to the Joint Committee Report by 14 February. Following this response, an updated draft of the Bill will be laid before Parliament and the Bill will begin its journey through the legislative process, before receiving final approval at some point in 2022.
Key contacts
Disclaimer
The articles published on this website, current at the dates of publication set out above, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your specific circumstances should always be sought separately before taking any action.