In the recent flurry of activity under the Online Safety Act 2023 (the "OSA"), Ofcom has made it clear that the OSA has now firmly arrived:
- On 16 January 2025, it published guidance on highly effective age assurance and children's access assessments.
- It has announced the opening of an 'age assurance enforcement programme' with robust language about launching investigations against services that do not engage or ultimately comply with the new age assurance provisions.
- Ofcom expects to publish its Protection of Children Codes and other children's risk assessment guidance in April 2025 (and services that are likely to be accessed by children will need to conduct a children’s risk assessment by July 2025).
- In summer 2025 Ofcom plans to publish a register of categorised services which will include listing those services that will be subject to a range of additional requirements.
As the list of codes and duties that in-scope businesses must navigate and comply with grows, those required to implement Ofcom's measures may be concerned about the proportionality of the new obligations. In this blog we look at how such questions may engage the protection of fundamental human rights.
Online safety and fundamental rights
At the heart of the OSA is a tension between online protection (particularly for children) and respecting the rights of users, including their privacy, their right to express themselves online and their right to navigate the online world freely. This tension has played out from the inception of the Online Safety Bill and was the focus of debates during its passage through Parliament. With such strongly held views on all sides, striking a path that satisfies the various stakeholders was always going to be difficult.
The OSA itself explicitly recognises this balance. For example, Schedule 4 of the OSA requires that measures described in a Code of Practice by Ofcom must be designed in light of the following principles relating to fundamental rights:
(a) the importance of protecting the right of users and (in the case of search services or combined services) interested persons to freedom of expression within the law, and
(b) the importance of protecting the privacy of users.
In addition, regulated services themselves are subject to duties to have particular regard to the importance of protecting freedom of expression rights and protecting users from a breach of privacy when deciding on and implementing safety measures and policies.
These principles reflect two fundamental human rights protected by the European Convention of Human Rights - Article 10 (right to freedom of expression) and Article 8 (right to respect of individual's private and family life and correspondence) respectively. These are enshrined into UK legislation via the Human Rights Act 1998. Measures relating to online safety could also engage other rights such as Article 9 (freedom of thought, conscience and religion) and Article 11 (freedom of assembly and association).
All these rights are qualified, meaning they may be interfered with or restricted where prescribed by law and as long as the interference is necessary pursuant to a legitimate aim. Where there is an interference with any such right, that interference must be justified as proportionate to achieve the legitimate aim. Assessing the proportionality of a measure requires looking at:
(1) whether the aim of the measure is important enough to justify the limitation of the right;
(2) whether the measure is rationally connected to the aim;
(3) whether a less intrusive measure could have been used without unacceptably compromising the achievement of the aim; and
(4) whether the measure’s contribution to the aim outweighs the effects on the rights of those to whom it applies i.e. has a “fair balance” been struck?
While Ofcom has acknowledged its obligations concerning fundamental rights in its key publications, the assessment of whether a fair balance has been struck in respect of such rights, or whether a less intrusive method could have been used, is often a question of detail (and evidence) to be considered on a case-by-case basis rather than being satisfied by overarching assertions. Therefore, organisations who are practically trying to implement the OSA will be well placed to comment on the implications of new requirements and consider the proportionality of Ofcom's various proposals/measures.
Raising concerns relating to fundamental rights
Those who have concerns about the implications of Ofcom's proposed measures from a rights/proportionality perspective may want to consider or raise such issues, focusing in particular on the four questions set out above.
These themes will be relevant at various junctures, such as when responding to a consultation, completing a required risk assessment or designing and deciding on what measures to implement to reduce risks. In addition, these arguments may found the basis for challenging an Ofcom decision or defending an organisation’s approach to compliance in the event of an Ofcom investigation or enforcement process.
Building on the points above, issues to consider raising could include:
- Does the likely practical impact of implementing proposed measures unduly restrict users' rights to freedom of expression and association? For example, in its consultation on protecting children from harms online in May 2024, Ofcom acknowledged that the requirements for highly effective age assurance for certain services may restrict the right to freedom of expression and association because they may cause certain services to exit the UK market altogether, or (more likely) make it more difficult for adult users to use a service. However, Ofcom concluded that the measure was "likely to go no further than needed". Is that conclusion justified, or are there alternative less restrictive measures that could be used?
- Will the measures have unintended or unacknowledged effects which might entail a disproportionate impact on users? Is there any evidence which Ofcom has not properly taken into account? Or anything which it has inaccurately considered which might have led it to an incorrect conclusion in relation to proportionality? For example, in relation to the highly effective age assurance measure, Ofcom has provided illustrative cost estimates of age checks where services want to use a third-party provider to assist – are these estimates fair/accurate?
- What is the practical impact of Ofcom's approach on small businesses, particularly the costs and burden of implementation of various measures and risk assessments for smaller services? Ofcom has stated that "setting lower expectations for smaller services would be inconsistent with the Act and could lead to ineffective age assurance on smaller services, exposing children to significant harm." However, does the current approach have a knock-on effect on the service that can be provided by such businesses so as to restrict the rights of users?
- What do Ofcom's proposed measures mean for users' right to privacy? For example, any requirement involving robust age assurance will necessarily present an interference with the right to private life (Article 8) and raise data privacy concerns. However, in its Statement on Age Assurance and Children's Access, Ofcom has concluded that provided service providers adopt a highly effective age assurance process, in accordance with Ofcom's recommended criteria, they may choose an age assurance process which minimises the amount of personal data collected and this "will ensure that adult users are not subjected to overly intrusive processes when accessing legal content". Is this a realistic assessment?
In-scope services concerned with these sorts of questions should consider as early as possible what their options are in relation to raising these issues with Ofcom or potentially even considering challenge, given the short timescales associated with bringing challenges and the complexity around when these time limits may run from.
Ultimately, the protection and appropriate balance of fundamental rights in the online safety space is an ongoing and pertinent issue for all. Ofcom will not be the only regulator or entity grappling with the questions raised in this context, such as the appropriate level of content moderation and the nature of effective protection. However, whether it is able to achieve this balance between often conflicting rights remains to be seen and will no doubt be subject to further debate, as well as being tested by affected organisations
Key contacts
Disclaimer
The articles published on this website, current at the dates of publication set out above, are for reference purposes only. They do not constitute legal advice and should not be relied upon as such. Specific legal advice about your specific circumstances should always be sought separately before taking any action.