Will the Bill apply to my business?
The Bill will apply to 'user-to-user services' (where users may upload and share user-generated content, such as social media platforms) and 'search services' (such as search engines, which enable users to search multiple websites and databases) where those services have 'links to the United Kingdom'. There are a number of exceptions to the scope of the proposed regime such as email services, SMS, one-to-one live aural communications, internal business services, and public bodies. Some businesses may fall within scope for some of their services and not for other services. It will be a question of the degree of coverage for each business.
What duties will apply?
The Bill contains wide ranging duties in relation to illegal content, as well as content that is deemed to be harmful but not illegal. It will increase liability for all businesses within scope. Currently, social media platforms and tech companies are liable only for content that they host. The Bill introduces secondary liability for failure to remove material that the company is put on notice of, and for failure to have sufficient systems and processes in place to identify and prevent the hosting of harmful or illegal content. There will likely be significant commercial implications for most, if not all, companies falling within scope.
Different duties will apply depending on whether the company is a user-to-user service or a search service. These include duties to:
- carry out risk assessments in relation to illegal and harmful content;
- comply with safety duties (including taking proportionate steps to reduce and manage the risk of harm to individuals);
- have regard to the importance of protecting users’ rights to freedom of expression and protecting users from unwarranted infringements of privacy when deciding on and implementing safety policies;
- reporting and redress duties (to have appropriate reporting systems and complaints processes in place that allow users and affected persons to easily report content); and
- record keeping and review duties.
Companies will be required to assess whether children are likely to access their services or platforms, and if so, additional duties will apply. These include the duty to carry out a children's risk assessment, and safety duties to mitigate and manage the risk of harm to children in different age groups.
Is my business a Category 1 service?
Further duties will apply to 'Category 1' businesses. This targets a small group of 'high-risk, high-reach services' and imposes additional controls on content that is lawful but still harmful (e.g. abuse that falls below the threshold of a criminal offence; encouragement of self-harm; mis/disinformation). Whether a company is Category 1 will be determined by Ofcom. The threshold conditions remain unspecified, but are likely to be based on the size of a service provider's audience and the functionalities that it offers. The focus will be on the largest and most popular social media sites. Category 1 companies will be required to conduct adult risk assessments, and to uphold additional safety duties in relation to content that is harmful to adults. They will also be required to consider content of 'democratic importance' and 'journalistic content' when making decisions about safety policies and what action to take.
How will the regime be funded?
The funding details remain unclear in the Bill. All regulated service providers with worldwide revenue above a threshold will be required to pay Ofcom a fee for the implementation and enforcement of the regime. Ofcom will have broad powers to determine what that fee should be. It remains to be seen where the threshold will be set, and correspondingly the impact on small and medium sized business.
As the independent regulator, Ofcom will be required to produce codes of practice and guidance to assist businesses to comply with their obligations. The Bill proposes Ofcom's new powers to ensure compliance with the regime will include the power to issue 'use of technology' notices and information notices to companies where it has reasonable grounds to believe that they are non-compliant. It will be able to conduct investigations, and, where providers fail to meet their duties, impose fines of up to £18 million, or 10% of a companies' annual global revenue, whichever is highest. Ofcom will also be able to seek court orders for a number of business disruption measures against regulated companies.
It remains unclear how the Bill will operate in practice. The reception of the draft Bill to date has been mixed, with criticism about its lack of detail and the cost of compliance for regulated companies. Much of the detail of the regime will be provided in secondary legislation and in guidance by Ofcom. Some of the duties in the draft bill are vague and uncertain, such as the test for content 'harmful' to children, and there is increasing concern about the impact of the Bill on freedom of expression online. The Bill has also come under criticism for failing to do enough to protect from online scams, and to restrict children's access to online pornography.
In any event, the potential duties proposed in the Bill and the powers to be granted to Ofcom are extraordinarily wide ranging. Some of the duties, in particular the Category 1 duties around content of democratic importance and journalistic content, are controversial, could lead to significant debate during the legislative process, and may be subject to widespread modification as the Bill progresses. Parliament will need to decide whether it is willing to rely on the tech industry to take decisions and weigh up competing factors in relation to freedom of expression and online safety.
The draft Bill will be subject to pre-legislative scrutiny by a joint committee of MPs before a final version is introduced to Parliament. The final legislation is unlikely to enter into force until at least 2023.
There may be opportunities for industry to make submissions to the joint committee in relation to the draft Bill. At Fieldfisher, we are here to advise and assist you in all efforts to engage with the early legislative process. Please get in touch with the Regulatory team for further information and advice.
We can help with:
- understanding how to best engage with the legislative and policy-making process and to help get your voice heard
- advice on whether your business is likely to fall within scope of the new regime.
- analysing whether your business may fall within Category 1 and the likely commercial implications.
- advice on whether or not your business is likely to meet Ofcom's test for 'services likely to be accessed by children'.
- initial views on online safety risk assessments, and thinking about your future compliance under the regime.
- any other strategic or legal considerations in light of the Bill.
Clause 3, the draft Online Safety Bill- Whether a service has links to the UK is determined by reference to whether it has a significant number of UK users; whether the UK is a target market for the service; or provided that the service is capable of being used in the UK by individuals, whether there are reasonable grounds to believe there is a material risk of significant harm to individuals in the UK arising from content presented on the service or encountered in search results
 Illegal content is defined as content that amounts to a terrorist offence, a CSEA offence, an offence specified in regulations made by the Secretary of State ('priority illegal content'), or any other offence in which the victim or intended victim is an individual or individuals. It does not include offences relating to the infringement of Intellectual Property Rights, the safety or quality of goods, or the performance of a service by a person not qualified to perform it. Harmful content is defined more broadly in relation to children and adults, and includes content that is not illegal.
 See for example, clause 18, which limits the application of some duties for search services. Clause 18(1) states that duties extend only to the design and operation of the service in the UK; clause 18(2) specifies that duties for search services do not extent to content on the website of a recognised news publisher.
 The assessment must determine whether it is possible for children to access the service (or any part of it) and, if so, whether (i) there are, in fact, a significant number of children who are users of the service (or any part of it) or (ii) the service (or any part of it) is of a kind likely to attract a significant number of users who are children.
 Government press release on 12 May 2021.
 Content that is harmful to adults is defined as content that the service provider 'has reasonable grounds to believe … that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities'.
 Clause 52, the draft Online Safety Bill
 Clause 85, the draft Online Safety Bill
Sign up to our email digest