Draft UK Online Safety Bill published | Fieldfisher
Skip to main content
Insight

Draft UK Online Safety Bill published

Locations

United Kingdom

The UK Government published the draft Online Safety Bill on 12 May 2021. The Bill is in pre-legislative form only and is potentially subject to significant change. This blog sets out the key points in the Bill as currently drafted, how it might impact your business, and how Fieldfisher may be able to assist.

Will the Bill apply to my business?

The Bill will apply to 'user-to-user services' (where users may upload and share user-generated content, such as social media platforms) and 'search services' (such as search engines, which enable users to search multiple websites and databases) where those services have 'links to the United Kingdom'.[1]  There are a number of exceptions to the scope of the proposed regime such as email services, SMS, one-to-one live aural communications, internal business services, and public bodies.  Some businesses may fall within scope for some of their services and not for other services. It will be a question of the degree of coverage for each business.

What duties will apply?

The Bill contains wide ranging duties in relation to illegal content, as well as content that is deemed to be harmful but not illegal.[2] It will increase liability for all businesses within scope. Currently, social media platforms and tech companies are liable only for content that they host. The Bill introduces secondary liability for failure to remove material that the company is put on notice of, and for failure to have sufficient systems and processes in place to identify and prevent the hosting of harmful or illegal content. There will likely be significant commercial implications for most, if not all, companies falling within scope.

Different duties will apply depending on whether the company is a user-to-user service or a search service.[3]  These include duties to:
 
  1. carry out risk assessments in relation to illegal and harmful content;
  2. comply with safety duties (including taking proportionate steps to reduce and manage the risk of harm to individuals);
  3. have regard to the importance of protecting users’ rights to freedom of expression and protecting users from unwarranted infringements of privacy when deciding on and implementing safety policies;
  4. reporting and redress duties (to have appropriate reporting systems and complaints processes in place that allow users and affected persons to easily report content); and
  5. record keeping and review duties.
Is my business' service likely to be accessed by children?

Companies will be required to assess whether children are likely to access their services or platforms,[4] and if so, additional duties will apply. These include the duty to carry out a children's risk assessment, and safety duties to mitigate and manage the risk of harm to children in different age groups.  

Is my business a Category 1 service?

Further duties will apply to 'Category 1' businesses. This targets a small group of 'high-risk, high-reach services' and imposes additional controls on content that is lawful but still harmful (e.g. abuse that falls below the threshold of a criminal offence; encouragement of self-harm; mis/disinformation). Whether a company is Category 1 will be determined by Ofcom.  The threshold conditions remain unspecified, but are likely to be based on the size of a service provider's audience and the functionalities that it offers. The focus will be on the largest and most popular social media sites.[5] Category 1 companies will be required to conduct adult risk assessments, and to uphold additional safety duties in relation to content that is harmful to adults.[6] They will also be required to consider content of 'democratic importance' and 'journalistic content' when making decisions about safety policies and what action to take.

How will the regime be funded?

The funding details remain unclear in the Bill. All regulated service providers with worldwide revenue above a threshold will be required to pay Ofcom a fee for the implementation and enforcement of the regime. Ofcom will have broad powers to determine what that fee should be.[7] It remains to be seen where the threshold will be set, and correspondingly the impact on small and medium sized business.

Ofcom's role

As the independent regulator, Ofcom will be required to produce codes of practice and guidance to assist businesses to comply with their obligations. The Bill proposes Ofcom's new powers to ensure compliance with the regime will include the power to issue 'use of technology' notices and information notices to companies where it has reasonable grounds to believe that they are non-compliant.  It will be able to conduct investigations, and, where providers fail to meet their duties, impose fines of up to £18 million, or 10% of a companies' annual global revenue, whichever is highest.[8] Ofcom will also be able to seek court orders for a number of business disruption measures against regulated companies.  

Comment

It remains unclear how the Bill will operate in practice. The reception of the draft Bill to date has been mixed, with criticism about its lack of detail and the cost of compliance for regulated companies. Much of the detail of the regime will be provided in secondary legislation and in guidance by Ofcom. Some of the duties in the draft bill are vague and uncertain, such as the test for content 'harmful' to children, and there is increasing concern about the impact of the Bill on freedom of expression online. The Bill has also come under criticism for failing to do enough to protect from online scams, and to restrict children's access to online pornography.

In any event, the potential duties proposed in the Bill and the powers to be granted to Ofcom are extraordinarily wide ranging. Some of the duties, in particular the Category 1 duties around content of democratic importance and journalistic content, are controversial, could lead to significant debate during the legislative process, and may be subject to widespread modification as the Bill progresses. Parliament will need to decide whether it is willing to rely on the tech industry to take decisions and weigh up competing factors in relation to freedom of expression and online safety.

Next steps

The draft Bill will be subject to pre-legislative scrutiny by a joint committee of MPs before a final version is introduced to Parliament. The final legislation is unlikely to enter into force until at least 2023.

There may be opportunities for industry to make submissions to the joint committee in relation to the draft Bill. At Fieldfisher, we are here to advise and assist you in all efforts to engage with the early legislative process. Please get in touch with the Regulatory team for further information and advice.

We can help with:      
  • understanding how to best engage with the legislative and policy-making process and to help get your voice heard 
  • advice on whether your business is likely to fall within scope of the new regime.
  • analysing whether your business may fall within Category 1 and the likely commercial implications.
  • advice on whether or not your business is likely to meet Ofcom's test for 'services likely to be accessed by children'.
  • initial views on online safety risk assessments, and thinking about your future compliance under the regime.
  • any other strategic or legal considerations in light of the Bill.
 

[1]Clause 3, the draft Online Safety Bill- Whether a service has links to the UK is determined by reference to whether it has a significant number of UK users; whether the UK is a target market for the service; or provided that the service is capable of being used in the UK by individuals, whether there are reasonable grounds to believe there is a material risk of significant harm to individuals in the UK arising from content presented on the service or encountered in search results


[2] Illegal content is defined as content that amounts to a terrorist offence, a CSEA offence, an offence specified in regulations made by the Secretary of State ('priority illegal content'), or any other offence in which the victim or intended victim is an individual or individuals. It does not include offences relating to the infringement of Intellectual Property Rights, the safety or quality of goods, or the performance of a service by a person not qualified to perform it. Harmful content is defined more broadly in relation to children and adults, and includes content that is not illegal.

 

[3]  See for example, clause 18, which limits the application of some duties for search services. Clause 18(1) states that duties extend only to the design and operation of the service in the UK; clause 18(2) specifies that duties for search services do not extent to content on the website of a recognised news publisher.


[4] The assessment must determine whether it is possible for children to access the service (or any part of it) and, if so, whether (i) there are, in fact, a significant number of children who are users of the service (or any part of it) or (ii) the service (or any part of it) is of a kind likely to attract a significant number of users who are children.

       

[5] Government press release on 12 May 2021.


[6] Content that is harmful to adults is defined as content that the service provider 'has reasonable grounds to believe … that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities'.


[7] Clause 52, the draft Online Safety Bill


[8] Clause 85, the draft Online Safety Bill

Sign up to our email digest

Click to subscribe or manage your email preferences.

SUBSCRIBE

Areas of Expertise

Public and Regulatory