Online Safety Bill introduced to Parliament: Five things you can do now to prepare your business | Fieldfisher
Skip to main content
Insight

Online Safety Bill introduced to Parliament: Five things you can do now to prepare your business

Locations

United Kingdom

On 17 March 2022, the long-awaited Online Safety Bill was introduced to the House of Commons. This is the first step of the Bill's passage through Parliament, and represents what advocates hope will be a "new era of accountability online".

The Bill sets out a new legal and regulatory framework for identifying and removing illegal and harmful content on the internet. It is an ambitious (and contentious) attempt to reign in the "wild west" of self-regulation by Big Tech companies. Culture Secretary Nadine Dorries has commented that the Bill means major tech firms will no longer be left to "mark their own homework".

The Bill has been significantly strengthened and refined since it was published in draft in May 2021, reflecting the outcome of extensive pre-legislative Parliamentary scrutiny. Nevertheless, it retains at its core the same statutory duty of care for content-sharing platforms and search services to keep users, particularly children, safe online.

Companies will be expected to take reasonable and proportionate action (bearing in mind their size, resources and risk level) to tackle online harms in connection with their services. What this means in practice will be a matter of substantial debate and will be heavily influenced by guidance to be published by Ofcom, the regulator under the Bill.

For businesses that fall within scope, below are five things that you can start doing now to help ensure that you are in compliance by the time the Bill becomes law:  

1. Conduct an online safety risk assessment

All businesses in scope will need to take a proactive approach to tackling illegal and harmful content on their platforms. The first step for most companies will be to assess their user base and the risk of harm to those users on the service. The Bill sets out a list of matters to be covered and different services will be subject to different requirements. You should familiarise yourself with these and bring together relevant experts from across your business to begin developing a risk assessment.

Part of this assessment will also include considering whether your service is likely to be accessed by children. If it is, then you will be required to protect under-18s from harmful and inappropriate content that does not cross a criminal threshold. Priority categories for “legal but harmful” content will be set out in secondary legislation in due course.

2. Review your complaints procedure and terms of service

The Bill requires businesses to have a transparent and easy to use complaints procedure that allows for specified types of complaints to be made. In particular, your complaints procedure must:

  • allow for complaints to be made in relation to the type of content and the duties on the service;
  • provide for appropriate action to be taken when a complaint is upheld (examples of appropriate action might include the removal of flagged illegal content or reinstatement of unfairly removed content);
  • be easy to access and use for all users, including children; and
  • be transparent (for example, each step of the complaints procedure should be set out clearly, including the types of complaints that can be made and what a user can expect to happen from the point at which they make the complaint).
The terms of service should also set out the policies and procedures that govern your handling of complaints. Start thinking about whether your current complaints procedures meet these requirements and the steps that you may need to take to update them (both in your terms of service and your policies and procedures).

3. Set up user reporting (if you haven’t already)

Service providers will need to implement systems and processes that allow users and affected persons to report specified types of content and activity. "Affected persons" include those who might be affected by content, or who may need to assist other users with making a complaint. Now is a good opportunity to examine your current reporting processes and procedures – can your users easily find and use the mechanisms in place to report content or behaviour that breaks the rules?

4. Report to the National Crime Agency

Service providers will need to implement systems and processes to ensure that detected but unreported child sexual exploitation and abuse ("CSEA") content is reported to the National Crime Agency ("NCA"). Reports must be sent to the NCA in a manner and within timeframes to be set out in regulations in due course. In the meantime, start to consider the systems that you currently have in place to detect illegal content on your platform, whether that is terrorist content or CSEA content, and the reporting mechanisms that you would implement.

5. Consider whether you might be a Category 1 service

The largest services with the highest-risk functionalities will be designated as "Category 1" services. Those user-to-user services that meet the Category 1 threshold conditions, specified by the Secretary of State in consultation with Ofcom, will be subject to additional legal requirements, including to:

  • set clear and accessible provisions in terms of service explaining how content that is legal but harmful to adults will be treated, and apply those provisions consistently;
  • carry out an assessment of the impact that safety policies and procedures will have on users’ legal rights to freedom of expression and privacy;
  • specify in a public statement the steps taken to protect users’ legal rights to freedom of expression and privacy;
  • implement systems and processes designed to ensure that the importance of the free expression of content of democratic importance is taken into account when making decisions about how to treat such content;
  • implement systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about how to treat such content;
  • implement a dedicated and expedited complaints procedure that ensures that the decisions of the service provider to take action against a user because of a particular piece of journalistic content can be challenged;
  • offer optional user verification and user empowerment tools on their sites; and
  • implement proportionate systems and processes to prevent the risk of users encountering fraudulent adverts.
Consider now whether your business may fall within Category 1, or if you have a direct business relationship with a Category 1 service. In either case, you will need to be aware of the additional burdens on Category 1 services.


With thanks to trainees Mikhail Popov and Genevieve Liston-Oakden for their assistance.

Areas of Expertise

Public and Regulatory

Related Work Areas

Retail and Consumer