To date, this approach has largely been self-motivated, with only limited regulation; namely the take down requirements for unlawful content under the e-commerce regulations.
However, this is all set to change with the introduction of the UK's Online Safety Bill: a flagship policy for the UK government and a milestone change in the regulation of online platforms and services, ushering in a new era of accountability online. While headlines have focused on the social media giants, the bill will also have a significant impact across the gaming industry.
A draft bill was released publicly in May 2021, with both UKIE and TIGA engaging in the process on behalf of the games industry, and the bill has been significantly strengthened and refined as a result of such scrutiny and feedback. Nevertheless, it retains at its core the same statutory duties of care aimed at keeping users, particularly children, safe online.
Will my game or platform fall within scope?
As currently (May 2022) drafted, the bill will apply to online user-to-user services and search services where those services have links to the UK.
While there is some complexity in the detail, broadly speaking you will fall in scope if your online game or platform:
Allows players or users to generate, upload and share content with others or enables content to be found through search;
Has a significant number of UK users or the UK is a target market; and/or
Can be used in the UK and there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK.
One particularly challenging (and perhaps unique) question for the gaming industry lies around the definition of the provider of a user-to-user service.
The bill targets the providers that have control over who may use the user-to-user functionality of the game. The government intends for this to focus on the entity that "directly controls users' access to functionality that enables users to interact or share user-generated content, rather than any other entity that may embed that service or control other aspects of it."
In the context of an online game, each company involved in the development, publishing and distribution of the game will need to assess whether it has control over this functionality. This may be easier said than done and will depend on a number of factors including the contractual model deployed, the role of each party in provision, access and operation of the game, and the relationship with the end player.
It remains to be seen how liability will be shared between developers, publishers and distributors.
We expect to see contracts in the future containing detailed requirements, indemnities and liability clauses to take into account these online safety requirements and distribute contractual responsibility.
What will I be required to do?
The bill imposes extensive duties on regulated services with the goal of establishing a duty of care to protect users against two types of content: (1) illegal content and (2) harmful content.
Illegal content is defined as content that amounts to a terrorist offence, a child sexual abuse offence, or other priority illegal content (to be identified by government in regulations).
Harmful content is divided into content that is harmful to children, and content that is harmful to adults, and is broader and more subjective than 'illegal content'.
A number of general duties will be imposed on all regulated services, including online games. These include:
The duty to carry out risk assessments for illegal content;
Safety duties to take proportionate steps to reduce and manage the risk of harm to individual users;
Duties to protect the rights to freedom of expression and privacy; and
Reporting and redress duties, including to have appropriate reporting systems and complaints processes in place.
While many online games will fall outside of the Category 1 scope, a lot of online games will likely be accessed by children, meaning the additional child safety duties will be highly relevant to the sector, regardless of age certification measures.
What happens if I don't comply?
The potential financial implications for non-compliance are significant. Regulated games companies could be liable for fines from Ofcom of up to £18 million, or 10% of annual global revenue.
Games businesses will also face the inevitable burden and cost of ongoing compliance, which is likely to be a challenge particularly for smaller games companies.
While many will want to take a proportionate and risk-based approach to compliance, the nature of games and the sudden surge in player numbers that can result from viral success will mean those businesses will need to keep their risk assessment constantly under review.
The bill also creates a number of criminal offences, including failing to comply with an Ofcom information notice. It also reserves the power for Ofcom to pursue criminal action against senior managers of companies who fail to comply with the bill.
Experience of new regulation suggests that, while there is likely to be an initial period of pragmatism from Ofcom, it will not be long before it starts to address non-compliance in key sectors. With the hyper growth of gaming and interactive entertainment, this sector is likely to be firmly in Ofcom's crosshairs.
What should I be doing now?
While the final text of the bill will be subject to some change as it is finalised in Parliament, substantial revisions are unlikely, so now is the time to start preparing.
Here are five things companies in this area can do now to help ensure you are compliant, by the time the bill becomes law:
Determine whether your games or platform fall in scope
Assess whether you are likely to fall within scope of the bill. For example, ask yourself:
Does your game or platform enable users to generate and share user content?
Does it contain chat functionality (whether voice or written chat) or allow other communications between players?
Do you have players in the UK or does your game or platform otherwise target the UK market?
All gaming businesses in scope will need to take a proactive approach to tackling illegal and harmful content on their games.
The first step for most companies will be to assess your user base and the risk of harm to those users on the service.
The bill sets out a list of matters to be covered in the risk assessment. You should familiarise yourself with these factors and bring together relevant experts from across your business to begin developing a risk assessment.
Part of this assessment will include considering whether the game is likely to be accessed by children (and this will be a relatively low bar). If it is, you will be required to protect under-18s from "harmful" content, even if that content is not criminal. Categories for "harmful" content will be set out in secondary legislation in due course.
Review your complaints procedure
The bill requires games businesses to have a transparent and easy to use complaints procedure that allows specified types of complaints to be made.
In particular, your complaints procedure must:
Allow complaints to be made in relation to the type of content and the duties in relation to the game;
Provide for appropriate action to be taken when a complaint is upheld (examples of appropriate action might include the removal of flagged illegal content or reinstatement of unfairly removed content);
Be easy to access and use for all users, including children;
Be transparent (for example, each step of the complaints procedure should be set out clearly, including the types of complaints that can be made and what a user can expect to happen from the point at which they make the complaint). The terms of service should also set out the policies and procedures that govern handling of complaints.
Set up user reporting
Games companies will need to implement systems and processes that allow users and affected persons to report specified types of content and activity. Affected persons include those who might be affected by content, or who may need to assist other users with making a complaint.
Now is a good opportunity to examine your current reporting processes and procedures – can your users easily find and use the mechanisms to report content or behaviour that breaks the rules?
Check Ofcom's interim codes of practice
Service providers will need to implement systems and processes to ensure that detected but unreported child sexual exploitation and abuse content is reported to the National Crime Agency (NCA).
Reports must be sent to the NCA in a manner and within timeframes to be set out in regulations in due course.
In the meantime, Ofcom has published helpful interim codes for illegal content. These are useful tools and set out what Ofcom is likely to expect from regulated games in future under the new regime. You should consider the systems, processes and tools that you currently have in place to detect illegal content, whether that is terrorist content or child sexual abuse material, and the reporting mechanisms available.
At the time of writing, the Online Safety Bill has entered the Select Committee stage in Parliament. While there is no certain timeline for entry into force, given the extent of pre-legislative scrutiny we expect a comparably short period before the bill becomes law, and enforcement begins.
A version of this article was first published in gamesindustry.biz.
John Brunning is a games and technology partner at Fieldfisher. He works across the games industry, acting for publishers and distributors, developers, social media platforms, ad techs and back end tech providers. Frankie Everitt is a public and regulatory lawyer at Fieldfisher. She has advised on high profile litigation and in non-contentious matters for technology companies. She has followed the development of the Online Safety Bill over the past two years, and advised clients on its implications.
Sign up to our email digest