How Ofcom's approach to Video Sharing Platforms will inform its regulation of the Online Safety Act | Fieldfisher
Skip to main content

How Ofcom's approach to Video Sharing Platforms will inform its regulation of the Online Safety Act


United Kingdom

In December, Ofcom published its annual report on the Video-Sharing Platform ("VSP") regulations (the "Report"). In this article we look at Ofcom's approach to regulating VSPs over the past three years and how it will handle the transition from the VSP regime to the Online Safety Act ("OSA") in the coming 12 months.

The Report identifies some of the areas Ofcom may investigate and act on once the OSA comes into force, and the compliance steps that platforms will need to take to mitigate risk to users.

Transitioning regimes: from the VSP to OSA

Since 2020, the VSP regulations have encouraged platforms to conduct risk assessments relating to content online. The OSA repeals the VSP regulations and replaces them with duties under the OSA following a transition period. The Secretary of State for DSIT (Department for Science, Innovation and Technology) must provide six months' notice of any repeal and the transition period will end at the expiry of this notice period. During the transition period, VSPs (or VSP elements of platforms that can be separated out from non-VSP sections of a platform) will continue to operate under the VSP regulations alongside certain elements of the OSA, such as the requirement to comply with demands for information by Ofcom.

The OSA imposes further – and strengthened – obligations than the VSP regulations. Platforms currently subject to the VSP regulations will now be legally required to conduct risk assessments for illegal content, child access and children's risk – and take appropriate measures to mitigate and manage these risks. More services, and most forms of user-generated content on those services, will fall into scope of the OSA. Harms are more broadly defined under the OSA than under the VSP regulations. Some platforms will also be subject to additional duties of transparency, user empowerment and preventing fraudulent advertising. The OSA therefore represents a step-change in how services must respond to online safety regulation.

Ofcom's findings: Children's use of online platforms and new obligations for VSPs

In the Report for 2023, Ofcom identified that over 20% of children aged 8-17 have an online profile with an age set at 18 or over, and one-third of children aged 8-15 have an account with an age set to 16 or over.

Under the OSA, in-scope services (which are likely to be used by children) must carry out a children's risk assessment and utilise appropriate measures to mitigate the risks of harm to children. The Report figures above suggest that many children are currently accessing services ostensibly intended for adults or older children. Platforms, which are adult services, aimed at users 18+, or for example a service with a particular age limit, such as 16+, will need to consider carefully whether their services are likely to be used by children and if they are not appropriate for all, will need to adopt an appropriate age assurance mechanism or mechanisms for their intended audience.

Services will also be obligated to conduct a children's access assessment to identify whether it is possible for children to access the service or part of it, and whether there are significant numbers of children using the service or if the service is likely to attract significant numbers of children. A service is only able to conclude that children cannot access its service for the purpose of the OSA if age verification or estimation has the result that children cannot normally access the service in question. This means that businesses will need to carefully consider the age assurance mechanism(s) that they implement.

Ofcom's approach to investigations: review of age gates

In preparing the Report, Ofcom tested the age gates of different services. It noted that many VSPs rely on users declaring their ages during account sign up.  Ofcom found that various services took different approaches on the default date of birth shown to users when declaring their age – with some using a 'neutral' date (that is designed to not encourage falsification, as selecting the default or a nearby date would not permit access), and others using a blank date, with a date of birth that would provide access visible to users by default.

More stringent tools – facial age estimation and ID verification – are available and used by some platforms. Ofcom noted that platforms should consider what options may be most appropriate when taking into account the nature of the content on their service.

The exact approach used by services will need to balance the risks of children accessing a service whilst complying with data protection law. The Information Commissioner's Office ("ICO") has recently revised its opinion on the use of age assurance technologies to take into account the OSA, and collaborative research commissioned by both Ofcom and ICO has explored how to measure the effectiveness of age assurance methods.

The Report also noted Ofcom's expectation that services proactively identify and remove underage accounts, with age assurance at account setup offering the greatest mitigation of underage users accessing inappropriate material or services. Ofcom emphasised its present focus on how services prevent and remove underage users and that this will continue under the OSA. VSPs were reported to use a combination of methods such as behavioural and language analysis, and case-by-case reviews by moderators.

Where services currently rely on lighter-touch methods at account setup, such as self-declaring, Ofcom clearly does not consider this sufficient means of detecting and removing underage accounts.

This is indicative of Ofcom's holistic approach to regulating child safety. Where one element of access control is weak, Ofcom considers this to have an impact broadly across a service's risk mitigation strategy. For example, having strict measures to verify the age of a user when setting up an account would be fundamentally undermined by allowing users to access content without having an account.

What can providers do to prepare

Ofcom notes in the Report that it anticipates its research under the VSP regime will inform its approach to regulating online safety under the OSA. Ofcom plans to consult on its draft codes of practice for the protection of children in Spring 2024, which will give services an opportunity to identify Ofcom's focus areas, as well as inputting their own expertise.

Given the OSA's significant penalties for non-compliance, together with Ofcom's enhanced investigative powers, it is essential for businesses to understand their obligations under the OSA. Companies will need to consider child safety, and the issue of age gating, throughout all facets of their offering in addition to the content that they are supporting.

For advice on how the OSA may impact your business, or for guidance on compliance, contact John Brunning, Lorna Cropper and Frankie Everitt.

With thanks to Jonathan Comfort, trainee, for his assistance in drafting this article.