Age assurance: a modern coming of age approach to ensure the safety of children online and an age appropriate experience | Fieldfisher
Skip to main content
Insight

Age assurance: a modern coming of age approach to ensure the safety of children online and an age appropriate experience

Locations

United Kingdom

Requirements for online services are maturing and driving greater certainty for age assurance in the UK, Europe and the US

Are children accessing your online services? Age verification regulations around the world are changing, enforcement is increasing and age verification technology is evolving at speed. Here's what you need to know.

In our previous article on the recent loot box transparency guidance, we highlighted how principles with respect to age assurance may prove more substantive than some of the other principles.  This is partly due to loot boxes being a relatively problematic topic but equally because there are a number of other initiatives, by way of regulatory guidance and legislation that are driving the implementation of more robust age assurance techniques online and across the gaming ecosystem.  Here, we examine those other influencing factors.

In recent months we have seen the enactment of the UK Online Safety Act (OSA) in the UK; the Irish Data Protection Commission (DPC)'s decision in the matter of TikTok Technology Limited; the release of the UK Government's guidance on the use of "loot boxes"; and an FTC consultation on parental consent under COPPA (Children's Online Privacy Protection Act 1998) in the US.  One thing that all these matters have in common is the consideration of the age of the user of online services, in particular whether the user is a child under 13 years of age or under 18 with respect to UK and EU law and guidance.  Whilst maximum fines of €20 million / £17 million or 4% of global annual turnover of an undertaking under the EU's GDPR (General Data Protection Regulation) caught the headlines when that Regulation was finalised, the OSA far exceeds that at £18 million or 10% of global annual turnover for offences such as regulated services not preventing children from accessing content that may be harmful to them or the failure of social media providers to block the accounts of children under 13 in accordance with the user terms and conditions.

This news is the latest in a series of recent updates from legislators and regulators that necessitate more effective age assurance for children online and better controls to prevent them accessing potentially inappropriate content. Here we consider what these developments mean for the regulation of age assurance in the UK, Europe and the US, besides focusing on what this means in practice for businesses' global compliance programmes.

What are the current requirements?

The requirements for age assurance mechanisms online and the available technology are still maturing.  For some time, online services have been required to identify when they are dealing with children in a variety of contexts.  For example, under COPPA, website operators are required to obtain verifiable parental consent for children under the age of 13.  Similarly, in 2018, the GDPR introduced the concept of a "Digital Age of Consent" under Article 8. This requires data controllers to "make reasonable efforts to verify" consent is being given by an adult with parental responsibility where they are processing the personal data of a child under 13 (and up to 16 according to Member State law).

With respect to COPPA, age assurance has traditionally always been assessed according to a risk-based approach in proportion with the intended purposes (for example, sending an email to the parent, has often been considered sufficient where the data was used for internal purposes only and not shared with third parties). Equally, under the GDPR, the obligation has been explicitly limited by the caveat to take into consideration "available technology" including "state of the art".  

What progress has there been?

In recent years we have seen change gathering momentum in three main ways:

  1. Regulation – the distribution of guidance from regulators and statutory codes focused on the protection of children online;
  2. Enforcement – the stricter enforcement of existing provisions; and
  3. Technology – the development and testing of new tools to facilitate compliance.

Increased Regulation of Children's Data and Online Safety

The UK's Information Commissioner's Office's (ICO) Age Appropriate Design Code (AADC) aka the Children's Code, was "the first of its kind" with respect to a statutory code setting out how online society services, i.e. an online service, likely to be accessed by a child (an individual under 18) needs to protect children's data in addition to the requirements of the GDPR.  The UK's Children's Code was closely followed by Ireland's Data Protection Commission's (DPC) fundamentals, and the passing of the California's Age Appropriate Design Code Act, which has now been blocked by a preliminary injunction but which is itself subject to further legal challenges.

These developments mark a notable shift towards a more technical approach to age assurance. For example, Standard 3, Age Appropriate Application, UK AADC, supported by the Information Commissioner's opinion on Age Assurance, asks providers of online services to consider whether it may be appropriate to engage a third party age verification service, or to make use of artificial intelligence tools to make an estimate of a user's age.  The age assurance method or methods selected by an online service provider needs to be appropriate and proportionate to the data that is being processed whilst complying with data minimisation.

Simultaneously, legislative initiatives like the EU's Digital Services Act (DSA) and the UK's OSA have imposed  stricter obligations on businesses to restrict children from accessing certain kinds of content online.  The former (which came into force in August 2023 for the largest platforms) prohibits the use of algorithmic programming to target advertisements at children under 18 (therefore further necessitating authentication of the age of users of online services). The OSA in its final Parliamentary form, imposes broad obligations on online platforms to prevent children, also defined as a person under the age of 18, from accessing content which would be inappropriate for their age, such as pornography or content encouraging, promoting or providing instructions about suicide, eating disorders and self-harm.  To prevent such access, relevant online service providers, termed in the Bill as "regulated service" need to adopt appropriate age assurance mechanisms to achieve this.

The OSA will be regulated by Ofcom, who in its Roadmap to Regulation, highlighted how it will leverage off and align with the ICO's work in this area, stating:

"The ICO has done a great deal of work to protect children online, including producing guidance on age assurance and how firms should assess whether their services are likely to be accessed by children. We will use this work to inform our own approach to guidance and Codes of Practice, and to ensure we have a common understanding of the interactions between online safety and privacy. We will consult the ICO before publishing relevant online safety guidance".

Finally, sector specific initiatives are also concurrently making this a priority. As mentioned in our opening paragraph, we recently saw this with the UK's Association for Interactive Entertainment's (UKIE – the videogame industry's trade body in the UK) guidance on loot boxes, which, in addition to setting out various transparency and information requirements particular to those mechanisms, also mandated the development of new technological controls to effectively restrict anyone under the age of 18 from acquiring a paid loot box without the consent or knowledge of a parent or guardian.

Stricter Enforcement

Regulators on both sides on the Atlantic are prioritising the safety of children online and there have been a number of recent enforcements involving children.

In the gaming sector, an industry which is extremely popular with children and adults alike, we have seen highly publicised settlements with the FTC involving Epic and Microsoft in connection with COPPA violations due to failure to appropriately limit child data sharing prior to confirming parental consent.

Similarly in Europe, we have seen substantial fines levied against Meta (in Ireland) in relation Instagram (September 2022) and most recently TikTok (September 2023) for allegedly inappropriately processing children's data and not configuring settings to the highest privacy standard by default.  The ICO fined TikTok (UK) £12.7 million (April 2023) and estimated that TikTok allowed up to 1.4 million UK children under 13 to use its platform in 2020.  Whilst all three of these matters relate to the processing of children's data before the emergence of the respective statutory code and regulatory guidance, both regulators have looked to the GDPR articles and recitals, in particular recital 38, in addition to European Data Protection Board's (EDPB) Guidelines, with respect to the additional protection children need to be given. The Instagram and the two TikTok decisions are under appeal and it will be interesting to see how the Courts apply the law.

With respect to the Irish DPC's draft TikTok decision, the Italy Supervisory Authority made an objection to the Irish DPC's findings that TikTok had complied with the GDPR with regard to the technical and organisational measures in respect of age verification in accordance with Article 25(1).  The EDPB's review and binding decision directed the Irish DPC to however find that it was not able to conclude that:

"the technical and organisational measures in respect of age verification processes themselves undertaken by TTL during the Relevant Period [31 July 2020 – 31 December 2020] in respect of the age verification processes … infringed the GDPR".

The EDPB provides much commentary about age verification in general as well as the specific case in its binding decision (BD) at paragraphs 166 – 246, which are embedded into the Irish DPC decision.  Two particular points leap out.  Firstly, in relation to what other companies in the market are doing and secondly with respect to the state of the art and how that links to the appropriate technical and organisational measures selected in order to comply with Article 25(1), GDPR.  With respect to market behaviour, the EDPB highlights how "a particular controller's compliance with Article 25, GDPR is assessed on a case-by-case basis", meaning that looking at what others are doing in the market, by way of comparison, might not be an appropriate approach (paragraph 216, EDBP BD).

Given the speed of technological change in recent times, the EDPB reminds controllers that "state of the art" is a concept that is "regularly changing over time".  To account for this the EDPB emphasises how:

"a controller therefore has to periodically review whether the measures applied are still appropriate at the current moment, taking in all factors under Article 25(1), GDPR" (paragraph 246, EDBP BD).

Better Tools

In recent years, we have seen the development of more sophisticated tools that will be able to support the more stringent requirements to authenticate the age of the user.  There has been a shift from "age verification" tools (dependent on comparing a particular individual against a trusted database of information) to "age estimation" technologies (which instead rely on making predictions of a person's age bracket based on biometric markers).

We have seen some of the bigger names in this arena (Yoti and SuperAwesome) recently apply to the US Federal Trade Commission (FTC) alongside the Entertainment Software Ratings Board (ESRB) to have their “Privacy-Protective Facial Age Estimation” technology recognised as an acceptable parental consent mechanism under COPPA. As explained by the Age Verification Providers Association (AVPA), these kinds of technology have the advantage of not needing "to retain any information about an individual, as the result is immediate and the facial image can be instantly deleted". Indeed, promotion material claims that such technology does not require "enough data for that data to be unique to the individual" which could potentially mean it can avoid being classified as "personal data" altogether.  Yoti also participated in the ICO's sandbox programme, which enabled it to engage with the regulator and benefit from the regulator's expertise and advice on mitigating risks and implementing ‘data protection by design’.

Whether any one age estimation, assurance or verification system will be endorsed by regulators remains to be seen. However, with industry standards like ISO 150/IEC 27566 on Age Assurance Systems and bodies like the Age Verification Providers Association continuing to progress this area, together with regulators becoming more familiar with how to measure these technologies, we can expect that they will only become more accurate and commonplace.

What does this mean for business?

Overall, the expansion of regulatory requirements on age gating, the associated increase in regulatory scrutiny to enforce adoption of these new systems and tools as well as civil society campaigns, means that businesses across the board will want to re-evaluate how appropriate their current arrangements are.

Whilst it may not be appropriate in all cases for a business to implement a third party age estimation service, businesses do need to assess, document and be able to evidence that the age assurance mechanism they have selected is suitable to their data processing and purposes and one that they can demonstrate works in practice for them.  The EDPB in its binding decision on TikTok expressed "serious doubts as to whether TTL has demonstrated, as required by the accountability principle, measurable effectiveness of the implemented ex ante measures" (paragraph 234).

Increased technical sophistication and closer regulatory scrutiny most certainly places the focus on businesses to consider the most appropriate, state of the art measures for their child users and where necessary to confirm its users are adults.  That is, from a data protection perspective. Consideration also needs to be given to Ofcom's age verification guidance for the UK market, when available, as well as requirements from the EU and US.  Other countries worldwide are concentrating on this as Ofcom's recent meeting of the Global Online Safety Regulators Network (GOSRN) shows.  With the priority of children's data and online safety shared at a global level and the potential level of fines available to regulators, businesses need to regularly assess their age gating methods and consider whether they comply with the GDPR as well as more recent legislation aimed at tackling online safety in the UK and EU, besides taking into account relevant guidance.

Interestingly, research suggests that prevailing attitudes amongst some parents are not seeking to rigidly enforce age assurance methods but rather to allow their children, under the age of 13, to circumvent age gating and even facilitate it!  The research showed how children are also:

"gaining access to their parents’ settings or parental control apps, creating new accounts online when parents followed them on social media, and changing their IP address using a VPN to avoid controls on Wi-Fi settings".

These issues highlight how children's safety online is a multi-stakeholder problem and not one that technology and online service providers will be able resolve on their own.  Whilst regulators will advocate for online service providers to liaise with the child users to understand how they want to be able to use their services, research shows that parents and educators will, of course, also have their part to play. 

Age assurance requirements online are most definitely maturing and together with other measures they can provide a safe online environment for children.  It is inevitable we are going to see a lot more activity in this area in the forthcoming months and given the need to regularly monitor the "state of the art" as well as be able to demonstrate that the mechanism you adopt works in practice for your business, it is an area that we all need to proactively manage.

With sincere thanks to my co-author and colleague James Russell, Legal Advisor, Fieldfisher, California.

Sign up to our email digest

Click to subscribe or manage your email preferences.

SUBSCRIBE

Areas of Expertise

Data and Privacy