The protection of children's data needs strengthening – will you need to reinforce your data protection practices? | Fieldfisher
Skip to main content

The protection of children's data needs strengthening – will you need to reinforce your data protection practices?


United Kingdom

The ICO has now released the final version of its Age Appropriate Design Code.  The objective of the Code is "to protect children within the digital world" and to provide an online environment that offers children a similar level of protection as they receive in the offline world. Read the full ICO Age Appropriate Design Code here

To achieve this aim, information society services, i.e., those companies and organisations offering online products and services such as apps, websites, games and connected toys or devices (with or without a screen), which process personal data and are likely to be accessed by children in the UK, will need to comply with the 15 standards that the Code specifies.
The text is a result of much stakeholder engagement, starting with a collection of views and evidence from "designers, app developers, academics and civil society" as well as research with parents and children to an initial public consultation that prompted a series of stakeholder round tables and a raft of follow up meetings between the ICO and respondents.

What does the Code mean by "Child"?

It is notable that the UNCRC (United Nations Convention on the Rights of the Child) provides the definition of child, which for the purposes of this Code is a "person under 18".  Some five years on from the age at which a child can give consent under the DPA 2018.

Do we need to follow the Code?

What does "likely" mean in practice?  Well, the possibility of children accessing your service "needs to be more probable than not", i.e., in reality, are children using your service?  To determine this the ICO suggests you take into account whether the nature and content of your service has the appeal factor for children and how your service is accessed as a result.   Are there any preventative measures in place to stop children accessing it?  For example, if you do not want children to access your service then you need to give attention to preventing children from doing so.  There is no specific requirement to use an age verification service to confirm the age of your users.  The Code provides a non-exhaustive list of alternative ways in which this can be achieved.  Naturally, in an age of accountability, if you take the decision that your service is not likely to be accessed by children, then it is advisable to document your reasons for such a decision with evidence.

What about services based outside the UK?

Services based in the UK will need to comply with the Code.  This Code also extends to services based outside the UK, which have a "branch, office or other 'establishment' in the UK and process personal data in the context of the activities of that establishment".   The Code does not apply to establishments elsewhere in the EEA if there is no UK establishment.  However, at the end of the Brexit transition period, the Code will apply to non-UK (including EEA) establishments that offer services to the UK or monitor the behaviour of users in the UK.  This is one of the consequences for the EEA of the UK GDPR. 

Are we expected to adapt our services to children?

Having determined whether the Code applies to you, the next step is to consider each of the 15 standards, which whilst individual in themselves, do need to be viewed as a whole.  Not surprisingly for a Code to protect children, the first standard is to ensure that the best interests of the child are to the fore when online services, likely to be accessed by children, are being designed and developed. 

"Best interests" is another concept borrowed from the UNCRC, which incorporates a medley of aspects that support a child's development from safety, health and wellbeing to family relationships, psychological and emotional growth, identity, privacy, freedom of expression and a voice.  The best interests of a child are of paramount importance in the Code and it will be necessary to balance these against competing interests.  The Code categorically states, "it is unlikely … that the commercial interests of an organisation will outweigh a child's right to privacy".  Overall though the Code's message is that it is not preventing the pursuit of commercial or other interests where children are concerned.  Nevertheless, a child's best interests will more than likely override any competing interests.
We have already done a DPIA! Is that enough?

Online providers may well already be implementing some of the Code's other standards.  For example, you may already have a robust procedure for data protection impact assessments and data protection by design and by default may be an embedded business as usual practice across your organisation.  It may however be necessary to revisit an existing DPIA to make certain that you have assessed and mitigated risks to the rights and freedoms of children that takes "into account differing ages, capacities and development needs" to ensure that the DPIA is compliant with this Code.

Under the Code, privacy settings by default will need to be high, the one exception is if there is a compelling reason for them not to be.  Other familiar areas of data protection include standards on data minimisation, geolocation and profiling.

A catalyst for the inception of this Code was due to children seeing inappropriate content considered detrimental to their health.  The Code suggests that online providers look at existing benchmarks used, for example, by the film and marketing industries or official Government sources like the Chief Medical Officer to assist them when determining what is appropriate.  Standard 12 is concerned with profiling, the settings for which need to be off by default, unless there is a persuasive argument for them not to be and it is in a child's best interests for them to be on by default. 

The profiling standard explains that where content is automatically recommended to children using their data, such as their browsing history, then the online provider will have a responsibility for the recommendations made, and one that is greater than if the child searched for the content themselves.  The content suggested should in no way be detrimental or recognised as harmful to a child.  This is another aspect of the Code where providers will be held to account.  

Standards, which are perhaps, more unique to the Code:
  • Nudge techniques should not be used to encourage children to provide unnecessary personal data, lower or close their default high privacy settings;  
  • Parental controls, which created a lot of conversation at roundtables held to discuss the draft consultation.  To alert a child to the fact that their parents are actively monitoring or tracking them, icons can be used; and
  • Data sharing of children's information should not be performed unless the online provider can demonstrate a compelling reason to do so, taking account of the best interests of the child
Can children have cookies?

The use of non-essential cookies may well prove a challenge since prior consent will need to be obtained, which for children under 13 requires parental consent.  With much change happening in the adtech industry and the extensive consultation that has occurred on this Code with the tech industry to date and more of which is planned between now and when the Code takes effect, this issue is surely not insurmountable to guarantee that a child's online user experience is seamless.

When should we start thinking about this?

In short, now is a good time to have this on your radar. Whilst the Code is unlikely to take effect before Autumn 2021 (twenty twenty one) due to legislative processes and its 12 month implementation period, it is important to consider whether the Code applies to you and if so determine what you need to do to comply.  

When considering how to comply with the Code's standards, perhaps give particular attention to those standards, which may be more challenging.  For example, how will you ensure the design of your service offers an "age appropriate application", i.e., that the different needs of children at a certain age and stage of development are central to its design?  The Code has adopted five different age categories for children: 0 – 5; 6 – 9; 10 – 12; 13 – 15; and 16 – 17 and it is necessary for online providers to ensure that appropriate measures are implemented to the age range of its users.  Protections established for young adults aged 16 – 17 may be markedly different to those for young children in the 0 – 5 age group.  

Invariably, if your services are directed at or likely to be accessed by children, you will need to make some amendments to your data protection practices.  Whilst the Code itself is a chunky piece of guidance, it is good to note the glossary, followed by a number of annexes starts at page 89.  Equally, it is good to see that the ICO is "preparing a significant package of support for organisations" and that its next phase of work "will include significant engagement with organisations".  There is a strong desire to ensure that children are given appropriate protections online and this can be achieved in a number of ways.  The Code is not "prescriptive" and allows a level of flexibility when complying with the standards.  Given the length of the implementation period and the expectation that GDPR enforcement will be fast moving in 18 months' time, it is advisable to begin to consider how the Code will impacts your business and determine, where applicable, how you demonstrate your service uses children's data fairly and in a compliant manner.