The Microsoft / FTC $20M Settlement – how to stay ahead of the game in children's privacy | Fieldfisher
Skip to main content
Insight

The Microsoft / FTC $20M Settlement – how to stay ahead of the game in children's privacy

Paul Lanois
05/07/2023

Locations

United Kingdom

On 5 June 2023, the US Federal Trade Commission ("FTC") announced that Microsoft will be required to pay $20M to settle charges for alleged violations of the Children's Online Privacy Protection Act ("COPPA") and the associated COPPA Rule.

The FTC found that Microsoft had violated COPPA by collecting personal information from children who signed up to its Xbox gaming system without first notifying their parents, and by illegally retaining the children’s personal information.

The news follows a similar (much larger) settlement between the FTC and another games company (Epic Games, Inc. – makers of "Fortnite"), which we took a look at back in January 2023, as well as a string of recent similar settlements in only the last few weeks with Amazon (over use of voice recordings on Alexa) and Edmodo (an Ed Tech provider) also found to have violated children's privacy requirements. This case further highlights the cross-Atlantic trend amongst regulators focusing on the protection of children and children's data generally, and on videogame companies in particular.

What was the Microsoft settlement about?

In its District Court complaint, the FTC outlines its allegations.

Under  §312.4(a) of the COPPA Rule, "covered operators" are required to provide notice and obtain verifiable parental consent before collecting, using, or disclosing personal information from children. "Covered operators" includes any operator of a commercial website or online service directed to children and to any operator of a commercial website or online service that has actual knowledge or should have known that it is collecting or maintaining personal information from children. Between 2017 and 2021, the FTC found that 218,000 users in the US had created a Microsoft account through their Xbox console, entering a birthdate indicating that they were children younger than 13 years old.

According to the FTC, the company was required to:

  1. Provide clear, understandable, and complete notice of its information practices, including specific disclosures, directly to parents;
  2. Post a prominent and clearly labeled link to an online notice of its information practices with regard to children including specific disclosures;
  3. Obtain verifiable parental consent before collecting, using, and/or disclosing personal information from children; and
  4. Not retain personal information collected from children online for longer than reasonably necessary to fulfill the purpose for which the information was collected.

Regarding the first two requirements, the FTC found that Microsoft's notice during the Xbox account creation process failed to sufficiently describe their collection and use practices with regard to personal information collected from children and instead only directed parents to the company’s “Privacy Statement”. According to the FTC, this document provided insufficient detail – only discussing Microsoft's practices with regards to children "generally". In particular, the notice failed to disclose to parents that Microsoft intended to collect, for example, images which may contain a child’s likeness (since children could upload their picture for their "gamerpic" profile picture).

Regarding the third obligation (parental consent), the FTC found that during the account creation process, Microsoft only prompted the self-identified under-13s to get a parent after it had already collected various important pieces of personal information (including a telephone number, and approval to share user data with advertisers).

Finally, the FTC found that between 2015 and October 2020 Microsoft had indefinitely retained personal information collected from around 10 million individuals (including children) when the account creation process had been initiated but never completed, an issue Microsoft has since described as a "technical glitch" that was inconsistent with their 14-day retention policy in such cases.

How has Microsoft responded?

As we noted with the Epic case, the key value of decisions like this is often the practical guidance it offers businesses in avoiding similar pitfalls. 

As one would expect, since the settlement, Microsoft has published a list of updates that have been made in response to the criticisms. These include:

  • Updates to the account creation process (such as requiring verified parental consent before requesting phone numbers / email addresses);
  • Introducing parental "re-consent" (prompting parents to re-verify child accounts and grant permission for their child to continue gameplay and activity on Xbox);
  • Fixing the data retention "glitch" and deleting the relevant data; and
  • Updates to the Microsoft Privacy Statement (which is now also linked in "each area of the service where personal information is collected")

What does this mean for businesses?

Whilst little of this is likely to be revolutionary to similar businesses (avoiding technical glitches and deleting data that was retained accidentally is unlikely to be something that would surprise most product teams, we imagine), this case nevertheless has some important takeaways:

Firstly, the increased scrutiny on privacy notices and transparency requirements does not seem to be abating. In the EU and UK, businesses have for some time had to grapple with the WhatsApp decision and the recent UK ICO fine against TikTok  appears to have only made these requirements more demanding. However, even for US businesses that are subject to the scrutiny of the FTC, the Microsoft settlement emphasizes that a consistent compliance strategy can be difficult to design (and, more importantly, to achieve in practice).

Secondly, the previous dearth of enforcement action in the videogame space would appear to be at an end. With the recent focus on children's data in particular, regulators and legislators alike seem to be waking up to the risks that may be posed by videogames and dedicating more of their resources to pursuing action in this area. Combined with wider technology regulation initiatives focused on children (e.g. the UK / California Age Appropriate Design Codes, U.S. state laws imposing social media protection for minors, the EU's Digital Services Act and the UK's prospective Online Safety Bill, among many others), videogame companies have never had more incentive to sit up and review their existing privacy practices.

Finally, the case highlights how privacy enforcement can often arise as an incident to investigations into other matters. Whilst the public record does not allow us to confirm for certain that the children's data investigation arose out of the wider anti-trust investigation into the company's merger with Activision Blizzard, it is safe to assume that the latter may have increased the degree of scrutiny. Various recent cases from the EU have demonstrated similar stories with data subject access request issues, or other minor infringements leading to full-blown investigations of a company's privacy practices.

The message is clear: videogame companies can no longer rely on lax scrutiny when it comes to enforcement of children's rights. Rather, those who want to avoid nasty surprises will need to stay ahead of the game. 

With thanks to James Russell, legal advisor in our Silicon Valley Office, for their support in preparing this article.

Areas of Expertise

Technology and Data

Related Work Areas

Technology