CNIL issues 50 million euro fine against Google in the first major GDPR infringement case | Fieldfisher
Skip to main content

CNIL issues 50 million euro fine against Google in the first major GDPR infringement case


United Kingdom

On 25 and 28 May 2018, the French Data Protection Authority (the "CNIL") received group complaints from the associations None Of Your Business and La Quadrature du Net. La Quadrature du Net was mandated by 10 000 people to refer the matter to the CNIL.

On 25 and 28 May 2018, the French Data Protection Authority (the "CNIL") received group complaints from the associations None Of Your Business and La Quadrature du Net. La Quadrature du Net was mandated by 10 000 people to refer the matter to the CNIL. The associations claimed that Google did not have a valid legal basis to process the personal data of its users, particularly for ads personalization purposes. The complaints focused specifically on Android's set-up process where users need to create a Google account in order to use their device.

On 21 January 2019, the CNIL's Restricted Committee – which is responsible for imposing sanctions – observed two types of GDPR infringements: a lack of transparency and information regarding the processing operations carried out by the tech giant and a lack of legal basis for the processing of personal data for advertising purposes. The CNIL contemplated that these substantial breaches opposed the legitimate aspirations of individuals who wish to maintain control over their own personal data.

This article sumarizes the CNIL's decision on these two fundamental issues and draws some practical observations that may potentially concern other Internet players.

For the sake of this article, relevant portions of the CNIL's decision were translated. Please be advised that these are neither official nor certified translations.

  1. Violation of the transparency principle and the obligation to inform users

On the merits, the CNIL found that the information Google had provided to its users did not comply with the principles of accessibility, clarity and intelligibility set out in Article 12 of the GDPR. The Restricted Committee also stressed the fact that some of the mandatory information listed in Article 13 of the GDPR had not been provided to the data subjects. 

  • On accessibility: the importance of structuring the information provided

Article 12 of the GDPR states that information must be provided in an "easily accessible form". The CNIL states that the obligation of accessibility is partly based on the ergonomic choices made by the controller.

In this case, the French DPA noted that the information was "scattered" across several documents, thus making it difficult for data subjects to easily access the entirety of the information. The CNIL further considered that these documents "contain buttons and links that must be activated to obtain additional information", which leads to a "fragmentation of information", thus forcing users to multiply the "number of clicks" necessary to access the various documents. On this point, the CNIL especially found that information about ads personalization processing was retrievable after "five actions of the data subjects". All in all, this makes it difficult to find the information, even for privacy practitioners.

It is worth noting that the CNIL assessed the “overall layout of information” that was put in place by Google suggesting that, more than ever, the transparency principle must be imbedded into the user's experience and in particular, data controllers must pay attention to the first level of information that is provided to data subjects (see below).

  • On clarity and intelligibility
  • Higher scrutiny for massive and intrusive processing

The CNIL considered that the obligation of clarity and intelligibility must be assessed in light of the nature of each processing operation and taking into account its concrete impact on data subjects.

In this case, the CNIL regarded the processing of personal data carried out by Google as “massive and intrusive in nature”. This qualification arose from a number of factors detailed with great precision by the French DPA:

  • The significant number of services offered by the company;
  • The wide variety of sources the data originated from (e.g., Gmail account, Youtube, Google analytics etc.);
  • The very nature of some of the data obtained individually (e.g., geolocation data, browsing history or other data likely to reveal with a “significant degree of accuracy many of the most intimate aspects of data subjects' lives”);
  • The combination of the said data.

Consequently, the particular massive and intrusive nature of Google's processing triggered a higher scrutiny. The CNIL considered in particular that the principle of clarity and intelligibility must be assessed in light of the particular characteristics of the said processing operations. In this case, users were not able to sufficiently understand the specific consequences of the processing. The purposes of the processing, or the collected data descriptions were deemed too generic, too vague and incomplete. Such description did not allow users to measure the extent of the processing and the degree of intrusion into their private life. The CNIL appears to apply here a balancing test: the more invasive the processing is, the more comprehensible and clearer the information must be.

  • Assessment of the clarity and intelligibility of the legal basis

The CNIL also considered that the lack of clarity and intelligibility applied to the legal basis for the ads personalization processing. While Google claimed to only rely on consent as the legal basis, the CNIL found that the company also relied on its legitimate interests in its privacy policy, in particular to carry out marketing activities. The Restricted Committee found that the distinction between ads personalization processing and marketing processing was rather unclear and did not allow users to clearly understand what processing relied on consent or on legitimate interest. The CNIL highlighted the importance of defining a clear and distinct legal basis for each processing operation, and more explicitly, to clearly define the nature of the processing operations envisaged and their respective legal basis.

  • On the information to be provided to data subjects

Citing Article 13 of the GDPR, the CNIL recalled that the data subjects must receive fair processing information, confirming its strict commitment to elect transparency as a key component of the European data protection framework. Some interesting developments are worth highlighting.

  • No distinction between the transparency obligations set out in Article 13(1) and Article 13(2)

Data protection practitioners have long agreed that it is unclear why the information to be provided to data subjects under Articles 13(1) and 13(2) is set out in two different provisions. As a reminder, in its detailed guidance on the right to be informed the UK's information Commissioner's Office (ICO) implies that both sets of information shall be given to data subjects in all cases.

In this instance, the CNIL heavily sanctioned Google for failure to specify the period for which the personal data will be stored. The French DPA especially stated that " this information is one of the mandatory information to be provided to the persons concerned pursuant to Article 13(2)(a) of the Regulation".

Thus, the position of the French DPA supports the fact that there is no practical distinction between Articles 13(1) and 13(2) and consequently no discretionary opportunity for a data controller to distinguish the type of information to be provided to data subjects. 

  • Clarification on the amount of information to be provided: "just the right amount"

In its defense, Google argued that the right to be informed must be appreciated in light of all the information tools made available to users at the time of the creation of their accounts and thereafter. Here, and unsurprisingly, the CNIL recalled that compliance with Article 13 of the GPDR shall be fully achieved at the time of the creation of the account, or at the time when personal data are obtained. 

However, the CNIL pointed out that the “provision of comprehensive information in the context of the very first layer of information would be counterproductive and would not comply with the transparency requirement”. Thus, the Restricted Committee seemed to distinguish between a “first layer of information” (i.e., provided at the time of the creation of the account) where data subjects should be enabled to grasp the “number and scope of the data processing operations undertaken”, and “further layers of information” (i.e., after the account has been created) where more comprehensive information should be provided.

As a result, it appears that practitioners must strike a balance to determine just the right amount of information to be provided to data subjects: saying too much too soon may turn out to be counterproductive and saying too little too late may be deemed an infringement of the obligation of transparency and information. This promises to be a complex balancing exercise for internet players who offer a plethora of interrelated online services.

  1. Violation of the obligation to have a legal basis for ads personalization processing: consent non-validly obtained

On the merits, the CNIL found that Google violated its obligation to have a legal basis for the processing as set out in Article 6 of the GDPR or, more precisely, that valid consent had not been obtained, as it was neither sufficiently informed, nor specific or unambiguous.

  • Not sufficiently informed

The Restricted Committee's decision enshrined that consent and transparency go hand in hand. Aligned with the EDPB's guidelines on consent, the CNIL recalled that for consent to be informed, data subjects must be clearly told what they are consenting to. In this case, due to the fragmented nature of the information and the lack of clarify on the exact nature of the processing thereof, the data subjects could not have a just and informed perception of the nature and amount of data collected. Thus, consent was not sufficiently informed.

  • Neither specific nor unambiguous

The CNIL also contended that Google’s chosen user experience led to blanket consent. Indeed, even though users could modify some options associated with their account by clicking on the 'more options' button, the account personalization settings were pre-checked by default, which reflected, unless otherwise specified, users' consent to ads personalization processing. The fact that users' positive action was necessary to opt-out from such settings meant that consent was not given by means of a clear affirmative action and thus was not unambiguous. Furthermore, where users did not click the 'more option' button; they had to agree to Google's terms of service and to the processing of their personal data as detailed in the latter. In so doing, users accepted all data processing as a whole. This blanket acceptance resulted in a non-specific and thus invalid consent.

On this last point, the Restricted Committee offered an interesting observation. The CNIL contemplated that to some extent a more generalized consent could legally be obtained for different but related purposes. For such generalized consent to be allowed, data subjects must be informed in advance about the different purposes of the processing and given the possibility to give consent for each purpose separately. Only after can the data subjects be offered with the choice to accept or refuse all data processing operations as a whole. The Restricted Committee specified that this should be the case without them having to take any particular action to access the information, such as clicking on a 'more options' button.

Once more, the CNIL’s analysis enshrines the importance of designing a clear notice mechanism. More generally, this shows the importance of raising awareness on privacy issues among the developers who design the user's experience.

  1. What now?

Unsurprisingly, in the days that followed the CNIL's decision, Google announced that it would appeal the decision before France’s Supreme Administrative Court ("Conseil d’État"). From a procedural standpoint, this case if far from being over and privacy practitioners will have their eyes riveted on the Court's ruling.

In the meantime, the CNIL has given us some significant takeaways to chew on. First, transparency and lawfulness are essential components of any data processing activity. If you get them wrong, your entire processing activity may be flawed. Second, while the CNIL did consider that Google had not obtained valid consent, it did not analyse in detail whether Google could rely on its legitimate interests for some of its (less intrusive) processing activities. This may come as a disappointment for many companies in the ad tech sector who were hoping that the CNIL's decision would provide clarity on the possible legal grounds on which they can (or should) rely to run their business. For better or for the worst, this leaves the door open to future litigation on the legitimate interest ground.

Lastly, this decision does finally answer one question: who will the EU regulators go after first? It comes as no surprise that the first massive fine under GDPR was pronounced against Google. This is only the beginning of a likely series of DPA actions against US tech giants across Europe. However, companies in other business sectors that are less in the spotlight should not underestimate the risk of a sanction that could be taken against their own business if they fail to comply with the GDPR. This is only the first sanction and there will be many more to come…

With special thanks to Paola Heudebert for her valuable contribution to this article.

Areas of Expertise

Data and Privacy

Related Work Areas