Security: lessons from GDPR fines | Fieldfisher
Skip to main content

Security: lessons from GDPR fines



United Kingdom

Post-GDPR, cyber and data security remain a major practical concern (alongside data subject rights, among other issues), and security compliance failures remain the number one way to a regulatory fine (alongside marketing rules violations, among other compliance failures).

Security: lessons from GDPR fines

Post-GDPR, cyber and data security remain a major practical concern (alongside data subject rights, among other issues), and security compliance failures remain the number one way to a regulatory fine (alongside marketing rules violations, among other compliance failures). About 31% of national/local enforcement actions initiated were based on controller data breach notification, according to a 26 Feb 2019 report by the European Data Protection Board (EDPB). This blog summarises key practical lessons from selected regulatory fines imposed to date under the GDPR for security issues.

We have a wealth of regulatory enforcement action decisions pre-GDPR, but enforcement post-GDPR remains relatively scarce and we have not yet seen any mega fines apart from the CNIL’s €50m fine against Google in France (not security-related). Nevertheless, several EU regulators have already taken enforcement action under the GDPR for security failures.

Most such enforcement cites not only Art.32 (security measures: both controllers and processors), but also Art.5(1)(f) (integrity and confidentiality: core principle for controllers, subject to a higher-tier 4%/€20m fine), or both. As is well known, not notifying personal data breaches (PDBs) when required can be fined as a separate breach, and we now cases of enforcement action for notification failures, as well as a processor being fined for security failures.

Fines relating to security obligations under GDPR

This blog draws on the following fining decisions, some of which also included SA orders to rectify the issues concerned:

  • France - property company SERGIC fined €400,000 – files from one folder, with documents supporting rental applications, were accessible online by changing the URL.
  • Germany (Baden-Wuerttemberg) – a social media platform (SMP) was fined €20,000 by LfDI after a hacker published 1.87 million usernames/passwords and over 800,000 e-mail addresses.
  • Hungary - political party DKP fined 11m HUF (€34,000), 4% of 2017 revenue, not for the security breach but for not notifying the SA or data subjects - misconfiguration allowed access by a hacker to a test database; hacker disclosed the vulnerability and published database online: 6,000 users’ full names, usernames, email addresses and passwords hashed using the weak MD5 algorithm.
  • Italy
    • Rousseau, a platform used by a political party for its websites, incorporating e-voting functions, fined €50,000 for (post-25 May 2018) still not properly implementing security measures ordered after a 2017 breach.
    • (Not a fine) Garante ordered a DPIA for the Revenue Agency’s proposed e-invoicing system, particularly security issues
  • Lithuania – payment service provider MisterTango UAB was fined €61,500 based on turnover, including for non-notification, by Lithuania’s SA, who coordinated with Latvia’s SA - screenshots of 9,000 payments involving 12 banks from different countries were exposed online for at least 2 days, it was unclear whether through a third party hack or (the company claimed) technical error; reportedly being appealed.
  • Malta – the Lands Authority was fined €5,000 as 10GB of personal data, e.g. correspondence and sensitive data (including identity cards), was open to the Internet, and even indexed by and searchable via Google.
  • Norway – Bergen municipality fined €170,000 or NOK 1.6m - file with usernames/passwords of over 35,000 pupils and employees was openly accessible to logged-in pupils, employees or school administrators; system stored pupils’ date of birth, national identity number, address.
  • Poland – football association (DZPN) fined €13,000 for inadvertent online publication (for nearly 3 years) of 585 referees’ ID numbers and home addresses, not just names and towns, all searchable online; notified SA and data subjects but took limited remedial action (which it outsourced) and didn’t verify its effectiveness - the SA found, after a referee’s complaint, that the data remained accessible months later via URL and search, though it was then remedied.
  • Portugal - Centro Hospitalar Barreiro Montijo (BM) fined €400,000 for allowing over-broad access to patient data (it had 985 registered doctor profiles but only 296 doctors) - €100,000 for Art.32, €150,000 for Art.5(1)(f), and €150,000 for Art.5(1)(c) (data minimisation).

Main points

  • As was the case pre-GDPR, enforcement action can be initiated on a single data subject complaint (SERGIC), media report (Lands Authority, BM) or SA inspection (Lithuania) - not just through controller notification of personal data breaches (PDBs). SAs can conduct technical tests online (SERGIC) and at datacentres (Rousseau).
  • Known security issues which continue beyond pre-25 May 2018 can be fined under GDPR (SERGIC, Bergen, Rousseau).
  • Although there have been no mega fines yet for security failures, we have now seen the first fines for failure to comply with the GDPR PDB notification requirements.
  • Helpfully, German regulator LfDI has stated that it’s not a competition to impose the highest fines; the goal is to improve privacy and security (SMP).
  • Not appropriately restricting access to personal data, enabling unauthorised access, often leads to fines (SERGIC, Lands Authority), or not redacting personal data made publicly accessible online (DZPN), even where there’s no evidence of damage to individuals (SERGIC, DZPN): mere breach of confidentiality can be non-material damage (DZPN).
  • A fine can be levied for not checking that a breach was remedied properly, as the breach was continuing (DZPN).
  • A processor has been fined for security failings, for the first time (Rousseau)
  • Reportedly an employee who designed flawed technology behind the breach “left” (Lands Authority). (Cf. in Singapore, after hackers accessed health data, both SingHealth and its tech provider IHiS were fined, but also IHiS fired two employees, demoted one and imposed “significant financial penalty” on 5 senior managers including the CEO).
  • Aggravating factors, potentially increasing fines, include
    • As mentioned in Art.83(2), nature of data e.g. special category data (political opinions - DKP, Rousseau; ID/health cards, account statement etc. - SERGIC) or data of vulnerable groups (children - Bergen); large volumes of personal data/numbers of affected individuals (Bergen, Rousseau etc.); duration of breach (Rousseau; DZPN – not properly remedied).
    • Ignoring warnings and/or delaying months to fix known security issues (Bergen – both internal and by the SA), especially if quick risk-reduction steps were possible (SERGIC – postponed for summer peak operations), or not addressing them at all (Barreiro Montijo; DKP – controller believed breach of test database with real personal data need not be reported; DZPN – “fix” ineffective).
    • Processing personal data for core professional activities (higher standards apply than for ancillary, incidental or small-scale activities – DZPN).
    • Not following processes to escalate reported breaches (Bergen).
    • Delegating breach remediation and not verifying it properly (DZPN).
    • Not documenting incidents, impact and remediation under Art.33(5) – including non-notifiable ones (DKP).
    • Not notifying the breach to the SA (MisterTango) or data subjects (DKP).
  • Infringement of new GDPR obligations cited (which presumably were aggravating factors) included Art.5(2) accountability (Bergen – integrity and availability; MisterTango – deletion policy/evidence – stored for 210 days, cf. policy 10 mins), and data protection by design and by default (Art.25) (Bergen, Revenue Agency) e.g. use of obsolete technology (Rousseau).
  • Mitigating factors include:
    • Self-reporting the PDB promptly and fully (SMP), although that may not help as it’s required by law (DZPN).
    • Constructive cooperation and transparency (SMP; DZPN) and willingness to follow SA guidance to implement security improvements (SMP, Rousseau).
    • Implementing fixes/enhanced security measures quickly (Lands Authority – took down within hours; SMP – weeks), including in conjunction with the SA (SMP), or at least ultimately remediating (DZPN).
    • “the overall financial burden” of security improvements (SMP); this indicates a laudable approach: funds may be better spent on security improvements than on regulatory fines.
  • Non-notification can of course be separately fined (MisterTango) especially if the SA learned about the breach otherwise than from the controller (DKP). This is a new development post-GDPR.
  • High risk, requiring data subject notification, can arise not just because data were special category (e.g. political opinions), but also if (combined) breached data could lead to identity theft (DKP; DZPN – just risk, not “high” risk, cited), or if passwords were hashed using outdated weak algorithms like MD5 and users’ choice of passwords were not sufficiently constrained e.g. minimum length, particularly as users tend to use the same usernames/passwords for different online services (DKP).
  • Data minimisation (Art.5(1)(c)) and storage limitation (Art.5(1)(e)) breaches are often cited and may increase fines because the security breach affected data that shouldn’t have been collected or should have been deleted (SERGIC – ID/health cards, account statements, marital documents etc.; MisterTango - information on loans, pensions, payment card numbers/balance, etc.; DZPN; Revenue Agency).


Many of the learnings are not new points and probably involve common sense measures, particularly to those working in security, but these fines underline their importance. Fines are often imposed, not for breaches, but for security failings discovered by the regulator upon investigating breaches.

  • Organisational measures should include:
    • Document access policies (map access rights to users’ functions, and policies for creating system users - BM), restricting access rights appropriately (IT staff could access medical information, doctors could access all patient files regardless of specialty - BM).
    • Create and implement appropriate policies, processes and training - including checking information to be published online (DZPN), and training developers/IT staff on storing passwords securely, ensuring personal data is not openly accessible/insufficiently restricted, etc.
    • Document implementation for accountability (e.g. MisterTango).
  • In terms of technical measures:
    • Conduct vulnerability tests before making services public, before changes, and regularly (Rousseau, Lands Authority).
    • Apply data protection by design and by default (including security by design and by default), e.g. migrate away from obsolete technologies that are difficult to “patch” (i.e. update with security vulnerability fixes) (Rousseau).
    • Consider encrypting at least partially (Revenue Agency).
    • On account creation, check users’ passwords for security (password quality indicators, minimum length, no dictionary words allowing brute force attack, etc) (DKP, Rousseau), implement e.g. reCAPTCHA to prevent automated account creation (Rousseau).
    • Passwords shouldn’t be stored unsecured (SMP – a “clear” violation of Art.32; Bergen). Not only should they be hashed (scrambled) with a secure algorithm (DKP) but also salted (i.e. adding unique values per user - much harder to break) (Bergen, Rousseau).
    • Indeed, arguably organisations should check proposed passwords against the most common (weak) breached passwords – some technologies enable this.
    • Implement organisational access restrictions properly (profiles for doctors who left were undeleted, only 18 accounts were inactive! - BM)
    • Don’t allow credentials sharing (username/password), particularly if they grant extensive access; implement unique credentials (Rousseau).
    • Use secure protocols and digital certificates to protect data in transmission (e.g. SFTP instead of FTP – Revenue Agency) and protect against spoofing of the websites (Rousseau).
    • If both service provider and DPO recommend certain enhanced security measures (two-factor authentication, for children’s data – Bergen), consider them carefully!
    • For data of vulnerable groups like children, 2FA login may be advisable (Bergen).
    • Implement proper auditing (e.g. logging database accesses/operations) for integrity and ex post control, particularly if some employees have extensive rights, which will also help in relation to insider threats.
    • The UK NCSC’s summary for Boards is useful. Mnemonics: 5Ps – phishing, privileged users, patching, providers/processors/partners and passwords. These involve both technical and organisational aspects. To these could perhaps be added another P – “publicly available inadvertently”!
  • Other tips:
    • Compensation claims, not just fines, are on the horizon, e.g. litigation regarding 2018 British Airways and Ticketmaster breaches. The Irish DPC’s actions regarding several Facebook security infringements (arising from bugs), including WhatsApp, bear close monitoring.
    • Don’t postpone checks/measures on the security of acquired businesses/subsidiaries. Some pre-GDPR fines have involved acquisitions.
    • Implement security-related due diligence, Art.28 terms and ongoing checks even for intra-group processors (e.g. pre-GDPR Equifax).
    • Consider insurance, but beware scope/exclusions (e.g. insurer Zurich argued a war exclusion against Mondelez’s claims for NotPetya losses).
    • Monitor developments regarding certifications/codes of conduct approved for GDPR purposes, which are getting more attention, as adherence may evidence compliance and help reduce or even avoid fines.
    • Regarding suspected personal data breaches (PDBs):
      • Involve lawyers with proven security/PDB expertise to advise (e.g. DKP wrongly believed that breach of a test database was not notifiable), and to help preserve legal privilege – in practice, organise external advisers/support in advance as part of your incident response plan (which should include your internal incident response team), as scrambling to do so post-PDB will cost more money and time (e.g. procurement processes).
      • Also ensure your risk assessment methodology (for deciding whether/who to notify) is ready in advance.
      • Testing is critical. Not just pen testing and application security, but rehearsing and testing incident response plans in advance – over 50% of organisations still don’t. And check/test remedial actions’ effectiveness! (a non-GDPR example – the UK Financial Conduct Authority fined Tesco Bank £16.4m; one failing was that it didn’t monitor the operation of a rule coded to block the fraud to check it worked, and it hadn’t).
      • Record security measures/incidents information (notifiable or not) for accountability etc.

The key general lessons, from our experience of assisting clients with security incidents as well as the above, are: prepare, prepare, prepare – and test, test, test!