Skip to main content
Insight

EDPB shines a light on dark patterns

The EDPB has released draft guidelines on prohibited dark patterns aimed at social media networks (the "Guidelines"). They contain a lot of detail and some repetition, but also some useful insights into the regulators' thoughts in an area of increased scrutiny.

Scope

The Guidelines are addressed to social media networks.  However, many of the points discussed would apply to other websites/apps and, indeed, some responses to the consultation have queried why they were so limited.

For those new to the term, "dark patterns" are interfaces/experiences that aim to influence users into making unintended decisions regarding their personal data.  
 
Structure

The Guidelines are structured around the main parts of a user's journey where they are likely to encounter dark
patterns.

Sign up:

The EDPB warns against continuous prompting of users to provide more data than is necessary.  If the platform states they need a user’s phone number for security, then they are limited to using it for that purpose only[1].  Platforms should not use emotional steering to obtain a more detail profile of users, nor harass users who skip these stages.

Transparency:

The EDPB reiterates the importance of providing comprehensive information in a concise way.  Platforms are warned against using vague phrases such as “your data might be used to improve our services” or internal classifications that most users wouldn’t understand.  They should also avoid creating a “privacy maze” by requiring users to click on a multitude of pages.

Settings:

Users should be provided with a simple dashboard to change their settings at any time, not with multiple confusing menu items.  Humour should not be used to misrepresent the potential risks of invalid consent.  (no more cookie puns!)

Closing account:

Platforms should hide from users the link to delete their account, nor engage in emotional steering when the user clicks on it. Pausing/deactivating accounts can be offered as an alternative, but should not be pre-checked, and the platform will need to explain what will happen to their data.
 
Wider considerations

Following the Guidelines, the Digital Services Act text was agreed, which includes a qualified ban on the use of dark patterns.  The final text is expected to prohibit designs that impair the user's ability to make free, autonomous decisions, to be detailed in future guidelines from the Commission.

Across the pond, US regulators and litigants are increasingly focussing on dark patterns: four state Attorneys General sued Google for their location data practices; several of the new state privacy laws include prohibitions.
 
Takeaways

On a strict reading of GDPR article 25 (privacy by design and by default), designs displaying dark patterns should never have arisen.  As mentioned at the outset all publishers (not just social media companies) should have an eye on these points, given they too are subject to article 25. 

Marketing departments always seek ways to increase user engagement, so one can understand how some practices have strayed over the line given the lack of specific guidance.  It will be much harder to make that case in future.

We await the final version, but the draft Guidelines, DSA and recent US actions certainly shift the risk landscape for deceptive designs.
 
An expanded version of this blog is available here.

 
[1] The US FTC takes this seriously – see the recent $150m settlement with Twitter for this very practice!

Sign up to our email digest

Click to subscribe or manage your email preferences.

SUBSCRIBE