Dark patterns in social media platform interfaces: How to recognise and avoid them (Guidelines 3/2022 for consultation )
In an effort to strengthen the data protection granted to users , the European Data Protection Board has established guidelines relating to dark patterns on social media platforms and their potential EU General Data Protection Regulation infringements. The guidelines offer examples and best practices for addressing dark patterns, and outline principles for transparency, accountability and data protection by design as well as GDPR provisions that can help dark pattern assessments. The EDPB also drew up a checklist for identifying particular dark patterns. The deadline for public comments on the guidance is May 2.
Dark Patterns under the guidelines are defined as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data”.
These Dark Patterns influence the user’s behaviour and obstruct their capacity to protect their personal data, and to this end infringe Article 5(1) of the GDPR on the principles of transparency, data minimisation and accountability.
The guidelines capture Dark Patterns under the following categories:
Overloading: where users are faced with a massive quantity of information, requests, or options with the aim to prompt them to allow their personal data to be processed. This in turn can be split into:
Continuous prompting: pushing users to provide more data than realistically necessary for processing or obtaining consent, done with the aim to have users give in to the requests.
Privacy Maze: information relating to data processing or tools to exercise data subject rights are difficult to find as they are spread out through multiple pages/documents without having a comprehensive layout.
Too Many Options: users are given too many options making users unable o make a choice or causing them to possibly overlook some information.
Skipping: where the interface or user experience is created in a way to cause users to forget all or some data protection aspects. This in turn can be split into:
Deceptive Snugness: whereby default, the most invasive data processing features/options are enabled.
Look Over There: data protection actions/information competing with another element, posing as a distraction to users.
Stirring: where the user’s choice is affected by appealing to their emotions or through the use of visual nudges. This in turn can be split into:
Emotional Steering: providing users information in a manner that appeal to their emotions, such as positively to create a ‘safe’ feeling or negatively to create guilt/fear and which has a higher impact on children.
Hidden in Plain Sight: visual nudges prompt users to less restrictive options.
Hindering: where the user is blocked from being further informed on or control their personal data. This in turn can be split into:
Dead End: where the link to information or control tool does not work or is not available at all.
Longer than Necessary: where user’s attempt to activate a control on data protection has an excessive number of steps in comparison to the activation of data invasive options.
Misleading Information: discrepancy between information and the actions, causing users to mistakenly do something they did not intend to.
Fickle: where the interface is purposefully designed to be inconsistent and unclear, making it difficult to access different data protection control tools. This in turn can be split into
Lacking Hierarchy: where interface is inconsistent, making users unable to access the controls and information.
Decontextualising: where information relating to data processing is placed in a section completely unrelated.
Left in the Dark: where the interface is designed to hide information or tools in an attempt to leave users unaware on how their data is processed. This can be split into:
Language Discontinuity: data protection information not provided in the official language(s) of the user’s location.
Conflicting Information: pieces of information conflict with each other.
Ambiguous Wording or Information: using vague wording or information to users.
These dark patterns can be avoided through the use of coherent wording and providing definitions with examples, creating shortcuts to data protections tools/information, and creating a notification system to raise awareness of users on the realities and risks of personal data processing. Cross-device consistency will also avoid dark patterns, as would a data protection directory for easy orientation.
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.OkPrivacy policy