The European Data Protection Authority EDSA has – already on February 14, 2023 – published the 74-page version 2.0 of the Guidelines on Deceptive Measures in Social Media (Guidelines 03/2022 on Deceptive design patterns in social media platform interfaces: how to recognise and avoid them) has been published. The previous version 1.0 had been opened for public consultation on March 14, 2022.
The guidelines are based on the GDPR and present in great detail a number of practices that are deceptive from the perspective of EDSA or its members, as a guide to data protection compliant design or checklist for review and each with design guidance. Appendix 2 provides additional “best practice” recommendations to facilitate compliance with the GDPR.
Overall, the EDSA’s recommendations can be summarized as follows: the operator of a social media site should behave decently and put himself in the user’s shoes, the latter from the perspective of a user with a short attention span and no critical attitude. It violates this not only if the user is taken for a fool or harassed, but also, for example, if he cannot delete an account, but can only deactivate it, a common but annoying procedure. Inadmissible would be, for example:
Example 56: In the process of deleting their account, users are provided with two options to choose from: To delete their account or to pause it. By default, the pausing option is selected.
Categories of prohibited practices
The following categories relate to content (e.g. wording) or the design of social media pages (user interface) are discussed in the guide:
„Overloading„:
Overloading means users are confronted with an avalanche/large quantity of requests, information, options or possibilities in order to prompt them to share more data or unintentionally allow personal data processing against the expectations of the data subject. The following three deceptive design pattern types fall into this category: Continuous prompting, Privacy Maze and Too Many Options.
Example: A user is prompted to enter his phone number every time he logs in.
„Skipping„:
Skipping means designing the interface or user journey in a way that users forget or do not think about all or some of the data protection aspects. The following two deceptive design pattern types fall into this category: Deceptive Snugness and Look over there
One example, according to EDSA, would be to give the user choices, but to emphasize one – the non-data-saving one – e.g. by a corresponding graphical design (graying out the other options). This would violate the principle of “privacy by default”:
Example 9 shows a Deceptive Snugness pattern, as it is not the option offering the highest level of data protection that is selected, and therefore activated, by default. In addition, the default effect of this pattern nudges users to keep the pre-selection, i.e. to neither take time to consider the other options at this stage nor to go back to change the setting at a later stage.
For Swiss law, such a design certainly does not violate the principle of privacy by default, because a choice is suggested to the user, but the user confirms it. Only if the confirmation is perceived as quasi-void, i.e. the deception goes so far, could one speak of a violation.
„Stirring„:
Stirring affects the choice users would make by appealing to their emotions or using visual nudges. The following two deceptive design pattern types fall into this category: Emotional Steering and Hidden in plain sight.
An example, according to EDSA, would be when users are asked to disclose more than the required data:
“Tell us about your amazing self! We can’t wait, so come on right now and let us know!”
„Obstructing„:
Obstructing means hindering or blocking users in their process of becoming informed or managing their data by making the action hard or impossible to achieve. The following three deceptive design pattern types fall into this category: Dead end, Longer than necessary and Misleading action
„Fuckle„:
Fickle means the design of the interface is inconsistent and not clearmaking it hard for the user to navigate the different data protection control tools and to understand the purpose of the processing. The following four deceptive design pattern types fall into this category: Lacking hierarchy, Decontextualising, Inconsistent Interface and Language Discontinuity
„Left in the dark„:
Left in the dark means an interface is designed in a way to hide information or data protection control tools or to leave users unsure of how their data is processed and what kind of control they might have over it regarding the exercise of their rights. The following two deceptive design pattern types fall into this category: Conflicting information and Ambiguous wording or information
For this purpose, a summary and the following overview can be found in the appendix:
These practices may violate Art. 5 GDPR (processing principles), and in the case of consents, possibly also Art. 4 No. 11 and Art. 7 GDPR. More specifically, they may undermine the following concerns, which the EDSA attributes to the GDPR:
- Autonomy – Data subjects should be granted the highest degree of autonomy possible to determine the use made of their personal data, as well as autonomy over the scope and conditions of that use or processing.
- Interaction – Data subjects must be able to communicate and exercise their rights in respect of the personal data processed by the controller.
- Expectation – Processing should correspond with data subjects’ reasonable expectations.
- Consumer choice – The controllers should not “lock in” their users in an unfair manner. Whenever a service processing personal data is proprietary, it may create a lock-in to the service, which may not be fair, if it impairs the data subjects’ possibility to exercise their right of data portability in accordance with Article 20 GDPR.
- Power balance – Power balance should be a key objective of the controller-data subject relationship. ower imbalances should be avoided. When this is not possible, they should be recognized and accounted for or with suitable countermeasures.
- No deception – Data processing information and options should be provided in an objective and neutral way, avoiding any deceptive or manipulative language or design.
- Truthful – the controllers must make available information about how they process personal
data, should act as they declare they will and not mislead data subjects.
The EDSA discusses these practices in Section 3 of the Guidelines, each with a description, an analysis of the applicable provisions of the GDPR, and examples.
Revocation of consent
The EDSA points out, among other things, that when designing consent declarations, for example, attention should be paid to the medium in which the consent is requested, i.e. the subject of the consent and the required information about it should also be clearly recognizable on a cell phone. In many other respects, the EDSA follows its well-known strict practice, for example with regard to the revocability of consent:
As an example, consent cannot be considered valid under the GDPR when consent is obtained through only one mouse-click, swipe or keystroke, but the withdrawal takes more steps, is more difficult to achieve or takes more time.
And:
Example 33: A social media provider does not provide a direct opt-out from a targeted advertisement processing even though the consent (opt-in) only requires one click.
The Guidelines contain various other references to consent and its documentation.
Requirements for privacy statements
Also Requirements for privacy statements are addressed, for example with the following, undoubtedly correct statement:
However, more information does not necessarily mean better information. Too much irrelevant or confusing information can obscure important content points or reduce the likelihood of finding them. Hence, the right balance between content and comprehensible presentation is crucial in this area. If this balance is not met, deceptive design patterns can occur.
However, this is easier said than done, because privacy statements are not written for the large mass of users, but for the few exceptions who will meticulously search them for loopholes in the event of a dispute. If one wants to be safe here, one will often write the privacy statements quite extensively, which threatens to inflate them. With extensive privacy statements, however, it helps to have a good graphical presentation, for example with pop-up texts such as here, or a clear distinction between basic statements and examples, or good structuring (EDSA also points this out), etc. Also the Privacy Icons can make a contribution to this.
To the Language requirements for privacy statements EDSA says,
Users will face this deceptive design pattern [“language discontinuity”] when data protection information is not provided in the official languages of the country where they live, whereas the service is provided in that language
This may be understood to mean that data protection information must be presented in the language in which the offer can be used. However, the following is also inadmissible:
Each time users call up certain pages, such as the help page, these automatically switch to the language of the country users are in, even if they have previously selected a different language.
If a service can be used on a website and an app, the required information would basically be Provide directly in the app:
It is important to note that even stronger effects than those caused by too many layers can occur when not only several devices, but also several apps provided by the same social media platform, such as special messenger apps, are used. Users who use that kind of secondary app would face greater obstacles and efforts if they have to call up the browser version or the primary app to obtain data protection related information. In such a situation, which is not only cross-device but cross-application, the relevant information must always be directly accessible no matter how users use the platform.
In other respects, too, the EDSA remains strict and reiterates the opinion that in data privacy statements are Formulations such as “we might use your data for…” or “our services” are too generic. The requirements for the content of the information also remain high:
Example 46: The social media platform does not explicitly state that users in the EU have the right to lodge a complaint with a supervisory authority, but only mentions that in some – without mentioning which – countries, there are data protection authorities which the social media provider cooperates with regarding complaints.
With the References to the data subject rights the EDSA goes even further:
Example 49: The paragraph under the subtitle “right to access” in the privacy policy explains that users have the right to obtain information under Article 15 (1) GDPR. However, it only mentions users’ possibility to receive a copy of their personal data. There is no direct link visible to exercise the copy component of the right of access under Article 15 (3) GDPR. Rather, the first three words in “You can have a copy of your personal data” are slightly underlined. When hovering over these words with the users’ mouse, a small box is displayed with a link to the settings.
This is a case of “hidden in plain sight”. Such a design should probably rather be mentioned among the best practices.
Security breach notification
The EDSA also addresses this issue. A security breach exists, for example, if an app can access more data as a result of a programming error. In the case of high risks, security breaches must also be communicated to the affected parties, and here, too, deceptive designs must be avoided, for example by linking the information about the type and scope of the breaches with “unspecific and irrelevant information and the implications and precautionary measures the controller has taken or suggests to take”: “This partly irrelevant information can be misleading and users affected by the breach might not fully understand the implications of the breach or underestimate the (potential) effects.
Also inadmissible the following:
Example 20:The controller only refers to actions of a third party, that the data breach was originated by a third party (e.g. a processor) and that therefore no security breach occurred. The controller also highlights some good practices that have nothing to do with the actual breach.
The controller declares the severity of the data breach in relation to itself or to a processor, rather than in relation to the data subject.
Or:
Example 21: Through a data breach on a social media platform, several sets of health data were accidentally accessible to unauthorized users. The social media provider only informs users that “special categories of personal data” were accidentally made public.
Or:
Example 22: The controller only provides vague details when identifying the categories of personal data affected, e. g. the controller refers to documents submitted by users without specifying what categories of personal data these documents include and how sensitive they were.
Or:
Example 23: When reporting the breach, the controller does not sufficiently specify the category of the affected data subjects, e. g., the controller only mentions that concerned data subjects were students, but the controller does not specify whether the data subjects are minors or groups of vulnerable data subjects.
Or:
Example 24: A controller declares that personal data was made public through other sources when it notifies the breach to the Supervisory Authority and to the data subject. Therefore, the data subject considers that there was no security breach.