Contact Members Join
AmCham Romania
Members only
Home |Privacy policy
Business Intelligence #AmCham #DataProtectionMonth Expert Views # 2: Dark Patterns on Social Media Platforms

#AmCham #DataProtectionMonth Expert Views # 2: Dark Patterns on Social Media Platforms

by MPR Partners | Maravela, Popescu & Asociații May 10, 2022

Article by: Cristina Crețu, Senior Privacy & Technology Consultant - MPR Partners | Maravela, Popescu & Asociații

“Dark patterns” are a hot topic for debate these days among the European authorities and European legislator.

On March 21, 2022, the European Data Protection Board (the “EDPB”) submitted for public consultation its “Guidelines on Dark patterns in social media platform interfaces: How to recognise and avoid them” (the “Guidelines”).

On April 23, 2022, the European Parliament and Council of the European Union reached a provisional political agreement on the Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC.

Digital Services Act will prohibit the manipulation of users’ choices through “dark patterns” by online platforms and marketplaces.

But what are the “dark patterns”

The Guidelines provides for a definition of „dark patterns” which is applicable in the context of the said Guidelines. Thus, the “dark patterns” are seen “as interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data.

According to the Guidelines, a series of “dark patterns” can be identified throughout the life cycle of a social media account, such as:

  1. overloading (continuous prompting, privacy maze and too many options), when the users are flooded with lots of information, requests and options meant to encourage the user to share more data or to allow more processing activities than expected by the user;
  2. skipping (deceptive snugness and look over there), when the interface or the user experience is designed in such a wat that the user disregards or does not take into account the aspects related to data protection;
  3. stirring (emotional steering and hidden in pain sight), when the user is enticed to decide in a certain way through emotions or visual prodding;
  4. hindering (dead end, longer than necessary and misleading information), when the users are obstructed or blocked to become informed or to be able to manage their personal data;
  5. fickle (lacking hierarchy and decontextualising), when the interface is designed in an inconsistent or unclear way, making it difficult for the user to navigate through data protection control tools or to understand the purpose of processing;
  6. left in the dark (language discontinuity, conflicting information and ambiguous wording or information), when by way of design the interface hides information or data protection control tools or leaves the user unsure of how the personal data is processed or what controls the user has when it comes to exercising his/her rights.

At the same time, although the Digital Services Act makes references to „dark patterns” and the banning of the same in certain circumstances, it seems to provide no definition or categorization of the concept. Whether the final text of the Digital Services Act will include such a definition and categorization, or a delegated act will provide it, it remains to be seen.

“Dark patterns” and the transparency obligation

The Guidelines detail on the “dark patterns” that can occur throughout the life cycle of a social media account from the time the account is opened until the user decides to leave behind the social media platform.

Some of the aspects raised in the Guidelines in connection with the observance of the transparency obligation might be considered useful by all the companies processing personal data and not only by social media platforms.

As seen from the recent enforcement actions taken by several Data Protection Authorities, ensuring compliance with the transparency obligation is no easy feat, since a lot of companies fail or find it difficult to provide information to the data subjects in a “concise, transparent, intelligible and easily accessible form, using clear and plain language”.

From providing conflicting information, using ambiguous wording to lacking hierarchy and entangling the user in a privacy maze or leaving the user in a dead end, all these “dark patterns” can be encountered outside the world of social media platforms.

Using “dark patterns” such as left in the dark (conflicting information, ambiguous wording or information, language continuity), fickle (lacking hierarchy), overloading (privacy maze) and hindering (dead end) will entail that the social media platforms do not meet the requirements referred to in article 12 of General Data Protection Regulation (“GDPR”). Consequently, in EDPB’s view, there will be “no valid information within the meaning of Articles 13 and 14 GDPR” in this case.

In the EDPB’s view, the usage of motivating texts, images and colours and appealing advertising is in principle permissible.

However, when such text or images are used alongside information on the positive outcomes for processing personal data, the user will be under the impression that sharing his/her data is beneficial, thus offering the comfort to share a wide range of personal data with the social media platform. Coupled with the fact that the social media platform fails to provide clear information on, for example, how the user can control the publicity of his/her data, we will be in the presence of conflicting information leaving the user in the dark with regards to publicity of his/her data and the controls the user has when it comes to the same.

As seen in practice, providing conflicting information is not limited to social media platforms. As an example, in 2021, a Spanish bank was fined by the Spanish Data Protection Authority, amongst other, for presenting its data protection notice as a benefit for the customer, thus implying that non-acceptance will result in the loss of customer benefits.

The same is valid with regards to using ambiguous wording or information in the data protection notices. Any use of vague and ambiguous terms will, in all cases, leave the users unsured on how their personal data are processed and the control they have over such data.

In terms of the language used by the social media platforms, EDPB is of the view that the same should be relevant for the residents of a Member State to whom the online services are offered. Otherwise, the user presented with a data protection notice in a different language than the one in which the service is provided will be affected by the language discontinuity pattern used by social media platform. Moreover, in EDPB’s view, it is equally important to consider the language expressly selected by the user and not change it with the language of the country of residence.

Another aspect identified by EDPB is the lack of hierarchy when it comes to the information provided in the data protection notices. Presenting the information in several places and in different ways across the data protection notice will, in all cases, leave the user confuse with regards to the way in which his/her personal data are processed. Ensuring consistency throughout the entire data protection notice and presenting the information in the relevant section where the user will expect to find such information will help presenting a clear picture on what happens with his/her data and how the same is processed.

Although important to provide a layered approach when it comes to data protection notices, the same should not be transformed in a privacy maze, which will be impossible to navigate by the user in his/her search for relevant information.

The same is valid when it comes to providing only general information on the availability of further information without mentioning where such information could be found or directing the user towards pages that contain information not related to the subject. The user will be put in front of a dead end, thus being left without a full understanding with regards to his/her personal data.

Although not perfect, the Guidelines can prove to be a useful tool in understanding and identifying “dark patterns”. However, it will be helpful if the Guidelines will be amended following the public consultation, as to incorporate more examples that will help to properly inform the users under the transparency obligation.

More from Business Intelligence

Previous Next