Dark Patterns: is this grey area the next privacy battleground?

Products Data Privacy
Types Article

Overview

On 14 March 2022, Europe’s data protection regulators (the EDPB) issued draft guidelines on dark patterns in social media platform interfaces, as they attempt to control a growing sector that encourages us to share ever more data in an increasing range of situations, from playing games or keeping fit to searching for a job. The draft guidelines aim to ensure that users have and exercise real control over their data on social media platforms, but all businesses should take note – many will be using dark patterns without even realising it - and the rules being applied here are not unique to social media.


Contents

Globally, as regulators focus more on dark patterns, privacy and consumer laws mature, and businesses use more techniques to influence individuals, standards are shifting up a grade and practices will change accordingly.

What are dark patterns?

Dark patterns are methods of presenting information and choices in a way that influence a person to make ‘unintended, unwilling and potentially harmful decisions.’ For example they might make a person more likely to share their data or less likely to delete an online account. This might be achieved through font size or differently coloured icons, distraction or prompting, or it might be more subtle – for example chatty, reassuring language.

Is this just a social media issue?

No. The issue is certainly of particular concern in social media which is always high on the regulatory hit list and whose platforms include a host of contextual factors potentially impacting users’ ability to make and implement choices. And while the EDPB’s draft guidelines are directed at social media, their approach should be considered by all online businesses in assessing whether they are presenting customers with free choice, especially where they are dealing with users subject to similar push factors such as power imbalance or pressure to make decisions quickly (as might happen in online sales). EU regulators have already raised the issue of dark patterns in other sectors, warning all businesses against using them to obtain consent to cookies (see for example guidance from France, Luxembourg and Germany) and California’s approach to dark patterns is not sector specific.

Does GDPR ban dark patterns?

GDPR’s detailed requirements are underpinned by more general principles including fairness, transparency, purpose limitation, data minimisation and privacy by design and default. The EDPB has set high expectations for how these should be implemented in earlier guidelines, in particular those on transparency or privacy by design and default. Initial regulatory focus was primarily on compliance with GDPR’s more prescriptive requirements but the shift is now towards implementing these principles more fully as hard requirements for the design of software and services, including user interfaces.

What about outside of Europe?

In fact in terms of hard law in this area, the EU was beaten to the line by the US, not surprisingly as data privacy in the US evolved from consumer protection principles (as opposed to the human rights origins of EU legislation). In California, for example, the CCPA was amended in 2021 to prohibit the use of specific dark patterns - such as requiring consumers trying to opt-out to click through or listen to reasons why they should not submit a request. The CCPA even provides an optional, decision neutral (i.e. anti-dark pattern) icon for businesses to use. From 1 January 2023 California law will ban consent being obtained through dark patterns altogether. At the US federal level, the FTC has already taken action against the use of dark patterns in relation to wider consumer issues (subscriptions) and has included deceptive and manipulative conduct on the internet in its core priority areas for the next decade. Although the focus is primarily on goods and services it has also expressed its concern about the use of dark patterns to manipulate people into giving up their personal data.

There is less noise from APAC regulators on the issue, though related concepts are embedded in laws such as Singapore’s law prohibiting obtaining consent by using deceptive or misleading practices and South Korea’s requirements regarding font size. In addition, concerns over big data and other digital issues have pushed data protection authorities to forge closer relationships with consumer protection authorities (most recently the Philippine’s data protection and competition authorities announced a formal agreement to work together on consumer and data protection in digital matters).

Practices under the spotlight

The EDPB has provided 60 examples of practices that may constitute dark patterns in the social media context. Ultimately however the question is whether these lead to unintended, unwilling and potentially harmful decisions - so it is important to assess all information and choices against this test.

Some of the examples are obvious - providing misleading information to obtain data, providing dead links to privacy
settings or leaving users floundering for detail. Humour is also a danger zone – so providing a real recipe for cookies in a pop-up asking for cookie consent is not to the EDPB’s taste. Other examples will need to be navigated carefully if businesses are also to meet legitimate business needs.

One challenging issue is when legitimate branding and styling starts to have undue influence. Motivating texts, images and colours can be permissible but they must not make users opt unquestioningly for more invasive choices. Think about one-sided messages delivered as users try to rush through a sign-up process: Hey, a lone wolf, are you? But sharing and connecting with others help make the world a better place! Share your geolocation! Let the places and people around you inspire you!

The EDPB takes a hard line to the practice of making users pause and consider only negative consequences when they try to delete data or close their account, leaving little scope for businesses to even set out a balance of pros and cons – in their view any irrelevant steps added to the exercise of a right might contravene provisions of the GDPR. Similarly, selecting ‘pause account’ rather than ‘delete account’ as the default option may be an unacceptable nudge if it shifts the user from its original intention. Is this right? Some users may prefer an opportunity to pause and reflect; it will be interesting to see whether this can be justified as an interim step if coupled with good information about the process.

So what now?

In the EU, the draft guidelines will leave social media businesses plenty of practices to assess and there will be many discussions between privacy and business units on the appropriate balance to be struck. Outside the social media sector, other businesses should be using this time to examine and adapt their own practices to ensure that they can defend their positon when the spotlight moves to them. We can probably also expect more from data protection and consumer authorities around the world on dark patterns as they focus more on digital issues.

As part of Rulefinder Data Privacy, we continue to closely monitor regulatory developments across the world, as part of our daily monitoring and alerts and through our Privacy Developments Tracker.

 

 

This summary was published as part of aosphere's Rulefinder Data Privacy. Nothing in this summary is intended to provide legal or other professional advice: aosphere does not accept responsibility for loss which may arise from reliance on this summary.

What is Rulefinder Data Privacy?

Rulefinder Data Privacy is a user friendly database of global data privacy law and regulation sourced from leading privacy counsel across the globe and curated by aosphere’s team of senior data privacy professionals. Learn more here.

 

Contact Information
Claire Farley
Head of Product Development at aosphere
claire.farley@aosphere.com

The information on these pages is sample data for general presentation purposes only and may not reflect the current law or practice. Nothing in the content of these pages is intended to provide legal or other professional advice and aosphere does not accept any responsibility for any loss which may arise from reliance on the information contained on this website.
aosphere Limited registered in England and Wales with registered number 15371365 and registered office at 47 Queen Anne Street, London, W1G 9JG, United Kingdom.
 

Following a strategic investment in aosphere by Inflexion Private Equity and Endicott Capital, the aosphere business has been transferred to a newly formed limited company named aosphere Limited and aosphere has ceased to be affiliated with Allen & Overy LLP. aosphere is no longer authorised and regulated by the Solicitors Regulation Authority of England & Wales ("SRA"). Screenshots, videos and other historic content on this website were created before this transaction and may refer to aosphere's historic affiliation with Allen & Overy and aosphere's historic regulation by the SRA.