In recent months, there has been increased chatter about “dark patterns” in user interfaces, and it’s only getting louder. When we think of dark patterns, we often think of features that make it more difficult to cancel subscriptions, or that (mis)lead us to sign up for a product or service despite our best intentions. However, dark patterns also impact data privacy in a number of ways.
In a 2018 report titled Deceived by Design, the Norwegian Consumer Counsel analyzed a number of user settings for three major platforms, in order to determine the extent to which UX design is deployed against the best interests of users. The detailed report defined “dark patterns” generally as “…features of interface design crafted to trick users into doing things that they might not want to do, but which benefit the business in question … or in short, nudges that may be against the user’s own interest. This encompasses aspects of design such as the placement and colour of interfaces, how text is worded, and more direct interventions such as putting pressure on users by stating that the product or service they are looking at is about to be sold out.”
The use of dark patterns – and their impact on consumer rights – has finally caught the attention of regulators, and in fact, the most recent iteration of the CCPA Regulations includes language specifically prohibiting the use of dark patterns when it comes to consumers’ ability to exercise their right to opt-out of the sale of their personal information under CCPA. In particular, businesses may not use a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out, nor may they use confusing language, such as double-negatives (e.g., “Don’t Not Sell My Personal Information”) when providing consumers the choice to opt-out. Among the materials relied upon by the Attorney General is Shining a Light on Dark Patterns by Jamie Luguri and Lior Jacob Strahilevitz, another report (among many others) on the use of dark patterns and how this practice violates federal and state laws restricting the use of unfair and deceptive practices in trade. In a 2020 Statement, Federal Trade Commissioner Rohit Chopra described dark patterns as “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent,” echoing the many white papers focusing on dark patterns in UX design. As noted in the Commissioner’s report, dark patterns include “visual misdirection, confusing language, hidden alternatives, or fake urgency” – all with the overarching goal of steering people toward or away from certain choices. Notably, the FTC will host a virtual workshop called “Bringing Dark Patterns to Light” on April 29, to explore the ways in which user interfaces can have the effect, intentionally or unintentionally, of obscuring, subverting, or impairing consumer autonomy, decision-making, or choice. The workshop will also focus on whether additional rules and standards (or enforcement) are needed to protect consumers.
With respect to privacy specifically, dark patterns are pervasive in the context of opt-in and opt-out mechanisms, as well as with respect to general privacy or individual rights settings. Taking the example of opt-in and opt-out mechanisms, dark patterns do not only stand in the way of consumers exercising their privacy rights, but also may result in consumers handing over more personal information than they would like or than needed. One need only look at most cookie banners that are intended to comply with EU laws to observe dark patterns in full swing. So often do we see cookie banners with two buttons: one button in a toned-down or light color with the option to decline cookies (or manage cookie preferences) obfuscated by a second, BOLDER AND BRIGHTER “accept all” button intended to prod users into consenting to certain cookies. The first, toned-down button, reflects the choice that individuals should be given under EU law, but the second, louder button, detracts from the first; given the high level of user “consent fatigue”, users will often naturally gravitate toward the second button to simply make the banner go away. Companies using this type of crafty interface may indeed believe that by having a cookie banner with opt-in choices, they meet the requirements under EU law, and from afar, that may be the case. But in fact, upon closer review, where the use of dark patterns has the opposite practical effect, that cookie banner is no longer compliant with the standard for consent under GDPR. The same goes for CCPA and the reviled “Do Not Sell My Personal Information” link. On websites or mobile apps, many are often hard to see (or find), or when prompted, lead the user on a wild goose chase to exercise the right to opt-out of sales. Here again, if the practical effect is to negate the purpose of these links and diminish individuals’ ability to exercise their rights, they are not in compliance with CCPA.
These are just a couple examples of how UX design can interfere with an organization’s compliance with applicable data protection laws and standards. Dark patterns come in different shapes and forms: some UX designs are intended to manipulate, deceive or coerce, while others merely have this practical effect. However, as consumers become increasingly privacy-minded (or frustrated!), they are much quicker to identify these practices that either coerce them into giving more personal information away than is necessary or that prevent them from exercising their rights. Importantly, privacy compliance is a comprehensive undertaking: legal requirements must align with technical implementation. This, in fact, is one of the tenets of privacy-by-design. As such, businesses should ensure that UX design does not detract from consumer rights nor chip away at their compliance efforts, and, importantly, take note of the recent interest of regulators in dark patterns. We certainly expect increased legislation and/or enforcement in this area in years to come.