The psychology of human-centered design meets the shadow side of human-centered design, and in this meeting, dark patterns begin to emerge. Dark patterns, coined in 2010 by Harry Brignull, have been infiltrating our daily lives in one form or another for more than a decade. But what are they, and why do they matter? Simply speaking, they are website or app design additives that attempt, often successfully, to get users to hand over something of value to them, such as money, information, or time. Users aren’t handing these items over willingly, but the designs employed make it challenging to discern or execute the intended action. Dark patterns range from a nuisance to thievery.
In our modern experience, we interact with devices and technologies daily. As such, identifying dark patterns in our technologies is a seek-and-find game, with consequences. The problem is that when we are bombarded and desensitized, we may miss dark pattern tactics—or succumb to them out of frustration. In this way, the user experience designers and dark patterns win by wearing down the userbase.
Let’s look at a sampling of dark pattern types and some examples to be on the lookout for.
Bait and Switch
The user has a clear goal in mind, but on the way to accomplishing that goal, a different thing happens instead. Reddit is a great example of a bait and switch (so is any tool that has sponsored or promoted content). You go in looking for a feed on a certain topic, like AssholeDesign (which features some entertaining dark pattern examples) and then find yourself clicking on promoted posts. Promoted posts are ads and not the content you set out to discover. That’s right, even AssholeDesign couldn’t keep out the dark patterns from its feed. Ironic?
Confirshaming: The Intentional Guilt Trip
Confirshaming is when a business intentionally guilts users who attempt to opt out of something. Say you are browsing a gardening site to read up on cultivating cacti, when you are startled by a pop-up asking for your email address to access the latest literature on cacti cultivation. You can say no, but the verbiage used in the declination statement is, “No thank you, I already know everything about cacti cultivation.” While you can decline, the wording is phrased to elicit feelings of guilt in doing so. After all, do you really know everything about cacti cultivation?
Disguised Ads
Disguised ads are so ubiquitous because they’re like a chameleon, blending in with content that is being reviewed. For example, Yelp allows businesses to pay for sponsored ads, which are disguised to look like reviews.
Forced Continuity: The ‘Free Trial’ That Comes With Free Headaches
Forced continuity is when a trial period comes to an end and the user finds they are not only being charged without warning, but also that cancellation is an arduous process. Maybe the business hoped that the user would forget the trial ended or that the burden of the cancellation process would keep them holding on.
An example? SiriusXM radio. Notification that the free trial is coming to an end is limited, if it happens at all, and when the user tries to stop the membership (which they are led to believe they can accomplish online), they are informed that they must call a number. Enter the scripted pleas of upselling and continuation promises.
Friend Spam
In friend spamming, the user is asked to provide their email (or social media) consent under the pretense that it will be used for a desirable outcome (e.g., finding contacts they already have). However, instead of finding existing contacts, it spams them. The real gem is that the spammed message claims to be from the user. One of the most notorious examples of friend spamming is the LinkedIn debacle, in which the company was forced to pay $13 million for unwanted emails back in 2015.
Misdirection
Like a magic trick, in misdirection, the user experience focuses the user’s attention on one thing to distract from another thing that is going on. A relatable example is when purchasing airfare. During the checkout process, the customer comes across a screen to purchase a seat (selected by the airline) for a determined additional fee. This selection process can be skipped, but the problem is that there is no opt-in function; you must search for the opt-out.
Another example is the illusionary paywall. You conduct a search and find a link with the apparent answer. But when you click on it, a pop-up appears, giving the illusion that the content is subscription-based. However, the user could just scroll down a bit and read the answer.
Roach Motel
You can check out any time you like, but you can never leave … well, not easily. Roach motels are premium subscriptions: simple sign-up, but difficult to cancel. Or, alternately, you must opt out or you automagically get that premium service.
Sneak Into Basket
Surprise! You’re busy filling your cart, and then at checkout, you notice additional, albeit optional, fees such as a many-year protection plan. Often there’s a button to remove the additional charges, but if you aren’t looking or paying attention, money has vanished. Good luck getting a refund on an optional service.
What’s Behind the Curtain? Hidden Costs!
Like the last example, the user gets to the last step of the checkout process only to discover unexpected charges have appeared, such as handling and delivery fees in the case of sending flowers, for instance. Generally, these don’t have an opt-out.
Trick Questions
Trick questions elicit a response, and sometimes even permissions, that a user didn’t intend on giving. An example is signing up for a newsletter or creating an account and encountering those infamous opt-in/opt-out checkboxes. While filling in the form, an unintended answer is given because the designer purposefully changed the word order from what was expected. When quickly viewed, the question appears to ask one thing, but when read carefully, it asks another thing entirely.
Conclusion
A dark pattern by any other name, like market manipulation, is still the same quagmire to navigate, even when users are educated. Each of these examples is a candidate for a future deep dive; however, it is important to understand the big picture of dark patterns in our modern environment so as users, we can remain vigilant when interacting with our devices and technologies.