Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology Unisphere/DBTA
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



e-Newsletters > NewsBreaks
Back Index Forward
Threads bluesky LinkedIn FaceBook Instagram RSS Feed
 




Dark Patterns Deep Dive
by
Posted On December 1, 2022
In November 2021, Kelly LeBlanc wrote “The Magic of Dark Patterns: Can We Evade Their Trickery?” for NewsBreaks: newsbreaks.infotoday.com/NewsBreaks/The-Magic-of-Dark-Patterns-Can-We-Evade-Their-Trickery-150051.asp. Throughout 2022, she covered various dark patterns in depth for Information Today. The following are excerpts from her article series.

To suscribe to Information Today at a personal subscription rate of less than $8 per issue, visit infotoday.stores.yahoo.net/intodsub.html (for the print subscription) or infotoday.stores.yahoo.net/naintom9ispe.html (for the digital subscription). Email editor Brandi Scardilli (bscardilli@infotoday.com) with any questions.


DARK PATTERNS DEFINED

Dark patterns are deceptive web design elements that manipulate the user into making a decision that is contrary to their intended behavior. This may be something the user didn’t want to do or weren’t aware they were doing, or it could be an action the user found difficult.

The Roach Motel Deep Dive from the April issueTHE ROACH MOTEL

The roach motel is a user experience (UX) design that makes getting into a situation simple and effortless, but it is extremely difficult to get out of it. 

COME ON IN

The roach motel lets users check in when it’s convenient for them, which is essentially anytime they choose. However, a graceful checkout is inhibited. Instead, there is frustration and unnecessary effort on the part of the users when they try to leave. In short, users are easily onboarded to their selected experience, but cannot offboard without problems. Premium subscriptions are a great example of the roach motel: It’s simple to sign up, but it is often difficult to cancel.

Think about running through a maze. The entrance is clearly marked, and you get in without incident, but when you try to exit—and usually there’s just one exit—you’re met with dead ends or roadblocks. The great thing about real mazes is that the journey to the exit is the intent all along, even if it is difficult. With roach motel-derived mazes, the exit is disguised to keep everyone inside. A less than a-maze-ing time is had by all who enter the roach motel.

COMMON ROACH MOTEL AVENUES

The following list represents situations in which roach motels find their stronghold:

  • Canceling a membership or subscription
  • Downgrading a membership or subscription
  • Deactivating an account
  • Deleting an account
  • Unsubscribing from a mailing list

WHY DO BUSINESSES DEPLOY THIS TACTIC?

Generally speaking, the goal of the roach motel is to ensure that a business holds on to existing users or subscribers. By making it difficult for users to sever the dysfunctional relationship, the business retains a larger userbase, even if it isn’t the most ethical way of doing so.

Misdirection and Forced Continuity Deep Dive page in the July/August issueMISDIRECTION AND FORCED CONTINUITY

Misdirection’s sleight of hand and forced continuity’s difficulty in canceling trial periods are commonly experienced dark patterns. 

MISDIRECTION

If dark patterns conjure thoughts of magic and intrigue, welcome Ms. Direction, the magician’s assistant. Misdirection creates an experience that purposely focuses the user’s attention on one situation to distract from another situation that is taking place at the same time. In short, the company the user is interacting with doesn’t want them to notice any number of things, and while it’s not always malicious—and it technically complies with legal standards—it is anything but transparent. In the long run, this lack of transparency will negatively affect the company, because those misguided users will lose trust in it and will potentially stop using its services.

The illusionary paywall is a common misdirection. The user is conducting a search and eventually finds what appears to be a link containing the sought after information. However, after clicking on the link, the page loads with a pop-up box or window that gives the illusion that the page itself (the information the user was seeking) is subscription-based. The pop-up often asks the user to enter some type of personal information in order to gain access to the page. This is misdirection because in cases like this, the user could just scroll down a bit or click out of the box and read the information they were seeking.

FORCED CONTINUITY

Forced continuity is when a user provides payment information for a service, and after the free trial period ends, they continue to be charged without any notification. When the user then tries to cancel, it becomes a herculean task. Often, the statement that the subscription renews automatically is hidden or buried. The business uses the forced continuity dark pattern hoping that the user will forget the trial ended or decide that the cancellation process is too arduous, so they remain a customer. As with other dark patterns, the transparency is lacking. While litigation does occur when users are tricked by forced continuity, companies continue to employ this technique. They walk a fine line between being just above the law and committing indictable infractions.

SiriusXM radio is guilty of forced continuity. Notification that a free trial is coming to an end is limited, if it’s sent at all, and when trying to stop the membership (which the user is led to believe they can accomplish online), they are informed they must call a customer service number. When called, the user receives the typical scripted pleas highlighting continuation promises and even attempts at upselling.

ALL THINGS CONSIDERED

With misdirection and forced continuity, just as with other dark patterns, the moral compass is broken. Ethically designing with transparency as the objective to gain customer respect and loyalty would empower customers and benefit the business providing the service in the long run, but it’s probably not realistic.

Confirshaming Deep Dive from the October issueCONFIRSHAMING

Confirshaming preys on human emotion. With a footing in human psychology, it’s also known as confirmshaming, confirm shaming, or clickshaming. Essentially, it is a dark pattern tactic used by businesses to guilt users into performing a task or behavior that they wouldn’t have normally chosen.

Unlike other dark patterns in this article series, confirshaming is easily spotted. When a user encounters a pop-up on a website asking for an email address to save a certain percent on their next purchase and they decline, the text says something such as, “No thanks, I hate saving money.” That is confirshaming. Similarly, when sites are soliciting their newsletters and the user declines, a message might pop up such as, “No thanks, I already know everything about [insert the topic here].”

The difference between confirshaming and other dark patterns is that while other dark patterns are more deception-based, confirshaming is in your face and not hiding in the dark. It’s still guilt-induced manipulation of the user, but legal action would be a hard-fought battle, since the filing of a consumer protection lawsuit because of it would be a stretch.

WHEN WE SEE CONFIRSHAMING

Businesses may insert confirshaming into the user experience in the following instances:

  • Users subscribe to a newsletter
  • Users try to unsubscribe from a newsletter
  • Users must accept certain conditions to continue
  • Businesses want to retain users on their site
  • Businesses want the user to retain a service or product (even when it’s free)

While confirshaming can be comically over the top at times—and if a user doesn’t want to do something, they won’t—the reality is that it’s sticking around because it works. “Shining a Light on Dark Patterns,” a 2021 article by Jamie Luguri and Lior Jacob Strahilevitz, is a solid read for those wanting to increase their understanding of the psychological complexities of dark patterns, especially confirshaming.

Bait and Switch and Disguised Ads Deep Dive from the November/December issueBAIT AND SWITCH AND DISGUISED ADS

BAIT AND SWITCH: AN OVERVIEW

If the term “bait and switch” conjures images of shady car dealerships luring in customers with a too-good-to-be-true offer, there may be a reason for that. At one time, paper fliers promised great deals on certain vehicle makes and models and were distributed to potential customers. When the customers arrived at the car lot, they were informed that those vehicles had sold out. They were then told that other vehicles were available. In the technology era, bait and switch follows the same principles: The user has a clear goal in mind, but on the way to accomplishing it, a different experience happens.

BAIT AND SWITCH: EXAMPLES

Microsoft presents us with a classic example when it was trying to cajole people into upgrading to Windows 10. Users would receive pop-ups asking them to update to the new OS. The user could “X” out and continue with what they were doing. No dishonesty there. However, it eventually changed the function of the “X” at the top right to mean yes, go forward with the Windows 10 update. As one might imagine, users felt duped and misled.

When shopping online, sometimes the chosen or intended product for purchase is shown as being out of stock so that the company can promote alternatives. These alternative products were what the ecommerce business wanted the customer to purchase all along.

Similar to other dark patterns, the bait and switch causes the user or customer to lose trust in—and potentially cut ties with—the business. The reality is that we cannot always sever the relationship because we might need the service (such as with the Microsoft example). The real downfall here is that users or customers may minimize their loss because there is still a perceived deal or gain.

DISGUISED ADS: AN OVERVIEW

Disguised ads are exactly what they sound like. These ads blend in with the content the user is looking for and function as clickbait. The ubiquity of these often-camouflaged ads causes the user to become desensitized to them. Although they are commonly ignored or just elicit an eyeroll, the ads work.

When executed correctly, disguised ads are almost indistinguishable from legitimate content. There are telltale signs of disguised ads—although not always used together—such as an “X” in the corner to close out; faint text stating “Ad,” “Sponsored,” or “Promoted”; or a border around the disguised ad that isn’t seen in the regular content.

DISGUISED ADS: EXAMPLES

Reddit uses disguised ads in the form of promoted posts. When a user is scrolling through content, promoted posts appear in line with the feed. Promoted posts are ads and not the content the user set out to discover. Even the AssholeDesign Reddit thread that points out dark patterns couldn’t keep the dark patterns from its feed. Ironic?

Facebook Marketplace uses sponsored listings to personalize the user experience. This personalization is not optional and doesn’t necessarily reflect what the user is searching for, making them akin to disguised ads.

Disguised ads are a nuisance and frustrating to wade through, but they are less problematic than other dark patterns because the user can more easily identify them—and then ignore them, avoid them, or close the webpage.

WHERE DO WE GO FROM HERE?

The dark patterns we looked at in this article are not flashy; they try to slide into the user experience unnoticed. You also won’t see a bait and switch or disguised ads peppering the news with litigation scandals because the businesses employing these techniques typically aren’t acting illegally. This means that user education is the most practical approach to identifying and circumventing these two dark patterns.


Kelly LeBlanc is a knowledge management specialist at FireOak Strategies, where she specializes in OA, open data, data management, geographic information systems (GISs), and data/information governance issues. Prior to joining FireOak, LeBlanc was with the Digital Initiatives Unit at the University of Alberta, where she worked with GISs, metadata, and spatial and research data. She served in various municipal planning and development capacities working with GISs, municipal law, planning/zoning regulations, and resource management. LeBlanc holds an M.L.I.S. from the University of Alberta and a master of letters from the University of Glasgow.



Related Articles

11/16/2021The Magic of Dark Patterns: Can We Evade Their Trickery?


Comments Add A Comment

              Back to top