Read to the end of this article.
According to the Wikipedia page One Weird Trick Advertisements, created in November 2015, such forms of clickbait have been around since the late 2000s. In 2011, the Federal Trade Commission (FTC) filed lawsuits against companies using pushy and misleading language in online advertisements; at least two of those cases resulted in settlements favoring the FTC. Now more than ever, especially during the 2016 election season, information professionals need to pay attention to misinformation on the web.
Google Combats Harmful Ads
On Jan. 21, 2016, Google reported on its blog that it had disabled more than 780 million ads that violated its policies in 2015, such as ads that intend to mislead people, lead to “phishing” sites, or infect machines with malware. Google listed the categories of ads it blocked or suspended, including pharmaceuticals (“more than 12.5 million ads that violated our healthcare and medicines policy, such as ads for pharmaceuticals that weren’t approved for use or that made misleading claims to be as effective as prescription drugs”), weight-loss (“more than 30,000 sites for misleading claims”), counterfeiters (“more than 10,000 sites and 18,000 accounts for attempting to sell counterfeit” designer watches and other goods), unwanted software (“more than 10,000 sites”), and trick-to-click (more than 17 million ads that “mislead or trick people into interacting with them—like ads designed to look like system warnings from your computer”). There is no doubt that some nontrivial number of those 780 million ads included phrases similar to “try this weird old trick,” “one weird trick,” or “click here now.”
Vulnerabilities to Pushy Language
This issue of misleading advertising might make us consider why people are so vulnerable to pushy language. The “weird trick” ads, as do many effective ads, use declaratives. A typical example is “Diabetics: Do This 1 Thing Before You Eat Sugar.”
People readily respond to commanding language. They may even psychologically crave it. In a world of chaos, a directive creates a mission and an instant order. The political ramifications of this are usually tragic—a boss commands people’s compliance, and the people do what they’re told without necessarily reflecting on whether it’s the right or wrong move. In order to avoid Godwin’s Law, consider the case of Francisco Franco’s White Terror during the Spanish Civil War. Tens of thousands of people were killed or injured due to fear of the political left, which Franco directed into action he framed as an existential necessity. It should be obvious that the public always must be on guard when exposed to such directives, but given the current political climate, now seems an especially good time to remember this.
“Increasingly unequal societies have spawned anger, an unsurprising development. The anger is diffuse, in search of somebody to articulate it, preferably in short declarative sentences,” The New York Times’ Roger Cohen writes about recent politics in the U.K. and U.S. Short, declarative sentences signal to an audience that something is about to be accomplished and that problems (whether perceived, real, mischaracterized, or ginned up) are about to be solved.
Logical Fallacies and Cognitive Biases
Information professionals can help with this issue in all its various forms and guises by pointing out two important factors. One, there is an informal logical fallacy often at work in such rhetoric, whether in online advertising, courtroom dramas, PTA meetings, or the Iowa Caucus. Two, people usually suffer from cognitive biases that turn declarations and directives into actions—emergencies are implied, and people go into fight or flight mode.
In the case of the informal logical fallacy, it often boils down to an appeal to authority—“because an authority thinks something, it must therefore be true”—assuming that only people with experience and/or power would direct a group of people with bold declarations. For example, because Person X has directed me to do something—to click on this “one weird trick” ad or to vote for him or her, etc.—X must be an authority figure. Because X is an authority figure, I’d better believe it or do it, because X has the power, and I want to avoid trouble with the powerful.
In the case of cognitive biases, one of which is projection bias (“one thinks that others have the same priority, attitude or belief that one harbours even if that is unlikely”), a highly emotional declaration or direction can indicate that something is wrong and urgent. When someone who is perceived to be an authority figure directs people into action, they move without careful consideration of the consequences, because they have been brought into a rhetorical framework of emergency. Emergencies are not the time to debate the merits of the authority figure’s argument. It is circular thinking, and this immediate response to compelling language doesn’t even begin to critique the supposed authority figure—there’s no time. People may unconsciously mirror the tone of the emotional appeals they hear: “We’re in a crisis here!”
Information professionals’ services to others often include creating a research plan, gaining strategic information on markets or competitors, or finding law citations for clients. But an implicit duty in their job description, perhaps one of the “other duties as assigned,” is to help people think through hard problems. Every step of how to find information, how to evaluate it, and how to use it hangs on mental tools such as the ability to synthesize, criticize, and evaluate. Info pros must learn to recognize tropes and symbols—what a button is and when to click it; what an archive is and how it is organized—as well as watch for their own and their clients’ tendency toward compulsion or hasty action.
One weird trick for getting people to do what you want them to do is to tell them to do it. For example, “Read to the end of this article.” Awareness of such clickbait, trick-to-click, or psychosocial “malware” is the first step in becoming a more careful decision maker.
Now, click here to share this article on Twitter.