Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology Unisphere/DBTA
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research

News & Events > NewsBreaks
Back Index Forward
Threads bluesky LinkedIn FaceBook Instagram RSS Feed

Disinformation: Why Does It Work, and What (If Anything) Can We Do About It?
Posted On December 5, 2023
This NewsBreak originally appeared in the November/December 2023 issue of Information Today.

“Falsehood flies, and truth comes limping after it. …”

—Jonathan Swift (1720)

“You can fool all of the people some of the time, and you can fool some of the people all of the time, but you can’t fool all of the people all of the time.”

—Abraham Lincoln (1857; attributed)

We use the words information, misinformation, and disinformation often, but it’s not always clear what their intended meanings are or how they are differentiated. They’ll be used liberally in this article, and I’ll define them as follows. Disinformation: inaccurate or misleading information communicated with the conscious intention to deceive and often to motivate action based on false belief. Misinformation: information that isn’t accurate (i.e., is demonstrably untrue) and is communicated without malice and with no intention to deceive, either in the mistaken belief that it is true or without assessing its accuracy. Information: understandable communication that either accurately represents reality or hasn’t been definitively assessed.


It seems as if we live in an age of crisis. We are beset by existing crises and potential crises. What’s the most concerning issue of all to you? Is it climate change or inflation? The war in Ukraine or crime here at home? Immigration or social justice? The list could go on … and on. But my choice would be misinformation and disinformation, because they underlie all of the other crises, issues, and controversies that contemporary society is dealing with. If we can figure out why misinformation and disinformation are so pervasive and so influential, and if we could mitigate or remove them, we’d be a lot closer to resolving the many important issues our society has to confront.


Alas, the reasons why misinformation and disinformation thrive remain a deep mystery to many of us. Not long ago, someone posted the following comment to an email list I subscribe to:

As a scientist myself, I cannot understand the … significant percentage of the population who reject the results of scientific investigation. This is especially remarkable given that nearly everyone takes for granted the cell phones and other devices that are a direct result of scientific research. (emphasis added)

The author of that comment is revealing a widely held presumption about not only scientific research, but also about judgment and decision making in daily life: that human beings follow a rational process in their thinking. According to this presumption, our thought processes work like this:

  • First, we gather as much information as we reasonably are able to find, representing diverse perspectives.
  • Then, we use logic to evaluate it, absorb it, and apply it.
  • Finally, we make a decision and take action based on the logical outcome of our thought process.

But if we step back and take a moment for honest reflection, we know that’s not at all how we operate in many situations.


Fortunately, the field of cognitive psychology has developed some conceptual tools to help us understand how people really process information and make decisions about what’s true, what to believe, and how to act on what they know. These concepts are perhaps best expressed in the book Thinking, Fast and Slow by the Nobel laureate Daniel Kahneman. To explain our mental processes, Kahneman presents the concepts of System 1 and System 2. These are intended as metaphors, not as concrete functions of the brain.

System 1 represents our mind as automatic: always on and requiring little effort on our part. Our mind seeks explanations for the information our senses present to us. It works fast and uses heuristics (or, colloquially, rules of thumb) to arrive at these explanations. Because it operates in this way, it can jump to conclusions. It enables us to function in everyday life, but it can also make mistakes.

System 2 is our capability for thoughtful, rational judgment and decision making. We presume it is operating all of the time, but it isn’t. It requires conscious effort and absorbs our attention. Moreover, it can never fully escape the influence of System 1: Even when we try to make rational judgments using System 2, System 1 is still operating in the background and influencing us.

So, System 1 affects us in ways we’re not conscious of. Fortunately, cognitive scientists have developed a number of insights into the biases it’s prone to and how they influence us.

One category of biases they have documented is called availability biases. Since System 1 operates quickly and with little conscious effort, information that’s readily available is the information that gets used. And what sort of information is most readily available? Vivid, arresting, and even shocking information that leaves a lasting impression. Also, recent information that hasn’t had time to fade from our memory. Information that’s clear and easily understood, as well as information that’s been repeated to us over and over again, are additional types of readily available information. Finally, because we human beings aren’t born with an innate understanding of probability and mathematics, and because stories meet some of the criteria of availability, anecdotes are more available to System 1 than statistics.

Confirmation biases are a second category of biases. As the label suggests, System 1 readily processes information that confirms what we already believe over contradictory information. In addition, System 1 operates on the WYSIATI principle: What You See Is All There Is. In other words, it doesn’t go looking for more information; instead, it confines itself to the information that is readily at hand. This is a direct contradiction of the principles of rational judgment and of scientific method, which encourage the consideration of alternatives.

System 1 also tends to lead to the escalation of commitment—in another word, stubbornness. Faced with evidence that events are not unfolding as expected, we too often strengthen our commitments, dig in our heels, and refuse to abandon our previous position. It is susceptible to the effect of anchoring, which is to say that the way a decision is presented influences how we perceive it and what judgment we make.

Beyond System 1 and System 2, which describe the mental processes of individuals, there are social factors to be considered. Our circumstances—who we spend time with, where we work and socialize, what groups we participate in—influence the information and opinions we are exposed to. Conforming to the opinions and agreeing with the information presented in our social settings reinforce and enhance our social status, while nonconformance reduces social status and can even lead to our being ostracized. In these ways, our social environment provides strong support for our System 1 processes and biases.


It’s not too much to say that disinformation relies on our System 1 flaws for its success. It can also attempt to fool System 2. Disinformation is designed to appeal to availability biases. Repetition, an essential component of any big lie, ensures that it is familiar and recent—indeed, ever-present. Often, disinformation is crafted to be vivid, shocking, and emotionally engaging; in other words, it’s sticky (a term applied to other digital information as well). Like advertising and other digital content, disinformation is often reliant on anecdotes, although it may throw in statistics as well (more about that at the end of this section).

The spread of digital disinformation is enabled by social media, search engines, and similar online services that feed System 1 confirmation biases. These services track our behavior, measure which information we engage with, and use algorithms to feed us more of the same—with or without our conscious agreement—thereby reinforcing our preexisting beliefs. Another way that digital social media amplifies disinformation is by incentivizing us to convert it to misinformation—false information we disseminate innocently, but thoughtlessly. Social media makes it easy to like, retweet, and otherwise forward disinformation, and it awards us credit within our social circles for doing so, as long as what we disseminate is in line with the prevailing opinions in the group.

The purveyors of disinformation recruit System 2 to abet their aims as well, in a couple of ways. Disinformation is often accompanied by exhortations to “do your own research.” This may sound like an appeal to the logical, systematic process of System 2, but it is often accompanied by attacks on resources that the purveyor believes to be corrupted, such as government and academic websites, together with references to sources that reinforce the disinformation being promoted. In effect, the exhortation is to “do your own research, but only trust the sources that I recommend to you.” Furthermore, these recommended sites may contain statistics and seemingly sophisticated analysis that are misused or distorted—yet another attempt to seduce System 2 into accepting the disinformation.


Within the past several years, there have been many new and expanded efforts to counteract disinformation and misinformation. There has been general agreement among those involved in these efforts that better methods to both refute disinformation and educate the public in the skills and attitudes necessary to evaluate information (i.e., information literacy) are needed.

Journalists have expanded their efforts to assess and publicize their analyses of claims by politicians and other public figures. Researchers have documented the penetration of disinformation and the spread of misinformation in public discourse, and they have explored a diverse array of new approaches to increase the speed and scale of fact-checking, including crowdsourcing and AI.

Meanwhile, educators and librarians have overhauled their approaches to information literacy. In many cases, programs that were formerly focused on the minutiae of scholarly and scientific information have been superseded by curricula that are applicable to the needs of the general population in our complex, participative democracy. The range of audiences addressed has similarly expanded from a former emphasis on students in secondary and higher education to encompass both younger students and the general adult population.

Legal action constitutes another dimension of the counterattack against disinformation. In some cases, disinformation can be shown to be defamatory, and there have been several highly publicized, successful lawsuits against purveyors of defamatory disinformation. This tactic applies in only a limited subset of cases, however.

To date, broader efforts to use federal administrative and legislative tools against disinformation have been blocked. A proposed disinformation board in the Department of Homeland Security was torpedoed by political objections fueled in part by First Amendment concerns. Similarly, legislative proposals to attack disinformation by removing the legal protections for social media and digital information distributors that host it have run into political roadblocks.

Thus, fact-checking and information literacy education remain the only broadly based tools to counteract disinformation and misinformation. Their conundrum is that to be effective, they need to break through the heuristics and biases of System 1 and activate System 2. It’s not at all clear that innovators and practitioners of either tool have yet succeeded at doing this.

And then there are the more personal efforts to address the problem. Both the literature and interpersonal conversations are replete with variations on the theme of “How can I engage my [spouse, relative, friend, neighbor] who believes in (what I consider to be) disinformation and follows crackpot theories in a civil, respectful, and productive way?” The answers vary similarly, ranging from “I don’t even try; my friend and I have agreed not to discuss our differences” to success stories. Many of the latter offer helpful suggestions. What an understanding of System 1 and System 2 adds is the recognition by both parties that we are all prey to System 1, a mutual agreement to avoid System 1 heuristics and biases, to be called out if we lapse into them, and to make the effort to apply System 2 principles in our conversations.

In the end, it all comes down to each one of us. To defeat disinformation and throttle misinformation, we need to start with ourselves: recognize that we are operating in System 1 by default, be aware of the heuristics inherent in it, be on the lookout for errors and biases those heuristics can lead to, and take the trouble to invoke System 2 when it matters.

Dave Shumaker is a retired clinical associate professor at The Catholic University of America in Washington, D.C., and a former corporate information manager. He is also the author of The Embedded Librarian: Innovative Strategies for Taking Knowledge Where It’s Needed (Information Today, Inc., 2012), and he founded SLA’s Embedded Librarians Caucus in 2015.

Related Articles

1/30/2024Information Literacy: What You Need to Know
12/14/2023Stephen Abram Shares 'Holiday Season Behaviour Tips'
12/19/2023CONFERENCE FLASHBACK 2023: Global Summit on Disinformation Focuses on Problems and Innovation
10/24/2023Sage Launches Hub to Help Address Misinformation and Disinformation
11/7/2023Hallucinate, Confabulate, Obfuscate: The State of Artificial Intelligence Today
10/17/2023Global Summit on Disinformation Focuses on Problems and Innovation
9/26/2023Libraries Help Foster Smart and Connected Communities
8/1/2023AP Looks at the Need for More Inclusive Disinformation Education During the 2024 U.S. Presidential Election

Comments Add A Comment

              Back to top