Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology Unisphere/DBTA
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



News & Events > NewsBreaks
Back Index Forward
Threads bluesky LinkedIn FaceBook Instagram RSS Feed
 



Supply and Demand: The Economy of Disinformation
by
Posted On October 15, 2019
This article appears in the November/December 2019 issue of Information Today as “Pineapple, Pizza, and Disinformation.”

Ever since the 2016 presidential election, American librarians have been focusing on combating misinformation and disinformation as never before. So have computer scientists, journalists, entrepreneurs, and all sorts of professional policy analysts. And while librarians have concentrated primarily on the “demand side” of the problem—educating students and citizens to become more discerning consumers and less susceptible to being misled—others are taking very different approaches. In many cases, they’re working on the “supply side”—labeling trustworthy content and choking off the spread of falsehoods.

CERTIFIED CONTENT COALITION

There’s so much going on that it’s impossible to keep track of it all. One group that has tried, though, is the Certified Content Coalition. Its Calendar and List page features an array of resources—not only a calendar of events, but also a list of key reports from government and academic sources, lists of newsletters and Twitter accounts to follow, and published standards of journalistic practice (more about these later). But its centerpiece is a list of more than 100 initiatives that are underway, sponsored by government, industry, and nonprofit organizations. Some will be familiar to many librarians (e.g., Snopes and FactCheck.org), but others may not be (e.g., Data & Society’s Media Manipulation Initiative and Nobias). If you don’t see your favorite initiative listed, there’s a form to submit it.

The Certified Content Coalition is a nonprofit organization founded by Denver-based cable industry entrepreneur Scott Yates. In addition to compiling the lists of resources, it has tackled the supply side of the problem of misinformation by proposing a system to certify trustworthy news organizations. Its draft specification document seeks to define good practice in journalism. Ultimately, the coalition envisions a worldwide network of Journalist Certification Authorities (JCA) that would implement the specification. News organizations would apply to become a JCA, and successful applicants would be certified as trustworthy sources, so that content platforms, advertisers, and individual consumers could interact with their material with the assurance that it had been produced according to best practices. Conversely, reports from outlets lacking the certification would be subject to closer scrutiny.

Another project, the Journalism Trust Initiative (JTI), is doing similar work. Led by Reporters Without Borders, with other media and public interest partners, JTI is developing a standard in accordance with the requirements of the European Committee for Standardization (CEN). Its goal is to have it adopted as an official European standard. A public comment draft is available at jti-rsf.org/#!your-participation.

DECODING THE DISINFORMATION PROBLEM

In another development, the Wilson Center recently held a half-day seminar, Decoding the Disinformation Problem, which combined a historical perspective with a discussion of current efforts to address the problem. A video of the proceedings is available at wilsoncenter.org/event/decoding-the-disinformation-problem.

In her opening remarks, center president Jane Harman reminded the audience that disinformation—false information that has been deliberately spread—isn’t new. She cited the Soviet Union’s Operation Denver, a 1980s campaign that claimed HIV was a result of bioweapons research by the U.S. military, as one historical example. In the first of two panels, Jessica Beyer of the University of Washington made the key point that what’s different about today’s environment is not that disinformation itself is new, but that the tools by which information—both true and false—is spread are so much faster and more powerful.

The other panelists, Nina Jankowicz of the Wilson Center and Ginny Badanes of Microsoft, highlighted the global nature of the problem, with examples of misinformation and disinformation from countries such as Estonia and Myanmar. The panelists also agreed that the goal of disinformation isn’t always to get people to change their vote. The overarching goal may be to create chaos and disorder or to break down trust in society and its institutions. Further, disinformation is effective because it is crafted to exploit preexisting divisions in society and amplify them. It works not because it’s plausible, but because it appeals to people’s emotions and because our technology allows it to spread so widely and quickly.

Moreover, the panelists agreed that technical solutions, such as algorithms and the use of AI by social media platforms to detect and eliminate misinformation, aren’t enough. Human-based solutions are required to deal with both supply and demand issues. On the supply side, there’s a need for human employees to monitor content on the platforms. On the demand side, the education of both children and adults is required at all levels.

While the demand side solution might seem to be good news for librarians, there’s an implied message there that information literacy instruction librarians aren’t doing enough. Traditional frameworks of information literacy based on rational assessment and intended for application to scholarly content won’t be successful. Instead, librarians have to deal with the emotional and social foundations of the problem. The first step is awareness: The panelists made the point that too many people don’t see themselves as targets of disinformation. We all need to realize that we are targets indeed, all the time. Second, we have to realize what the goal of a disinformation campaign is. As the panelists noted, it’s not just to spread falsehoods or influence actions. Instead, it’s to sow fear, doubt, and mistrust. So when we teach people information literacy, we have to teach them what the purpose of disinformation is and how they can deal with the emotions they may have when they are exposed to it.

PINEAPPLE ON PIZZA?

One useful resource to explain these points is the U.S. Cybersecurity and Infrastructure Security Agency’s infographic, The War on Pineapple: Understanding Foreign Interference in 5 Steps. Using the nonthreatening example of whether pineapple belongs on pizza, the infographic goes through the five steps of understanding how a disinformation campaign works:

  • It targets divisive issues.
  • It moves across multiple social media accounts.
  • It amplifies, distorts, and tries to raise the temperature of the discussion.
  • It worms its way into trusted media outlets.
  • It transitions from discussion into forms of action, including meetings, rallies, and sometimes violence.

NO FINISH LINE

The second panel at the Decoding the Disinformation Problem seminar amplified many of the first panelists’ points. It was most notable for the comments of Katie Harbath, public policy director for global elections at Facebook. Asked how Facebook has changed since the 2016 election, she asserted that it’s “a completely different company now.” As the discussion continued, she made two more comments indicating a significant shift in Facebook’s attitude. The first: “There will never be a finish line.” In other words, the problems of misinformation and disinformation will never be solved. The era we now live in will be characterized by an ongoing arms race between those who seek to spread falsehoods in order to undermine our society and those trying to stop them. We will continue to see new tactics that will call for new countermeasures. The second: Echoing the first panel, she acknowledged that technical solutions alone won’t be sufficient. In effect, human content review is needed, and Facebook has invested heavily in it. Harbath described in detail how the company now has a specific process to prepare for elections in countries all over the world and to monitor content during electoral campaigns.

The other panelist, David Greene, senior staff attorney and civil liberties director at the Electronic Frontier Foundation (EFF), pointed out the threats to freedom of speech that are posed by such a monitoring system, but he had no alternative to offer. The one thing that both panelists could agree on (along with the first panel) was the importance of education.

THE WORK CONTINUES

So what can we expect from the ongoing struggle for accurate, high-quality information in our society? First, there are many efforts to address the problem, and they include work by companies such as Facebook and Microsoft. (Where were Twitter and Google? Their absence from the seminar was notable.) Second, disinformation will continue, with new tactics emerging as responses are developed to the old ones. Third, the leaders of supply side initiatives—including the Certified Content Coalition, JTI, Microsoft, and Facebook—all agree that addressing the demand side of educating consumers at all levels also has to be part of the response. And finally, if librarians want to be effective, we need to tailor our instruction to today’s disinformation environment and not rely on generic information literacy concepts of the past.


Dave Shumaker is a retired clinical associate professor at The Catholic University of America in Washington, D.C., and a former corporate information manager. He is also the author of The Embedded Librarian: Innovative Strategies for Taking Knowledge Where It’s Needed (Information Today, Inc., 2012), and he founded SLA’s Embedded Librarians Caucus in 2015.



Related Articles

11/2/2021How to Find Political Advertising Data
8/27/2020'Providing Nuanced Information to Voters to Address Voting Disparities and Difficulties' by John Hernandez
5/5/2020Combating Misformation and Disinformation on the Web: A Roundup
3/10/2020Australian Associated Press Will Cease Operations in June
2/20/2020Reuters and Facebook Join Forces for a New Fact-Checking Mission
2/11/2020Exploring 2020 Presidential Candidates' Email and Web Security
1/14/2020'You're Probably More Susceptible to Misinformation Than You Think' by Darren Lilleker
1/7/2020Matrix Integration Helps Businesses Create Better Cybersecurity Plans
12/5/2019Precisesecurity.com Looks at Governmental Requests for Google's User Data
11/21/2019EFF Works to Combat Stalkerware Apps
11/14/2019'Facebook's New Role as News Publisher Brings New Scrutiny' by Marc Tracy
10/3/2019Disinformation Campaigns Are a Global Problem
6/6/2019Kobo Explores the Importance of Reading in Combating Fake News
5/21/2019Credder Opens Its Beta Site to the Public
4/16/2019Whither the Information Professions? Indicators From the iConference
1/24/2019Microsoft Edge Commits to Fighting Fake News
10/16/2018POLITICO Starts Database for Collecting and Debunking Fake News
10/11/2018Knight Foundation Report Sheds Light on Twitter’s Fake News Problem
8/21/2018The Front Lines of the Battle Against Fake News


Comments Add A Comment

              Back to top