In the past 2 years, fake news—with its grown-up synonyms, misinformation and disinformation—has emerged as one of the most critical issues confronting not only the information professions, but democratic society as a whole. It’s the topic of conversation and pontification ad nauseam, so it may seem as if misinformation has become something that everybody talks about but nobody does anything about. MisinfoCon DC (@misinfocon), held Aug. 6–7 at the Newseum in Washington, D.C., however, provided ample evidence to the contrary. It turns out that there’s a great deal going on in the effort to combat public misinformation.
Funded by the Knight Foundation, Craig Newmark Philanthropies, the Mozilla Foundation, and SAGE and organized by Hacks/Hackers, MisinfoCon DC was the fourth conference in a series being held around the world. It brought together some 200 journalists, technologists, information scientists, and policy specialists to share insights and initiatives in an interactive format.
On both mornings, 10-minute “lightning presentations” filled the agenda. There were 17 in all, each one either describing and analyzing the misinformation problem or presenting an initiative to address some aspect of it. Afternoons were devoted to breakout workshops, during which every participant contributed to lively discussion. A group report from each workshop was presented to the conference at the end of the day.
With this kind of full and varied agenda, it’s impossible to provide a comprehensive account of the event. The following is a sampling of the experience.
In any dialogue about an emerging situation, it’s important to establish key concepts and the definition of key terms. Two concepts that stood out were the “three elements of information disorder”—a framework for understanding the problem—and “solutionism.”
The three elements of information disorder led much of the discussion. This framework was proposed in the report, “Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making,” which was produced for the Council of Europe by First Draft and Harvard University’s Shorenstein Center on Media, Politics and Public Policy in October 2017. The three elements are 1) the agents who create, produce, and distribute the messages, 2) the messages themselves, and 3) the interpreters—those who receive and decode the messages.
Solutionism was a key theme discussed by the third lightning presentation speaker on the first day: Lisa-Maria Neudert (Oxford Internet Institute). In describing the depth and magnitude of the misinformation problem, she warned against the tendency to rush to do something—anything—without thinking it through. She termed this tendency solutionism, and she warned that it can make the problem worse. This proved a useful antidote to the inclination to consider any of the worthy initiatives and projects presented over the 2 days of the conference as “the solution” to the problem.
References to digital and published resources were frequent throughout the conference. There were two of particular note: “Truth Decay” and the “Report of the Attorney General’s Cyber Digital Task Force.”
“Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life,” released earlier this year, is a 324-page report by Jennifer Kavanagh and Michael D. Rich of the RAND Corp. It provides a multidisciplinary review of the misinformation problem, including the definition, a historical review, analysis of current developments, and a research agenda. The report is available for free digital download and can also be purchased in print.
Anyone especially interested in disinformation campaigns by state actors will want to consult the “Report of the Attorney General’s Cyber Digital Task Force,” released on July 2, 2018. It includes the federal government’s framework for countering “malign foreign influence operations” and is available for free download.
Algorithms and Tools
Several presentations described diverse tools being developed to address various aspects of the misinformation problem.
Jessica Leinwand (public policy/strategic response at Facebook) discussed her company’s policies and recent initiatives. Facebook does not remove content believed to be false, although it does remove content for other reasons and has recently adopted a policy to remove content promoting imminent violence. Going forward, it will rely on improved algorithms to identify falsified accounts—which it does remove—as well as offer increased options for users to fact-check content and post evaluations. (It remains to be seen how agents of misinformation will game these initiatives and what kind of “arms race” will develop between them and Facebook.)
The Credibility Coalition is focused on messages and interpreters. One project, in partnership with the W3C Credible Web Community Group, is defining the characteristics of information quality: What factors signal the truth or falsehood of a news report?
Filippo Menczer (Indiana University’s School of Informatics, Computing, and Engineering) discussed the role of bots in promoting misinformation and his research in identifying stories that are being promoted on social media by bots. One research product that’s available to the public is Hoaxy, which uses an algorithm to characterize Twitter accounts as bots or humans and monitors the distribution of news to show whether content is being promoted by accounts that are likely bots.
Information literacy, or more often, media literacy, was a frequently mentioned topic. As a librarian, I was particularly interested in initiatives to improve people’s ability to evaluate news reports. Here are two initiatives, gleaned from side conversations as well as presentations.
The Digital Polarization Initiative is led by Mike Caulfield (Washington State University–Vancouver) for the American Association of State Colleges and Universities (AASCU). It involves 11 AASCU members piloting a curriculum to help students become more web-literate and to involve them in fact-checking digital news reports.
The Calling Bull: Data Reasoning in a Digital World curriculum was developed by Jevin West (University of Washington’s iSchool) and Carl Bergstrom (University of Washington’s biology department). The instructors assert that they began developing the course—which was initiated as a one-credit class in spring 2017—in 2015 and not as a reaction to the 2016 election. The course’s goals are to equip students to analyze information and identify false information and reasoning and to help them develop the skills to refute misinformation. Noting the challenges in “calling bull,” they refer to Brandolini’s Law: “The amount of energy needed to refute [bull] is an order of magnitude bigger than to produce it.”
A Final Thought
Librarians, who deal with human information behavior and the truth and falsehood of information sources, have as good a grasp of misinformation as anyone. So why were they absent from MisinfoCon? I seem to have been the only practicing librarian in attendance. Several of the researchers and entrepreneurs I spoke with, including the conference emcee, expressed their admiration of librarians and desire to engage us more fully in their work. And yet, I get the impression that by and large, librarians are having parallel, librarians-only conversations, and our community is not engaging with the important work being done by others to address this critical issue.
A view of the Capitol Building from the Newseum, by Dave Shumaker