KMWorld CRM Media Streaming Media Faulkner Speech Technology Unisphere/DBTA
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM EContentMag Faulkner Information Services Fulltext Sources Online InfoToday Europe Internet@Schools Intranets Today KMWorld Library Resource Literary Market Place OnlineVideo.net Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



News & Events > NewsBreaks
Back Index Forward
Twitter RSS Feed
 



Have a Question About Government Data? Data USA Can Help!
by
Posted On June 7, 2016
PAGE: 1 2


In April 2016, the MIT Media Lab, working with accounting firm Deloitte, LLP and data mining and analysis company Datawheel, released Data USA, described as the most comprehensive website and visualization engine ever created for U.S. government data using an open source platform available to anyone. This marks a major milestone for open source data, visualization, and global access to key information.

In 2012, The New York Times posted a new type of immersive online experience—a multimedia feature called “Snow Fall: The Avalanche at Tunnel Creek,” which was created by a team of professional developers and reporters led by John Branch. The next year, the paper released “How Y’all, Youse and You Guys Talk,” a dialect quiz based on research from a 2002 Harvard Dialect Survey project. The importance of these initiatives and other, similar efforts from journalists in recent years is that they have been phenomenally successful: “A news app, a piece of software about the news made by in-house developers, generated more clicks than any article,” The Atlantic notes. “And it did this in a tiny amount of time: The app only came out on December 21, 2013. That means that in the 11 days it was online in 2013, it generated more visits than any other piece” (emphasis in original). This is an impressive feat that points to the public’s interest in deeper learning experiences.

Informing the Public

Alexis Lloyd of The New York Times Research & Development group believes the future will take us in new directions, based on the “particles” that make up information. “While news organizations have adapted to new media through the creative use of interactivity, video, and audio, even the most innovative formats are still conceived of as dispatches: items that get published once and don’t evolve or accumulate knowledge over time. Any sense of temporality is still closely tied to the rhythms of print,” Lloyd says.

“Creating news for the current and future media landscape means considering the time scales of our reporting in much more innovative ways,” she continues. “Information should accumulate upon itself; documents should have ways of reacting to new reporting or information; and we should consider the consumption behavior of our users as one that takes place at all cadences, not simply as a daily update.”

News organizations are looking at how to compete with newer media—whether they are entertainment programs such as The Daily Show or efforts from tech giants such as Facebook’s Instant Articles, Google News, Snapchat Discover, or Apple News—to more successfully use technology to inform readers and to provide the essential journalistic sensibility necessary for reporting in the 21st century.

Writing on the Poynter Institute’s website, Ren LaForme describes the following four qualities needed in tools that would be used for the future of news:

  • Explore new storytelling formats
  • Are templatable and reusable
  • Are easy to learn (because most organizations don’t have a Snow Fall-sized team)
  • Are free or low-cost (because most news organizations are lean)

With major newspaper layoffs and increasing dependence on social media, Bloomberg’s editor-in-chief, John Micklethwait, explains in a memo to staff members that “vibrant journalism, whether it is on the terminal, the web, print, radio or television, needs a core—an area where we excel, partly because we believe in it and understand it better than our competitors do. … On the web, almost 70% of our visitors now come directly to article pages through social media or search engines, rather than the homepage.” He continues, “Bloomberg is still too focused on developed markets, established finance and the Western world (especially America). By contrast capitalism is moving to private markets and the emerging world. To chronicle it, we must follow it.”

Today, citizens are reporting incidents such as police brutality and other first-person experiences through venues such as crowdsourcing platforms. However, first-hand accounts and individuals aren’t able to provide context or easily go beneath the surface of events or issues that face them or their communities today. As Limelight CEO Sonny Tosco writes in AdWeek, “There are 7 [billion] people on the planet. As connectivity improves and more and more rely on smartphones, our news network will grow and grow. Real-time, on the ground news that comes directly from the people who are affected is more compelling and trustworthy than carefully constructed news reports and editorials. Where TV stations are slow to react and newspapers are a day late, people want their information here and now.” So in this environment, how can journalists, researchers, and the public delve deeper for meaning and context?

Data Visualization Comes of Age

In 1975, Edward Tufte created a statistics curriculum for a group of journalists who were visiting Princeton University to study economics. These lectures formed the basis of his book on information design, The Visual Display of Quantitative Information, which has become a classic resource for the design of statistical graphics, charts, and tables and the presentation of other complex information. His book clearly shows not only how graphic representations make information more easily understood, but also how graphic analysis can make it possible to find deeper meanings from available data. Tufte’s work linked the use of computers for information analysis and display in the 1970s to principles of statistics and representation that are still relevant. Interestingly, Tufte looked backward to 1854, when epidemiologist John Snow was able to create a map of the raging cholera epidemic in London and prove that contaminated water was the causative agent.

Datawheel logoToday, research is reinforcing the fact that visualization can help policymakers, scientists, researchers, and the public better understand and analyze complex information. Datawheel, a small Massachusetts-based consultancy with strong ties to the Massachusetts Institute of Technology, consists of “a small crew of programmers and designers unveiling the big secrets in big data. Our mission is to make the world’s information accessible and digestible for the benefit of all.” The website points out that Data USA is “to date, the largest and most comprehensive representation of U.S. data online. The website uses over 200,000 publicly available government datasets to create clean, easy to navigate visualizations on everything from job markets to higher education and healthcare. For anyone from business executives to students, Data USA can be used as a platform to enhance understanding and inform decision making within the U.S.”


PAGE: 1 2


Nancy K. Herther is American studies, anthropology, Asian American studies, and sociology librarian at the University of Minnesota Libraries, Twin Cities campus.

Email Nancy K. Herther

Related Articles

7/17/2014UNESCO Report Explores Current State of Journalism and the Media
3/22/2016SPECIAL REPORT: Open Data Day
5/10/2016Working Toward Universal Transparency
7/12/2016MIT Makes Database Creation Easier
8/9/2016The Berkman Klein Center Dips Its Toes Into Open Data Directories
4/20/2017USAFacts Offers a Comprehensive Government Data Archive
6/27/2017Bloomberg Businessweek Changes Its Access Model


Comments Add A Comment

              Back to top