Another trendy term has arrived—altmetrics, a contraction of the phrase "alternative metrics." The term refers to a group of different techniques and technologies all meant to wrestle with the issues of scholarship moving to digital venues. The issues include filtering good/better/best from bad/worse/worst, relieving the “firehose as drinking fountain” challenge of keeping up with digitally borne scholarship, and effectively replacing or reinforcing traditional measures of success in scholarship (e.g., prestigious journals, positive peer review, cite counts, etc.). A new small startup, Plum Analytics, has moved into the altmetrics field. It offers universities and other research institutions a way to track how researchers on staff have fared in the open access (OA) milieu. Where and how often have what they have written been referred to? What about the same information for co-authors, even ones not working at the client institution? What venues produce the best results for spreading the word?The motto and mission of the new company is “making research more assessable and accessible.” The two co-founders, Michael Buschman and Andrea Michalek, formerly directed the product management and technology for Serials Solution’s Summon, a discovery service from ProQuest. The first client for Plum Analytics services is the University of Pittsburgh Library System (ULS), the 22nd largest academic library system in North America. The University of Pittsburgh will supply a list of its researchers with profiles that should include lists of their writings and publications. In turn, Plum will enhance the profiles to build a directory that correlates the list with “usage and interaction metrics” from OA sources, social networks, data repositories, blogs, and others.
According to Rush Miller, university librarian and director at the University of Pittsburgh, the Plum service will “work in tandem with traditional measures to assess the impact of Pitt research in non-traditional venues. These days scholars are no longer waiting to publish their research in formal publications. They’re using Twitter, social networks, blogs, etc. to publish research and thoughts as they occur. Plum will match Pitt’s researchers to their own database.” Miller also indicated that the University of Pittsburgh is committed to OA—publishing about 20 OA journals itself. It also uses portions of the money made to fund payment of author fees to other high-quality OA journals (e.g., in the sadly underfunded humanities).
Buschman approved combining traditional measures with altmetrics. He pointed out that with the rise of OA and other forms of digital scholarly communication, traditional citation measures had become “a lagging indicator.” The delay factor in getting content into publication and then the delay in getting the citing publications published makes the traditional approach more appropriate for measuring past years than the latest research, according to Buschman. And in more and more fields, born digital content, including datasets for example, is simply hard to track accurately using traditional techniques.
The Plum Research Directory models the affiliations and research outputs of researchers in a flexible, extensible manner. Plum maps who follows or engages with the researcher and/or their work, such as co-authors. Plum then crawls the web, social networks, and university-hosted data repositories to collect and calculate metrics about the usage of each artifact. Plum’s Researcher Graph uses RDF, the same data model that underlies the semantic web.
The social networks currently tracked by Plum are Twitter, Facebook, and Google+. Buschman said, “If scholarly social networks appear, we are open to adding anything.” He indicated that LinkedIn would also be covered though its profiles did not accommodate researcher information as well as some other services. Data repositories tapped at present include all Dryad and Figshare ones.
Buschman admitted that at present that a body of blogs is not tapped: “If a blog is mentioned, our crawling will reach it. Once we have the target metrics, the database will grow. Right now we are doing our own crawling, but changing to a commercial one is open for discussion."
Plum also works with its clients to expand the sources tapped (e.g., data repositories), and the kind of data collected (e.g., data that could help improve the promotion of researcher content). Through such approaches, Buschman explained, Plum hopes to see its database constantly expanding almost organically.
Once a Plum directory is created, Plum clients can decide whether to make their directory product available to the public. If they did, researchers outside the institution listed in the directories could find their profiles and edit them. Buschman explained that once Plum confirmed the individual as the one listed in the directory, it would let individuals within and outside the original client institution revise and amend their own profiles.
Buschman also expects Plum’s products to help researchers learn how best to promote their research. For example, they can track how effectively different methods of communication or outlets reach particular collegial communities. As their database grows, Buschman promises it will help people measure how their work compares with other universities, departments, labs, and colleagues. He indicated that Plum would conduct special reports for clients (e.g., to gauge institutional strengths—in a sense providing competitive intelligence for academic research markets).
Headquartered in Philadelphia and Seattle, Plum currently has one full-time employee and a network of contracted specialists, in addition to its co-founders. It is too early to predict its success at amassing revenue. At present, it hopes to rely on institutional subscriptions from universities and other research organizations. Miller from the University of Pittsburgh indicated that the price to them was fairly low. (“If it hadn’t been, we couldn’t have bought it.”) At present, systems are still under development (e.g., user interfaces).
For more general information on altmetrics, check out Altmetrics: a Manifesto at Altmetrics.org. It includes some discussion of current players in the field, such as Zotero and Mendeley.