This past week, you might have noticed the many news stories about killer cats. The research study about domestic cats’ impact on nature concluded that cats kill up to 3.7 billion birds and 20.7 billion mammals every year. Did you happen to pick up that the senior author on the paper was Peter Marra of the Migratory Bird Center, a research unit of the National Zoo, and one of his cowriters was Scott Loss, also of the MBC? While we are always excited by and proud of the research output of the Smithsonian, this is an example of a scholarly article having an impact in the public sphere—i.e. beyond just the scientific community. Does that matter? How does it matter? Is there a way for the organization sponsoring that research to measure impact of research output like this? These are the kinds of questions we can finally begin to tackle with the use of altmetrics.
Science is all about sharing methods and results. Traditionally, this has been done through publishing in peer-reviewed journals that took months to get from article to your lap. From there, any article of interest was then cited by other researchers and together this builds a prevailing scientific field of inquiry. This method of knowledge production can be expensive and slow, with limits on who can access the research and what they can really do with it. Key metrics were also limited: the individual researcher could calculate their h-index, which lines up the researcher’s most cited papers with how many citations they received in other publications to come up with a metric, or someone could consult Thomson Reuters for the journal Impact Factor, which reflects the average number of citations a journal has received in recent years. There are other bibliometrics for sure, but these are the big players for now. Yet both have limitations, such as how they are restricted in scope to scientific publishing.
Altmetrics has emerged as a response to the shifting ground on which scholarly output stands. No longer is it sufficient to count citations or get published in a specific journal. Funders like the NSF are increasingly requiring principle investigators to list “products” rather than just “publications.” This demands that the research community pay attention to the broader impact of their work. In response, a proliferation of alternative methods of gauging the impact of a particular article has emerged (thus altmetrics: alt+metric). Capitalizing on big data and open source APIs, altmetrics allow for quick, flexible analysis and insight into emerging trends, hot topics, and public interest stories. Altmetrics may even give us insight into research impact in ways we haven’t thought of before.
Many journals have taken note of this, and now provide article-level metrics. There are also standalone sites that aggregate altmetrics on a number of articles, datasets, blog posts and more. With these tools, we can begin to see the minimum impact research makes on the community in the broader sense.
So back to fluffy the kitty and her murderous ways. Traditional bibliometrics won’t tell us much at this stage in the life of our cat-killer article, which was published in Nature Communications on the 29th of January, 2013. (Loss, Scott R., Will, Tom and Marra, Peter P. 2013. The impact of free-ranging domestic cats on wildlife of the United States. Nature Communications, 4 is the citation.) You can find it in research.si.edu (since we try and capture these publications once a publication date is set, we often have them listed before official publication). You should also note the DOI (digital object identifier) is listed, too: 10.1038/ncomms2380 (it links to the article). We are in luck that the journal Nature Communications has the option to see metrics for this article, listed underneath the bibliographic information (just above the Abstract). Many other journals have similarly begun to embed altmetrics in their articles, too. Nature Communications has given us a variety of different metrics to explore. Most prominent is the more traditional metric of how many people have cited this article. Would you be shocked to see no one has? No, because the article is only a few weeks old. But you can see from the rest of the metrics listed that this article has legs, with over 260 tweets about it, and numerous mentions in news articles and scientific blogs. It even has a chart showing the page views, which I’ve recreated here:
Sites like impactstory.org can be used, too, by entering the DOI of the article and getting an extensive look into the ways this article is having an impact.
Mind you, this article is days old, but it has already made an impact in the public discussion on issues like animal control and the behavior of pet owners. Perhaps this article has made an impact because cats are the nation’s most popular pet, or perhaps our collective anxiety about the impact we have on the natural world and a lingering guilt of destroying the planet in some subconscious nihilistic death drive that plays out in our careless disregard for what fluffy does on her nightly jaunts in the neighborhood is showing through. While the Scholarly Communications department of the Libraries may not be able to answer the question of why kitties kill (though a reference librarian might point you in the right direction), what we can do is give insight into the impact an article has, and pay attention to trending research. As keepers of the bibliographic record of the research output of the Smithsonian Institution, we are in a key position to provide this.
PLoS Collections has started the Altmetrics Collection, an ever expanding body of research on altmetrics.
There is also an altmetrics group on Mendeley.
The Open Society Foundations have funded Beyond Impact, which aims to bring together funders, developers, and service providers to work on more effective research assessment.