2016-04-01

by Lauren B. Collister, PhD  (Scholarly Communications Librarian, University Library System, University of Pittsburgh)

and Timothy S. Deliyannides, MSIS  (Director, Office of Scholarly Communication and Publishing and Head, Information Technology, University Library System, University of Pittsburgh)

As the scholarly communication system becomes increasingly diverse, new tools arise that allow scholars to tell the story of their research and evaluate how their work is being used after its publication.  The set of tools known as altmetrics have had an impact on journal article evaluation in particular.

Altmetrics, a blend of the words “alternative” and “metrics,” show the use of an article beyond citation counts, which are a method traditionally used to evaluate the impact of an article.  This usage can include appearances on Wikipedia, discussion in social media outlets, saves in bookmarking programs like Mendeley or Delicious, blog posts and Websites that reference the article, and many more.  While citation counts have traditionally been viewed as the purest measure of scholarly usage of an article, altmetrics present a more comprehensive view of how the article is used in less formal ways and in less academic outlets.  This usage tracked by altmetrics tends to happen more quickly than traditional citation counts and some early research shows that certain altmetrics may actually predict future citation rates.  When a new journal issue is published online, altmetrics can track the number of times an article is downloaded immediately, within days of publication.  If a new article is widely downloaded and read, then it seems possible that the article will eventually receive citations from those people who have downloaded and read it.  Research by Ehsan Mohammadi and colleagues (2015) indicates that Mendeley usage by users who are graduate students and faculty can be an early predictor of citation counts.  This is not always the case for all altmetrics, however, as social media altmetrics data is correlated relatively weakly with citation counts according to research by Costas, Zahedi, and Wouters (2015a).  Rather than thinking of altmetrics as a simple complement or alternative to traditional metrics, the wide range of available metrics can be viewed as existing along a continuum from scholarly impact on one end (traditional citations and bookmarks in reference management databases) to popular and societal impact on the other (tweets and Facebook mentions).  While more time and research are needed to understand the relationship of altmetrics and citations, altmetrics remain a powerful tool for assessing impact beyond citation counts and academia.  As more journals provide altmetrics data for their articles, these tools can provide a more complete picture of the impact of research in a variety of different fields and help scholars tell the story of their research.

There are many providers of altmetrics data, each with different strengths and audiences.  Some providers show metrics directly on the publication page of a journal article, while others allow authors to create a profile and input their citations to begin altmetrics tracking.  In this short article, we will first share two brief examples of the different types of altmetrics providers, Altmetric and ImpactStory, followed by an examination of the University of Pittsburgh’s implementation of PlumX for journal article metrics as well as author profiles.

Altmetrics for Publishers and Researchers

In this section, we will share two examples of altmetrics providers, one that is aimed primarily at publishers and universities (Altmetric), and one that is targeted to individual researchers (ImpactStory).  Following this section, we will detail a case study of the use of a third provider, Plum Analytics, by the University Library System at the University of Pittsburgh.

One major player in the field is Altmetric (altmetric.com), whose data can be seen on journal articles from a number of major publishers.  The data from Altmetric are shown as a “doughnut” with a numerical score for an article;  this information appears on articles in major journals such as Nature Publishing and BioMed Central journals.  Institutions can also have an institutional account at Altmetric, which allows tracking of altmetrics from researchers, research groups, and departments from an institution.  There is also a bookmark app for individual researchers to find altmetrics data about their papers.  See Figure 1.



Figure 1

Figure 1:  Altmetrics information from Altmetric for the article “Male and Female Brain Evolution is Subject to Contrasting Selection Pressures in Primates” published in BMC Biology, 2007.  dx.doi.org/10.1186/1741-7007-5-21

ImpactStory is another altmetrics provider, although this service is directed at the authors rather than the journals.  Users of ImpactStory can input their own citations and links to their articles and ImpactStory will gather various metrics and generate a report for the user.  ImpactStory is often used to create an author profile that shows an overview of the impact of an individual scholar’s research work.  Each article listed in a profile can be explored for altmetrics from a variety of sources; the article and the author can be compared with the rest of the userbase of ImpactStory as a percentile ranking.  See Figure 2.



Figure 2

Figure 2:  Altmetrics data and ranking for an article in ImpactStory. (https://impactstory.org/lbcollister/product/6ov0voc8w7lbuiv2tdgxbx3z/metrics)

These are just two of many different altmetrics providers that can be used for obtaining metrics about a journal level, from the article and the researcher.  For the remainder of this article, the focus will be on Plum Analytics and their PlumX product, which is being implemented at the University Library System, University of Pittsburgh.

Plum Analytics at the University of Pittsburgh

PlumX, the altmetrics product provided by Plum Analytics, is a robust altmetrics tool that can be implemented by a research institution or publisher.  A dashboard view allows the user to view impact aggregated at the level of the entire institution or within individual schools, programs, or research labs.  Further, a series of easy-to-install widgets allows integration within any institutional repository or online publishing platform to display article level metrics on the abstract page for any document or article.  With an institutional repository, several international subject-based repositories, and a large Open Access journal publishing program, the University Library System at the University of Pittsburgh was very interested in showing altmetrics in all of its digital library platforms.  PlumX also allows for the creation of researcher profiles that can be viewed on an institutional Website or easily embedded as a widget on any Website.

PlumX makes use of standard numbers including but not limited to DOIs to facilitate the harvesting of metrics for digital objects.  Researcher profiles can also be enriched through systematic harvesting based on standard numbers such as ORCID iD, SCOPUS Author ID, and user identifiers in several popular content sharing sites.  Digital objects without a DOI or other standard number may also be tracked by entering individual URLs for each item in the user profile.  PlumX can track metrics for all items within an institutional repository and can also track less traditional scholarly outputs such as software, multimedia, blogs, videos, and presentations harvested from sharing sites like SlideShare, Vimeo, figshare, GitHub, and others.  Furthermore, if an object appears in two locations— such as an article published in a scholarly journal and an author’s version in an institutional repository — these separate manifestations are deduped and the metrics are combined to provide a full picture of use for all versions.

The PlumX article level widget is configurable to meet the needs of the system in which it is embedded, ranging from a comprehensive display of metrics to a very compact display based on the available screen real estate and the design needs.  In all cases, the widget links out to full tracking information for the article on the native PlumX site, where the user can drill down to view full detail on individual metrics.  Metrics are divided into different categories including Citations, Usage, Captures, Mentions, and Social Media.  The types of metrics shown in each of these categories are illustrated in Table 1, below.

Table 1:  PlumX categories and metrics.

Valuable to our users is the ability to see totals in the categories as well as drilling down into the data for each category and viewing the individual uses, many of which link out to the events themselves.  For example, users are able to see not only the number of social media shares, but also (for example) view the tweets about the article.  This facilitates the sharing of numbers and profile information for CVs and author profiles as well as interaction with those who are talking about the article in real time on social media, blogs, and news outlets.

The PlumX widget appears on the abstract page for all journal articles and repository documents as a “Plum Print,” a 5-part diagram that dynamically shows the ratios of the different categories.  The Plum Print can be expanded to show numbers in each category, with links to more information.  An example from the journal International Journal of Telerehabilitation is shown below in Figure 3.

Figure 3:  Plum print and expanded PlumX details from the article “Tele-AAC Resolution.”  http://dx.doi.org/10.5195/ijt.2012.6106

PlumX provides both altmetrics and traditional metrics from a variety of sources.  Showing these side-by-side allows a researcher to see all of the different impacts of their article in one place.  This presentation allows researchers to see traditional and alternative metrics side by side and evaluate them as they see fit.  It does not place one metric above another as more valuable; instead, it presents categories that may be of interest in different situations.  A researcher interested in knowing who is talking about their article might find the Social Media or Mentions categories most valuable; a tenure and promotion committee member may be interested in evaluating the traditional metrics in the Citations category and the “buzz” around an author’s work using the other four categories.  PlumX provides numbers that can be evaluated according to the user’s needs; there are no “scores” like those present in Altmetric, ImpactStory, and many other outlets, which can obscure the meaning of the data below the score.  Researchers at the University of Pittsburgh have expressed enjoyment of this versatile and data-driven approach, which has been viewed as a relief from increasing pressure to apply scores and ratios and rankings to the output of a researcher.  Generally, researchers appreciate the ability to see the data behind their metrics without being compared to an unknown amount of others with a scoring system that may not be immediately transparent.

As a library publisher, the University Library System, University of Pittsburgh can take advantage of the many faces of PlumX, including implementation within the institutional repository and other subject-based repositories as well as use on the journals that are published by the library.  Because the forty journals published by the University Library System use the open source platform Open Journal Systems (OJS), we worked with Plum Analytics to design and implement a plugin for OJS that displays the widget on the abstract of articles published by our journals.  This plugin is available for any institution that uses both OJS and PlumX, and features different display options including location on the page, amount of information displayed, and whether the widget will be shown if there are no metrics available.  In early 2015, we also contributed enhancements to the OJS software that allow PlumX (or indeed, any other altmetrics system) to harvest article-level usage metrics from OJS using NISO’s new SUSHI Lite protocol.  The editors of the journals we publish have appreciated the ability to see more robust metrics about their journal articles, and in some cases have begun participating more in social media discussions that have been revealed by the widget.  As a publisher, we view the extra information about article usage as another way to evaluate the journals and generate ideas about how to market and promote the journals from the information gleaned from the altmetrics information.

There are new innovations in altmetrics on the horizon from Plum Analytics.  For example, a newly-released product from Plum Analytics shows grant and funding information based on a scholar’s past work, which may result in new opportunities for scholars to find funding and grants.  We are looking forward to seeing how altmetrics gathering and display will develop in the near future as more and more entities take advantage of this wealth of usage information.

Looking Forward

Altmetrics have demonstrated value in assessing the impact of research in a variety of fields and contexts.  As scholars face increased pressure from both their own institutions and from funders to show relevance for their research, altmetrics will become increasingly valued.  It is important for institutions to consider altmetrics as a valuable source of information about the research coming from their scholars.  Altmetrics show immediate impact — in some cases, years before citation counts can be gathered from the traditional publication cycle.  In our observation, this timeliness is increasingly important in today’s fast-paced research environment where rapid dissemination of new knowledge is key.  The value of these new metrics is immediately apparent to the author of a manuscript deposited in a preprint repository or to the editor of a fledgling Open Access journal who might otherwise need to wait years to see evidence of the impact of their work.  Altmetrics also have value in documenting a broader range of societal impact, demonstrating how research is used in outlets like journalism and Wikipedia, or even how new knowledge captures the popular imagination and curiosity.

Whether altmetrics can or should replace citation counts is not the question that needs to be asked.  Some research has shown that some metrics can be an indicator of future citations, but the relationship is questionable because of the different contexts of each metric and has also been shown to vary widely by discipline (Costas, Zahedi, and Woulters, 2015b; Jobmann et al., 2014).  The question that we should ask is how we can evaluate the different metrics available to us, how individual scholarly communities can best use the data, and how we can influence the consideration of many different impacts of research beyond traditional citation counts.

Scholarship that impacts the world beyond academia should be valued.  If news outlets are picking up a research article or if it has been cited in a widely-used source like Wikipedia, then this shows that the scholarship is valuable in some way and tells a story about the impact of the research after it is done.  It is not the same as an article that is foundational to the body of research and receives many citations, but it still holds value in that it shows how academic research can connect with the world outside of academia.  If this kind of impact is valued, then it should be assessed with solid, quantifiable data available from one of the many altmetrics providers.

Academic libraries typically serve no more than a peripheral, supporting role in formulating tenure and promotion policies;  however, library publishers have a special role to play in fostering new ways of evaluating research impact.  By making these new data available wherever we can, whether on the journals that we publish, through the repositories that we support, or on researcher profile systems at our institutions, we can present new possibilities and raise awareness and understanding of these new tools.  Libraries not engaged in the publishing enterprise themselves can urge publishers to incorporate article level metrics into their publications and consult with their faculty on how to interpret and present these metrics.  More than simply making the data available, we should seek to share the stories that are in the data about our researchers.

References

Costas, R., Zahedi, Z., and Wouters, P. (2015).  Do “altmetrics” correlate with citations?  Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective.  Journal of the Association for Information Science and Technology, 66: 2003–2019.  doi: 10.1002/asi.23309

Costas, R., Zahedi, Z., and Wouters, P. (2015).  The thematic orientation of publications mentioned on social media: large-scale disciplinary comparison of social media metrics with citations.  Aslib Journal of Information Management, 67(3), 260-288.  http://dx.doi.org/10.1108/AJIM-12-2014-0173

Jobmann, A., Hoffmann, C. P., Künne, S., Peters, I., Schmitz, J., and Wollnik-Korn, G. (2014).  Altmetrics for large, multidisciplinary research groups: Comparison of current tools.  Bibliometrie-Praxis und Forschung, 3.  http://www.bibliometrie-pf.de/article/view/205

Mohammadi, E., Thelwall, M., Haustein, S., and Larivière, V. (2015).  Who reads research articles?  An altmetrics analysis of Mendeley user categories.  Journal of the Association for Information Science and Technology.  doi: 10.1002/asi.23286

Show more