We’re pleased to announce that following an open call for research and altmetrics project proposals, generously supported by Thomson Reuters, researchers Zohreh Zahedi (Leiden University), Rodrigo Costas (Leiden University), and Martin Fenner (PLOS) have been selected to receive the funding for their research project, which will examine and compare the differences and consistency of altmetrics data across the different commercial and non-commercial providers.
This was selected as the winning application by the organising committee of the conference (Martin Fenner was on this committee but not involved in this discussion). The committee felt that out of all the applications this one offered the most in terms of potential practical application, and usefulness to the wider community. Zohreh and her colleagues have contributed this description of what their work will aim to encompass:
How consistent are altmetrics data providers?
The recent emergence of “altmetrics” is opening the possibility of novel ways of exploring and informing the reception of scholarly outputs by broader audiences, particularly audiences beyond the more scholarly ones (Priem, et al., 2010). Several altmetric data providers and aggregators have appeared during the recent years. However, the quality and consistency of the altmetric data by these providers are still not clear, and at this moment they still need careful systematic analysis. Recent research has observed that similar metrics differ across different altmetrics providers due to differences in collection times, data sources and collection methods (Chamberlain, 2013; Zahedi, Fenner & Costas, 2014). Hence, an extensive assessment of the quality, reliability and consistency of the altmetrics tools and their data is absolutely crucial (Wouters & Costas, 2012).
Data quality behind the metrics is necessary
The reliability and validity of any new indicator is of course dependent on the quality of the data collected and used for its calculation. Thus, an important challenge of these new altmetric providers is related with their transparency, reliability and openness regarding their data. Our main motivation in this project is to investigate in detail the data extracted from different providers, particularly by studying their distinctive technical features and how their APIs work in extracting and collecting altmetric data for the same set of publications. Thus, this project will focus on testing the accuracy and quality of the data available by a set of different altmetric providers. This knowledge will help to understand better the potential metrics obtained through them, to discover their main differences and characteristics and to inform which providers are more useful and reliable for providing different types of metrics and data.
Consistency of Altmetric data providers
In order to test the consistency and differences of the data obtained by the different providers, in this project we will extend our analysis to a broad set of altmetric data providers, including: Altmetric.com (www.altmetric.com), Mendeley (www.mendeley.com/), Lagotto (https://github.com/articlemetrics/lagotto, open source software used by PLOS, CrossRef Labs and others) and Plum Analytics (www.plumanalytics.com/). A dataset of publications will be collected and we will technically explore the altmetric data extracted for this same set of publications. Data will be extracted at exactly the same date and time. Complementary to our technical analysis, the views of the different altmetric providers will be also considered in order to discuss the potential differences and inconsistencies that might arise during the analysis. Thus, it will be possible to report the reasons for any potential inconsistencies and also a discussion on their possible solutions.
The results of this project will provide critical information on the different ways used by the different altmetric data providers in extracting metrics via different data sources. It will also contribute to understand better the analytical opportunities that these new data can provide and to consider what issues should be taken into account when working with metrics from different providers. The project also expects to provide a valuable benchmark for quality assessment of altmetric sources for future studies. Most of the outcomes are expected to be presented in several scientific publications and international conferences. Moreover, this project is really dependent on the support and assistance by the various altmetric providers, so we ask for your help!
We highly appreciate and acknowledge the Thomson Reuters funding proposal award for giving us this opportunity of exploring this idea in more depth.