2:AM Session 1: Standards in altmetrics

Blogging of this year’s conference has begun! This first post is contributed by Natalia Madjarevic

The 2:AM conference kicked off with a session on Standards in altmetrics, and the first speaker, Geoff Bilder (Director of Strategic Initiatives at CrossRef), began by discussing the emergence of altmetrics to help track the evolving scholarly record and made the case for altmetrics data standards. He described receiving an early email from the team at eLife discussing altmetrics: “[How can we] agree to do things in similar ways, so that the data is more comparable?” And how could CrossRef help? Bilder called for altstandards for altmetrics, and the treatment of data surrounding the research process in the same way as research data: open, comparable, auditable and quotable. CrossRef’s DOI Event Tracker pilot will be available in 2016 [bit.do/DETinfo]

Next up, Zohreh Zahedi, of CWTS-Leiden University, shared the latest findings from the Thomson Reuters-supported project analysing altmetrics data across a number of providers: MendeleyAltmetric.com and Lagotto (used by PLOS). The project looked at the altmetrics for the same set of 30,000 DOIs from 2013 publications (15k pulled from Web of Science and 15k CrossRef) with the data extraction conducted at the same date and time across each provider. The study investigated specific source coverage across providers and found varying results, sharing possible reasons for altmetrics data inconsistencies. Zahedi called for increased transparency in altmetrics data collection processes and, furthermore, data transparency from altmetrics sources (e.g. Twitter, Facebook).

Martin Fenner, (DataCite), provided an update on the NISO Alternative Assessment Metrics (Altmetrics) Initiative, currently in Phase II, with several groups working on the five topics identified in Phase I. The working group topics can be found here. All groups are currently finalising draft documents for public comment, to be made available in February 2016 and finalised in Spring 2016: some as standards, best practices, recommendations, definitions and codes of conducts – hopefully ready for discussion at 3:AM!

Finally, Gregg Gordon (SSRN) discussed the findings of a recent PLOS Biology paper: The Question of Data Integrity in Article-Level Metrics. The study conducted an in-depth analysis of article-level metrics and gaming across SSRN and PLOS outputs. In the case of SSRN, the study found gaming of PDF downloads by monitoring user IP ranges but saw a 70% decrease in potential fraudulent downloads after adding a pop-up warning message to the site. Gordon closed by highlighting the importance of clean and auditable altmetrics data to ensure emerging metrics are trusted and used by the academic community.

This session offered a pretty comprehensive overview of where we’re at with altmetrics standards in terms of establishing standards, in-depth data analysis, and the importance of auditable data in order to increase researcher confidence in altmetrics.

Leave a comment