Bernie Folan is our blogger for this session. Bernie offers market research and consultancy to organisations within the scholarly communication sector. She has been involved in scholarly communications for over 20 years, previously in marketing at SAGE, London and serving on the UKSG main committee and as UKSG Marketing Officer. Earlier this year she launched Scholarly Social with Ginny Hendricks.
Bernie speaks at scholarly conferences and seminars and has published articles and reports. She has a passion for supporting academic research and teaching and understanding its changing nature and user needs.
Impact assessment in the funding sector: the role of altmetrics
Funders are continually seeking to better understand the impact and reach of the research they fund. In this session, four key players from a variety of funding bodies in the UK and Ireland considered the role altmetrics can offer. The session was chaired by Jonathan Adams, Chief Scientist at Digital Science.
James Wilsdon (@jameswilsdon), Chair of the HEFCE Steering Group on Metrics and Professor of Science and Democracy in the Science Policy Research Unit (SPRU) at the University of Sussex.
Wilsdon was recently appointed by HEFCE to chair an independent review of the role of metrics in research assessment, which will report in 2015. Wilsdon spoke about how this review will be carried out, sharing an overview of the terms of reference which importantly include understanding what cannot be measured, as well as what can. He stressed that HEFCE are aware that with the review and its outcomes they need to walk a careful tightrope of exciting innovations afforded by altmetrics to supplement the impact element of REF case studies versus the extra burden on the research community.
A call for evidence was issued in May 2014 and 152 responses were received to the request for evidence “relating to the use of metrics in research assessment and management”. These will all be loaded to the review site and Wilsdon explained that the responses run the gamut of attitudes on metrics from anti to pro. A few examples were shared to illustrate the range including from @david_colquhoun, and @PLOS. Wilson also cited the following useful inputs: The International Council for Science and the Leiden Manifesto
Findings are about to be written up and Wilsdon shared a timetable to the report publication expected in June 2015 (preliminary findings released March 2015) beginning with an In Metrics We Trust? workshop at the University of Sussex on October 7th.
Crucially, Wilsdon reassured us that all voices are being heard and there is no pre-cooked agenda.
Adam Dinsmore (@apDinsmore), Evaluation Officer, Wellcome Trust
Adam Dinsmore is an Evaluation Officer working within the Wellcome Trust’s Strategic Planning & Policy Unit.
Dinsmore began by explaining that the Wellcome Trust provides in the region of £600m in funding (£538m 2012/13) and monitors the research it funds using a wide range of indicators – of which citations are one small part. Measurement is at the article level – looking at the content, not the container and altmetrics are used to explore the stories around the funded research. Dinsmore shares a couple of examples of article “buzz” cautioning on the importance of digging into the detail. A PLOS paper on alcohol policy created a lot of Twitter activity ahead of the academic citations timeline and impact on social policy can be reasonably anticipated, based on the identity of the organisations and individuals tweeting. In another example, Dinsmore expects the article title may well be the cause of the social media attention. Content of the conversations, not the volume of them, is the important thing, he cautions.
The unit are currently monitoring citations of WT funded research within academic domains to understand more about the educational impact of the research and its use in academic syllabi.
Dinsmore finishes with a call for more consistency, transparency and availability within the altmetrics ecosystem explaining even citation counts vary between providers and cautioning that the lack of consistency and standards is an impediment to trust in the metrics.
Ruth Freeman (@abigailruthf), Director of Strategy & Communications, Science Foundation Ireland
Science Foundation Ireland (SFI) is the largest government funding body in Ireland and has just started thinking about how to use altmetrics. Defining impact as “the demonstrable contribution that excellent research makes to the economy and society”, Freeman explains that the past return on investment rhetoric, introduced during austerity, was felt as adversarial by researchers. When considering what impact means in a small country like Ireland (which is “not just a scaled down version of a larger country”), SFI came up with five streams: inputs, activity, output, outcomes and impact and have broken down each into measurables.
SFI have assured researchers that they will be backed beyond purely financial impact stories to reassure them following the climate of the financial crisis. Freeman explains this is helped by the size of the country (just seven universities) and the ability to reach all researchers. Impact is defined within six categories: economical and commercial, public policy & services, environmental, societal, health and professional services. SFI has developed a suite of impact statements to assist researchers in writing lay summaries and these are subject to ongoing review with annual reports of supporting metrics. Freeman speculates that altmetrics may find their way into these annual reports.
The promise of altmetrics to demonstrate reach and public engagement with and attitudes to science is real but Freeman cautions that altmetrics must be part of a suite of metrics – not used alone and that opinion (e.g. in blog posts) should not be confused with fact.
Liz Philpots, Head of Research, Association of Medical Research Charities
AMRC has 130-135 member charities and Philpots’ role is to support member charities in their grant-giving activities, and oversee AMRC’s data and knowledge management and strategy.
We start with the fact that more than a third of medical research in the UK is funded by charity giving. AMRC are exploring how altmetrics can be used by charities and are monitoring media activity around their funded research to help understand how research can be used by patients and medical staff.
Facebook activity around an article show charities using it to engage and educate patient followers explaining the science in lay terms, for instance.
Philpots asks what does “impact” mean and what behaviour do we want to see? There are real risks (e.g. gameplaying; a lack of education on how to use social media) to take into account. AMRC want to see metrics rewarding the research, not the social media or PR skills of the researcher or department.
Jonathan Adams, moderator, kicks off the panel discussion asking if altmetrics have the potential to show us the exciting front of research – the ideas stage – before the inputs. Freeman wonders if scientists are too protective of ideas but Philpots refers us back to a previous session where @reneehlozek discussed the #BICEP twitter storm. The panel discusses the opportunities altmetrics could provide for ‘discipline hopping’, moving research beyond academic silos. Wilsdon considers that the real-time data rich research environment may allow government to horizon-scan to inform strategic funding decisions
A question from the audience: how do we attribute impact/establish links between impact and the funded research given the time gap involved. Wilsdon points out that sometimes with government work, the impact happens long before a paper is produced and the REF panel know that. Freeman: SFI go broad and claim any association – have stopped being too scientifically cautious about it. SFI also follow researchers for as long as possible and encourage them to stay in touch.
Finally a question on how funders can address the research irreproducibility issue. Freeman (SFI) makes the point that malevolent forces are not always or often at work. Other factors such as time pressures can create problems. SFI are auditing institutions where there have been research integrity issues to understand how they can help to prevent future issues – helped she concedes by the size of and scale of the country.