This is a guest post contributed by Robin Champieux, Scholarly Communication Librarian at Oregon Health & Science University in Portland, Oregon. She leads efforts that contribute to the pace and impact of scholarly communication by partnering with OHSU research, teaching, and student communities on issues relating to publication, public access, data sharing, and scientific contribution. Robin is the co-founder of Advancing Research Communication & Scholarship, a multi-disciplinary conference focused on new modes and models of scholarly communication. She is a passionate advocate for open science and the success of early career researchers.
I have a confession to make. The Oregon Health & Science University Library (OHSU) is entering its third year of using altmetrics to answer impact questions, but some of the time I still feel like I’m winging it, I often have more questions than answers, and, if local trends continue, I may need to clone myself…but, I love it! The uncertainty and interesting challenges motivate me to learn more, work closely and thoughtfully with researchers and my colleagues, and ultimately to do the experimentation and planning needed to build a successful service.
Like many of our peers, the OHSU Library began using altmetrics to gain a fuller understanding of the impact of our institution’s research. We wanted to provide faculty, students, labs, and administrators with data they could use to track and tell their impact stories. Initially, we focused on the creation of personal impact profiles, which users could update and pull from as needed. The service has evolved into something more consultative. I work with individuals and organizations to formulate impact questions, gather and analyze data, and translate this information into compelling stories.
I spend a lot of time thinking about this work, and transitioning trial and error efforts to replicable, trusted methodologies. Below I describe the themes that run through the questions, challenges, and progress I’ve encountered or made over the last two years. I think my local experience is echoed by many of the conversations and developments happening in the broader altmetrics, research, and library communities.
Context
My experience using altmetric and bibliometric data for impact assessment and story telling has taught me that the circumstances of attention matter, really matter. Understanding this for some data is easier than others, but it all requires work. Take a set of tweets, for example. To effectively and convincingly use Twitter attention to understand and communicate impact, I need to know (and show) who was talking about the research and the nature of their networks. If I’m working with a lab group interested in telling a story about public engagement, tweets between 100 specialized neuroscientists isn’t the right evidence. Similarly, the originality and depth of attention is important. The scientists and administrators I’m advising may appreciate data showing how a journal’s press release was picked up and republished, but they are not going to tell a story with it. Rather, they want to know about and promote original news coverage and social media discussions of their work.
I’ve also learned that some kinds of altmetric data are better for story telling that others. High download counts help me identify engagement, especially for recently disseminated outcomes, but I would not use them to communicate impact in a P & T dossier or NIH biosketch. What I would incorporate are the impacts the downloads helped me uncover, such as inclusion in curriculum or the use of a measure in clinical trials.
Authority
One way or another, all of the individuals and organizations I’ve worked with have asked two questions upon reviewing their altmetric data: what does that mean and why is it important? These are good questions and ones we should to be addressing for all of the data we use to uncover and tell stories of impact. It’s not enough for the scientists, students, and institutes I work with to reference general statements about the growing importance of the web to scientific communication. Honestly, most of them don’t care. Rather, they want to know, have confidence in, and communicate the relationships between online attention and specific kinds of impact. We should demand the same for bibliometric data, by the way.
I believe this necessitates the use of and new research on how scholarship and scientific information is sought, endorsed, and applied by different communities. For example, my colleague Tracy Dana and I will be trying to suss out the potential relationships between the medical literature, research on how physicians seek and use information, and bibliometric and altmetric data in order to better understand patterns and evidence of clinical impact. Along with existing resources like the Becker Medical Library Model for Assessment of Research Impact, and the NISO Alternative Assessment Metrics (Altmetrics) Initiative, I believe this kind of work will inform the creation of more trustworthy and compelling methodologies for answering impact questions.
Sustainability
When I present on the impact assessment work I’m doing, the question of sustainability is inevitably raised. It’s true, I can loose days, analyzing tweets, slogging through Google Books or Patents, and playing in The Lens. That said, I’m optimistic about the viability of impact assessment services in all kinds and sizes of libraries. For one, this work is being incorporated into local and global discussions about the roles of libraries and librarians in scholarly production and communication, with several institutions creating full-time impact assessment positions. For example, look at the work Karen Gutzman is doing at Northwestern University. Additionally, I’m convinced the most successful service models will be team-based efforts, which leverage the disciplinary knowledge of liaison librarians.
Finally, I’m confident that we’ll begin to see more of the context and authority issues described above addressed through code, and reflected in the metrics and benchmarking from data providers. Already, tools like PlumX and the Altmetric Bookmarklet are essential to the work I do. As the intelligence of the data improve, I can focus more of my efforts and hours on the human centered work of analyzing and translating the numbers into meaningful and actionable stories of impact.