This document discusses innovative uses of research impact indicators and metrics. It provides examples of how research institutions like the University of Michigan Publishing, EPA Research Triangle Park, and the National Center for Atmospheric Research have used metrics to demonstrate the broader impact and value of research to various stakeholders. It also outlines some of the shared challenges these institutions face in gathering and contextualizing impact data, as well as opportunities for librarians to play a leadership role in these efforts through skills in project management, data analysis, and relationship building. Overall, the document argues that understanding and communicating research impact can help validate funding and build partnerships across organizations.
Benefits and Challenges of Using Open Educational Resources
The New Metrics: conference presentation
1.
2. Innovative Uses of Research
Impact Indicators
Elaine M. Lasda, MLS
Coordinator of Scholarly Communication
Subject Librarian for Research Impact, Social Welfare, Gerontology and Dewey Reference
University at Albany
SLA On Demand Session
October 2020
3. Introduction
• Hi, I’m Elaine!
• I wear a lot of hats at my
job
• Research impact is my jam
• I live on the Hudson River,
where I can see the sun
rise
• I have 2 cats, Iggy and
Cyggie
10. Specialized Contexts
• Impact of research reaches beyond the academy
• Differing missions require differing metrics
• Purposes/benefits of metrics expand beyond employee
promotion/tenure/institutional rankings etc.
• Demonstrating value in broader arenas, to broader constituencies
11. Metrics for Engaging Stakeholders
• Communicating success
• Explaining relevance and utility
• Justifying funding
• Quantifying activity
• Demonstrating public engagement
• Demonstrating value and significance
12. Best Practices
• No rule of thumb for “good” scores
• Make sure comparisons are “apples to apples”
• The right metric for a given circumstance
• Use a dashboard of several metrics
• Contextualize the metrics you use
13. Other Metrics May Get the Job Done
• Web statistics trackers
• (ie, Google Analytics)
• Usage Data
• Requests
• Availability of the publication
• Ie (worldcat)
• Four Quadrants Model
15. Intsitute for Transportation Studies
The Challenge Tracking research through transportation project lifecycle and
beyond (Justify funding from Legislature)
Stakeholders Cali. State Legislature, High Level UC admins, Transportation
community(public and private)
Outputs/Content Assessed Technical reports and other grey literature, PRJAs, project reports,
etc.
Impact Deliverables Created a method for tracking impact using Google Scholar and
liking research output to project codes.
Librarian Roles Gathering, synthesizing data, communicating with field and
administrators, facilitating cooperation, technical expertise,
subject matter knowledge
16. Michigan Publishing
The Challenge Demonstrate impact of published content, have content
indexed in discoverable locations with appropriate metrics
Stakeholders Editors, Authors, UMichigan Libraries, Michigan Publishing
Administration, University level Administrators
Outputs/Content Assessed All publishing output from: traditional academic press, open
access publishing platform, and institutional repository
Impact Deliverables Wide variety of impact metrics and testing of various tools
and approaches for greatest insights
Librarian Roles Compliance: We can work to ensure that our publications
are consistently recognized by and included in the systems
and datasets upon which existing metrics are calculated.
Defiance: We can articulate new (alternative) metrics that
are meaningful for us and our stakeholders
18. EPA-RTP
The Challenge Demonstrate meaningful value of scholarly output, researcher
accountability, meet administrative and researcher requests
Stakeholders Researchers, funders, award committees, agency administration
Outputs/Content Assessed Mainly PRJAs and other traditional scholarly output using
WoS/InCites, ImpactStory, PlumX, GS/PoP, Altmetric, news, etc.
Impact Deliverables RIR/AIR <- impact reports with high quality visual appeal, context
and data synthesis
Librarian Roles Graphic design and visualization, data gathering, synthesizing,
contexutalizing. Brought in on other data projects. Educating
stakeholders.
20. NCAR/UCAR
The Challenge Demonstrate impact in 3 arenas: annual report of activity,
EarthCube “ecosystem”, and scientist use of supercomputer @
NCAR
Stakeholders Funders (government/university members), researchers,
UCAR/NCAR Directorate, Library
Outputs/Content Assessed Largely PRJAs, but other scholarly output will be included
Impact Deliverables Annual Report. Use/Impact beyond Journal-Author-Article levels.
Impact of NCAR equipement/services. Used WoS/InCites,
Altmetric
Librarian Roles Software engineers! Gather data, technical expertise (developing
an API), testing workfkow and delegating when appropriate
27. How the Cases Differ
● Range of mission & purpose
● Funding sources
● Parent organization activities
● Relationships to stakeholders
● Evaluated subject matter
● Impact data output formats
● Technical resources & staff skill sets
● Maturity & level of services provided
28. Shared Challenges
● Labor intensive
● Lack of standardized identifiers for all output types
● “Out of the box” tools insufficient
● Metrics do not stand alone/speak for themselves
● Need for stakeholder education
● Measuring impact outside disciplinary
boundaries/publications
● User education: “Metric Literacy”
31. Benefits to the Greater Organization
● Confidence in the numbers
● Comparative advantage
○ wise use of human resources
● Improved communication
● Breaking down silos
● Owning the story
Mission is not solely to get promotion and tenure
Simple citation counts and bibliometrics don’t cut it
Public impact beyond the ivory tower
Practical benefits of research
Societal Stakeholders Specific communities involved in (or the subjects of study for) social science research have a potential interest in measuring the impact of that work. The general public more broadly may have an interest in measuring the impact of social science, whether as a wise use of taxes or an effective way to address pressing problems, and policy makers can make the case for less or more funding to the social sciences based on that interest, as we see with What Works Centres in the UK. Commercial Stakeholders Companies that currently provide information to measure scholarly and traditional outputs—such as Elsevier, Clarivate, and others—could have an additional interest if measures of broader impact became important to their traditional markets. Entrepreneurs and entrepreneurial academics developing new metrics, tools, data sources, and other research infrastructure would have an interest in the progress of the initiative and opportunities they could address. People working on user interfaces, system design, or service design might be impacted if there are clearer ways of being able to understand society and measure wider impacts (e.g., the dynamics of followers and likes on Twitter).
Four Quadrants Model Another model would ask about impact within specific and broadly orthogonal domains and, by so doing, support researchers whose work has different kinds of impact. The four following domains could help paint a fuller picture. Academic. Has the work advanced our knowledge of the world? This domain is currently the best understood, with many indicators and metrics of impact already present. Practitioner. Has the work led to an improvement in how fellow researchers are able to conduct their work? This might be indicated by creating new research methods, building tools that get adopted, or creating datasets that get reused. A SAGE White Paper 8 Societal. Has the work led to changes in society? This might be indicated by being able to tie academic advice directly to changes in policy, regulation, or legislation. It could be related to economic output, productivity, or changes to GDP. Public. Has the work changed public understanding? In contrast to societal change, this would look at changes in the public debate, public engagement and outreach activities, and possibly changes to the spread and nature of information spread on networks (from news networks to social and private networks). We discussed that each of these four areas will have specific types of signals that evidence impact in those areas. We expect that some of these signals will be more robust and developed than others. We noted that the approach of looking at impact by different domains is not new, and there are other examples such as the Social Impact Open Repository (SIOR).
Indexing journals in key dB’s
Getting JIFs and other metrics for editors
Measuring impact as an accurate mark of success of a given pub
What is success in each of these venues
Experimenting with all the tools!