SlideShare a Scribd company logo
1 of 26
Research Metrics: Data-informed
Strategic Planning for the Research
Enterprise
NISO Virtual Conference ♦ February 20, 2019
Is This Still Working? Incentives to Publish, Metrics, and New Reward Systems
Holly J. Falk-Krzesinski, PhD
Vice President, Research Intelligence ♦ Elsevier
Presentation Roadmap
• Research metrics
for institutions
• Data for metrics in
institutional research
information systems
Presentation Roadmap
• Research metrics
for institutions
• Data for metrics in
institutional research
information systems
Strategic Context for Research Metrics
• Decreasing government grant funding for research
• Increasing competition for government research funding
• Rise of interdisciplinary and international grand challenge themes
• Increased team science and cross-sector collaboration
• Competition to attract the best research leaders globally
• Growing need to demonstrate both economic and social impact of research
Research Metrics at Different Levels
https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics
Journal Level
• CiteScore
• Journal Impact Factor
• Scimago Journal
Rank (SJR)
• Source Normalized
Impact Per Paper (SNIP)
Article Level
• Citation count
• Citations per paper
• Field-Weighted Citation Impact
(FWCI)
• Outputs in top quartile
• Citations in policy and medical
guidelines
• Usage
• Captures
• Mentions
• Social media
Researcher Level
• Document count
• Total citations
• h-Index
• i10-Index
• g-Index
Research Metrics Throughout the Research Process
Categories of Metrics for Analysis
https://www.elsevier.com/__data/assets/pdf_file/0020/53327/ELSV-13013-Elsevier-Research-Metrics-Book-r5-Web.pdf
Analyze research
strengths
Determine
where research is a
good potential
investment
Demonstrate
impact (Return On
Investment) of
research funding
Showcase researchers
or identify rising
stars
Tell a better
narrative about
everything that
is happening
with research
Research Metrics Use Cases
A Building Need to Demonstrate Impact
https://report.nih.gov/nihdatabook/report/20
“Given how tight budgets are around
the world, governments are rightfully demanding
effectiveness in the programs they pay for.
To address these demands, we
need better measurement tools
to determine which approaches
work and which do not.”
Bill Gates
Gates Foundation Annual Letter 2013
Research Impact Frameworks
https://becker.wustl.edu/impact-assessment/
• Number of Library holdings
(WorldCat OCLC)
• Views on Slideshare
• Plays on YouTube
• Amazon book reviews
• Clinical citations or Health
policy/guideline citations
• Government policy citations
• News mentions
• Patent citations
• Academic: Industry
partnerships
• Licenses
• Business consultancy activities
• Number of patents filed and
granted
• Wikipedia citations
• Blog mentions
• StackExchange links
• Downloads from Github,
RePEc, IRs
• Citations (field normalised,
%iles, counts)
• Collaborators on Github
• Full text, pdf, html views on
ScienceDirect, Figshare etc
• Social media metrics (Shares,
likes, +1, Tweets)
Educational
impact
Societal impact
Commercial
impact
Innovation
Informational
impact
Academic
impact
Promotion /
attention /
buzz
Types of
impact
Diverse Set of Metrics for Demonstrating Impact
Qualitativeinput:Expertfeedbackonqualityandimpactofresearch
Wide Range of Research Output Types
•abstracts
•articles
•audio files
•bibliographies
•blogs
•blog posts
•books
•book chapters
•brochures/pamphlets
•cases
•catalogues
•clinical trials
•code/software
•collections
•commentaries
•conference papers
•corrections
•data sets
•designs/architectural plans
•editorials
•exhibitions/events
•expert opinions
•file sets
•figures
•government documents
•grants
•guidelines
•images
•interviews
•issues
•journals
•learning objects
•lectures/presentations
•letters
•live performances
•manuscripts
•maps
•media files
•musical scores
•newsletters
•news
•online courses
•papers
•patents
•policy
•posters
•preprints
•press releases
•projects
•recorded works
•reference entries/works
•reports
•research proposals
•reviews
•retractions
•speeches
•standards
•syllabi
•technical documentation
•textual works
•theses/dissertations
•videos
•visual arts
•volumes
•web pages
•web resources
•other
https://plumanalytics.com/learn/about-artifacts/
https://rdmi.uchicago.edu/papers/08212017144742_deWaard082117.pdf
Research Data Metrics
Goal: Metric: How to measure
Research Data is Shared:
1 Stored, i.e. safely available in long-term
repository)
Nr of datasets stored in long-term storage Mendeley Data, Pure; Plum Indexes Figshare,
Dryad, Mendeley Data and working on
Dataverse
2. Published, i.e. long-term preserved,
accessible via web, have a GUID, citable,
with proper metadata
Nr of datasets published, in some form Scholix, ScienceDirect, Scopus
3. Linked, to articles or other datasets Nr of datasets linked to articles Scholix, Scopus
4. Validated, by a reviewer/curated Nr of datasets in curated databases/peer
reviewed in data articles
Science Direct, DataSearch (for curated
dBses)
Research Data is Seen and Used:
5. Discovered Nr of datasets viewed in
databases/websites/search engines
DataSearch, metrics from other search
engines/repositories
6. Identified DOI is resolved DataCite has DOI resolution: made available?
7. Mentioned Social media and news mentions Plum and Newsflo
8. Cited Nr of datasets cited in articles Scopus
9. Downloaded Downloaded from repositories Downloads from Mendeley Data, access data
from Figshare/Dryad
10. Reused Mention of usage in article or other dataset ScienceDirect, access to other data
repositories
Open Science Metrics
• Impact of Open Science
• Engagement in Open
Science activities and
impact of that
engagement
https://ec.europa.eu/research/openscience/pdf/os_rewards_wgreport_final.pdf
Always use both qualitative and
quantitative input into your
decisions
Always use more than one research
metric as the quantitative input
Using multiple metrics drives desirable
changes in behavior
There are many different ways of representing
impact
A research metric’s strengths can complement
the weaknesses of other metrics
Combining both approaches will get you closer
to the whole story
Valuable intelligence is available from the
points where these approaches differ in their
message
This is about benefitting from the strengths of
both approaches, not about replacing one
with the other
Golden Rules for Using Research Metrics
Mechanisms for Gathering Data for Metrics
From the NISO Code of Conduct for Altmetrics: https://www.niso.org/press-releases/2016/02/niso-releases-draft-recommended-practice-altmetrics-data-quality-public
Describe all known
limitations of the data
Provide a clear
definition of each metric
Describe how data are
aggregated
Detail how often data
are updated
Responsible Metrics
• Robustness: basing metrics on the best possible data in terms of accuracy and scope
• Humility: recognizing that quantitative evaluation should support – but not supplant –
qualitative, expert assessment
• Transparency: keeping data collection and analytical processes open and
transparent, so that those being evaluated can test and verify the results
• Diversity: accounting for variation by field, and using a variety of indicators to support
diversity across the research system
• Reflexivity: recognizing systemic and potential effects of indicators and updating them
in response
http://www.hefce.ac.uk/pubs/rereports/year/2015/metrictide/
https://www.snowballmetrics.com/
https://www.elsevier.com/connect/new-metrics-will-make-journal-evaluation-easier-and-more-transparent
Transparency in Research Metrics
Presentation Roadmap
• Research metrics
for institutions
• Data for metrics in
institutional research
information systems
Gathering Data for Evidence
Gather data as you go along rather than retrospectively
Think about what success would look like for each question or impact
activity and how to evidence it
Use all the data available, be clear & specific, and build a coherent
narrative to provide context
Data in Institutional Systems
• Persons - researchers, postgraduate students, external persons
• Organizations - faculties, departments, research groups, external units
• Publications - peer-reviewed journal articles, books, chapters, theses, non-textual, etc.
• Publishers and journals - names, IDs, ratings
• Bibliometrics - citations, impact factors, altmetrics
• Activities - conferences, boards, learned societies, peer reviewing, prizes
• Narratives - narrative recordings of the impact of research
• Datasets - stored locally or in separate data repository
• Equipment - type, placement, ownership details
• Funding opportunities - funder, program, eligibility, etc.
• Grant applications - stage, funder, program, amount applied, documents attached
• Grant awards - funder, program, amount, dates, contract docs, applicants, budget
• Projects - budget, expenditure, participants, collaborators, students, outputs
• Press clippings - national and international papers, electronic media
Joachim Schöpfel et al. / Procedia Computer Science 106 (2017) 305 – 320
Sources of Data and Ingestion Options
Type of Data Source(s) of Data Ingestion into Pure, a CRIS
Persons Internal HR system Pure XML format (automatic recurring sync job)
Organizations Internal HR system Pure XML format (automatic recurring sync job)
Publications Manual entry
Online sources, e.g.
Scopus
Legacy systems
User-friendly templates
Single-record import
Automated import by person
Automated import by department
Pure XML format (single or repeated legacy
import)
Elsevier PRS service
Publishers and journals Manual entry
Online sources, e.g.
Scopus
Legacy systems
Automatically together with import
Elsevier PRS service
Pure XML format (single or repeated legacy
import)
Bibliometrics Scopus
Web of Science
Automatically together with import
Automatic sync job for citations (Scopus and
WoS)
Pure XML format (single or repeated legacy
import)
Elsevier PRS service (Scopus bibliometrics only)
Activities Manual entry
Legacy systems
User-friendly templates
XML format for legacy import
Narratives Manual entry User-friendly templates
Datasets Manual entry
Legacy systems
User-friendly templates
XML format for legacy import
Elsevier’s Pure’s datamodel
Research Metrics Dashboard
https://www.osti.gov/biblio/1462196
Research Metrics in Research Information Systems
https://plumanalytics.com/integrate/load-your-data/pure-plumx-integration/
Thank you!
Holly J. Falk-Krzesinski, PhD
Email: h.falk-krzesinski@elsevier.com
Tel: +1 847-848-2953

More Related Content

What's hot

Measuring success through improved attribution
Measuring success through improved attributionMeasuring success through improved attribution
Measuring success through improved attributionKristi Holmes
 
Measuring Progress: Indicators, Data Sources and Assessment | Laszlo Pinter, ...
Measuring Progress: Indicators, Data Sources and Assessment | Laszlo Pinter, ...Measuring Progress: Indicators, Data Sources and Assessment | Laszlo Pinter, ...
Measuring Progress: Indicators, Data Sources and Assessment | Laszlo Pinter, ...NAP Global Network
 
Spatial Decision Support Portal- Presented at AAG 2010
Spatial Decision Support Portal- Presented at AAG 2010Spatial Decision Support Portal- Presented at AAG 2010
Spatial Decision Support Portal- Presented at AAG 2010Nathan Strout
 
Measuring Progress: Indicators, data sources and assessment | Laszlo Pinter, ...
Measuring Progress: Indicators, data sources and assessment | Laszlo Pinter, ...Measuring Progress: Indicators, data sources and assessment | Laszlo Pinter, ...
Measuring Progress: Indicators, data sources and assessment | Laszlo Pinter, ...NAP Global Network
 
Citation & altmetrics - a comparison
Citation & altmetrics - a comparisonCitation & altmetrics - a comparison
Citation & altmetrics - a comparisonIUPUI
 
Altmetrics 101 - Altmetrics in Libraries
Altmetrics 101 - Altmetrics in LibrariesAltmetrics 101 - Altmetrics in Libraries
Altmetrics 101 - Altmetrics in LibrariesIUPUI
 
SHARE Update for CNI, Spring 2014
SHARE Update for CNI, Spring 2014SHARE Update for CNI, Spring 2014
SHARE Update for CNI, Spring 2014SHARE
 
RDC Xinjie Cui - Data Interoperability_II
RDC Xinjie Cui - Data Interoperability_IIRDC Xinjie Cui - Data Interoperability_II
RDC Xinjie Cui - Data Interoperability_IICASRAI
 
Library resources and services for grant development
Library resources and services for grant developmentLibrary resources and services for grant development
Library resources and services for grant developmentrds-wayne-edu
 
Evaluating the possibilities of DataCite for developing 'Open data metrics' o...
Evaluating the possibilities of DataCite for developing 'Open data metrics' o...Evaluating the possibilities of DataCite for developing 'Open data metrics' o...
Evaluating the possibilities of DataCite for developing 'Open data metrics' o...Nicolas Robinson-Garcia
 
Research impact beyond metrics
Research impact beyond metricsResearch impact beyond metrics
Research impact beyond metricsPat Loria
 

What's hot (19)

Measuring success through improved attribution
Measuring success through improved attributionMeasuring success through improved attribution
Measuring success through improved attribution
 
Measuring Progress: Indicators, Data Sources and Assessment | Laszlo Pinter, ...
Measuring Progress: Indicators, Data Sources and Assessment | Laszlo Pinter, ...Measuring Progress: Indicators, Data Sources and Assessment | Laszlo Pinter, ...
Measuring Progress: Indicators, Data Sources and Assessment | Laszlo Pinter, ...
 
Spatial Decision Support Portal- Presented at AAG 2010
Spatial Decision Support Portal- Presented at AAG 2010Spatial Decision Support Portal- Presented at AAG 2010
Spatial Decision Support Portal- Presented at AAG 2010
 
Measuring Progress: Indicators, data sources and assessment | Laszlo Pinter, ...
Measuring Progress: Indicators, data sources and assessment | Laszlo Pinter, ...Measuring Progress: Indicators, data sources and assessment | Laszlo Pinter, ...
Measuring Progress: Indicators, data sources and assessment | Laszlo Pinter, ...
 
Assessing and Reporting Research Impact – A Role for the Library - Kristi L....
Assessing and Reporting Research Impact – A Role for the Library  - Kristi L....Assessing and Reporting Research Impact – A Role for the Library  - Kristi L....
Assessing and Reporting Research Impact – A Role for the Library - Kristi L....
 
Citation & altmetrics - a comparison
Citation & altmetrics - a comparisonCitation & altmetrics - a comparison
Citation & altmetrics - a comparison
 
Cassidy "Case Study: Supporting Researcher Impact and Efficiency"
Cassidy "Case Study: Supporting Researcher Impact and Efficiency"Cassidy "Case Study: Supporting Researcher Impact and Efficiency"
Cassidy "Case Study: Supporting Researcher Impact and Efficiency"
 
Jonathan Breeze, Symplectic
Jonathan Breeze, SymplecticJonathan Breeze, Symplectic
Jonathan Breeze, Symplectic
 
Cooper "Simplicity is the Ultimate Sophistication: Accessible, Ubiquitous Tec...
Cooper "Simplicity is the Ultimate Sophistication: Accessible, Ubiquitous Tec...Cooper "Simplicity is the Ultimate Sophistication: Accessible, Ubiquitous Tec...
Cooper "Simplicity is the Ultimate Sophistication: Accessible, Ubiquitous Tec...
 
Altmetrics 101 - Altmetrics in Libraries
Altmetrics 101 - Altmetrics in LibrariesAltmetrics 101 - Altmetrics in Libraries
Altmetrics 101 - Altmetrics in Libraries
 
SHARE Update for CNI, Spring 2014
SHARE Update for CNI, Spring 2014SHARE Update for CNI, Spring 2014
SHARE Update for CNI, Spring 2014
 
ROI and Beyond - King
ROI and Beyond - KingROI and Beyond - King
ROI and Beyond - King
 
RDC Xinjie Cui - Data Interoperability_II
RDC Xinjie Cui - Data Interoperability_IIRDC Xinjie Cui - Data Interoperability_II
RDC Xinjie Cui - Data Interoperability_II
 
2016 AAUDE
2016 AAUDE2016 AAUDE
2016 AAUDE
 
Library resources and services for grant development
Library resources and services for grant developmentLibrary resources and services for grant development
Library resources and services for grant development
 
Evaluating the possibilities of DataCite for developing 'Open data metrics' o...
Evaluating the possibilities of DataCite for developing 'Open data metrics' o...Evaluating the possibilities of DataCite for developing 'Open data metrics' o...
Evaluating the possibilities of DataCite for developing 'Open data metrics' o...
 
Kevin Gardner, UNH
Kevin Gardner, UNHKevin Gardner, UNH
Kevin Gardner, UNH
 
Research impact beyond metrics
Research impact beyond metricsResearch impact beyond metrics
Research impact beyond metrics
 
Carelli "Promoting Content Discovery Within the Reader/Researcher Workflow"
Carelli "Promoting Content Discovery Within the Reader/Researcher Workflow"Carelli "Promoting Content Discovery Within the Reader/Researcher Workflow"
Carelli "Promoting Content Discovery Within the Reader/Researcher Workflow"
 

Similar to Falk-Krzesinski, "Administrator (Institutional Use of the Data): Data-informed Strategic Planning for the Research Enterprise"

Important.ppt
Important.pptImportant.ppt
Important.pptAcademics
 
Researcher profiles and metrics that matter
Researcher profiles and metrics that matterResearcher profiles and metrics that matter
Researcher profiles and metrics that matterLibrary_Connect
 
20160607 citation4software opening
20160607 citation4software opening20160607 citation4software opening
20160607 citation4software openingDaniel S. Katz
 
Gather evidence to demonstrate the impact of your research
Gather evidence to demonstrate the impact of your researchGather evidence to demonstrate the impact of your research
Gather evidence to demonstrate the impact of your researchIUPUI
 
Research analytics service - ARMA study tour
Research analytics service - ARMA study tour  Research analytics service - ARMA study tour
Research analytics service - ARMA study tour Christopher Brown
 
Author identifiers & research impact: A role for libraries
Author identifiers & research impact: A role for librariesAuthor identifiers & research impact: A role for libraries
Author identifiers & research impact: A role for librariesKristi Holmes
 
The changing world of research evaluation
The changing world of research evaluationThe changing world of research evaluation
The changing world of research evaluationJisc
 
Communicating Research Impact
Communicating Research ImpactCommunicating Research Impact
Communicating Research ImpactErin Owens
 
Workshop intro090314
Workshop intro090314Workshop intro090314
Workshop intro090314Philip Bourne
 
Understanding impact through alternative metrics: developing library-based as...
Understanding impact through alternative metrics: developing library-based as...Understanding impact through alternative metrics: developing library-based as...
Understanding impact through alternative metrics: developing library-based as...Kristi Holmes
 
Introduction to Altmetrics for Medical and Special Librarians
Introduction to Altmetrics for Medical and Special LibrariansIntroduction to Altmetrics for Medical and Special Librarians
Introduction to Altmetrics for Medical and Special LibrariansLinda Galloway
 
Practical applications for altmetrics in a changing metrics landscape
Practical applications for altmetrics in a changing metrics landscapePractical applications for altmetrics in a changing metrics landscape
Practical applications for altmetrics in a changing metrics landscapeDigital Science
 
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENT
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENT
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTSaptarshi Ghosh
 
How to measure the impact of Research ?
How to measure the impact of Research ?How to measure the impact of Research ?
How to measure the impact of Research ?Le_GFII
 
BLC & Digital Science: Kevin Gardner, University of New Hampshire
BLC & Digital Science:  Kevin Gardner, University of New HampshireBLC & Digital Science:  Kevin Gardner, University of New Hampshire
BLC & Digital Science: Kevin Gardner, University of New HampshireBoston Library Consortium, Inc.
 
What's new at Crossref - Ed Pentz - London LIVE 2017
What's new at Crossref - Ed Pentz - London LIVE 2017What's new at Crossref - Ed Pentz - London LIVE 2017
What's new at Crossref - Ed Pentz - London LIVE 2017Crossref
 
Strategic planning for research uptake
Strategic planning for research uptakeStrategic planning for research uptake
Strategic planning for research uptakeresyst
 

Similar to Falk-Krzesinski, "Administrator (Institutional Use of the Data): Data-informed Strategic Planning for the Research Enterprise" (20)

Assessing and Reporting Research Impact – A Role for the Library - Kristi L....
Assessing and Reporting Research Impact – A Role for the Library  - Kristi L....Assessing and Reporting Research Impact – A Role for the Library  - Kristi L....
Assessing and Reporting Research Impact – A Role for the Library - Kristi L....
 
Important.ppt
Important.pptImportant.ppt
Important.ppt
 
Researcher profiles and metrics that matter
Researcher profiles and metrics that matterResearcher profiles and metrics that matter
Researcher profiles and metrics that matter
 
20160607 citation4software opening
20160607 citation4software opening20160607 citation4software opening
20160607 citation4software opening
 
Gather evidence to demonstrate the impact of your research
Gather evidence to demonstrate the impact of your researchGather evidence to demonstrate the impact of your research
Gather evidence to demonstrate the impact of your research
 
Research analytics service - ARMA study tour
Research analytics service - ARMA study tour  Research analytics service - ARMA study tour
Research analytics service - ARMA study tour
 
Author identifiers & research impact: A role for libraries
Author identifiers & research impact: A role for librariesAuthor identifiers & research impact: A role for libraries
Author identifiers & research impact: A role for libraries
 
The changing world of research evaluation
The changing world of research evaluationThe changing world of research evaluation
The changing world of research evaluation
 
Communicating Research Impact
Communicating Research ImpactCommunicating Research Impact
Communicating Research Impact
 
Clement, A measured approach to supporting research productivity
Clement, A measured approach to supporting research productivityClement, A measured approach to supporting research productivity
Clement, A measured approach to supporting research productivity
 
Workshop intro090314
Workshop intro090314Workshop intro090314
Workshop intro090314
 
Understanding impact through alternative metrics: developing library-based as...
Understanding impact through alternative metrics: developing library-based as...Understanding impact through alternative metrics: developing library-based as...
Understanding impact through alternative metrics: developing library-based as...
 
Introduction to Altmetrics for Medical and Special Librarians
Introduction to Altmetrics for Medical and Special LibrariansIntroduction to Altmetrics for Medical and Special Librarians
Introduction to Altmetrics for Medical and Special Librarians
 
Practical applications for altmetrics in a changing metrics landscape
Practical applications for altmetrics in a changing metrics landscapePractical applications for altmetrics in a changing metrics landscape
Practical applications for altmetrics in a changing metrics landscape
 
Anu digital research literacies
Anu digital research literaciesAnu digital research literacies
Anu digital research literacies
 
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENT
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENT
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENT
 
How to measure the impact of Research ?
How to measure the impact of Research ?How to measure the impact of Research ?
How to measure the impact of Research ?
 
BLC & Digital Science: Kevin Gardner, University of New Hampshire
BLC & Digital Science:  Kevin Gardner, University of New HampshireBLC & Digital Science:  Kevin Gardner, University of New Hampshire
BLC & Digital Science: Kevin Gardner, University of New Hampshire
 
What's new at Crossref - Ed Pentz - London LIVE 2017
What's new at Crossref - Ed Pentz - London LIVE 2017What's new at Crossref - Ed Pentz - London LIVE 2017
What's new at Crossref - Ed Pentz - London LIVE 2017
 
Strategic planning for research uptake
Strategic planning for research uptakeStrategic planning for research uptake
Strategic planning for research uptake
 

More from National Information Standards Organization (NISO)

More from National Information Standards Organization (NISO) (20)

Bazargan "NISO Webinar, Sustainability in Publishing"
Bazargan "NISO Webinar, Sustainability in Publishing"Bazargan "NISO Webinar, Sustainability in Publishing"
Bazargan "NISO Webinar, Sustainability in Publishing"
 
Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"
 
Compton "NISO Webinar, Sustainability in Publishing"
Compton "NISO Webinar, Sustainability in Publishing"Compton "NISO Webinar, Sustainability in Publishing"
Compton "NISO Webinar, Sustainability in Publishing"
 
Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"
 
Hazen, Morse, and Varnum "Spring 2024 ODI Conformance Statement Workshop for ...
Hazen, Morse, and Varnum "Spring 2024 ODI Conformance Statement Workshop for ...Hazen, Morse, and Varnum "Spring 2024 ODI Conformance Statement Workshop for ...
Hazen, Morse, and Varnum "Spring 2024 ODI Conformance Statement Workshop for ...
 
Mattingly "AI & Prompt Design" - Introduction to Machine Learning"
Mattingly "AI & Prompt Design" - Introduction to Machine Learning"Mattingly "AI & Prompt Design" - Introduction to Machine Learning"
Mattingly "AI & Prompt Design" - Introduction to Machine Learning"
 
Mattingly "Text and Data Mining: Building Data Driven Applications"
Mattingly "Text and Data Mining: Building Data Driven Applications"Mattingly "Text and Data Mining: Building Data Driven Applications"
Mattingly "Text and Data Mining: Building Data Driven Applications"
 
Mattingly "Text and Data Mining: Searching Vectors"
Mattingly "Text and Data Mining: Searching Vectors"Mattingly "Text and Data Mining: Searching Vectors"
Mattingly "Text and Data Mining: Searching Vectors"
 
Mattingly "Text Mining Techniques"
Mattingly "Text Mining Techniques"Mattingly "Text Mining Techniques"
Mattingly "Text Mining Techniques"
 
Mattingly "Text Processing for Library Data: Representing Text as Data"
Mattingly "Text Processing for Library Data: Representing Text as Data"Mattingly "Text Processing for Library Data: Representing Text as Data"
Mattingly "Text Processing for Library Data: Representing Text as Data"
 
Carpenter "Designing NISO's New Strategic Plan: 2023-2026"
Carpenter "Designing NISO's New Strategic Plan: 2023-2026"Carpenter "Designing NISO's New Strategic Plan: 2023-2026"
Carpenter "Designing NISO's New Strategic Plan: 2023-2026"
 
Ross and Clark "Strategic Planning"
Ross and Clark "Strategic Planning"Ross and Clark "Strategic Planning"
Ross and Clark "Strategic Planning"
 
Mattingly "Data Mining Techniques: Classification and Clustering"
Mattingly "Data Mining Techniques: Classification and Clustering"Mattingly "Data Mining Techniques: Classification and Clustering"
Mattingly "Data Mining Techniques: Classification and Clustering"
 
Straza "Global collaboration towards equitable and open science: UNESCO Recom...
Straza "Global collaboration towards equitable and open science: UNESCO Recom...Straza "Global collaboration towards equitable and open science: UNESCO Recom...
Straza "Global collaboration towards equitable and open science: UNESCO Recom...
 
Lippincott "Beyond access: Accelerating discovery and increasing trust throug...
Lippincott "Beyond access: Accelerating discovery and increasing trust throug...Lippincott "Beyond access: Accelerating discovery and increasing trust throug...
Lippincott "Beyond access: Accelerating discovery and increasing trust throug...
 
Kriegsman "Integrating Open and Equitable Research into Open Science"
Kriegsman "Integrating Open and Equitable Research into Open Science"Kriegsman "Integrating Open and Equitable Research into Open Science"
Kriegsman "Integrating Open and Equitable Research into Open Science"
 
Mattingly "Ethics and Cleaning Data"
Mattingly "Ethics and Cleaning Data"Mattingly "Ethics and Cleaning Data"
Mattingly "Ethics and Cleaning Data"
 
Mercado-Lara "Open & Equitable Program"
Mercado-Lara "Open & Equitable Program"Mercado-Lara "Open & Equitable Program"
Mercado-Lara "Open & Equitable Program"
 
Ratner "Enhancing Open Science: Assessing Tools & Charting Progress"
Ratner "Enhancing Open Science: Assessing Tools & Charting Progress"Ratner "Enhancing Open Science: Assessing Tools & Charting Progress"
Ratner "Enhancing Open Science: Assessing Tools & Charting Progress"
 
Pfeiffer "Enhancing Open Science: Assessing Tools & Charting Progress"
Pfeiffer "Enhancing Open Science: Assessing Tools & Charting Progress"Pfeiffer "Enhancing Open Science: Assessing Tools & Charting Progress"
Pfeiffer "Enhancing Open Science: Assessing Tools & Charting Progress"
 

Recently uploaded

ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxVanesaIglesias10
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Projectjordimapav
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...SeĂĄn Kennedy
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...JojoEDelaCruz
 
USPSÂŽ Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPSÂŽ Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPSÂŽ Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPSÂŽ Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...Postal Advocate Inc.
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management SystemChristalin Nelson
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxlancelewisportillo
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 

Recently uploaded (20)

ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptx
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Project
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...
 
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
 
USPSÂŽ Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPSÂŽ Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPSÂŽ Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPSÂŽ Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management System
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 

Falk-Krzesinski, "Administrator (Institutional Use of the Data): Data-informed Strategic Planning for the Research Enterprise"

  • 1. Research Metrics: Data-informed Strategic Planning for the Research Enterprise NISO Virtual Conference ♦ February 20, 2019 Is This Still Working? Incentives to Publish, Metrics, and New Reward Systems Holly J. Falk-Krzesinski, PhD Vice President, Research Intelligence ♦ Elsevier
  • 2. Presentation Roadmap • Research metrics for institutions • Data for metrics in institutional research information systems
  • 3. Presentation Roadmap • Research metrics for institutions • Data for metrics in institutional research information systems
  • 4. Strategic Context for Research Metrics • Decreasing government grant funding for research • Increasing competition for government research funding • Rise of interdisciplinary and international grand challenge themes • Increased team science and cross-sector collaboration • Competition to attract the best research leaders globally • Growing need to demonstrate both economic and social impact of research
  • 5. Research Metrics at Different Levels https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics Journal Level • CiteScore • Journal Impact Factor • Scimago Journal Rank (SJR) • Source Normalized Impact Per Paper (SNIP) Article Level • Citation count • Citations per paper • Field-Weighted Citation Impact (FWCI) • Outputs in top quartile • Citations in policy and medical guidelines • Usage • Captures • Mentions • Social media Researcher Level • Document count • Total citations • h-Index • i10-Index • g-Index
  • 6. Research Metrics Throughout the Research Process
  • 7. Categories of Metrics for Analysis https://www.elsevier.com/__data/assets/pdf_file/0020/53327/ELSV-13013-Elsevier-Research-Metrics-Book-r5-Web.pdf
  • 8. Analyze research strengths Determine where research is a good potential investment Demonstrate impact (Return On Investment) of research funding Showcase researchers or identify rising stars Tell a better narrative about everything that is happening with research Research Metrics Use Cases
  • 9. A Building Need to Demonstrate Impact https://report.nih.gov/nihdatabook/report/20
  • 10. “Given how tight budgets are around the world, governments are rightfully demanding effectiveness in the programs they pay for. To address these demands, we need better measurement tools to determine which approaches work and which do not.” Bill Gates Gates Foundation Annual Letter 2013
  • 12. • Number of Library holdings (WorldCat OCLC) • Views on Slideshare • Plays on YouTube • Amazon book reviews • Clinical citations or Health policy/guideline citations • Government policy citations • News mentions • Patent citations • Academic: Industry partnerships • Licenses • Business consultancy activities • Number of patents filed and granted • Wikipedia citations • Blog mentions • StackExchange links • Downloads from Github, RePEc, IRs • Citations (field normalised, %iles, counts) • Collaborators on Github • Full text, pdf, html views on ScienceDirect, Figshare etc • Social media metrics (Shares, likes, +1, Tweets) Educational impact Societal impact Commercial impact Innovation Informational impact Academic impact Promotion / attention / buzz Types of impact Diverse Set of Metrics for Demonstrating Impact Qualitativeinput:Expertfeedbackonqualityandimpactofresearch
  • 13. Wide Range of Research Output Types •abstracts •articles •audio files •bibliographies •blogs •blog posts •books •book chapters •brochures/pamphlets •cases •catalogues •clinical trials •code/software •collections •commentaries •conference papers •corrections •data sets •designs/architectural plans •editorials •exhibitions/events •expert opinions •file sets •figures •government documents •grants •guidelines •images •interviews •issues •journals •learning objects •lectures/presentations •letters •live performances •manuscripts •maps •media files •musical scores •newsletters •news •online courses •papers •patents •policy •posters •preprints •press releases •projects •recorded works •reference entries/works •reports •research proposals •reviews •retractions •speeches •standards •syllabi •technical documentation •textual works •theses/dissertations •videos •visual arts •volumes •web pages •web resources •other https://plumanalytics.com/learn/about-artifacts/
  • 14. https://rdmi.uchicago.edu/papers/08212017144742_deWaard082117.pdf Research Data Metrics Goal: Metric: How to measure Research Data is Shared: 1 Stored, i.e. safely available in long-term repository) Nr of datasets stored in long-term storage Mendeley Data, Pure; Plum Indexes Figshare, Dryad, Mendeley Data and working on Dataverse 2. Published, i.e. long-term preserved, accessible via web, have a GUID, citable, with proper metadata Nr of datasets published, in some form Scholix, ScienceDirect, Scopus 3. Linked, to articles or other datasets Nr of datasets linked to articles Scholix, Scopus 4. Validated, by a reviewer/curated Nr of datasets in curated databases/peer reviewed in data articles Science Direct, DataSearch (for curated dBses) Research Data is Seen and Used: 5. Discovered Nr of datasets viewed in databases/websites/search engines DataSearch, metrics from other search engines/repositories 6. Identified DOI is resolved DataCite has DOI resolution: made available? 7. Mentioned Social media and news mentions Plum and Newsflo 8. Cited Nr of datasets cited in articles Scopus 9. Downloaded Downloaded from repositories Downloads from Mendeley Data, access data from Figshare/Dryad 10. Reused Mention of usage in article or other dataset ScienceDirect, access to other data repositories
  • 15. Open Science Metrics • Impact of Open Science • Engagement in Open Science activities and impact of that engagement https://ec.europa.eu/research/openscience/pdf/os_rewards_wgreport_final.pdf
  • 16. Always use both qualitative and quantitative input into your decisions Always use more than one research metric as the quantitative input Using multiple metrics drives desirable changes in behavior There are many different ways of representing impact A research metric’s strengths can complement the weaknesses of other metrics Combining both approaches will get you closer to the whole story Valuable intelligence is available from the points where these approaches differ in their message This is about benefitting from the strengths of both approaches, not about replacing one with the other Golden Rules for Using Research Metrics
  • 17. Mechanisms for Gathering Data for Metrics From the NISO Code of Conduct for Altmetrics: https://www.niso.org/press-releases/2016/02/niso-releases-draft-recommended-practice-altmetrics-data-quality-public Describe all known limitations of the data Provide a clear definition of each metric Describe how data are aggregated Detail how often data are updated
  • 18. Responsible Metrics • Robustness: basing metrics on the best possible data in terms of accuracy and scope • Humility: recognizing that quantitative evaluation should support – but not supplant – qualitative, expert assessment • Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results • Diversity: accounting for variation by field, and using a variety of indicators to support diversity across the research system • Reflexivity: recognizing systemic and potential effects of indicators and updating them in response http://www.hefce.ac.uk/pubs/rereports/year/2015/metrictide/
  • 20. Presentation Roadmap • Research metrics for institutions • Data for metrics in institutional research information systems
  • 21. Gathering Data for Evidence Gather data as you go along rather than retrospectively Think about what success would look like for each question or impact activity and how to evidence it Use all the data available, be clear & specific, and build a coherent narrative to provide context
  • 22. Data in Institutional Systems • Persons - researchers, postgraduate students, external persons • Organizations - faculties, departments, research groups, external units • Publications - peer-reviewed journal articles, books, chapters, theses, non-textual, etc. • Publishers and journals - names, IDs, ratings • Bibliometrics - citations, impact factors, altmetrics • Activities - conferences, boards, learned societies, peer reviewing, prizes • Narratives - narrative recordings of the impact of research • Datasets - stored locally or in separate data repository • Equipment - type, placement, ownership details • Funding opportunities - funder, program, eligibility, etc. • Grant applications - stage, funder, program, amount applied, documents attached • Grant awards - funder, program, amount, dates, contract docs, applicants, budget • Projects - budget, expenditure, participants, collaborators, students, outputs • Press clippings - national and international papers, electronic media Joachim SchĂśpfel et al. / Procedia Computer Science 106 (2017) 305 – 320
  • 23. Sources of Data and Ingestion Options Type of Data Source(s) of Data Ingestion into Pure, a CRIS Persons Internal HR system Pure XML format (automatic recurring sync job) Organizations Internal HR system Pure XML format (automatic recurring sync job) Publications Manual entry Online sources, e.g. Scopus Legacy systems User-friendly templates Single-record import Automated import by person Automated import by department Pure XML format (single or repeated legacy import) Elsevier PRS service Publishers and journals Manual entry Online sources, e.g. Scopus Legacy systems Automatically together with import Elsevier PRS service Pure XML format (single or repeated legacy import) Bibliometrics Scopus Web of Science Automatically together with import Automatic sync job for citations (Scopus and WoS) Pure XML format (single or repeated legacy import) Elsevier PRS service (Scopus bibliometrics only) Activities Manual entry Legacy systems User-friendly templates XML format for legacy import Narratives Manual entry User-friendly templates Datasets Manual entry Legacy systems User-friendly templates XML format for legacy import Elsevier’s Pure’s datamodel
  • 25. Research Metrics in Research Information Systems https://plumanalytics.com/integrate/load-your-data/pure-plumx-integration/
  • 26. Thank you! Holly J. Falk-Krzesinski, PhD Email: h.falk-krzesinski@elsevier.com Tel: +1 847-848-2953

Editor's Notes

  1. https://www.niso.org/events/2019/02/still-working-incentives-publish-metrics-and-new-reward-systems Research Metrics: Data-informed Strategic Planning for the Research Enterprise Administrator (Institutional Use of the Data) perspective As competition for extramural research funding continues to increase and resources become more difficult to acquire and even maintain, universities and other research institutions are relying more heavily on data to help inform their decision-making.  In this session, I will address considerations for research leaders and institutional administrators using research information systems, data, metrics and analytics to support the strategic planning for their institutions’ research enterprise.  Holly J. Falk-Krzesinski, PhD is the Vice President, Research Intelligence on the Global Strategic Networks team at Elsevier, an information analytics company. Her key role is building and maintaining long term relationships with research institutions and funding bodies, giving voice to research leaders at those organizations within Elsevier to help the business deliver the most impactful solutions to support research globally. Dr. Falk-Krzesinski also focuses on how open science is advancing, in particular, how institutions are addressing issues of recognition and reward for research data sharing throughout the research life cycle. Prior to joining Elsevier, Dr. Falk-Krzesinski was a faculty member and administrator at Northwestern University. Notably, she launched the central Office of Research Development and examined the use of various tools to support intra- and inter-institutional scientific collaboration and demonstrate the impact of the university’s research programs. She also investigated how universities are changing structures to reward engagement in interdisciplinary research and team science.  NORDP Workshop A Basket of Metrics for Research Evaluation Increasingly, institutions are interested in tracking and reporting on research outputs to understand their strengths, set goals, chart progress, and make budgetary decisions.  Universities globally are, more and more, investing in an evidence-based approach to develop a clear understanding of their position and progress.  When used correctly, research metrics, together with qualitative input, offer a balanced, multi-dimensional view for decision-making.  While metrics help illuminate the impact of research outputs, they can be a challenge for researchers, research development professionals, and other research leaders unfamiliar with when best to use which metrics.  And importantly, when research metrics are misunderstood they have the potential to be misused, becoming a serious point of contention.  This session will provide an overview of four levels of research metrics—institutional (rankings); journal (e.g., CiteScore); article (e.g., citations); and author (e.g., h-index)—to guide university decision makers to assemble the most appropriate “basket of metrics” for their institutions’ research evaluation needs.  The need for research metrics are is pervasive, to support research institutions and researchers There are different types of metrics Elsevier is evolving our research metrics strategy to empower scholars/researchers to claim the narrative of what they do and why it matters
  2. Interpretation of Data/Metrics at Administrative Level (Provost, Univ Library, etc.) Research Information Systems (value-add, cautionary notes, etc.)
  3. Interpretation of Data/Metrics at Administrative Level (Provost, Univ Library, etc.) Research Information Systems (value-add, cautionary notes, etc.)
  4. Traditional article-related metrics Institutional aggregate
  5. Metrics can and are calculated across all areas of the research workflow from input metrics such as grant awards volume, process metrics such as income volume or amount spent through to metrics around outputs, outcomes and impacts. There are lots of metrics in the areas in orange and these are associated more with the traditional bibliometrics measures such as usage and citations and we provide many of these in our tools such as Scopus and SciVal. Looking further to the right though around areas such as engagement and impact, there is opportunity for innovation and this is where we are currently pushing so we can start to uncover more of the stories around research impact for example.   To enable us to achieve this, we acquired Plum Analytics early in 2017 which is allowing us to markedly extend the basket of metrics we can offer and the research outputs we can track and analyse. Plum currently tracks over 100 million research artifacts and has captured billions of interactions with these artifacts from over 40 different sources of metrics. By combining Elsevier’s rich data with Plum Analytics capabilities we can now monitor research across all disciplines much more effectively and extensively to help the research community to get closer to the whole story of how research is being engaged with and the impact the research is having.   In addition to expanding our technical capabilities in this area, we are a partner in an initiative called Snowball Metrics. The initiative has successfully developed robust and clearly defined metric methodologies, across the whole research workflow to enable confident comparison in an ‘apples to apples’ way. Importantly, the development of the metric methodologies or ‘recipes’ was sector led and is owned by research intensive universities around the world. The recipes are available free of charge for anyone to use and implement and are system and supplier agnostic. The initiative has now defined 32 metric methodologies which are available to download in the 3rd edition of the recipe book.
  6. Let’s talk about the second golden rule “Always use more than one research metric as the quantitative input”. Scopus aims to provide a basket of research metrics to measure research performance. This slide is showing a basket of metrics for measuring research excellence, where each theme has a different set of metrics associated with it. These research metrics are supported by qualitative input (remember golden rule #1) as represented by the grey bar on the left side. It’s important to make metrics available for all the different entities that you would want to measure: authors, institutions, journals, subject fields, etc.
  7. Let’s talk about the second golden rule “Always use more than one research metric as the quantitative input”. Scopus aims to provide a basket of research metrics to measure research performance. This slide is showing a basket of metrics for measuring research excellence, where each theme has a different set of metrics associated with it. These research metrics are supported by qualitative input (remember golden rule #1) as represented by the grey bar on the left side. It’s important to make metrics available for all the different entities that you would want to measure: authors, institutions, journals, subject fields, etc.
  8. Research metrics play a significant role in key decisions of institutions and funders Research metrics can be used for a variety of purposes such as to help analyze research strengths, identify hot topics or areas for investment, to demonstrate the return on an investment or program, showcase researcher performance or to identify rising stars or to help tell a narrative around research and demonstrate impact for example. In all cases however, we recommend using more than one metric and also complementing the quantitative information the metrics provide with qualitative judgment or expert opinion. Recognition and reward of individual researchers International rankings Institutional benchmarking Portfolio analysis Research evaluation Measuring collaboration Demonstrating impact
  9. Ever-Increasing Competition for Research Funding NIH Data Book Competition for Research $’s is more competitive than ever- # applicants *way* up Award ($) funding similar to 2003 levels % submissions accepted down year over year
  10. Metrics to measure a return on investment
  11. Becker Medical Library Model for Assessment of Research Impact THE MODEL FOR ASSESSMENT OF RESEARCH IMPACT IS A FRAMEWORK FOR TRACKING DIFFUSION OF RESEARCH OUTPUTS AND ACTIVITIES TO LOCATE INDICATORS THAT DEMONSTRATE EVIDENCE OF BIOMEDICAL RESEARCH IMPACT.
  12. This slide is showing a basket of article-level metrics for measuring research excellence, importantly these research metrics are supported by qualitative input as represented by the grey bar on the right side of the figure In this slide, I have tried to demonstrate example metrics and the different things they can indicate for a piece of research. I will not go through all of the examples but ways you could use research metrics could be in demonstrating: educational impact through looking at the number of libraries holding your book, To demonstrate Societal impact you could look at the number of clinical citations in health policy documents or guidelines to demonstrate an effect on clinical practice. Or you could demonstrate more academic impact through citation metrics, usage metrics such as downloads on sites such as SSRN or for Computer Science downloads or code forks in GitHub.   One area we are focused on currently though is exploring how we can use our capabilities and technology to help demonstrate the impact research has in the policy space to further investigate the relationship between research and policy. We would like to explore, understand and so help actors in both the research and policy space understand this relationship more fully by leveraging our capabilities and technology more effectively.   Being able to extract the references to research in policy documents is one way to help but requires us to have access to policy documents so we can extract and analyze them.   We could then use our technology to identify research, researchers or institutions being cited or referenced in policy documents as well as potentially the context around the citation or reference. For example, was the research used instrumentally or as an idea to influence the policy climate.   Or was a researcher influencing policy by being a participant in a consultation
  13. Unlike Scopus, which is an objective set of curated content, PlumX considers a very broad range of research output that the research community considers, all of which can be collected within Pure
  14. Historically a focus on publications as the major research output, but Research Data are growing in their importance as a research output, one that also has an associated set of metrics, but different than publications. Credit for Sharing and Reuse of Research Data, Framework that affords credit throughout the data lifecycle Demonstrate the value of data sharing and reward data discovery behavior Work closely with NISO and BD2K working group Show value of data sharing and reward data discovery behavior Two forms of research data Research Datasets – experimental datasets, as stored in data.mendeley.com and in other repositories Research Data Entities –data objects defined by experimental research, for example, genes, proteins, astronomical bodies; these are often identified by type specific IDs, e.g., accession numbers, which usually take the form of URLs to particular domains Types of metrics Usage data Citation data Linking and sharing data (aka altmetrics) Development of a new Data Citation Index Working closely with NISO on their research data recommendations
  15. To provide a balanced, multi-dimensional view
  16. See more at: http://plumanalytics.com/niso http://www.niso.org/news/pr/view?item_key=72abb8f785b18bbe2cdfdb8b6a237c21f75e6a2f https://plumanalytics.com/wp-content/uploads/2018/10/NISO-Self-Reporting-Table-Plum-Analytics-October-15-2018.pdf
  17. CiteScore is a simple and transparent metric for all Journals Indexed in Scopus CiteScore is essentially the average citations per document that a title receives over a three-year period. This is the calculation for the CiteScore 2015 value of a particular journal A= Sum of citations received in 2015 to documents published in the journal during 2012, 2013, and 2014 B= All of the documents indexed in Scopus published during 2012, 2013, and 2014 CiteScore metrics are available for all Scopus serial types: Peer review journals, including supplements and special issues; Book series; Conference proceedings; and Trade journals Rather than using a two- or five-year citation window, CiteScore uses three. Research over the years has found that in slower-moving fields, two years worth of data is too short; yet five years is too long to consider in faster-moving fields. The peer-reviewed bibliometric literature shows that three years is the best compromise for a broad-scope database, such as Scopus. It incorporates a representative proportion of citations in all disciplines while also reflecting relatively recent data. Snowball Metrics Defined and agreed by research-intensive universities Commonly understood metrics to uncover research strengths by benchmarking, provide valuable input into strategic decision making Tested methodologies that are data and tool provider agnostic Recipes that are owned by universities, and are open for the community to use without cost or restriction A set of global standards and cover the entire spectrum of research activities
  18. Interpretation of Data/Metrics at Administrative Level (Provost, Univ Library, etc.) Research Information Systems (value-add, cautionary notes, etc.)
  19. Underlying all metrics are data, this data-informed approach offer trustworthy evidence
  20. Harkening back to the Range of Research Output Types from the previous section Pure aggregates an organization's data from numerous internal and external sources, and ensures the data that drives strategic decisions is trusted, comprehensive and accessible in real time. Interrelated actors that are all connected A highly versatile centralized system, Pure enables your organization to build reports, carry out performance assessments, manage researcher profiles, enable research networking and expertise discovery and more, all while reducing administrative burden for researchers, faculty and staff.
  21. On this slide, simply walk your audience through the text on the slide –first the type of content, then the source where it comes from, and then the method of getting it into Pure Availability of APIs enable sharing of data and metrics into an institution’s own systems (e.g., dashboards, web sites, platforms, repositories) Enhanced transparency Ability to combine data and metrics from multiple sources
  22. Batelle Snowball Metrics working group: https://www.osti.gov/servlets/purl/1462196 “The purpose of the Battelle Snowball Metric Working Group is to use the Snowball Metrics framework to build and employ a method to calculate metrics that will enable Battelle-affiliated National Laboratories to better understand their strengths and weaknesses in a few representative areas.” “There is an advantageous alignment of the recommended Snowball Metrics with the U.S. Department of Energy (DOE) Performance Evaluation and Measurement Plan (PEMP) framework. Using the PEMP as a foundation, an integrated set of metrics, which include Snowball Metrics, can help inform Battelle on the progress of delivering S&T results that contribute to and enhance DOE’s mission by providing world-class scientific research capacity and advancing scientific knowledge through peer-reviewed scientific results.” The working group, as a consensus, recommends the following subset of Snowball Metrics: Scholarly Output Collaboration Intellectual Property Volume Citations per Output Example dashboard: http://maryland.sla1.org/wp-content/uploads/2018/09/PNNL-Day_Justin.pdf
  23. Customizable dashboards provide administrators with an overview of strategically important metrics Dashboards can be personalized, shared and used for monitoring and reporting User controls ensure that only data relevant to the user are visible Metrics integrated w/i RIS, allows viewing at different aggregate levels, e.g., individual researchers, departmental group of faculty, etc