Research Impact in Specialized
Settings: 3 Case Studies
Elaine M. Lasda, University at Albany
Rebecca Welzenbach, University of Michigan
Keith Maull, National Center of Atmospheric Research
Taylor Abernethy Johnson, RTP Library, Environmental Protection Agency
Specialized Contexts
• Impact of their research reaches beyond the academy
• Differing missions require differing metrics
• Purposes/benefits of metrics expand beyond employee
promotion/tenure/institutional rankings etc.
• Demonstrating value in broader arenas, to broader constituencies
Book Genesis
• Meeting of the Minds with Emerald
• Special(ized) Libraries and Librarians
• Personal Network
• Variety is the Spice of Life
University/
National Center
for Atmospheric
Research
Keith Maull
&
Matthew Mayernik
Environmental
Protection
Agency
Taylor Abernethy Johnson
&
Anthony Holderied
Michigan
Publishing
Rebecca Welzenbach
Institute for
Transportation
Studies
Kendra K. Levine
LA Museum of
Natural History
Richard P. Hulser
Scholarly Metrics
at NCAR
Keith E. Maull
Matthew Mayernik
NCAR Library Metrics Brief History
• 2015
• Origins in “NCAR Fact Sheets” (manual, laborious)
• Initial development of research metrics services
• Initial implementation of metrics software API
• 2016
• Continued API development and Dashboard conception
• First application of services for NSF site visit reports for NCAR labs
• 2017
• Additional services delivered to UCP/Unidata and others
• 2018-2019
• Continued expansion of metrics services and support
• 2020
• Exploration of dataset and software metrics (Dataset DOIs, open source software on Github,
etc.)
NCAR Library Metrics Mission
Our objective is to deliver
research metrics
of value and interest
to NCAR community stakeholders
through services, software and
support.
Metrics Foci
• Develop workflows to collect,
analyze and reporting research and
scholarly research metrics.
• Develop partnerships and
collaborations with institutional
stakeholders to serve their specific
metrics needs.
• Provide on-going as well on-
demand/ad-hoc services to
institutional, lab and group clients.
Research Metrics Services
INSTITUTION
LAB ∙ PROGRAM
DIVISION ∙ GROUP
FACILITY ∙ INSTRUMENT
MODEL ∙ PLATFORM ∙
EDUCATIONAL RESOURCES
INDIVIDUAL
• Global benchmarking
• UCP Programs / Partners
• Funding sources
• UCP related materials
• Unidata
• Models (WRF, etc.)
• Supercomputer
• Observation Platforms
• UCP25
• NSF Site Visit Reviews
• FIM-COP Pilot
• ARG
PEER
-REVIEWED
PUBLICATIONS
• h-index analysis
• Publication profiles
• Collaboration analyses
(geographic, author, institution)
Analyze
Manage
Plan
Report Interpret
Share
Organize
• Identify and prioritize
metrics needs
• Review existing data
inputs
• Develop metrics plan
and outcomes
• Collect research paper
bibliographies
• Organize research papers
for review and analysis
• Provide web API for
automated management
• Ongoing curation (e.g.
citation count updates)
• Perform bibliometric
analyses
• Synthesize with other
data sources
• Provide customized dashboard
and overview reports
• Provide web API access to
reporting data
• Identify higher level reporting
opportunities
Facilities and lab managers,
institutional leadership
Collect
Administrative and
technical staff, research
librarians and interns
Administrative and
technical staff, Facilities
and lab managers,
scientists
Research
Metrics
Workflow
Research Metrics Integrations
Research Metrics Capabilities
METRICS SERVICES
Metrics Outputs Research Metrics
Dashboards
Submission Forms
Metrics Inputs
Data
Providers Analytics + Computation
…
Web Display / Live
Data Views
Reports
• Formatted
publication + DOI
lists
• Unprocessed raw
data files
• NSF Grant IDs
• Author lists
• Program names
• …
Programmatic
API
Research
Metrics EITL
Case studies
NCAR LIBRARY RESEARCH METRICS
Research Metrics Use Case Study
The CISL Supercomputer Survey
• CISL supercomputer users
surveyed annually (2008-2014)
• Survey included peer-reviewed
publications questions
• Data semi-structured textual
citations (un-normalized)
• Some manual effort required to
clean the data
• Automated extraction of
publication data
• Automated discovery of citation
data (CrossCite)
• Utilized TR-WOS data to extract
detailed data profile
• Developed analysis of resulting
data
Automation and Analysis
Data
meteorology
& atmospheric
sciences
geosciences-
multidisc.
oceanography
environmental
sciences
astronomy &
astrophysics
geochemistry &
geophysics
physics-fluids &
plasmas
multidisc. sciences
mechanics
geography-physical
computer science-
interdisc.
applications
CISL Survey Analysis Highlights
803
TOTAL PEER-
REVIEWED
PUBLICATIONS*
547 TOTAL UNIQUE
INSTITUTIONS
116 TOTAL UNIQUE
JOURNALS
31 DISTINCT
PUBLISHERS
MOST CITED LEAD INSTITUTIONS
* : Int’l Affiliate, †: UCAR Member
Institution
Lead
Pubs
Cites Per
Lead Pub
Total
Pubs
Georgia Inst Technol† 3 89.67 10
Univ Sheffield 1 79.00 2
Argonne Natl Lab 1 67.00 4
WHOI† 1 67.00 1
Univ Toronto† 1 66.00 4
Seoul Natl Univ* 2 53.00 7
NASA GSFC 1 53.00 1
Royal Netherlands
Meteorol Inst KNMI
De Bilt 1 52.00 1
Univ Maryland† 13 50.54 28
Dalian Univ Technol 1 49.00 1
Rutgers State Univ† 6 47.83 26
Univ Lancaster 1 46.00 4
Desert Res Inst† 1 44.00 4
TOP LEAD AUTHORS (by pub count)
8 S. Vavrus Univ Wisconsin
6 T. Krishnamurti Florida State Univ
6 Z. Liu† Univ Wisconsin
5 Z. Wang Univ Illinois
4 G. Jin Chinese Acad Sci
4 B. Kirtman Univ Miami
PROLIFIC AUTHORS (by pub count)
† : ASP Fellow, *: NCAR Faculty Fellowship, 2005
18 L. Wang Univ Delaware
17 N. Mahowald Cornell Univ
14 B. Kirtman Univ Miami
13 E. Maloney† Colorado State Univ
13 S. Vavrus† Univ Wisconsin
CITATIONS PER
PUBLICATION
13007 TOTAL
CITATIONS
16.20
TIMES THE TOP 5 NON-NCAR
LED PAPERS WERE CITED
1202
Journal Total Pubs
Journal of Climate 122
J. Geophys. Res. 100
Geophysical Research Letters 67
Mon. Wea. Rev. 55
Atmos. Chem. Phys. 51
Journal of the Atmospheric
Sciences
42
Research Metrics Use Case Study
2016 NSF Site Visit Report for NCAR Modeling
• Publications applicable to the
various models developed by
NCAR (WRF, CESM, etc.)
• Publications curated by
directors, admins and some
scientists
• Span of 10 years of publications
• Some publications required
manual and semi-automated
lookup of publication titles
Automation and Analysis
• Automated extraction of
publication data
• Automated discovery of citation
data (CrossCite)
• Utilized TR-WOS data to extract
detailed data profile
• Developed analysis of resulting
data
Data
Modeling and Data Assimilation Publications Fact Sheet
2005-2015 User Publication Analysis
About the Publications:
Publications were found by searching NCAR’s OpenSky Institutional Repository. This collection represents NCAR’s refereed publications from 2005 to 2015;
Also collected from Thomson Reuters Web of Science Index;
Models captured by the publication list include, but not limited to, CAM-chem, MOZART, WACCM, WRF-chem, CESM and WRF;
Others were derived from internally maintained research bibliographies.
5023 Total
Peer-Reviewed Publications
6.2 Avg. Authors/Pub
20.9 Avg. Citations/Pub
13.6 Avg. Citation/Pub in
Atmospheric Sciences
105,076 Total Citations
2,110 Unique Institutions
8,904 Unique Authors
* One pin per location, does not include duplicates
Top 100 Institutions with the Greatest Number of
Authors Citing NCAR resources*
Top 10 Impact Factor Journals by Pub Count
Top 10 Institutions by
Author Count
Most Common Topics
1. University of Colorado 848
2. NOAA 748
3. Pacific NW Natl Lab 699
4. Chinese Acad Sci 616
5. NASA 560
6. Univ Wisconsin 328
7. Univ Washington 288
8. Univ Calif Berkeley 236
9. Lawrence Livermore Natl Lab 235
10.Caltech 230
Key Takeaways
• Automation is essential for any
of your metrics efforts
• Stakeholders need to be
partners in data collection and
interpretation
• Calibrating what is possible helps
keep expectations in line with
capabilities
• Create a menu of products in
line with metrics needs
• Developing a team dedicated to
metrics is the only way to
sustain success
• Stakeholders must understand
“metrics” should extend much
further than papers
• Incentives to collect data may
help in the efforts – everyone
wants the results, but don’t
realize the requirements
One of >20 libraries in National
Library Network
• One of three repositories
• Top research facility
• Highest foot traffic, reference
transactions, ILL requests
Staffed by contractors through
UNC-SILS since 1975
• Five full-time staff
• Eight student interns
LITERATURE SEARCHING | REFERENCE | INTERLIBRARY LOAN | INSTRUCTION | CATALOGING
SERIALS | EPA DOCUMENT PUBLISHING | ENDNOTE SUPPORT | BIBLIOMETRIC SERVICES
22
“Measuring
Your
Research
Impact”
Webinar
Officially
rolled out
Research
Impact
Reports
Production &
Prototyping
“Article
Impact
Reports-A
New EPA-RTP
Library
Service”
Webinar
Library EXPO
and STAA
award season
Investigating
Group Impact
&
Automation
2016 Now
THE
NEW
METRICS
Research Impact
Reports
• Analyzes a set of publications
• Presents author-, article-, journal-level
metrics
• Packages data with graphical illustrations
Article Impact
Reports
• For a single article and based on its set of
citations
• Data from citing sources revealing:
• Geographic identities
• Journals & organizations
• Research areas
• Alternative metric activity (social media,
news, usage)

Since
The New Metrics
• Building AIR Program
• Investigating Group Impact
• Bibliometric analysis of EPA
research, models, & tools
• Automation
Compliance & Defiance: Michigan Publishing's
Early Encounters with Research Impact Metrics
Rebecca Welzenbach
Research Impact Librarian
University of Michigan Library
SLA 2020
University of Michigan Press
● Founded in 1930
● Part of the U-M Library since 2009
● 80-100 monographs per year with
disciplinary strength in Classical
studies, performance studies,
political science, African and Asian
Studies, and more
● OA titles funded by Knowledge
Unlatched, TOME, and more
● Acquisitions, Production, Marketing
& Outreach, Business and
Administration, technology
● Relies (mostly) on a sales model
U-M Press and research impact metrics
Historically: little involvement. But now, more is expected, requested, mandated. How best to engage
effectively? Two options:
● Compliance: We can work to ensure that our publications are consistently recognized by and included
in the systems and datasets upon which existing metrics are calculated.
● Defiance: We can articulate new (alternative) metrics that are meaningful for us and our stakeholders
University of Michigan Press (Monographs)
Compliance
● Books have long been totally absent from the research impact metrics space. Where they’ve been
indexed, the record is inconsistent: BKCI-SSH has indexed 194 UM Press titles while Scopus has
indexed 916
● Newer players Google Scholar and Dimensions Plus are changing what’s possible to know and show--
but now we’re turning up a lot of gaps.
● Citation counts for a single title are interesting to authors--but only if accurate. Otherwise, distressing!
University of Michigan Press (Monographs)
Defiance
● University presses tend to look to different metrics:
○ Financial (across press, not necessarily at book level)
○ Academic/disciplinary prestige/reputation (awards, reviews, repeat authors, attracting
prominent authors)
○ Use and persistence (course adoptions, new printings/editions)
● Mapping the Free eBook Supply Chain study of OA ebook usage revealed that--as presses have long
known--usage is “spiky” and unpredictable.
● Altmetric Explorer pilot sheds light on long-term engagement with books, syllabus citations
● Lots of data from many sources, but difficult to analyze and share meaningfully
Leading for Change in our Industry
New business models
Leading for Change in our Industry
Community-Owned Infrastructure
Leading for Change in our Industry
Leadership and Collaboration
Conclusion: Future Directions
Compliance
● DOIs and ORCIDs
● Consistent capture &
communication of data about
usage and impact
Defiance
● Interrogate what counts, and
what we count
● HuMetricsHSS
● Responsible Metrics policies
Levelling up
What new skills did you gain from working on
these efforts?
Challenges
What were the biggest challenges for you (the librarians or
infopros) in launching your projects/services?
And what were the biggest challenges for the organization as a
whole?
Benefits to the Greater Organization
Have there been any collateral benefits to your
organization that were not anticipated when you
started?
Elaine M. Lasda
Coordinator of Scholarly Communication
University at Albany
elasda@albany.edu
https://slideshare.net/librarian68
Keith Maull
Data Scientist
National Center of Atmospheric Research
kmaull@ucar.edu
Taylor Abernethy Johnson
Assistant Director
US Environmental Protection Agency Library (UNC)
tabernethy@live.com
https://www.linkedin.com/in/taylor-g-abernethy-johnson-66966195
Rebecca Welzenbach
Research Impact Librarian
University of Michigan
rwelzenb@umich.edu
https://tinyurl.com/NewMetrics
Questions?

Research Impact in Specialized Settings: 3 Case Studies

  • 1.
    Research Impact inSpecialized Settings: 3 Case Studies Elaine M. Lasda, University at Albany Rebecca Welzenbach, University of Michigan Keith Maull, National Center of Atmospheric Research Taylor Abernethy Johnson, RTP Library, Environmental Protection Agency
  • 2.
    Specialized Contexts • Impactof their research reaches beyond the academy • Differing missions require differing metrics • Purposes/benefits of metrics expand beyond employee promotion/tenure/institutional rankings etc. • Demonstrating value in broader arenas, to broader constituencies
  • 3.
    Book Genesis • Meetingof the Minds with Emerald • Special(ized) Libraries and Librarians • Personal Network • Variety is the Spice of Life
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
    LA Museum of NaturalHistory Richard P. Hulser
  • 9.
    Scholarly Metrics at NCAR KeithE. Maull Matthew Mayernik
  • 10.
    NCAR Library MetricsBrief History • 2015 • Origins in “NCAR Fact Sheets” (manual, laborious) • Initial development of research metrics services • Initial implementation of metrics software API • 2016 • Continued API development and Dashboard conception • First application of services for NSF site visit reports for NCAR labs • 2017 • Additional services delivered to UCP/Unidata and others • 2018-2019 • Continued expansion of metrics services and support • 2020 • Exploration of dataset and software metrics (Dataset DOIs, open source software on Github, etc.)
  • 11.
    NCAR Library MetricsMission Our objective is to deliver research metrics of value and interest to NCAR community stakeholders through services, software and support. Metrics Foci • Develop workflows to collect, analyze and reporting research and scholarly research metrics. • Develop partnerships and collaborations with institutional stakeholders to serve their specific metrics needs. • Provide on-going as well on- demand/ad-hoc services to institutional, lab and group clients.
  • 12.
    Research Metrics Services INSTITUTION LAB∙ PROGRAM DIVISION ∙ GROUP FACILITY ∙ INSTRUMENT MODEL ∙ PLATFORM ∙ EDUCATIONAL RESOURCES INDIVIDUAL • Global benchmarking • UCP Programs / Partners • Funding sources • UCP related materials • Unidata • Models (WRF, etc.) • Supercomputer • Observation Platforms • UCP25 • NSF Site Visit Reviews • FIM-COP Pilot • ARG PEER -REVIEWED PUBLICATIONS • h-index analysis • Publication profiles • Collaboration analyses (geographic, author, institution)
  • 13.
    Analyze Manage Plan Report Interpret Share Organize • Identifyand prioritize metrics needs • Review existing data inputs • Develop metrics plan and outcomes • Collect research paper bibliographies • Organize research papers for review and analysis • Provide web API for automated management • Ongoing curation (e.g. citation count updates) • Perform bibliometric analyses • Synthesize with other data sources • Provide customized dashboard and overview reports • Provide web API access to reporting data • Identify higher level reporting opportunities Facilities and lab managers, institutional leadership Collect Administrative and technical staff, research librarians and interns Administrative and technical staff, Facilities and lab managers, scientists Research Metrics Workflow
  • 14.
    Research Metrics Integrations ResearchMetrics Capabilities METRICS SERVICES Metrics Outputs Research Metrics Dashboards Submission Forms Metrics Inputs Data Providers Analytics + Computation … Web Display / Live Data Views Reports • Formatted publication + DOI lists • Unprocessed raw data files • NSF Grant IDs • Author lists • Program names • … Programmatic API Research Metrics EITL
  • 15.
    Case studies NCAR LIBRARYRESEARCH METRICS
  • 16.
    Research Metrics UseCase Study The CISL Supercomputer Survey • CISL supercomputer users surveyed annually (2008-2014) • Survey included peer-reviewed publications questions • Data semi-structured textual citations (un-normalized) • Some manual effort required to clean the data • Automated extraction of publication data • Automated discovery of citation data (CrossCite) • Utilized TR-WOS data to extract detailed data profile • Developed analysis of resulting data Automation and Analysis Data
  • 17.
    meteorology & atmospheric sciences geosciences- multidisc. oceanography environmental sciences astronomy & astrophysics geochemistry& geophysics physics-fluids & plasmas multidisc. sciences mechanics geography-physical computer science- interdisc. applications CISL Survey Analysis Highlights 803 TOTAL PEER- REVIEWED PUBLICATIONS* 547 TOTAL UNIQUE INSTITUTIONS 116 TOTAL UNIQUE JOURNALS 31 DISTINCT PUBLISHERS MOST CITED LEAD INSTITUTIONS * : Int’l Affiliate, †: UCAR Member Institution Lead Pubs Cites Per Lead Pub Total Pubs Georgia Inst Technol† 3 89.67 10 Univ Sheffield 1 79.00 2 Argonne Natl Lab 1 67.00 4 WHOI† 1 67.00 1 Univ Toronto† 1 66.00 4 Seoul Natl Univ* 2 53.00 7 NASA GSFC 1 53.00 1 Royal Netherlands Meteorol Inst KNMI De Bilt 1 52.00 1 Univ Maryland† 13 50.54 28 Dalian Univ Technol 1 49.00 1 Rutgers State Univ† 6 47.83 26 Univ Lancaster 1 46.00 4 Desert Res Inst† 1 44.00 4 TOP LEAD AUTHORS (by pub count) 8 S. Vavrus Univ Wisconsin 6 T. Krishnamurti Florida State Univ 6 Z. Liu† Univ Wisconsin 5 Z. Wang Univ Illinois 4 G. Jin Chinese Acad Sci 4 B. Kirtman Univ Miami PROLIFIC AUTHORS (by pub count) † : ASP Fellow, *: NCAR Faculty Fellowship, 2005 18 L. Wang Univ Delaware 17 N. Mahowald Cornell Univ 14 B. Kirtman Univ Miami 13 E. Maloney† Colorado State Univ 13 S. Vavrus† Univ Wisconsin CITATIONS PER PUBLICATION 13007 TOTAL CITATIONS 16.20 TIMES THE TOP 5 NON-NCAR LED PAPERS WERE CITED 1202 Journal Total Pubs Journal of Climate 122 J. Geophys. Res. 100 Geophysical Research Letters 67 Mon. Wea. Rev. 55 Atmos. Chem. Phys. 51 Journal of the Atmospheric Sciences 42
  • 18.
    Research Metrics UseCase Study 2016 NSF Site Visit Report for NCAR Modeling • Publications applicable to the various models developed by NCAR (WRF, CESM, etc.) • Publications curated by directors, admins and some scientists • Span of 10 years of publications • Some publications required manual and semi-automated lookup of publication titles Automation and Analysis • Automated extraction of publication data • Automated discovery of citation data (CrossCite) • Utilized TR-WOS data to extract detailed data profile • Developed analysis of resulting data Data
  • 19.
    Modeling and DataAssimilation Publications Fact Sheet 2005-2015 User Publication Analysis About the Publications: Publications were found by searching NCAR’s OpenSky Institutional Repository. This collection represents NCAR’s refereed publications from 2005 to 2015; Also collected from Thomson Reuters Web of Science Index; Models captured by the publication list include, but not limited to, CAM-chem, MOZART, WACCM, WRF-chem, CESM and WRF; Others were derived from internally maintained research bibliographies. 5023 Total Peer-Reviewed Publications 6.2 Avg. Authors/Pub 20.9 Avg. Citations/Pub 13.6 Avg. Citation/Pub in Atmospheric Sciences 105,076 Total Citations 2,110 Unique Institutions 8,904 Unique Authors * One pin per location, does not include duplicates Top 100 Institutions with the Greatest Number of Authors Citing NCAR resources* Top 10 Impact Factor Journals by Pub Count Top 10 Institutions by Author Count Most Common Topics 1. University of Colorado 848 2. NOAA 748 3. Pacific NW Natl Lab 699 4. Chinese Acad Sci 616 5. NASA 560 6. Univ Wisconsin 328 7. Univ Washington 288 8. Univ Calif Berkeley 236 9. Lawrence Livermore Natl Lab 235 10.Caltech 230
  • 20.
    Key Takeaways • Automationis essential for any of your metrics efforts • Stakeholders need to be partners in data collection and interpretation • Calibrating what is possible helps keep expectations in line with capabilities • Create a menu of products in line with metrics needs • Developing a team dedicated to metrics is the only way to sustain success • Stakeholders must understand “metrics” should extend much further than papers • Incentives to collect data may help in the efforts – everyone wants the results, but don’t realize the requirements
  • 21.
    One of >20libraries in National Library Network • One of three repositories • Top research facility • Highest foot traffic, reference transactions, ILL requests Staffed by contractors through UNC-SILS since 1975 • Five full-time staff • Eight student interns LITERATURE SEARCHING | REFERENCE | INTERLIBRARY LOAN | INSTRUCTION | CATALOGING SERIALS | EPA DOCUMENT PUBLISHING | ENDNOTE SUPPORT | BIBLIOMETRIC SERVICES
  • 22.
    22 “Measuring Your Research Impact” Webinar Officially rolled out Research Impact Reports Production & Prototyping “Article Impact Reports-A NewEPA-RTP Library Service” Webinar Library EXPO and STAA award season Investigating Group Impact & Automation 2016 Now THE NEW METRICS
  • 23.
    Research Impact Reports • Analyzesa set of publications • Presents author-, article-, journal-level metrics • Packages data with graphical illustrations
  • 24.
    Article Impact Reports • Fora single article and based on its set of citations • Data from citing sources revealing: • Geographic identities • Journals & organizations • Research areas • Alternative metric activity (social media, news, usage) 
  • 30.
    Since The New Metrics •Building AIR Program • Investigating Group Impact • Bibliometric analysis of EPA research, models, & tools • Automation
  • 31.
    Compliance & Defiance:Michigan Publishing's Early Encounters with Research Impact Metrics Rebecca Welzenbach Research Impact Librarian University of Michigan Library SLA 2020
  • 33.
    University of MichiganPress ● Founded in 1930 ● Part of the U-M Library since 2009 ● 80-100 monographs per year with disciplinary strength in Classical studies, performance studies, political science, African and Asian Studies, and more ● OA titles funded by Knowledge Unlatched, TOME, and more ● Acquisitions, Production, Marketing & Outreach, Business and Administration, technology ● Relies (mostly) on a sales model
  • 34.
    U-M Press andresearch impact metrics Historically: little involvement. But now, more is expected, requested, mandated. How best to engage effectively? Two options: ● Compliance: We can work to ensure that our publications are consistently recognized by and included in the systems and datasets upon which existing metrics are calculated. ● Defiance: We can articulate new (alternative) metrics that are meaningful for us and our stakeholders
  • 35.
    University of MichiganPress (Monographs) Compliance ● Books have long been totally absent from the research impact metrics space. Where they’ve been indexed, the record is inconsistent: BKCI-SSH has indexed 194 UM Press titles while Scopus has indexed 916 ● Newer players Google Scholar and Dimensions Plus are changing what’s possible to know and show-- but now we’re turning up a lot of gaps. ● Citation counts for a single title are interesting to authors--but only if accurate. Otherwise, distressing!
  • 37.
    University of MichiganPress (Monographs) Defiance ● University presses tend to look to different metrics: ○ Financial (across press, not necessarily at book level) ○ Academic/disciplinary prestige/reputation (awards, reviews, repeat authors, attracting prominent authors) ○ Use and persistence (course adoptions, new printings/editions) ● Mapping the Free eBook Supply Chain study of OA ebook usage revealed that--as presses have long known--usage is “spiky” and unpredictable. ● Altmetric Explorer pilot sheds light on long-term engagement with books, syllabus citations ● Lots of data from many sources, but difficult to analyze and share meaningfully
  • 38.
    Leading for Changein our Industry New business models
  • 39.
    Leading for Changein our Industry Community-Owned Infrastructure
  • 40.
    Leading for Changein our Industry Leadership and Collaboration
  • 41.
    Conclusion: Future Directions Compliance ●DOIs and ORCIDs ● Consistent capture & communication of data about usage and impact Defiance ● Interrogate what counts, and what we count ● HuMetricsHSS ● Responsible Metrics policies
  • 42.
    Levelling up What newskills did you gain from working on these efforts?
  • 43.
    Challenges What were thebiggest challenges for you (the librarians or infopros) in launching your projects/services? And what were the biggest challenges for the organization as a whole?
  • 44.
    Benefits to theGreater Organization Have there been any collateral benefits to your organization that were not anticipated when you started?
  • 45.
    Elaine M. Lasda Coordinatorof Scholarly Communication University at Albany elasda@albany.edu https://slideshare.net/librarian68 Keith Maull Data Scientist National Center of Atmospheric Research kmaull@ucar.edu Taylor Abernethy Johnson Assistant Director US Environmental Protection Agency Library (UNC) tabernethy@live.com https://www.linkedin.com/in/taylor-g-abernethy-johnson-66966195 Rebecca Welzenbach Research Impact Librarian University of Michigan rwelzenb@umich.edu https://tinyurl.com/NewMetrics Questions?

Editor's Notes

  • #3 Mission is not solely to get promotion and tenure Simple citation counts and bibliometrics don’t cut it Public impact beyond the ivory tower Practical benefits of research
  • #4 SLA
  • #5 Keith age 6. He wanted you to know he is not actually six now.
  • #6 Eno River State Park in Durham NC… Echo is the dog’s name
  • #7 Ren fair & ice cream Umich press, open access publishing unit, big blue the IR
  • #8 (Eddie) Kid Moonbeam
  • #9 Fort Lauderdale (frozen margarita)
  • #22 The renowned internship program allows library and information science students to conduct many of the day to day activities as they rotate through some of the service areas you see here as well as some others like cataloging and eresources management. The most recent addition to this realm is bibliometric services to which you might say
  • #24 You may have heard advertised before, The EPA-RTP Library has attempted to take these traditional and newer metrics and combine them with graphical illustrations that tell a more complete story of your career research impact in order to provide deeper meaning. This is an example of a Research Impact Report. A Research Impact Report is a portfolio created by the EPA-RTP Library using data about your research endeavors. We work with you on gathering information about your publications, research output, and field area to pull together a picture of your scientific contributions. We will add graphics and visuals better capable of communicating your value to the organization.