Scholarly Metrics in
Specialized Settings:
A View from the Trenches
Elaine M. Lasda, University at Albany, SUNY
Rebecca Welzenbach, University of Michigan
Presentation for Bibliometric and Research Impact Community
15-16 May 2019
Laval University, Quebec City, QC, CAN
Merci de nous
accueillir ici
aujourd'hui!
Research Impact in the Academy
Metric "Tyranny" - Muller
Caution: Proceed With Care!
Image by OpenClipart-Vectors from Pixabay
Specialized Contexts
● Impact of their research reaches beyond the academy
● Differing missions require differing metrics
● Purposes/benefits of metrics expand beyond employee
promotion/tenure/institutional rankings etc.
● Demonstrating value in broader arenas, to broader constituencies
Book Genesis
● Meeting of the Minds with Emerald
● Special(ized) Libraries and Librarians
● Personal Network
● Variety is the Spice of Life
The Five Case
Studies
Image by PublicDomainPictures from Pixabay
Institute for
Transportation
Studies
Kendra K. Levine
ITS & Research Impact
The Challenge Tracking research through transportation project
lifecycle and beyond (Justify funding from
Legislature)
Stakeholders Cali. State Legislature, High Level UC admins,
Transportation community(public and private)
Outputs/Content Assessed Technical reports and other grey literature, PRJAs,
project reports, etc.
Impact Deliverables Created a method for tracking impact using Google
Scholar and liking research output to project codes.
Librarian Roles Gathering, synthesizing data, communicating with
field and administatiors, facilitating cooperation,
technical expertise, subject matter knowledge
Environmental
Protection
Agency
Taylor Abernethy Johnson
&
Anthony Holderied
EPA & Research Impact
The Challenge Demonstrate meaningful value of scholarly output,
researcher accountability, meet administrative and
researcher requests
Stakeholders Researchers, funders, award committees, agency
administration
Outputs/Content Assessed Mainly PRJAs and other traditional scholarly output
using WoS/InCites, ImpactStory, PlumX, GS/PoP,
Altmetric, news, etc.
Impact Deliverables RIR/AIR <- impact reports with high quality visual
appeal, context and data synthesis
Librarian Roles Graphic design and visualization, data gathering,
synthesizing, contexutalizing. Brought in on other
data projects. Educating stakeholders.
University/
National Center
for Atmospheric
Research
Keith Maull
&
Matthew Mayernik
NCAR/UCAR & Research Impact
The Challenge Demonstrate impact in 3 arenas: annual report of
activity, EarthCube “ecosystem”, and scientist use
of supercomputer @ NCAR
Stakeholders Funders (government/university members),
researchers, UCAR/NCAR Directorate, Library
Outputs/Content Assessed Largely PRJAs, but other scholarly output will be
included
Impact Deliverables Annual Report. Use/Impact beyond Journal-
Author-Article levels. Impact of NCAR
equipement/services. Used WoS/InCites, Altmetric
Librarian Roles Software engineers! Gather data, technical
expertise (developing an API), testing workfkow
and delegating when appropriate
LA Museum of
Natural History
Richard P. Hulser
LA-MNH & Research Impact
The Challenge Demonstrate interest and visibility of rsch & activity
beyond citing references and media hits
Stakeholders Museum admin, media & Marketing, donors,
education program, resarchers
Outputs/Content Assessed PRJAs, and nonscholarly publications
Impact Deliverables Proof of concept of wide range of uses of altmetrics
for a museum/humanities environment
Librarian Roles Leading, project management/coordination, liaison
with vendor, promotor/champion, educatio
Live, and In Person:
Rebecca
Welzenbach
Compliance & Defiance: Michigan Publishing's
Early Encounters with Research Impact Metrics
Rebecca Welzenbach
Research Impact Librarian
University of Michigan Library
BRIC 2019, Laval University, Quebec City
What is Michigan Publishing doing?
1. Publishing and supporting excellent and innovative scholarship in the
humanities and social sciences, including:
a. Peer-reviewed scholarly monographs
b. Independent, open access journals
c. Institutional repository
2. Driving change in our industry to help university presses and library publishers
adapt and thrive into the future, including:
a. New business models
b. Community-owned infrastructure
c. Modeling leadership and collaboration
University of Michigan Press
● Founded in 1930
● Part of the U-M Library since 2009
● 80-100 monographs per year with
disciplinary strength in Classical
studies, performance studies, political
science, African and Asian Studies,
and more
● OA titles funded by Knowledge
Unlatched, TOME, and more
● Acquisitions, Production, Marketing
& Outreach, Business and
Administration, technology
● Relies (mostly) on a sales model
Michigan Publishing Services
● Established as Library Unit ~2000
under the name Scholarly Publishing
Office
● ~30 OA journals/serials and ~35
books
● Production and hosting support for
external clients Lever Press &
Humanities EBook
● Relies (mostly) on chargebacks
Deep Blue
● Launched in 2006
● >124,000 objects in document
repository (DSpace)
● 237 data sets in data repository
(Samvera/Fedora)
● ⅓ of items in Deep Blue are not
published/available anywhere else
● Sustained as a core service by U-M
Library
Michigan Publishing and research impact metrics
Historically: little involvement. But now, more is expected, requested, mandated.
How best to engage effectively? Two options:
● Compliance: We can work to ensure that our publications are consistently
recognized by and included in the systems and datasets upon which existing
metrics are calculated.
● Defiance: We can articulate new (alternative) metrics that are meaningful for
us and our stakeholders
University of Michigan Press (Monographs)
Compliance
● Books have long been totally absent from the research impact metrics space.
Where they’ve been indexed, the record is inconsistent: BKCI-SSH has indexed
194 UM Press titles while Scopus has indexed 916
● Newer players Google Scholar and Dimensions Plus are changing what’s
possible to know and show--but now we’re turning up a lot of gaps.
● Citation counts for a single title are interesting to authors--but only if accurate.
Otherwise, distressing!
University of Michigan Press (Monographs)
Defiance
● University presses tend to look to different metrics:
○ Financial (across press, not necessarily at book level)
○ Academic/disciplinary prestige/reputation (awards, reviews, repeat authors, attracting
prominent authors)
○ Use and persistence (course adoptions, new printings/editions)
● Mapping the Free eBook Supply Chain study of OA ebook usage revealed that--
as presses have long known--usage is “spiky” and unpredictable.
● Altmetric Explorer pilot sheds light on long-term engagement with books,
syllabus citations
● Lots of data from many sources, but difficult to analyze and share meaningfully
Michigan Publishing Services (Journals)
Compliance
● For many of our journals (esp. In the humanities), JIF and comparable metrics
are not meaningful--but that can change suddenly based on author demand
● As we expand into new disciplines--especially health sciences--the requirement
for representation in indexes like WoS, Scopus, and Medline are a huge
challenge, learning curve
● Student journals pose their own unique challenges
● Often our role is to educate, provide context, manage expectations, facilitate
progress
Michigan Publishing Services (Journals)
Defiance
● We’re interested in success, stability of our program and services
○ How many journals?
○ Are they publishing consistently?
○ What proportion of our service supports campus publications vs. off-campus?
○ Are we succeeding in getting them indexed?
● Altmetric pilot applied to journals in 2015. In 2017 shareable reports made this
useful
● Google Analytics + Google Data Studio for sharing usage data with partners
Deep Blue (Institutional Repository)
Compliance
● Many items in the repository were first published in scholarly journals with
citation counts, JIF, and other bibliometrics.
● Integration of IR with Research Information System will ensure that these
publications are preserved, accessible, and contextualized
Source: Byrne, Kate and Stephen Cawley. Connections, Collaborations, & Impact: Data-Driven Approaches to
Understanding institutional research expertise. Digital Science case study. October 2018
Deep Blue (Institutional Repository)
Defiance
● Visibility of informal and non-traditional forms of scholarship
● Download statistics
● Altmetric engagement data**
● Variety of types and forms of scholarship means that even the metrics we have
don’t work the same way for everyone, everything
Leading for Change in our Industry
New business models
Leading for Change in our Industry
Community-Owned Infrastructure
Leading for Change in our Industry
Leadership and Collaboration
Which is worse:
To be represented
inadequately, or not at all?
Which is better:
To show up where we know
others are counting? Or to
count what matters to us?
Conclusion: Future Directions
Compliance
● DOIs and ORCIDs
● Consistent capture &
communication of data about
usage and impact
Defiance
● Interrogate what counts, and
what we count
● HuMetricsHSS
● Responsible Metrics policies
Synthesis
Image by Gerd Altmann from Pixabay
How the Cases Differ
● Range of library mission & purpose
● Funding sources
● Parent organization activities
● Relationships to stakeholders
● Evaluated subjects/objects
● Impact data output formats
● Technical resources & staff skill sets
● Maturity & level of services provided
Shared Challenges
● Labor intensive
● Lack of standardized identifiers for all output types
● “Out of the box” tools insufficient
● Metrics do not stand alone/speak for themselves
● Need for stakeholder education
● Measuring impact outside disciplinary boundaries/publications
● User education: “Metric Literacy”
Organizational Challenges/Opportunities
● “Canaries in the coal mine”
● Collaboration across the enterprise
● Embedded librarian/informationists
● Recognition of leadership
● Educational mission -> Information Literacy -> Metric Literacy
InfoPro/Librarian/(??) Challenges/Opportunities
● Project management
● Tech skills
● Negotiation
● Delegation
● Relationship building
● Valued team members
Benefits to the Greater Organization
● Confidence in the numbers
● Comparative advantage -> wise use of human resources
● Improved communication and breaking down silos
● Owning the story
Questions?
Release Date: August 19, 2019
https://tinyurl.com/NewMetrics

Scholarly Metrics in Specialized Settings

  • 1.
    Scholarly Metrics in SpecializedSettings: A View from the Trenches Elaine M. Lasda, University at Albany, SUNY Rebecca Welzenbach, University of Michigan Presentation for Bibliometric and Research Impact Community 15-16 May 2019 Laval University, Quebec City, QC, CAN
  • 2.
    Merci de nous accueillirici aujourd'hui!
  • 3.
    Research Impact inthe Academy
  • 4.
  • 5.
    Caution: Proceed WithCare! Image by OpenClipart-Vectors from Pixabay
  • 6.
    Specialized Contexts ● Impactof their research reaches beyond the academy ● Differing missions require differing metrics ● Purposes/benefits of metrics expand beyond employee promotion/tenure/institutional rankings etc. ● Demonstrating value in broader arenas, to broader constituencies
  • 7.
    Book Genesis ● Meetingof the Minds with Emerald ● Special(ized) Libraries and Librarians ● Personal Network ● Variety is the Spice of Life
  • 8.
    The Five Case Studies Imageby PublicDomainPictures from Pixabay
  • 9.
  • 10.
    ITS & ResearchImpact The Challenge Tracking research through transportation project lifecycle and beyond (Justify funding from Legislature) Stakeholders Cali. State Legislature, High Level UC admins, Transportation community(public and private) Outputs/Content Assessed Technical reports and other grey literature, PRJAs, project reports, etc. Impact Deliverables Created a method for tracking impact using Google Scholar and liking research output to project codes. Librarian Roles Gathering, synthesizing data, communicating with field and administatiors, facilitating cooperation, technical expertise, subject matter knowledge
  • 11.
  • 12.
    EPA & ResearchImpact The Challenge Demonstrate meaningful value of scholarly output, researcher accountability, meet administrative and researcher requests Stakeholders Researchers, funders, award committees, agency administration Outputs/Content Assessed Mainly PRJAs and other traditional scholarly output using WoS/InCites, ImpactStory, PlumX, GS/PoP, Altmetric, news, etc. Impact Deliverables RIR/AIR <- impact reports with high quality visual appeal, context and data synthesis Librarian Roles Graphic design and visualization, data gathering, synthesizing, contexutalizing. Brought in on other data projects. Educating stakeholders.
  • 13.
  • 14.
    NCAR/UCAR & ResearchImpact The Challenge Demonstrate impact in 3 arenas: annual report of activity, EarthCube “ecosystem”, and scientist use of supercomputer @ NCAR Stakeholders Funders (government/university members), researchers, UCAR/NCAR Directorate, Library Outputs/Content Assessed Largely PRJAs, but other scholarly output will be included Impact Deliverables Annual Report. Use/Impact beyond Journal- Author-Article levels. Impact of NCAR equipement/services. Used WoS/InCites, Altmetric Librarian Roles Software engineers! Gather data, technical expertise (developing an API), testing workfkow and delegating when appropriate
  • 15.
    LA Museum of NaturalHistory Richard P. Hulser
  • 16.
    LA-MNH & ResearchImpact The Challenge Demonstrate interest and visibility of rsch & activity beyond citing references and media hits Stakeholders Museum admin, media & Marketing, donors, education program, resarchers Outputs/Content Assessed PRJAs, and nonscholarly publications Impact Deliverables Proof of concept of wide range of uses of altmetrics for a museum/humanities environment Librarian Roles Leading, project management/coordination, liaison with vendor, promotor/champion, educatio
  • 17.
    Live, and InPerson: Rebecca Welzenbach
  • 18.
    Compliance & Defiance:Michigan Publishing's Early Encounters with Research Impact Metrics Rebecca Welzenbach Research Impact Librarian University of Michigan Library BRIC 2019, Laval University, Quebec City
  • 20.
    What is MichiganPublishing doing? 1. Publishing and supporting excellent and innovative scholarship in the humanities and social sciences, including: a. Peer-reviewed scholarly monographs b. Independent, open access journals c. Institutional repository 2. Driving change in our industry to help university presses and library publishers adapt and thrive into the future, including: a. New business models b. Community-owned infrastructure c. Modeling leadership and collaboration
  • 21.
    University of MichiganPress ● Founded in 1930 ● Part of the U-M Library since 2009 ● 80-100 monographs per year with disciplinary strength in Classical studies, performance studies, political science, African and Asian Studies, and more ● OA titles funded by Knowledge Unlatched, TOME, and more ● Acquisitions, Production, Marketing & Outreach, Business and Administration, technology ● Relies (mostly) on a sales model
  • 22.
    Michigan Publishing Services ●Established as Library Unit ~2000 under the name Scholarly Publishing Office ● ~30 OA journals/serials and ~35 books ● Production and hosting support for external clients Lever Press & Humanities EBook ● Relies (mostly) on chargebacks
  • 23.
    Deep Blue ● Launchedin 2006 ● >124,000 objects in document repository (DSpace) ● 237 data sets in data repository (Samvera/Fedora) ● ⅓ of items in Deep Blue are not published/available anywhere else ● Sustained as a core service by U-M Library
  • 24.
    Michigan Publishing andresearch impact metrics Historically: little involvement. But now, more is expected, requested, mandated. How best to engage effectively? Two options: ● Compliance: We can work to ensure that our publications are consistently recognized by and included in the systems and datasets upon which existing metrics are calculated. ● Defiance: We can articulate new (alternative) metrics that are meaningful for us and our stakeholders
  • 25.
    University of MichiganPress (Monographs) Compliance ● Books have long been totally absent from the research impact metrics space. Where they’ve been indexed, the record is inconsistent: BKCI-SSH has indexed 194 UM Press titles while Scopus has indexed 916 ● Newer players Google Scholar and Dimensions Plus are changing what’s possible to know and show--but now we’re turning up a lot of gaps. ● Citation counts for a single title are interesting to authors--but only if accurate. Otherwise, distressing!
  • 27.
    University of MichiganPress (Monographs) Defiance ● University presses tend to look to different metrics: ○ Financial (across press, not necessarily at book level) ○ Academic/disciplinary prestige/reputation (awards, reviews, repeat authors, attracting prominent authors) ○ Use and persistence (course adoptions, new printings/editions) ● Mapping the Free eBook Supply Chain study of OA ebook usage revealed that-- as presses have long known--usage is “spiky” and unpredictable. ● Altmetric Explorer pilot sheds light on long-term engagement with books, syllabus citations ● Lots of data from many sources, but difficult to analyze and share meaningfully
  • 29.
    Michigan Publishing Services(Journals) Compliance ● For many of our journals (esp. In the humanities), JIF and comparable metrics are not meaningful--but that can change suddenly based on author demand ● As we expand into new disciplines--especially health sciences--the requirement for representation in indexes like WoS, Scopus, and Medline are a huge challenge, learning curve ● Student journals pose their own unique challenges ● Often our role is to educate, provide context, manage expectations, facilitate progress
  • 30.
    Michigan Publishing Services(Journals) Defiance ● We’re interested in success, stability of our program and services ○ How many journals? ○ Are they publishing consistently? ○ What proportion of our service supports campus publications vs. off-campus? ○ Are we succeeding in getting them indexed? ● Altmetric pilot applied to journals in 2015. In 2017 shareable reports made this useful ● Google Analytics + Google Data Studio for sharing usage data with partners
  • 32.
    Deep Blue (InstitutionalRepository) Compliance ● Many items in the repository were first published in scholarly journals with citation counts, JIF, and other bibliometrics. ● Integration of IR with Research Information System will ensure that these publications are preserved, accessible, and contextualized
  • 33.
    Source: Byrne, Kateand Stephen Cawley. Connections, Collaborations, & Impact: Data-Driven Approaches to Understanding institutional research expertise. Digital Science case study. October 2018
  • 34.
    Deep Blue (InstitutionalRepository) Defiance ● Visibility of informal and non-traditional forms of scholarship ● Download statistics ● Altmetric engagement data** ● Variety of types and forms of scholarship means that even the metrics we have don’t work the same way for everyone, everything
  • 36.
    Leading for Changein our Industry New business models
  • 37.
    Leading for Changein our Industry Community-Owned Infrastructure
  • 38.
    Leading for Changein our Industry Leadership and Collaboration
  • 39.
    Which is worse: Tobe represented inadequately, or not at all?
  • 40.
    Which is better: Toshow up where we know others are counting? Or to count what matters to us?
  • 41.
    Conclusion: Future Directions Compliance ●DOIs and ORCIDs ● Consistent capture & communication of data about usage and impact Defiance ● Interrogate what counts, and what we count ● HuMetricsHSS ● Responsible Metrics policies
  • 42.
    Synthesis Image by GerdAltmann from Pixabay
  • 43.
    How the CasesDiffer ● Range of library mission & purpose ● Funding sources ● Parent organization activities ● Relationships to stakeholders ● Evaluated subjects/objects ● Impact data output formats ● Technical resources & staff skill sets ● Maturity & level of services provided
  • 44.
    Shared Challenges ● Laborintensive ● Lack of standardized identifiers for all output types ● “Out of the box” tools insufficient ● Metrics do not stand alone/speak for themselves ● Need for stakeholder education ● Measuring impact outside disciplinary boundaries/publications ● User education: “Metric Literacy”
  • 45.
    Organizational Challenges/Opportunities ● “Canariesin the coal mine” ● Collaboration across the enterprise ● Embedded librarian/informationists ● Recognition of leadership ● Educational mission -> Information Literacy -> Metric Literacy
  • 46.
    InfoPro/Librarian/(??) Challenges/Opportunities ● Projectmanagement ● Tech skills ● Negotiation ● Delegation ● Relationship building ● Valued team members
  • 47.
    Benefits to theGreater Organization ● Confidence in the numbers ● Comparative advantage -> wise use of human resources ● Improved communication and breaking down silos ● Owning the story
  • 48.
    Questions? Release Date: August19, 2019 https://tinyurl.com/NewMetrics

Editor's Notes

  • #3 Thank you for having us here today
  • #4 Publish or perish is twofold: Productivity Impact
  • #5 “Juking the stats” “Teaching to the test” Maligned/unintended incentives for playing to the numbers Salami publishing
  • #6 Metrics as evaluative tools. DORA , altmetric manifesto, WoS’s Profiles, NOt Metrics report
  • #10 (Eddie) Kid Moonbeam
  • #12 Eno River State Park in Durham NC… Echo is the dog’s name
  • #14 Keith age 6. He wanted you to know he is not actually six now.
  • #16 Fort Lauderdale (frozen margarita)