This document discusses implementing ORCID identifiers at Northumbria University. It describes Northumbria as a research-rich university with over 1,300 academic staff across four faculties. The Scholarly Publications team provides support for research activities including the institutional repository and research data management. ORCID was first promoted in 2013 and is now integrated into the postgraduate researcher workflow and upcoming staff publishing workflows. ORCID helps with accurate attribution of authors in research metrics reporting and identifying collaborations. Maintaining central support and emphasizing the benefits to individuals have helped adoption.
Quick reference cards for research impact metricsLibrary_Connect
When meeting with students, researchers, deans or department heads, the metrics on these quick reference cards can serve as a jumping off point in conversations about where to publish, adding to researcher profiles, enriching promotion and tenure files, and benchmarking research outputs. The cards were co-developed by librarian Jenny Delasalle and Elsevier's Library Connect program. Learn more and download poster versions as well at: https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics
Impact Factor Journals as per JCR, SNIP, SJR, IPP, CiteScoreSaptarshi Ghosh
Journal-level metrics
Metrics have become a fact of life in many - if not all - fields of research and scholarship. In an age of information abundance (often termed ‘information overload’), having a shorthand for the signals for where in the ocean of published literature to focus our limited attention has become increasingly important.
Research metrics are sometimes controversial, especially when in popular usage they become proxies for multidimensional concepts such as research quality or impact. Each metric may offer a different emphasis based on its underlying data source, method of calculation, or context of use. For this reason, Elsevier promotes the responsible use of research metrics encapsulated in two “golden rules”. Those are: always use both qualitative and quantitative input for decisions (i.e. expert opinion alongside metrics), and always use more than one research metric as the quantitative input. This second rule acknowledges that performance cannot be expressed by any single metric, as well as the fact that all metrics have specific strengths and weaknesses. Therefore, using multiple complementary metrics can help to provide a more complete picture and reflect different aspects of research productivity and impact in the final assessment. ( Elsevier)
A tool for librarians to select metrics across the research lifecycleLibrary_Connect
These slides introduce a range of research impact metrics. They were presented at the ER&L Conference (April 2017) by Chris James, Product Manager Research Metrics, Elsevier.
Quick reference cards for research impact metricsLibrary_Connect
When meeting with students, researchers, deans or department heads, the metrics on these quick reference cards can serve as a jumping off point in conversations about where to publish, adding to researcher profiles, enriching promotion and tenure files, and benchmarking research outputs. The cards were co-developed by librarian Jenny Delasalle and Elsevier's Library Connect program. Learn more and download poster versions as well at: https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics
Impact Factor Journals as per JCR, SNIP, SJR, IPP, CiteScoreSaptarshi Ghosh
Journal-level metrics
Metrics have become a fact of life in many - if not all - fields of research and scholarship. In an age of information abundance (often termed ‘information overload’), having a shorthand for the signals for where in the ocean of published literature to focus our limited attention has become increasingly important.
Research metrics are sometimes controversial, especially when in popular usage they become proxies for multidimensional concepts such as research quality or impact. Each metric may offer a different emphasis based on its underlying data source, method of calculation, or context of use. For this reason, Elsevier promotes the responsible use of research metrics encapsulated in two “golden rules”. Those are: always use both qualitative and quantitative input for decisions (i.e. expert opinion alongside metrics), and always use more than one research metric as the quantitative input. This second rule acknowledges that performance cannot be expressed by any single metric, as well as the fact that all metrics have specific strengths and weaknesses. Therefore, using multiple complementary metrics can help to provide a more complete picture and reflect different aspects of research productivity and impact in the final assessment. ( Elsevier)
A tool for librarians to select metrics across the research lifecycleLibrary_Connect
These slides introduce a range of research impact metrics. They were presented at the ER&L Conference (April 2017) by Chris James, Product Manager Research Metrics, Elsevier.
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTSaptarshi Ghosh
The term ‘Altmetrics’ was proposed by Jason Priem, a PhD student at the School of Information and Library Science at University of North Carolina, Chapel Hill through a tweet. [https://twitter.com/asnpriem/status/25844968813].
Altmetrics is the combination of two words such as: ‘Alternative’ and ‘Metrics’ in which the ‘alt-‘part refers to alternative types of metrics (that is alternative to traditional metrics such as citation analysis, impact factor, downloads & usage data etc.).
Altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship (http://altmetrics.org/about/). It is the study of new indicators for the analysis of academic activity based on Web 2.0.
This presentation will show how to find journal metrics in Scopus such as CiteScore, SCImago Journal Ranking (SJR) and Source Normalized Impact per Paper (SNIP).
Traditional metrics, such as the h-index and journal impact factors, are used to measure the scholarly impact of research. However, in the current climate of accountability by funding providers, fund recipients would benefit from a more comprehensive impact management system (IMS) to facilitate the capture and reporting of narratives (including metrics) about research impact in the academy, on social policy, in industry, and ultimately with the public.
Librarians have always been good at telling and facilitating stories. Research support librarians can use their storytelling skills to contribute to the implementation and administration of an impact management system. Being able to translate research impact into harvestable and reportable metadata is the key.
Showcasing your Research Impact using BibliometricsCiarán Quinn
Seminar to make academics aware of the bibliometric resources available to them and how to use them to improve their research impact. The session looked at
• What are Bibliometrics and Altmetrics
• Why they are important for you
• How to identify your research impact
and research profile
• How to improve your citations
• How to identify potential research collaborations
Updated 30/01/2015
This session included discussions around the value of bibliometrics for individual performance management/promotion and the REF.
What are bibliometrics?
Journal metrics
Personal metrics
Article level metrics and altmetrics
Discussion of alternatives to traditional bibliometric sources (many free) including Scopus, eigenfactor, SNIP, SJR, altmetrics, Publish or Perish, Microsoft Academic Search
LITA’s Altmetrics and Digital Analytics Interest Group is proud to present Heather Coates, Richard Naples, and Lauren Collister in our second free webinar of the season. Heather will introduce the concept of altmetrics with a quick "Altmetrics 101," Richard will discuss the Smithsonian's implementation of Altmetric, and Lauren will share the University of Pittsburgh's experience with Plum Analytics.
This presentation was provided by Holly Falk-Krzesinski of Elsevier during the NISO event, "Is This Still Working? Incentives to Publish, Metrics, and New Reward Systems," held on February 20, 2019.
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTSaptarshi Ghosh
The term ‘Altmetrics’ was proposed by Jason Priem, a PhD student at the School of Information and Library Science at University of North Carolina, Chapel Hill through a tweet. [https://twitter.com/asnpriem/status/25844968813].
Altmetrics is the combination of two words such as: ‘Alternative’ and ‘Metrics’ in which the ‘alt-‘part refers to alternative types of metrics (that is alternative to traditional metrics such as citation analysis, impact factor, downloads & usage data etc.).
Altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship (http://altmetrics.org/about/). It is the study of new indicators for the analysis of academic activity based on Web 2.0.
This presentation will show how to find journal metrics in Scopus such as CiteScore, SCImago Journal Ranking (SJR) and Source Normalized Impact per Paper (SNIP).
Traditional metrics, such as the h-index and journal impact factors, are used to measure the scholarly impact of research. However, in the current climate of accountability by funding providers, fund recipients would benefit from a more comprehensive impact management system (IMS) to facilitate the capture and reporting of narratives (including metrics) about research impact in the academy, on social policy, in industry, and ultimately with the public.
Librarians have always been good at telling and facilitating stories. Research support librarians can use their storytelling skills to contribute to the implementation and administration of an impact management system. Being able to translate research impact into harvestable and reportable metadata is the key.
Showcasing your Research Impact using BibliometricsCiarán Quinn
Seminar to make academics aware of the bibliometric resources available to them and how to use them to improve their research impact. The session looked at
• What are Bibliometrics and Altmetrics
• Why they are important for you
• How to identify your research impact
and research profile
• How to improve your citations
• How to identify potential research collaborations
Updated 30/01/2015
This session included discussions around the value of bibliometrics for individual performance management/promotion and the REF.
What are bibliometrics?
Journal metrics
Personal metrics
Article level metrics and altmetrics
Discussion of alternatives to traditional bibliometric sources (many free) including Scopus, eigenfactor, SNIP, SJR, altmetrics, Publish or Perish, Microsoft Academic Search
LITA’s Altmetrics and Digital Analytics Interest Group is proud to present Heather Coates, Richard Naples, and Lauren Collister in our second free webinar of the season. Heather will introduce the concept of altmetrics with a quick "Altmetrics 101," Richard will discuss the Smithsonian's implementation of Altmetric, and Lauren will share the University of Pittsburgh's experience with Plum Analytics.
This presentation was provided by Holly Falk-Krzesinski of Elsevier during the NISO event, "Is This Still Working? Incentives to Publish, Metrics, and New Reward Systems," held on February 20, 2019.
Gather evidence to demonstrate the impact of your researchIUPUI
This workshop is the 3rd in a series of 4 titled "Maximize your impact" offered by the IUPUI University Library Center for Digital Scholarship. Faculty must provide strong evidence of impact in order to achieve promotion and tenure. Having strong evidence in year 5 is made easier by strategic dissemination early in your tenure track. In this hands-on workshop, we will introduce key sources of evidence to support your case, demonstrate strategies for gathering this evidence, and provide a variety of examples. These sources include citation metrics, article level metrics, and altmetrics as indicators of impact to support your narrative of excellence.
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
Intervention d'Anne Catherine Rota, Spécialiste en Research Intelligence chez Elsevier au Forum du GFII 2015 : http://forum.gfii.fr/forum/les-nouvelles-mesures-de-l-influence-scientifique-l-apport-des-metriques-alternatives-au-pilotage-de-la-recherche
With Great Power Comes the Responsible Use of MetricsClaire Sewell
Metrics have long been used as an indicator of academic success and as a way to make key decisions. As the measurement of impact becomes increasingly important within academia there has been something of a backlash against trusting purely quantitative methods of assessment. The Responsible Metrics movement aims to ensure that metrics are used fairly alongside other measures to gather a true assessment of impact.
This webinar will discuss what the Responsible Metrics movement is, why it was developed, its importance and how library staff can best educate their research staff.
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
This presentation was provided by Todd Carpenter, Executive Director of NISO, and Nettie Lagace, NISO on June 25, during a ALA session devoted to Altmetrics.
The New Dimensions in Scholcomm: How a global scholarly community collaborati...NASIG
Digital Science and 100+ global research institutions have spent the better part of the last two years collaborating to solve three distinct challenges in the existing research landscape:
* Research evaluation focuses almost exclusively on publications and citations data
* Research evaluation tools are siloed in proprietary applications that rarely speak to each other
* The gaps amongst proprietary data sources made generating a complete picture of impact extremely difficult (and expensive)
The goal of this collaboration amongst publishers, funders, research administrators, libraries, and Digital Science was to transform the research landscape by attempting to solve the problems resulting from expensive, siloed data research evaluation data.
Academics must provide evidence to demonstrate the impact and outcomes of their scholarly work. This webinar, presented by librarians, will help faculty explore various forms of documentary evidence to support their case for excellence. Sponsored by the IUPUI Office of Academic Affairs.
Note: The webinar included demonstrations of Web of Science & Scopus, which the slides do not reflect.
Practical applications for altmetrics in a changing metrics landscapeDigital Science
"Practical applications for altmetrics in a changing metrics landscape" - Sara Rouhi, Altmetric product specialist, and Anirvan Chatterjee, Director Data Strategy for CTSI at UCSF
Presentation slides on Open Science and research reproducibility. Presented by Gareth Knight (LSHTM Research Data Manager) on 18th September 2018, as part of an Open Science event for LSHTM Week 2018.
Research support and open science services at the University of Eastern Finla...Library_Connect
This presentation from Helena Silvennoinen-Kuikka, Head of Learning and Information Services, University of Eastern Finland Library, shares their approach to the library as a leader in establishing open science within their institution. The presentation was part of a Library Connect webinar on open science on Oct. 11, 2018, which can be viewed at https://libraryconnect.elsevier.com/library-connect-webinars?commid=334301
Elevate the status of your library with data visualizations and multimedia me...Library_Connect
Webinar slides from:
- Todd Bruns, Institutional Repository Librarian, Eastern Illinois University
- Dudee Chiang, Senior Technical Librarian, NASA Jet Propulsion Laboratory
- Jean Shipman, Vice President of Global Library Relations, Elsevier
See the recorded webinar at: http://libraryconnect.elsevier.com/library-connect-webinars?commid=279911
An informatics perspective on health literacyLibrary_Connect
Professor Prudence Dalrymple, a leading health information professional, presented "An Informatics Perspective on Health Literacy: Challenges and Obstacles" at the Elsevier Luncheon for Medical Librarians concurrent with the 2017 Medical Library Association Annual Meeting and Exhibition in Seattle.
In these webinar slides, librarians share their inspiration and process for developing high-impact library services. Presentations from Katy Kavanagh Webb, Assistant Professor | Head, Research and Instructional Services, J.Y. Joyner Library, East Carolina University; Donna Gibson, Director of Library Services, Memorial Sloan Kettering (MSK) Cancer Center; and
J. William (Bill) Draper, Reference Librarian, Biddle Law Library, University of Pennsylvania Law School. View the webinar at: http://libraryconnect.elsevier.com/library-connect-webinars?commid=255645
Slides | Research data literacy and the libraryLibrary_Connect
Slides from the Dec. 8, 2016 Library Connect webinar "Research data literacy and the library" with Christian Lauersen, Sarah J. Wright and Anita de Waard. See the full webinar at: http://libraryconnect.elsevier.com/library-connect-webinars?commid=226043
Slides | Targeting the librarian’s role in research servicesLibrary_Connect
Slides from the Nov. 8, 2016 Library Connect webinar "Targeting the librarian’s role in research services" with Nina Exner, Amanda Horsman and Mark Reed. See the full webinar at: http://libraryconnect.elsevier.com/library-connect-webinars?commid=223121
Capturing and communicating the value of information management services in a...Library_Connect
Ulla de Stricker's webinar slides from the June 21 Library Connect webinar, "Capturing and communicating the value of information management services in a corporate culture." View the webinar: https://libraryconnect.elsevier.com/library-connect-webinars?commid=202895
Life in the Fast Lane: The Journey from Squid Axons to Alzheimer’s BrainsLibrary_Connect
Slides from Dr. Scott Brady's presentation at the Elsevier luncheon during the Medical Library Association 2016 conference. Dr. Brady is Professor and Head of Anatomy and Cell Biology, College of Medicine, University of Illinois at Chicago, and an editor of "Basic Neurochemistry."
Research impact metrics for librarians: calculation & contextLibrary_Connect
Slides from the May 19, 2016, Library Connect webinar "Research impact metrics for librarians: calculation & context" with Jenny Delasalle and Andrew Plume.
Watch the webinar at: https://libraryconnect.elsevier.com/library-connect-webinars?commid=199783
Wouter Haak's presentation on open science and research data management from the Elsevier Library Connect Event 2016 "Navigating the new publishing & open science terrain: what librarians need to know." Wouter is Elsevier's Vice President of Research Data Management Solutions.
Library Connect Webinar | Fostering research community through library spaces...Library_Connect
In this March 31, 2016 webinar three experienced librarians explored outreach activities to engage various user groups, and how services and a physical space - like a research commons or makerspace - can enhance collaboration, interdisciplinarity and raise the profile of the library.
View the webinar at:
http://libraryconnect.elsevier.com/library-connect-webinars?commid=192865
Presenters:
Yvonne Nobis, Head of Science Information Services, Betty and Gordon Moore Library, University of Cambridge
Danianne Mizzy, Head of Kenan Science Information Services, Kenan Science Library, University of North Carolina at Chapel Hill
Meris Mandernach, Associate Professor and Head of Research Services, University Libraries, The Ohio State University
Library Connect Webinar - Calculating sharing metrics: Possible approaches Library_Connect
This presentation from Lorraine Estelle, Director, Project Counter, was part of the Dec. 3, 2015 Library Connect Webinar, How researchers share articles: impact on library resources and services.
View the webinar recording: http://libraryconnect.elsevier.com/library-connect-webinars?commid=167539
Find out more about the Beyond Downloads project: http://libraryconnect.elsevier.com/beyond-downloads
Library Connect Webinar - The secret life of articles: From download metrics ...Library_Connect
This presentation from Suzie Allard, Associate Dean for Research, University of Tennessee, Knoxville, was part of the Dec. 3, 2015 Library Connect Webinar, How researchers share articles: impact on library resources and services.
View the webinar recording: http://libraryconnect.elsevier.com/library-connect-webinars?commid=167539
Find out more about the Beyond Downloads project: http://libraryconnect.elsevier.com/beyond-downloads
Library Connect Webinar- Metrics Support at the Department levelLibrary_Connect
University of Massachusetts Medical School’s Rebecca Reznik-Zellen talks about Metrics support at the department level. From the Nov. 12, 2015 webinar, Article, author and journal metrics: what librarians need to know.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
6. 2. About metrics
Assumption
• Citations measure impact
• Citations are objective
• Higher is better
• Citations are what count
Reality
• It’s complicated
• It’s complicated
• It’s complicated
• It’s complicated
7. 3. About peer review
You provide a
CV / application
Reviewer looks
at your CV
Reviewer
(hopefully)
reads some
papers
A miracle
occurs
Reviewer
makes a
decision
8. 3. Problems with peer review
• Conscious or unconscious bias
• Inconsistent or conflicting results
• Time- and resource-limited
9. Pros
Handles large data sets
Produces reproducible results
Impact according to a large
sample
Cons
Requires expertise to generate
and interpret
Only measures publications
Limited measure of impact
3. Pros and cons
15. | 15| 15| 15
Researcher Profiles and Metrics
that Matter
Andrea Michalek
Vice President of Research Metrics, Product
Management
and Managing Director of Plum Analytics, Elsevier
June 8, 2017
16. | 16| 16| 16
Analyze the
strengths of
research at the
institution
Determine
where research is
a good potential
investment
Demonstrate
ROI (Return On
Investment) of
research money
Identify rising
stars amongst
the early career
researchers
Tell a better
narrative about
everything that
is happening
with research
Research Metrics Can Be Used to…
17. | 17| 17| 17
17
Different Researchers Have Different Needs for Metrics
18. | 18| 18| 18
Research Metrics Throughout the Research Process
18
19. | 19| 19| 19
Examples of Metrics
Journal Level
• CiteScore
• Journal Impact
Factor
• Scimago Journal
Rank (SJR)
• Source
Normalized
Impact Per Paper
(SNIP)
Article Level
• Citation Count
• Citations per paper
• Field-Weighted
Citation Impact (FWCI)
• Outputs in top quartile
• Citations in policy and
medical guidelines
• Usage
• Captures, e.g.
bookmarking
• Mentions
• Social media
Researcher Level
• Document Count
• h-Index
20. | 20| 20| 20
Users in Different Countries Select Different Metrics
Metric World Australia Canada China Germany Japan
United
Kingdom
United
States
Field-Weighted Citation
Impact
1 1 1 3 2 4 3 1
Outputs in Top Percentiles 2 2 3 1 4 1 1 6
Publications in Top Journal
Percentiles
3 4 2 2 6 2 2 5
Collaboration 4 6 6 5 1 3 5 7
Citations per Publication 5 3 7 6 3 5 4 3
Citation Count 6 5 5 4 8 6 6 2
h-indices 7 7 4 8 7 7 7 4
Usage of metrics available in SciVal’s Benchmarking module from 11 March 2014 to 28 June 2015.
A partial list of the metrics available at that time is shown, focusing on the most frequently-used. Scholarly Output it
excluded since this is the default.
Note that recently added metrics based on e.g. media mentions and awards data were not available at this time and so are
not represented in this analysis.
22. | 22| 22| 22
2-5
years
Idea
Blog Post
?
years
Grant
Conference
3-5
years
Video
Citations
Metrics timeline:
From Idea to Impact
It can take at least 2 - 5 years
from idea to a published
peer-reviewed journal article
Old Paradigm New Paradigm
Due to the pace of scholarly
publishing, it takes another 3 - 5
years from the time the work is
published to get to critical mass
of citation counts
From idea to measurable
citation counts can take
5 - 10 years
Metrics
available
immediately
citation counts
presentation view
share
save reference
bookmark
PDF download
click
video play
dataset download
citation counts
tweet
Publication
24. | 24| 24| 24
USAGE
(clicks, downloads, views,
library holdings, video plays)
CAPTURES
(bookmarks, code forks, favorites,
readers, watchers)
MENTIONS
(blog posts, comments, reviews,
Wikipedia links)
SOCIAL MEDIA
(+1s, likes, shares, tweets)
CITATIONS
(citation indexes, patents,
clinical, policy)
Categorizing Metrics for Analysis
25. | 25| 25| 25
How Do You Measure Research Output
26. | 26| 26| 26
Expanding Metrics:
Measuring Policy and Clinical Citations
• Basic research is cited three to five times more than clinical
research
• Early-career researchers are opting out of studying translational
medicine
• When output is cited in a policy document or a clinical guideline
New metrics can help the researcher tell their story
27. | 27| 27| 27
Golden Rules for Using Research Metrics
Use both qualitative and
quantitative input into your
decisions
Use more than one research
metric as the quantitative input
Using multiple metrics drives desirable
changes in behaviour
There are lots of different ways of
being excellent
A research metric’s strengths can
complement the weaknesses of others
Combining both approaches will get
you closer to the whole story
Valuable intelligence is available from
the points where these approaches
differ in their message
This is about benefitting from the
strengths of both approaches, not
about replacing one with the other
28. | 28| 28| 28
Responsible Metrics
• Robustness: basing metrics on the best possible data in terms of accuracy and scope
• Humility: recognizing that quantitative evaluation should support – but not supplant –
qualitative, expert assessment
• Transparency: keeping data collection and analytical processes open and transparent,
so that those being evaluated can test and verify the results
• Diversity: accounting for variation by field, and using a variety of indicators to support
diversity across the research system
• Reflexivity: recognizing systemic and potential effects of indicators and updating them in
response
http://www.hefce.ac.uk/pubs/rereports/year/2015/metrictide/
29. | 29| 29| 29
Mechanisms for Gathering Metrics is Important
Describe all
known limitations
of the data.
Provide a clear
definition of
each metric.
Describe how
data are
aggregated.
Detail how
often data are
updated.
From the NISO Code of Conduct for altmetrics
30. | 30| 30| 30
www.elsevier.com/research-intelligence
Thank you
Email andrea@plumanalytics.com
Mobile +1 215.280.1805
Twitter @amichalek
31. ORCID implementation at
Northumbria University
Ellen Cole
Scholarly Publications Librarian, Student and Library Services
http://orcid.org/0000-0002-1293-2599
32. Northumbria University is a research-rich, business focused
University in Newcastle-upon-Tyne
Three campuses: two in Newcastle, one in London
27,167 students
3200 employees / 1385 academic staff
Comprehensive subject coverage over four faculties
Faculty of Engineering & Environment
Faculty of Health & Life Sciences
Faculty of Arts, Design & Social Sciences
Faculty of Business & Law
33. Scholarly Publications
A team (one librarian, two FTE library
assistants) providing a central support
service for academic staff and
postgraduate research students.
• Research Excellence Framework
• Bibliometrics
• Research Data Management
• Publications module of Pure
• Open Access
• Institutional OA fund
• Institutional repository
• Compliance with Higher
Education Funding Council
England and Research Councils
UK OA policies
• Northumbria Journals
34. ORCID in 2013
Adding ORCID to an existing
programme of research
training and research events
Soft promotion: email
signatures, flyers and posters
Adding a secondary identifier
to the institutional repository
Gaining endorsement from
senior research committees
35. Jisc-ARMA ORCID pilot projects
“The aim of the pilot project is to streamline
the ORCID implementation process at universities and to develop
the best value approach for a potential UK wide adoption of
ORCID in higher education.”
https://orcidpilot.jiscinvolve.org/wp/
36. “Embedding ORCID across researcher career paths”
New centralised, online
research process for
postgraduate research
students
Optional submission of
ORCID to HESA (Higher
Education Statistics
Agency) annual return for
students
An opportunity to
integrate ORCID at most
useful point for individual
and institution
37. ORCID in 2017
Established as part of the
postgraduate researcher workflow
About to roll out for staff using Pure
Well embedded in skills training for
staff and students
Library’s Researcher Development
Week training sessions
Online guidance for publishing
Research and Innovation Services
and Graduate School training
Important part of Scholarly
Publications processes
38. Research metrics
Regular reporting to
senior faculty staff
and committees,
focusing on
benchmarking against
other institutions and
collaboration
ORCID assists with
accurate identification
of Northumbria
authors
39. Embed information about ORCID wherever its relevant
Maintain a central point of contact and support
Technical integration not necessarily required
…but definitely has advantages for accurate and timely capture
Emphasise the benefit to the individual not the institution
40. It’s time for a new standard of journal citation impact.
Don’t Speculate. Validate.
CurrentTransparentComprehensive
Based on Scopus, the world’s
largest abstract and citation
database
CiteScore metrics are available
for 22,618 serial titles on
Scopus: journals, book series,
conference proceedings and
trade publications
CiteScore metrics are available
for free
CiteScore metrics are easy to
calculate for yourself
The underlying database is
available for you to interrogate
CiteScore Tracker is updated
monthly
New titles will have CiteScore
metrics the year after they are
indexed by Scopus