DORA & Responsible
Metrics in Research
Assessment
ZUZANA ORIOU
OPEN RESEARCH TEAM
19/11/2020
1
Objectives
2
 To introduce DORA and it’s aims
 To understand publication metrics and their common misconceptions
 To introduce key arguments, numbers and data, to aid change in the research culture
 To understand which metrics and how to use them for researchers (eg. in CV and funding
applications)
DORA
3
Declaration On Research Assessment
Improving how research is assessed
sfdora.org
@DORAssessment
• Addresses misuse of journal-based metrics in hiring, promotion and funding decisions
• Draws attention to the problem and fosters progressive solutions
• Demonstrates community support for change
Signed by ~2000 organizations and >16,000 individuals
Core principles of DORA for research institutions
• Be explicit about the criteria used to reach hiring, tenure, and promotion decisions,
• highlighting that scientific content of a paper is much more important than publication metrics
• for the purposes of research assessment, consider the value and impact of all research outputs
Bibliometrics
4
Bibliometrics = statistical analysis of publications.
- can provide valuable insights into aspects of research in some disciplines if used critically
BUT! ….when used in the wrong context…
- it can be problematic for researchers and research progress
- research “excellence” and “quality” are abstract concepts, difficult to measure directly
- can incentivize undesirable behaviours
- chasing publications in journals with high impact factors regardless of whether this is the most appropriate
venue for publication
- discouraging the use of open research approaches such as preprints or data sharing
 Inaccurate assessment of research can become unethical when metrics take precedence over expert judgement,
where the complexities and nuances of research or a researcher’s profile cannot be quantified
Questionable metrics
5
• Journal Impact Factor ( JIF)
• Citation Count
• H- index
Journal Impact factor
6
What is Journal Impact Factor?
Journal Impact factor
7
𝐽𝐼𝐹 (2018) =
𝐶𝑖𝑡𝑎𝑡𝑖𝑜𝑛𝑠 𝑖𝑛 𝟏𝟖 𝑡𝑜 𝑖𝑡𝑒𝑚𝑠 𝑝𝑢𝑏𝑙𝑖𝑠ℎ𝑒𝑑 𝑖𝑛 𝟏𝟔 + 𝟏𝟕
𝑁𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑡𝑒𝑚𝑠 𝑖𝑛 𝟏𝟔 + 𝟏𝟕
= The mean average of citations over the last 2 years
What is Journal Impact Factor?
Journal Impact factor
8
Limitations:
• Citation distributions within journals are highly skewed - average highly susceptible to outliers
• Field-specific - The amount of citations received in different fields vary significantly.
• Can be manipulated by editorial policy, self-citations, citation ‘cartels’
• Data used to calculate JIF are not transparent nor openly available
• False precision
JIF
9
Physical Review LettersNature
•Citation distributions within journals are highly skewed - average highly susceptible to outliers
An article published in nature does not mean that is has been or will be cited 43x!
Editorial (2005). Not so deep impact. Nature 435, 1003–1004. https://www.nature.com/articles/4351003b.pdf
JIF- practical example
10
• Do articles in high JIF journals receive more attention?
Altmetric
score
Article Journal Citation
count
Evidence for a Bs0π± State Physical Review Letters
July 2016
99
Correlated Event-by-Event Fluctuations of Flow
Harmonics in Pb-Pb Collisions at sNN=2.76 TeV
Physical Review Letters
October 2016
77
Direct Evidence of Octupole Deformation in
Neutron-Rich Ba144
Physical Review Letters
March 2016
69
Observation of the 1S–2S transition in trapped
antihydrogen
Nature
December 2016 69
Atom-at-a-time laser resonance ionization
spectroscopy of nobelium
Nature
September 2016
57
An improved limit on the charge of
antihydrogen from stochastic acceleration
Nature
January 2016
27
Nature JIF(2016) = 40.1, Physical Review Letters JIF (2016)= 8.4
JIF – practical example
11
Altmetric
score
Article Journal JIF Citation count
Hyperporous Carbons from Hypercrosslinked Polymers Advanced Materials 19.8 107
Three-dimensional protonic conductivity in porous organic cage
solids
Nature Communications 12.2 55
Reticular synthesis of porous molecular 1D nanotubes and 3D
networks
Nature Chemistry 21.7 51
A Perspective on the Synthesis, Purification, and Characterization
of Porous Organic Cages
Chemistry of Materials 9.5 40
Understanding static, dynamic and cooperative porosity in
molecular materials
Chemical Science 8.7 23
High Surface Area Sulfur-Doped Microporous Carbons from
Inverse Vulcanised Polymers
Journal of Materials Chemistry, A 8.8 19
JIF
12Adler, R., Ewing, J., and Taylor, P. (2008) Citation statistics. A report from the International Mathematical Union.
•Field-specific. The amount of citations received in different fields vary significantly.
JIF
13
•Can be manipulated by editorial policy, self-citations, citation
‘cartels’
• asking authors to cite articles from the same journal
• deliberately publishing more citable paper types such as
review article
• reducing the number of "citable items" in the JIF
calculation
• Men self-cite mote than women*
• Data used to calculate JIF are not transparent nor openly
available
*https://www.cambridge.org/core/journals/international-organization/article/gender-citation-gap-in-international-relations/3A769C5CFA7E24C32641CDB2FD03126A
Journal Impact Factor
𝐽𝐼𝐹 (2018) =
𝐶𝑖𝑡𝑎𝑡𝑖𝑜𝑛𝑠 𝑖𝑛 𝟏𝟖 𝑡𝑜 𝑖𝑡𝑒𝑚𝑠 𝑝𝑢𝑏𝑙𝑖𝑠ℎ𝑒𝑑 𝑖𝑛 𝟏𝟔 + 𝟏𝟕
𝑁𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑡𝑒𝑚𝑠 𝑖𝑛 𝟏𝟔 + 𝟏𝟕
14
= The mean average of citations over the last 2 years
≠ Impact or quality
Citation Count
15
What does high citation count tells us about a research paper?
Citation Count
16
What does high citation count tells us about a research paper?
• The citations can be negative, positive, neutral or even
quite random
• Citations are not reviewed and removed in case of
retraction.
• Citation cartels/ self-citations
• Certain types of articles tend to get more citations than
others
Citation analysis
17https://scite.ai/
Self-Citation Count
18
Retraction example
https://retractionwatch.com/2020/07/31/cite-yourself-excessively-apologize-then-republish-the-papers-with-fewer-
self-citations-journal-says-fine/
A journal has allowed a geophysicist who cited his own
work hundreds of times across 10 papers to retract the
articles and republish them with a fraction of the self-
citations
Chen cited himself 370 times (out of 898 ref, 42 %)
 Retraction  more acceptable 105 times in the new
versions
From 2017 to 2019, Yangkang Chen published
10 papers in Geophysical Journal International
Other Journal Metrics
Metric Meaning Database
JIF • for journals listed in the Journal Citation Reports (JCR) – ca. 11,000 WoS
CiteScore • =JIF by Elsevier
• for journals listed in Scopus – ca. 22,000
• Looks at previous 3 years instead of 2
Scopus
EigenFactor • a rating of the total importance of a scientific journal
• citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those
from poorly ranked journals
• Complex algoritm for calculation of eigenvectror centrality = percentage of the total influence in the
network
WoS
Article Influence Score • Eigenfactor adjusted for the number of papers published in each journal WoS
SJR = Scimago Journal
Rank
• It is a prestige metric similar Google page rank and Eigen Factor
• Differences to Eigen – does not eliminate self-ciatation, uses 3 year window of citations (EF 5 year)
Scopus
SNIP = Source normalized
impact per paper
• 𝑆𝑁𝐼𝑃 =
𝐽𝐼𝐹
𝐶𝑖𝑡𝑎𝑡𝑖𝑜𝑛 𝑃𝑜𝑡𝑒𝑛𝑡𝑖𝑎𝑙 𝑖𝑛 𝑖𝑡𝑠 𝑓𝑖𝑒𝑙𝑑
• Citation potential = reflection of how likely paper is to be cited
• corrects for differences in citation practices between scientific fields
Scopus
19
DO NOT MIX AND MATCH!!
Owner Database Strengths Weaknesses Statistical Tool
- Advanced citation searching and
analysis
- Good coverage in Sciences and Health
sciences
- Includes books & conference
proceedings
- Limited coverage of non-English
titles
- Limited for Humanities, social
sciences and Arts
- Dissertations excluded
- Advanced citation search and analysis
- Better coverage of Social sciences
- 120k standalone books &421 book
series
- Conference proceeding included
- Books and conference proceeding
limited
- Coverage of Arts, Humanities and
social sciences limited
- - dissertations excluded
- Free
- Better international and multi-lingual
coverage
- Incl. books and conference papers
- Data quality can be an issue
- Journal coverage incomplete
- Can include non-scholarly content
- Sources indexed and timespans
covered not disclosed
- Free
- Moving beyond citations to deliver a
broader picture of impact
- 360` view - related grants; article
metrics; the related patents, policy
documents, and datasets
- Not as widely known and used
- Some search limitations, e.g. Searching
in the main search box restricted to
only two options
Bibliometrics - sources
Publish
or Perish
DO NOT MIX AND MATCH!!
20
H-index
• Can be manipulated by self-citations
• Discriminates against early career researchers, women and other who
may have taken a career break
• Field specific
= the number of publications
for which an author has been
cited at least that same
amount of times
21
H- index
Footballer Goals
Joself Bican 805
Romário 772
Pelé 767
Ferenc Puskas 746
Gert Muller 735
Basketball Player Points
Kareem Abdul-Jabbar 38,387
Karl Malone 36,928
Kobe Bryant 33,643
Michael Jordan 32,292
Wilt Chamberlain 31,419
Other comparisons:
• National vs. local league
• Women's vs men's league
• total score of young vs. mature
players
22
Journal metrics and simple citations counts are indicators for Publishers and Librarians, not
designed for Assessing research(ers) – they do not reflect research quality!
23
Metrics are:
 limited.
 a reflection of popularity, not quality.
 short sighted - the research power enabled by the diverse academic base cannot be captured by a
single metric.
 often rewarding the established and actively inhibit diversity of staff and research and so innovation.
To sum up:
24
• Looking at the whole research profile within suitable context using expert
judgement ( can be supported by suitable quantitative indicators)
• Aligned to the research mission
• Taking into account all research outputs
• Taking into account equality and diversity
• gender, FTE, career length, career break
CONSIDER NARRATIVES, NOT SOLELY METRICS!!
How else can we judge quality of a research(ers)?
The use of Metrics should be:
•based on variety of suitable metrics in terms of accuracy, scope and context
(avoiding JIF and using variety of metrics)Robust
•quantitative evaluation should support – but not supplant – qualitative,
expert assessment
Complementary to Peer
review
•expectations, data collection and analytical processes should be open and
transparent, so that those being evaluated can test and verify the resultsTransparent
•accounting for variation by field, institutional goals as well as professional
and personal circumstances.Inclusive
• recognise the effects of indicators, update them regularly and champion
best practice in data collectionReflexive
25
Which Metrics?
26
• No. of downloads: check the publisher website and your institutional
repository
• Altmetric score: shows the online attention your research online
• Field Weighted Citation Impact
• Remember to use metrics to your advantage and illustrate narratives!
 Responsible use of Metrics
Which metrics should you use to demonstrate the impact of your research?
DO NOT MIX AND MATCH THE DATA SOURCE!!
Responsible Metrics
• Bibliometrics
• statistical analysis of publications, eg. #publications,
#citations
• Author/article(s) related, not journal related should
be used
• Altmetrics
• Other research outputs
https://www.metrics-toolkit.org/ 27
Bibliometrics
Field Citation Ratio (FCR)
similar to FWCI
FWCI - Field Weighted Citation Impact
FWCI of 1.44 means 44% more cited than expected
+ measures the citation impact of the output itself, not the journal in which it is published
+ compares like-with-like (outputs of the same age and type as classed by Scopus)
- may disadvantage multi-disciplinary work
𝐹𝑊𝐶𝐼 =
# 𝑐𝑖𝑡𝑎𝑡𝑖𝑜𝑛𝑠
𝑒𝑥𝑝𝑒𝑐𝑡𝑒𝑑 # 𝑐𝑖𝑡𝑎𝑡𝑖𝑜𝑛𝑠
28
Altmetrics
Alternative metrics ("altmetrics") captures online attention surrounding academic content e.g.
Twitter, Facebook and Social Media activity; mentions in Policy documents and registered
Patents; Media coverage etc.
• offer useful information about impact outside of scholarly publishing
• can serve as early indicators of possible intentions to cite a publication
• help create success narrative around research impact
https://www.altmetric.com
29
Altmetrics
30
Can help answer questions such as:
Who is talking about my research and where?
Who could be my potential collaborators?
Which of my papers are the most noticed/read by others?
What is the impact of my research outside of my discipline?
How do I compare to my peers in terms of public engagement?
Altmetric Explorer
31https://www.altmetric.com/explorer/
Other research outputs
• Key datasets, software, novel assays and reagents
• Awards + Funding
• Conference papers, keynote speeches
• Knowledge transfers
• Science communication + Outreach
• Licenses or patents
• Training delivered + Mentoring
• Contribution to consortia
32
Change is underway
- movement highlighting that the current research assessment is flawed and change needs to be
made
2012 2015 2015
33
Change is happening: Research Inst.
34
UT Southwestern
Dallas, Texas
Charité University
Berlin, Germany
Candidates receive questions before Skype
interviews, so they have time to reflect. The
goal is to identify thoughtful individuals, in
addition to candidates who process
information quickly.
Application includes:
• Contribution to research field
• Open Science
• Team Science
• Interactions with stakeholders
Staff member sits on hiring deliberation meetings as a neutral
party to promote balanced discussions
Russel Group Universities
35
Statement
Principles
Policy
Cardiff University
University of Edinburgh
University of Glasgow
Imperial College London
University of Manchester
University of Nottingham
University of Oxford
Queen’s University BelfastUniversity of Birmingham
University of Exeter
King’s College London
University of Leeds
Newcastle University
Queen Mary University of London
University of Sheffield
University of Warwick
University of Bristol
University of Cambridge
London School of Economics
Durham University
University of Southampton
University of York
University College London
University of Liverpool
Responsible
Research
Assessment
*In red are institutions that did not sign DORA
Change is happening: Funders
36
Wellcome
London, United Kingdom
Cancer Research UK
London, UK
Application includes:
 List of research outputs
 Summary of 3-5 achievements
 Space to describe other measures of impact
Reminds peer reviewers and committee members of
DORA principles throughout funding process
Application includes:
 List of research outputs
 Contributions to mentorship
 Output sharing plan to advance potential
health benefits
 Plans for public engagement
Guidance provided to advisory panel members
What can you do?
37
• Sign DORA & become an advocate
DORA - Key recommendations For Researchers
38
When researching
• Focus on content
• do not cherry pick data
• be open about your research
• Cite primary literature when you can
• Consider carefully where to publish
• Subject relevance of journal (or publisher for monographs)
• Journal/publisher Open Access options
• Funder and university Open Access requirements
• Length of editorial and production processes
• University subscription to the journal
• Costs of publishing (e.g. page charges, colour charges)
• Perception of the publisher among peers
• Promote/track your research outputs
DORA - Key recommendations For Researchers
39
When presenting yourselves
• Use a range of metrics to show the impact of your work - Altmetrics, article level metrics,
check scite
• Relate your achievements to your goals
• In funding applications
• Contribution to research field
• Open Science
• Team Science
• Interactions with stakeholders
• other measures of impact
• Change the way you present your CV
CV
40
The Résumé for Researchers
41
Personal details
- education, key qualifications and relevant positions
4 modules:
• How have you contributed to the generation of knowledge?
• How have you contributed to the development of individuals?
• How have you contributed to the wider research community?
• How have you contributed to broader society?
Personal statement
Additions
https://royalsociety.org/topics-policy/projects/research-culture/tools-for-support/resume-for-researchers/
Developed by the Royal Society
DORA - Key recommendations For Researchers
42
Otherwise
• Change the culture!
• Champion good practices to implement change
• Share improved assessment / your ideas with others
• Support your institution to review assessment practices
Responsible Metrics at Liverpool?
We are signatory of DORA since 2018
Statement
Principles
Policy
Cardiff University
University of Edinburgh
University of Glasgow
ImperialCollege London
University of Manchester
University of Nottingham
University of Oxford
Queen’s University Belfast
University of Birmingham
University of Exeter
King’s College London
University of Leeds
NewcastleUniversity
Queen Mary University of London
University of Sheffield
University of Warwick
University of Bristol
University of Cambridge
London Schoolof Economics
DurhamUniversity
University of Southampton
University of York
University College London
Universityof Liverpool
Responsible
Research
Assessment
Wellcome trust and other funders require it
We do not want to stay behind!!!! The change is happening now
We want to do things right!! Ethics No 1 Principle
43
Responsible Metrics at Liverpool
To development of a policy and implementation plan for
responsible use of Metrics in Research Assessment is underway!
Draft policy to be ready for sign off by April 2021.
44
If you would like to get involved, please contact:
Dr Zuzana Oriou
Responsible Metrics Project Manager
Open Research Team - Libraries, Museums and Galleries
z.oriou@liverpool.ac.uk
Thank you
45

Responsible metrics in research assessment

  • 1.
    DORA & Responsible Metricsin Research Assessment ZUZANA ORIOU OPEN RESEARCH TEAM 19/11/2020 1
  • 2.
    Objectives 2  To introduceDORA and it’s aims  To understand publication metrics and their common misconceptions  To introduce key arguments, numbers and data, to aid change in the research culture  To understand which metrics and how to use them for researchers (eg. in CV and funding applications)
  • 3.
    DORA 3 Declaration On ResearchAssessment Improving how research is assessed sfdora.org @DORAssessment • Addresses misuse of journal-based metrics in hiring, promotion and funding decisions • Draws attention to the problem and fosters progressive solutions • Demonstrates community support for change Signed by ~2000 organizations and >16,000 individuals Core principles of DORA for research institutions • Be explicit about the criteria used to reach hiring, tenure, and promotion decisions, • highlighting that scientific content of a paper is much more important than publication metrics • for the purposes of research assessment, consider the value and impact of all research outputs
  • 4.
    Bibliometrics 4 Bibliometrics = statisticalanalysis of publications. - can provide valuable insights into aspects of research in some disciplines if used critically BUT! ….when used in the wrong context… - it can be problematic for researchers and research progress - research “excellence” and “quality” are abstract concepts, difficult to measure directly - can incentivize undesirable behaviours - chasing publications in journals with high impact factors regardless of whether this is the most appropriate venue for publication - discouraging the use of open research approaches such as preprints or data sharing  Inaccurate assessment of research can become unethical when metrics take precedence over expert judgement, where the complexities and nuances of research or a researcher’s profile cannot be quantified
  • 5.
    Questionable metrics 5 • JournalImpact Factor ( JIF) • Citation Count • H- index
  • 6.
    Journal Impact factor 6 Whatis Journal Impact Factor?
  • 7.
    Journal Impact factor 7 𝐽𝐼𝐹(2018) = 𝐶𝑖𝑡𝑎𝑡𝑖𝑜𝑛𝑠 𝑖𝑛 𝟏𝟖 𝑡𝑜 𝑖𝑡𝑒𝑚𝑠 𝑝𝑢𝑏𝑙𝑖𝑠ℎ𝑒𝑑 𝑖𝑛 𝟏𝟔 + 𝟏𝟕 𝑁𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑡𝑒𝑚𝑠 𝑖𝑛 𝟏𝟔 + 𝟏𝟕 = The mean average of citations over the last 2 years What is Journal Impact Factor?
  • 8.
    Journal Impact factor 8 Limitations: •Citation distributions within journals are highly skewed - average highly susceptible to outliers • Field-specific - The amount of citations received in different fields vary significantly. • Can be manipulated by editorial policy, self-citations, citation ‘cartels’ • Data used to calculate JIF are not transparent nor openly available • False precision
  • 9.
    JIF 9 Physical Review LettersNature •Citationdistributions within journals are highly skewed - average highly susceptible to outliers An article published in nature does not mean that is has been or will be cited 43x! Editorial (2005). Not so deep impact. Nature 435, 1003–1004. https://www.nature.com/articles/4351003b.pdf
  • 10.
    JIF- practical example 10 •Do articles in high JIF journals receive more attention? Altmetric score Article Journal Citation count Evidence for a Bs0π± State Physical Review Letters July 2016 99 Correlated Event-by-Event Fluctuations of Flow Harmonics in Pb-Pb Collisions at sNN=2.76 TeV Physical Review Letters October 2016 77 Direct Evidence of Octupole Deformation in Neutron-Rich Ba144 Physical Review Letters March 2016 69 Observation of the 1S–2S transition in trapped antihydrogen Nature December 2016 69 Atom-at-a-time laser resonance ionization spectroscopy of nobelium Nature September 2016 57 An improved limit on the charge of antihydrogen from stochastic acceleration Nature January 2016 27 Nature JIF(2016) = 40.1, Physical Review Letters JIF (2016)= 8.4
  • 11.
    JIF – practicalexample 11 Altmetric score Article Journal JIF Citation count Hyperporous Carbons from Hypercrosslinked Polymers Advanced Materials 19.8 107 Three-dimensional protonic conductivity in porous organic cage solids Nature Communications 12.2 55 Reticular synthesis of porous molecular 1D nanotubes and 3D networks Nature Chemistry 21.7 51 A Perspective on the Synthesis, Purification, and Characterization of Porous Organic Cages Chemistry of Materials 9.5 40 Understanding static, dynamic and cooperative porosity in molecular materials Chemical Science 8.7 23 High Surface Area Sulfur-Doped Microporous Carbons from Inverse Vulcanised Polymers Journal of Materials Chemistry, A 8.8 19
  • 12.
    JIF 12Adler, R., Ewing,J., and Taylor, P. (2008) Citation statistics. A report from the International Mathematical Union. •Field-specific. The amount of citations received in different fields vary significantly.
  • 13.
    JIF 13 •Can be manipulatedby editorial policy, self-citations, citation ‘cartels’ • asking authors to cite articles from the same journal • deliberately publishing more citable paper types such as review article • reducing the number of "citable items" in the JIF calculation • Men self-cite mote than women* • Data used to calculate JIF are not transparent nor openly available *https://www.cambridge.org/core/journals/international-organization/article/gender-citation-gap-in-international-relations/3A769C5CFA7E24C32641CDB2FD03126A
  • 14.
    Journal Impact Factor 𝐽𝐼𝐹(2018) = 𝐶𝑖𝑡𝑎𝑡𝑖𝑜𝑛𝑠 𝑖𝑛 𝟏𝟖 𝑡𝑜 𝑖𝑡𝑒𝑚𝑠 𝑝𝑢𝑏𝑙𝑖𝑠ℎ𝑒𝑑 𝑖𝑛 𝟏𝟔 + 𝟏𝟕 𝑁𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑡𝑒𝑚𝑠 𝑖𝑛 𝟏𝟔 + 𝟏𝟕 14 = The mean average of citations over the last 2 years ≠ Impact or quality
  • 15.
    Citation Count 15 What doeshigh citation count tells us about a research paper?
  • 16.
    Citation Count 16 What doeshigh citation count tells us about a research paper? • The citations can be negative, positive, neutral or even quite random • Citations are not reviewed and removed in case of retraction. • Citation cartels/ self-citations • Certain types of articles tend to get more citations than others
  • 17.
  • 18.
    Self-Citation Count 18 Retraction example https://retractionwatch.com/2020/07/31/cite-yourself-excessively-apologize-then-republish-the-papers-with-fewer- self-citations-journal-says-fine/ Ajournal has allowed a geophysicist who cited his own work hundreds of times across 10 papers to retract the articles and republish them with a fraction of the self- citations Chen cited himself 370 times (out of 898 ref, 42 %)  Retraction  more acceptable 105 times in the new versions From 2017 to 2019, Yangkang Chen published 10 papers in Geophysical Journal International
  • 19.
    Other Journal Metrics MetricMeaning Database JIF • for journals listed in the Journal Citation Reports (JCR) – ca. 11,000 WoS CiteScore • =JIF by Elsevier • for journals listed in Scopus – ca. 22,000 • Looks at previous 3 years instead of 2 Scopus EigenFactor • a rating of the total importance of a scientific journal • citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals • Complex algoritm for calculation of eigenvectror centrality = percentage of the total influence in the network WoS Article Influence Score • Eigenfactor adjusted for the number of papers published in each journal WoS SJR = Scimago Journal Rank • It is a prestige metric similar Google page rank and Eigen Factor • Differences to Eigen – does not eliminate self-ciatation, uses 3 year window of citations (EF 5 year) Scopus SNIP = Source normalized impact per paper • 𝑆𝑁𝐼𝑃 = 𝐽𝐼𝐹 𝐶𝑖𝑡𝑎𝑡𝑖𝑜𝑛 𝑃𝑜𝑡𝑒𝑛𝑡𝑖𝑎𝑙 𝑖𝑛 𝑖𝑡𝑠 𝑓𝑖𝑒𝑙𝑑 • Citation potential = reflection of how likely paper is to be cited • corrects for differences in citation practices between scientific fields Scopus 19 DO NOT MIX AND MATCH!!
  • 20.
    Owner Database StrengthsWeaknesses Statistical Tool - Advanced citation searching and analysis - Good coverage in Sciences and Health sciences - Includes books & conference proceedings - Limited coverage of non-English titles - Limited for Humanities, social sciences and Arts - Dissertations excluded - Advanced citation search and analysis - Better coverage of Social sciences - 120k standalone books &421 book series - Conference proceeding included - Books and conference proceeding limited - Coverage of Arts, Humanities and social sciences limited - - dissertations excluded - Free - Better international and multi-lingual coverage - Incl. books and conference papers - Data quality can be an issue - Journal coverage incomplete - Can include non-scholarly content - Sources indexed and timespans covered not disclosed - Free - Moving beyond citations to deliver a broader picture of impact - 360` view - related grants; article metrics; the related patents, policy documents, and datasets - Not as widely known and used - Some search limitations, e.g. Searching in the main search box restricted to only two options Bibliometrics - sources Publish or Perish DO NOT MIX AND MATCH!! 20
  • 21.
    H-index • Can bemanipulated by self-citations • Discriminates against early career researchers, women and other who may have taken a career break • Field specific = the number of publications for which an author has been cited at least that same amount of times 21
  • 22.
    H- index Footballer Goals JoselfBican 805 Romário 772 Pelé 767 Ferenc Puskas 746 Gert Muller 735 Basketball Player Points Kareem Abdul-Jabbar 38,387 Karl Malone 36,928 Kobe Bryant 33,643 Michael Jordan 32,292 Wilt Chamberlain 31,419 Other comparisons: • National vs. local league • Women's vs men's league • total score of young vs. mature players 22
  • 23.
    Journal metrics andsimple citations counts are indicators for Publishers and Librarians, not designed for Assessing research(ers) – they do not reflect research quality! 23 Metrics are:  limited.  a reflection of popularity, not quality.  short sighted - the research power enabled by the diverse academic base cannot be captured by a single metric.  often rewarding the established and actively inhibit diversity of staff and research and so innovation. To sum up:
  • 24.
    24 • Looking atthe whole research profile within suitable context using expert judgement ( can be supported by suitable quantitative indicators) • Aligned to the research mission • Taking into account all research outputs • Taking into account equality and diversity • gender, FTE, career length, career break CONSIDER NARRATIVES, NOT SOLELY METRICS!! How else can we judge quality of a research(ers)?
  • 25.
    The use ofMetrics should be: •based on variety of suitable metrics in terms of accuracy, scope and context (avoiding JIF and using variety of metrics)Robust •quantitative evaluation should support – but not supplant – qualitative, expert assessment Complementary to Peer review •expectations, data collection and analytical processes should be open and transparent, so that those being evaluated can test and verify the resultsTransparent •accounting for variation by field, institutional goals as well as professional and personal circumstances.Inclusive • recognise the effects of indicators, update them regularly and champion best practice in data collectionReflexive 25
  • 26.
    Which Metrics? 26 • No.of downloads: check the publisher website and your institutional repository • Altmetric score: shows the online attention your research online • Field Weighted Citation Impact • Remember to use metrics to your advantage and illustrate narratives!  Responsible use of Metrics Which metrics should you use to demonstrate the impact of your research? DO NOT MIX AND MATCH THE DATA SOURCE!!
  • 27.
    Responsible Metrics • Bibliometrics •statistical analysis of publications, eg. #publications, #citations • Author/article(s) related, not journal related should be used • Altmetrics • Other research outputs https://www.metrics-toolkit.org/ 27
  • 28.
    Bibliometrics Field Citation Ratio(FCR) similar to FWCI FWCI - Field Weighted Citation Impact FWCI of 1.44 means 44% more cited than expected + measures the citation impact of the output itself, not the journal in which it is published + compares like-with-like (outputs of the same age and type as classed by Scopus) - may disadvantage multi-disciplinary work 𝐹𝑊𝐶𝐼 = # 𝑐𝑖𝑡𝑎𝑡𝑖𝑜𝑛𝑠 𝑒𝑥𝑝𝑒𝑐𝑡𝑒𝑑 # 𝑐𝑖𝑡𝑎𝑡𝑖𝑜𝑛𝑠 28
  • 29.
    Altmetrics Alternative metrics ("altmetrics")captures online attention surrounding academic content e.g. Twitter, Facebook and Social Media activity; mentions in Policy documents and registered Patents; Media coverage etc. • offer useful information about impact outside of scholarly publishing • can serve as early indicators of possible intentions to cite a publication • help create success narrative around research impact https://www.altmetric.com 29
  • 30.
    Altmetrics 30 Can help answerquestions such as: Who is talking about my research and where? Who could be my potential collaborators? Which of my papers are the most noticed/read by others? What is the impact of my research outside of my discipline? How do I compare to my peers in terms of public engagement?
  • 31.
  • 32.
    Other research outputs •Key datasets, software, novel assays and reagents • Awards + Funding • Conference papers, keynote speeches • Knowledge transfers • Science communication + Outreach • Licenses or patents • Training delivered + Mentoring • Contribution to consortia 32
  • 33.
    Change is underway -movement highlighting that the current research assessment is flawed and change needs to be made 2012 2015 2015 33
  • 34.
    Change is happening:Research Inst. 34 UT Southwestern Dallas, Texas Charité University Berlin, Germany Candidates receive questions before Skype interviews, so they have time to reflect. The goal is to identify thoughtful individuals, in addition to candidates who process information quickly. Application includes: • Contribution to research field • Open Science • Team Science • Interactions with stakeholders Staff member sits on hiring deliberation meetings as a neutral party to promote balanced discussions
  • 35.
    Russel Group Universities 35 Statement Principles Policy CardiffUniversity University of Edinburgh University of Glasgow Imperial College London University of Manchester University of Nottingham University of Oxford Queen’s University BelfastUniversity of Birmingham University of Exeter King’s College London University of Leeds Newcastle University Queen Mary University of London University of Sheffield University of Warwick University of Bristol University of Cambridge London School of Economics Durham University University of Southampton University of York University College London University of Liverpool Responsible Research Assessment *In red are institutions that did not sign DORA
  • 36.
    Change is happening:Funders 36 Wellcome London, United Kingdom Cancer Research UK London, UK Application includes:  List of research outputs  Summary of 3-5 achievements  Space to describe other measures of impact Reminds peer reviewers and committee members of DORA principles throughout funding process Application includes:  List of research outputs  Contributions to mentorship  Output sharing plan to advance potential health benefits  Plans for public engagement Guidance provided to advisory panel members
  • 37.
    What can youdo? 37 • Sign DORA & become an advocate
  • 38.
    DORA - Keyrecommendations For Researchers 38 When researching • Focus on content • do not cherry pick data • be open about your research • Cite primary literature when you can • Consider carefully where to publish • Subject relevance of journal (or publisher for monographs) • Journal/publisher Open Access options • Funder and university Open Access requirements • Length of editorial and production processes • University subscription to the journal • Costs of publishing (e.g. page charges, colour charges) • Perception of the publisher among peers • Promote/track your research outputs
  • 39.
    DORA - Keyrecommendations For Researchers 39 When presenting yourselves • Use a range of metrics to show the impact of your work - Altmetrics, article level metrics, check scite • Relate your achievements to your goals • In funding applications • Contribution to research field • Open Science • Team Science • Interactions with stakeholders • other measures of impact • Change the way you present your CV
  • 40.
  • 41.
    The Résumé forResearchers 41 Personal details - education, key qualifications and relevant positions 4 modules: • How have you contributed to the generation of knowledge? • How have you contributed to the development of individuals? • How have you contributed to the wider research community? • How have you contributed to broader society? Personal statement Additions https://royalsociety.org/topics-policy/projects/research-culture/tools-for-support/resume-for-researchers/ Developed by the Royal Society
  • 42.
    DORA - Keyrecommendations For Researchers 42 Otherwise • Change the culture! • Champion good practices to implement change • Share improved assessment / your ideas with others • Support your institution to review assessment practices
  • 43.
    Responsible Metrics atLiverpool? We are signatory of DORA since 2018 Statement Principles Policy Cardiff University University of Edinburgh University of Glasgow ImperialCollege London University of Manchester University of Nottingham University of Oxford Queen’s University Belfast University of Birmingham University of Exeter King’s College London University of Leeds NewcastleUniversity Queen Mary University of London University of Sheffield University of Warwick University of Bristol University of Cambridge London Schoolof Economics DurhamUniversity University of Southampton University of York University College London Universityof Liverpool Responsible Research Assessment Wellcome trust and other funders require it We do not want to stay behind!!!! The change is happening now We want to do things right!! Ethics No 1 Principle 43
  • 44.
    Responsible Metrics atLiverpool To development of a policy and implementation plan for responsible use of Metrics in Research Assessment is underway! Draft policy to be ready for sign off by April 2021. 44 If you would like to get involved, please contact: Dr Zuzana Oriou Responsible Metrics Project Manager Open Research Team - Libraries, Museums and Galleries z.oriou@liverpool.ac.uk
  • 45.