SlideShare a Scribd company logo
1 of 20
Download to read offline
Classification: Internal
Photo credit© ZB MED / Sima Deghani, published under CC BY-ND 4.0
https://creativecommons.org/licenses/by-nd/4.0/legalcode
From Impact Factors to Responsible Research Assessment and Open
Metrics– which suggestions are currently on the table?
IFLA satellite conference 18-19 August 2023
If not indicated otherwise, the content on these slides is licensed under a
Creative Commons Attribution 4.0 International License:
https://creativecommons.org/licenses/by/4.0/
Page
Classification: Internal
Why Research Assessment anyway?
 Career progression in research
 Funding decisions
 Monitoring of ongoing projects
 Evaluation of finished projects
 Evaluation of performance of institutions
Further aspects:
 assessment is fundamental for a functioning research system
 kind of measurement has direct influence on research culture and quality of research
 assessment should secure quality and impact and strengthen trust in science
8/19/2023 2
J. Schmitz: Responsible Research Assessment and Open Metrics
Based on Science Europe: RECOMMENDATIONS ON RESEARCH ASSESSMENT PROCESSES (2020): https://www.scienceeurope.org/media/3twjxim0/se-position-statement-research-assessment-processes.pdf;
p. 6; 8
Necessary because applicants and
proposals outnumber positions and
funding money!
European Commission, Directorate-General for Research and Innovation, Towards a reform of the research assessment system : scoping report, Publications Office,
2021, https://data.europa.eu/doi/10.2777/707440, p. 4
Page
Classification: Internal
What‘s wrong with the current Research Assessment Practices?
8/19/2023 3
J. Schmitz: Responsible Research Assessment and Open Metrics
• focus on no. of publications and citations
• striving for JIF journals
• publish-or-perish-culture
→ hamper quality; integrity; and trust in science plus…
European Commission, Directorate-General for Research and
Innovation, Towards a reform of the research assessment
system : scoping report, Publications Office, 2021,
https://data.europa.eu/doi/10.2777/707440, p. 3-5
Page
Classification: Internal
What‘s wrong with the current Research Assessment Practices?
8/19/2023 4
J. Schmitz: Responsible Research Assessment and Open Metrics
Changes through digitisation:
• new tasks that require new skills
• information overload
• multiple kinds of output
• openness/ accessibility as issues and keys to reliability and
reproducibility
• collaboration and trans-/multi-disciplinary approaches are
needed to solve complex problems
→ are not well represented in current research assessment
practices
European Commission, Directorate-General for Research and
Innovation, Towards a reform of the research assessment
system : scoping report, Publications Office, 2021,
https://data.europa.eu/doi/10.2777/707440, p. 4
Page
Classification: Internal
From Research Assessment to Responsible Research Assessment (RRA)
8/19/2023 5
J. Schmitz: Responsible Research Assessment and Open Metrics
“As attention shifts from describing these problems, towards designing and implementing solutions,
efforts are coalescing around the idea of responsible research assessment (RRA). This is an
umbrella term for approaches to assessment which incentivise, reflect and reward the plural
characteristics of high-quality research, in support of diverse and inclusive research cultures”.
Curry, Stephen; de Rijcke, Sarah; Hatch, Anna; Pillay, Dorsamy (Gansen); van der Weijden, Inge; Wilsdon,
James (2020). The changing role of funders in responsible research assessment: progress, obstacles and
the way ahead (RoRI Working Paper No.3). Research on Research Institute. Report.
https://doi.org/10.6084/m9.figshare.13227914.v2 , p. 7
Page
Classification: Internal
From Research Assessment to Responsible Research Assessment (RRA)
 Examples for RRA guidelines:
– DORA (https://sfdora.org/): assessment of research publications and beyond
– Leiden Manifesto (https://doi.org/10.1038/520429a): accountability in metrics-based
research assessment
– Hong Kong Principles (https://www.wcrif.org/guidance/hong-kong-principles): focus on
research integrity
– INORMS/ SCOPE (https://inorms.net/scope-framework-for-research-evaluation):
improvement of research assessment processes
8/19/2023 6
J. Schmitz: Responsible Research Assessment and Open Metrics
Based on: Marianne Gauffriau: Navigating Responsible Research Assessment Guidelines. Leiden Madtrics 02/02/2023: https://www.leidenmadtrics.nl/articles/navigating-responsible-research-assessment-guidelines
Page
Classification: Internal 8/19/2023 7
J. Schmitz: Responsible Research Assessment and Open Metrics
Positions & perspectives – UNESCO (multinational organisation)
UNESCO (2021): UNESCO Recommendation on Open
Science:
https://unesdoc.unesco.org/ark:/48223/pf0000379949.local
e=en , p.20-22
IV. AREAS OF ACTION
[…]
(ii) Developing an enabling policy environment for open science
17. Member States, according to their specific conditions, governing structures and
constitutional provisions, should develop or encourage policy environments […]
including policies to incentivize open science practices among researchers. Through a
transparent participatory, multi-stakeholder process that includes dialogue with the
scientific community, especially early-career researchers, and other open science
actors, Member States are encouraged to consider the following:
[…]
h. Encouraging responsible research and researcher evaluation and assessment
practices, which incentivize quality science, recognizing the diversity of research outputs,
activities and missions.
→ acknowledging/ incentivising open science practices
Page
Classification: Internal
Positions & perspectives – universities
8/19/2023 8
J. Schmitz: Responsible Research Assessment and Open Metrics
EUA (2022): The EUA Open Science Agenda
https://www.eua.eu/downloads/publications/eua%20os%20agenda.pdf, p.6
Page
Classification: Internal
Positions & perspectives – research performing institutions
8/19/2023 9
J. Schmitz: Responsible Research Assessment and Open Metrics
G6 statement on Open Science (2021):
https://os.helmholtz.de/assets/open_science/Downloads/G6_statement_on_
Open_Science.pdf, p.2
→ evaluation of open science practices and
transparency of the assessment process as such
Page
Classification: Internal
Position & perspectives – researchers
8/19/2023 10
J. Schmitz: Responsible Research Assessment and Open Metrics
Initiative for Science in Europe (2022): Centrality of researchers in reforming research assessment:
https://initiative-se.eu/wp-content/uploads/2022/03/2022-03-16_ise_report_online_final.pdf, p. 2
→ one of the key aspects: assess with
those who are assessed!
Page
Classification: Internal
Position & perspectives – researchers
8/19/2023 11
J. Schmitz: Responsible Research Assessment and Open Metrics
Bregt Saenen (EUA), Anna Hatch (DORA), Stephen Curry
(DORA), Vanessa Proudman (SPARC Europe) and Ashley
Lakoduk (DORA) (2021): https://eua.eu/downloads/publications/eua-dora-
sparc_case%20study%20report.pdf, p. 11 own highlighting
- “Academic career assessment practices should become more
open, accurate, transparent, and responsible. Key to meeting
this goal is institutions developing and instilling their own
standards and structure into assessment processes”.
- “However, more accurate, transparent, and responsible
approaches to academic evaluation should not primarily or
even necessarily aim to add more indicators, but rather
seek to find dynamic, context-sensitive, and above all
holistic approaches that allow researchers and universities
the freedom to pursue/manage academic activities in any
way they believe is most effective in service to society”.
→ autonomy of researchers and institutions when
assessing careers
Page
Classification: Internal
Positions & perspectives – funders
8/19/2023 12
J. Schmitz: Responsible Research Assessment and Open Metrics
Science Europe: RECOMMENDATIONS ON RESEARCH ASSESSMENT PROCESSES (2020): https://www.scienceeurope.org/media/3twjxim0/se-position-statement-research-assessment-
processes.pdf; licensed under CC-BY 4.0, p. 7
• Recommendations:
- Processes must be transparent (guidelines, right-to-reply, publication of results,
assessment reports for applicants etc.), criteria well-defined (e.g. quality);
- Constant evaluation of the assessment processes (also: consider changes in the
research system)
- Share best practices
- Demonstrate/evaluate how bias, discrimination and unfair treatment are addressed
- Broader criteria for the selection of reviewers should be developed (more diverse profiles,
national and international), also to overcome reviewer fatigue
- Guidelines, and training should be provided
- Streamline processes to reduce burden to reviewers; time and effort of applicants should be
considered as well
- Focus on content of application (qualitative assessment instead of quantitative) but also
consider the a broad spectrum of research outputs and activities
- Implementation of novel assessment techniques, sharing methodologies and outcomes
→ focus is on
assessment
processes, not so
much on the criteria
Page
Classification: Internal
Initiatives: CoARA – Coalition for Advancing Research Assessment
8/19/2023 13
J. Schmitz: Responsible Research Assessment and Open Metrics
Agreement on Reforming Research Assessment (2022):
https://coara.eu/app/uploads/2022/09/2022_07_19_rra_agreement_final.pdf; p. 2, own highlighting
“As signatories of this Agreement, we agree on
the need to reform research assessment
practices. Our vision is that the assessment of
research, researchers and research
organisations recognises the diverse outputs,
practices and activities that maximise the
quality and impact of research. This requires
basing assessment primarily on qualitative
judgement, for which peer review is central,
supported by responsible use of quantitative
indicators. Among other purposes, this is
fundamental for: deciding which researchers to
recruit, promote or reward, selecting which
research proposals to fund, and identifying which
research units and organisations to support”.
“As of 31 July 2023, there are 527 CoARA
member organisations from across the world”.
https://coara.eu/coalition/membership/
Page
Classification: Internal
Initiatives: CoARA – Coalition for Advancing Research Assessment
8/19/2023 14
J. Schmitz: Responsible Research Assessment and Open Metrics
Agreement on Reforming Research Assessment (2022):
https://coara.eu/app/uploads/2022/09/2022_07_19_rra_agreement_final.pdf, p. 13, own highlighting
“Collaboration on the basis of common
principles will facilitate progress in research
assessment reform – Thus far, progress across
research organisations and countries has
been uneven, and ongoing efforts are
fragmented. Collaboration on research
assessment reform will allow signatories to
move forward on the basis of common
principles. This will also diminish the
perceived ‘first-mover-disadvantage’ involved
in changing a culture of research assessment
based on quality, trust and risk-taking that is
applied globally”.
Page
Classification: Internal
Initiatives: CoARA – Coalition for Advancing Research Assessment
8/19/2023 15
J. Schmitz: Responsible Research Assessment and Open Metrics
Agreement on Reforming Research Assessment (2022):
https://coara.eu/app/uploads/2022/09/2022_07_19_rra_agreement_final.pdf; p. 3-4
Principles for assessment:
- ethics and integrity before counter-incentives
- freedom of scientific research
- autonomy of research organisations
- data infrastructure and data on which assessment is based should be
clear and transparent; control and ownership by research community
- focus on quality (peer review)
- reward variety of research mission
- quality through transparency; openness contributes to quality
- support by quantitative indicators only when appropriate
- recognition of results that advance science or have an impact
- acknowledge diversity & ensure equality
- of outputs
- of research activities
- of approaches
- of openness
- of culture in research fields
- of roles and career stages
Page
Classification: Internal
Initiatives: CoARA – Coalition for Advancing Research Assessment
8/19/2023 16
J. Schmitz: Responsible Research Assessment and Open Metrics
Agreement on Reforming Research Assessment
(2022):
https://coara.eu/app/uploads/2022/09/2022_07_19_rra
_agreement_final.pdf; p. 4-10, own highlighting
Core Commitments:
„[…]
1. Recognise the diversity of contributions to, and careers in, research in accordance with the needs
and nature of the research […]
2. Base research assessment primarily on qualitative evaluation for which peer review is central,
supported by responsible use of quantitative indicators […]
3. Abandon inappropriate uses in research assessment of journal- and publication-based metrics,
in particular inappropriate uses of Journal Impact Factor (JIF) and h-index […]
4. Avoid the use of rankings of research organisations in research assessment […]
5. Commit resources to reforming research assessment as is needed to achieve the organisational
changes committed to […]
6. Review and develop research assessment criteria, tools and processes […]
7. Raise awareness of research assessment reform and provide transparent communication,
guidance, and training on assessment criteria and processes as well as their use […]
8. Exchange practices and experiences to enable mutual learning within and beyond the Coalition […]
9. Communicate progress made on adherence to the Principles and implementation of the
Commitments […]
10. Evaluate practices, criteria and tools based on solid evidence and the state-of-the-art in research
on research, and make data openly available for evidence gathering and research […]
!
- Review practices by the end of
2023 or within a year
- Complete first cycle by the end
of 2027 or within 5 years
Page
Classification: Internal
Need for open infrastructure for responsible research assessment
8/19/2023 17
J. Schmitz: Responsible Research Assessment and Open Metrics
UNESCO Recommendation on
Open Science (2021):
https://unesdoc.unesco.org/ark:/
48223/pf0000379949.locale=en,
p.12
“Open science infrastructures refer to shared research infrastructures
(virtual or physical, including […] current research information systems, open
bibliometrics and scientometrics systems for assessing and analysing scientific
domains […]) that are needed to support open science and serve the needs of
different communities [...]”.
4) “Keep data collection and analytical processes open, transparent and simple.
The construction of the databases required for evaluation should follow clearly
stated rules, set before the research has been completed. This was common
practice among the academic and commercial groups that built bibliometric
evaluation methodology over several decades. Those groups referenced protocols
published in the peer-reviewed literature. This transparency enabled scrutiny. […]
Recent commercial entrants should be held to the same standards; no one should
accept a black-box evaluation machine […]”.
https://doi.org/10.1038/520429a
Page
Classification: Internal
Open infrastructure for research assessment
8/19/2023 18
J. Schmitz: Responsible Research Assessment and Open Metrics
https://i4oc.org/
https://opencitations.net/datasets
Page
Classification: Internal
Open infrastructure for research assessment
8/19/2023 19
J. Schmitz: Responsible Research Assessment and Open Metrics
https://www.dimensions.ai/why-dimensions/
https://openalex.org/about
Page
Classification: Internal
Transformation ZB MED – Von der Bibliothek zum forschenden Informationszentrum 31.08.2023 20
For further information please do not hesitate to contact me
Dr Jasmin Schmitz
PUBLISSO Advisory Services
ZB MED – Information Centre for Life Science
Gleueler Straße 60
50931 Cologne
schmitz@zbmed.de
@jasminschmitz12 (please use email to get in touch)
+49 (0) 221 478-32795
https://www.publisso.de/en/advice/
8/19/2023
J. Schmitz: Responsible Research Assessment and Open Metrics 20

More Related Content

Similar to IFLA ARL Satellite conference 2023: “From Impact Factors to Responsible Research Assessment and Open Metrics: which suggestions are currently on the table?

Change Agent Network - Viewpoints cards - evaluation impact and sustainability
Change Agent Network - Viewpoints cards - evaluation impact and sustainabilityChange Agent Network - Viewpoints cards - evaluation impact and sustainability
Change Agent Network - Viewpoints cards - evaluation impact and sustainability
balham
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
Institute of Development Studies
 

Similar to IFLA ARL Satellite conference 2023: “From Impact Factors to Responsible Research Assessment and Open Metrics: which suggestions are currently on the table? (20)

Change Agent Network - Viewpoints cards - evaluation impact and sustainability
Change Agent Network - Viewpoints cards - evaluation impact and sustainabilityChange Agent Network - Viewpoints cards - evaluation impact and sustainability
Change Agent Network - Viewpoints cards - evaluation impact and sustainability
 
Peer reviewing
 Peer reviewing Peer reviewing
Peer reviewing
 
Peer reviewing
Peer reviewingPeer reviewing
Peer reviewing
 
Peer reviewing
Peer reviewingPeer reviewing
Peer reviewing
 
Introduction to project evaluations for SLOGA / Trialog
Introduction to project evaluations for SLOGA / TrialogIntroduction to project evaluations for SLOGA / Trialog
Introduction to project evaluations for SLOGA / Trialog
 
Measuring success through improved attribution
Measuring success through improved attributionMeasuring success through improved attribution
Measuring success through improved attribution
 
Finding the best structure to support impact - case of IDC at Ghent University
Finding the best structure to support impact - case of IDC at Ghent UniversityFinding the best structure to support impact - case of IDC at Ghent University
Finding the best structure to support impact - case of IDC at Ghent University
 
H2020 open-data-pilot
H2020 open-data-pilotH2020 open-data-pilot
H2020 open-data-pilot
 
Horizon 2020 LEIT-Space 2016-participaton rules and lessons learned
Horizon 2020 LEIT-Space 2016-participaton rules and lessons learned Horizon 2020 LEIT-Space 2016-participaton rules and lessons learned
Horizon 2020 LEIT-Space 2016-participaton rules and lessons learned
 
Unveiling the Ecosystem of Science: How can we characterize and assess divers...
Unveiling the Ecosystem of Science: How can we characterize and assess divers...Unveiling the Ecosystem of Science: How can we characterize and assess divers...
Unveiling the Ecosystem of Science: How can we characterize and assess divers...
 
Research in current scenario -sgd-adamf-20-apr-2018
Research in current scenario -sgd-adamf-20-apr-2018Research in current scenario -sgd-adamf-20-apr-2018
Research in current scenario -sgd-adamf-20-apr-2018
 
Decline of the tie a social investigation
Decline of the tie  a social investigationDecline of the tie  a social investigation
Decline of the tie a social investigation
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
Project evaluation
Project evaluationProject evaluation
Project evaluation
 
Joining it all up: developing research-practice linkages in the UK
Joining it all up: developing research-practice linkages in the UKJoining it all up: developing research-practice linkages in the UK
Joining it all up: developing research-practice linkages in the UK
 
The iterative engagement between curation and evaluation in an open research ...
The iterative engagement between curation and evaluation in an open research ...The iterative engagement between curation and evaluation in an open research ...
The iterative engagement between curation and evaluation in an open research ...
 
The iterative engagement between curation and evaluation in an open research ...
The iterative engagement between curation and evaluation in an open research ...The iterative engagement between curation and evaluation in an open research ...
The iterative engagement between curation and evaluation in an open research ...
 
A Step By Step Guide To Monitoring And Evaluation
A Step By Step Guide To Monitoring And EvaluationA Step By Step Guide To Monitoring And Evaluation
A Step By Step Guide To Monitoring And Evaluation
 
CDE Conference 09/02/2009. M Oliver. From present findings to future agendas:...
CDE Conference 09/02/2009. M Oliver. From present findings to future agendas:...CDE Conference 09/02/2009. M Oliver. From present findings to future agendas:...
CDE Conference 09/02/2009. M Oliver. From present findings to future agendas:...
 
2016-12-13 DRDC Conference MOOQ Workshop Christian M. Stracke
2016-12-13 DRDC Conference MOOQ Workshop Christian M. Stracke2016-12-13 DRDC Conference MOOQ Workshop Christian M. Stracke
2016-12-13 DRDC Conference MOOQ Workshop Christian M. Stracke
 

More from IFLAAcademicandResea

IFLA ARL Webinar Series: Open Science milestones in Botswana: Advancing Knowl...
IFLA ARL Webinar Series: Open Science milestones in Botswana: Advancing Knowl...IFLA ARL Webinar Series: Open Science milestones in Botswana: Advancing Knowl...
IFLA ARL Webinar Series: Open Science milestones in Botswana: Advancing Knowl...
IFLAAcademicandResea
 
IFLA ARL Webinar Series: Data Stewardship at the University of Vienna - Build...
IFLA ARL Webinar Series: Data Stewardship at the University of Vienna - Build...IFLA ARL Webinar Series: Data Stewardship at the University of Vienna - Build...
IFLA ARL Webinar Series: Data Stewardship at the University of Vienna - Build...
IFLAAcademicandResea
 
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLAAcademicandResea
 
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLAAcademicandResea
 

More from IFLAAcademicandResea (20)

IFLA ARL Webinar Series: Open Science milestones in Botswana: Advancing Knowl...
IFLA ARL Webinar Series: Open Science milestones in Botswana: Advancing Knowl...IFLA ARL Webinar Series: Open Science milestones in Botswana: Advancing Knowl...
IFLA ARL Webinar Series: Open Science milestones in Botswana: Advancing Knowl...
 
IFLA ARL Webinar Series: Data Stewardship at the University of Vienna - Build...
IFLA ARL Webinar Series: Data Stewardship at the University of Vienna - Build...IFLA ARL Webinar Series: Data Stewardship at the University of Vienna - Build...
IFLA ARL Webinar Series: Data Stewardship at the University of Vienna - Build...
 
IFLA ARL Webinar Series: Open Science at a US public institute (Presentation 3)
IFLA ARL Webinar Series: Open Science at a US public institute (Presentation 3) IFLA ARL Webinar Series: Open Science at a US public institute (Presentation 3)
IFLA ARL Webinar Series: Open Science at a US public institute (Presentation 3)
 
IFLA ARL Satellite conference 2023: Moving the needle: addressing inequity in...
IFLA ARL Satellite conference 2023: Moving the needle: addressing inequity in...IFLA ARL Satellite conference 2023: Moving the needle: addressing inequity in...
IFLA ARL Satellite conference 2023: Moving the needle: addressing inequity in...
 
IFLA ARL Satellite conference 2023: Role of Academic and Research Libraries f...
IFLA ARL Satellite conference 2023: Role of Academic and Research Libraries f...IFLA ARL Satellite conference 2023: Role of Academic and Research Libraries f...
IFLA ARL Satellite conference 2023: Role of Academic and Research Libraries f...
 
IFLA ARL Satellite conference 2023: Realising equity, inclusion and diversity...
IFLA ARL Satellite conference 2023: Realising equity, inclusion and diversity...IFLA ARL Satellite conference 2023: Realising equity, inclusion and diversity...
IFLA ARL Satellite conference 2023: Realising equity, inclusion and diversity...
 
IFLA ARL Satellite conference 2023: Identifying Opportunities to Support Blac...
IFLA ARL Satellite conference 2023: Identifying Opportunities to Support Blac...IFLA ARL Satellite conference 2023: Identifying Opportunities to Support Blac...
IFLA ARL Satellite conference 2023: Identifying Opportunities to Support Blac...
 
IFLA ARL Satellite conference 2023: Open access in Oceania - perspectives fro...
IFLA ARL Satellite conference 2023: Open access in Oceania - perspectives fro...IFLA ARL Satellite conference 2023: Open access in Oceania - perspectives fro...
IFLA ARL Satellite conference 2023: Open access in Oceania - perspectives fro...
 
IFLA ARL Satellite conference 2023: Exploring open infrastructure in Latin Am...
IFLA ARL Satellite conference 2023: Exploring open infrastructure in Latin Am...IFLA ARL Satellite conference 2023: Exploring open infrastructure in Latin Am...
IFLA ARL Satellite conference 2023: Exploring open infrastructure in Latin Am...
 
IFLA ARL Satellite conference 2023: From Harare to Harvard - Challenges of eq...
IFLA ARL Satellite conference 2023: From Harare to Harvard - Challenges of eq...IFLA ARL Satellite conference 2023: From Harare to Harvard - Challenges of eq...
IFLA ARL Satellite conference 2023: From Harare to Harvard - Challenges of eq...
 
IFLA ARL Satellite conference 2023: Open to Whom? The Importance of Accessibi...
IFLA ARL Satellite conference 2023: Open to Whom? The Importance of Accessibi...IFLA ARL Satellite conference 2023: Open to Whom? The Importance of Accessibi...
IFLA ARL Satellite conference 2023: Open to Whom? The Importance of Accessibi...
 
IFLA ARL Satellite conference 2023: Shadow Information Technology in the adve...
IFLA ARL Satellite conference 2023: Shadow Information Technology in the adve...IFLA ARL Satellite conference 2023: Shadow Information Technology in the adve...
IFLA ARL Satellite conference 2023: Shadow Information Technology in the adve...
 
IFLA ARL Satellite conference 2023: What needs to be done to ensure the maxim...
IFLA ARL Satellite conference 2023: What needs to be done to ensure the maxim...IFLA ARL Satellite conference 2023: What needs to be done to ensure the maxim...
IFLA ARL Satellite conference 2023: What needs to be done to ensure the maxim...
 
IFLA ARL Satellite conference 2023: Mapping Openness Across Asian Libraries: ...
IFLA ARL Satellite conference 2023: Mapping Openness Across Asian Libraries: ...IFLA ARL Satellite conference 2023: Mapping Openness Across Asian Libraries: ...
IFLA ARL Satellite conference 2023: Mapping Openness Across Asian Libraries: ...
 
IFLA ARL Hot Topics 2023: Large Language Models: Immediate Challenges, Long-t...
IFLA ARL Hot Topics 2023: Large Language Models: Immediate Challenges, Long-t...IFLA ARL Hot Topics 2023: Large Language Models: Immediate Challenges, Long-t...
IFLA ARL Hot Topics 2023: Large Language Models: Immediate Challenges, Long-t...
 
IFLA ARL Hot Topics 2023: Artificial Intelligence and ChatGPT in Academic and...
IFLA ARL Hot Topics 2023: Artificial Intelligence and ChatGPT in Academic and...IFLA ARL Hot Topics 2023: Artificial Intelligence and ChatGPT in Academic and...
IFLA ARL Hot Topics 2023: Artificial Intelligence and ChatGPT in Academic and...
 
IFLA ARL Hot Topics 2023: Anfang-Bot-Schluß
IFLA ARL Hot Topics 2023: Anfang-Bot-SchlußIFLA ARL Hot Topics 2023: Anfang-Bot-Schluß
IFLA ARL Hot Topics 2023: Anfang-Bot-Schluß
 
IFLA ARL Webinar Series: Emerging technologies and artificial intelligence in...
IFLA ARL Webinar Series: Emerging technologies and artificial intelligence in...IFLA ARL Webinar Series: Emerging technologies and artificial intelligence in...
IFLA ARL Webinar Series: Emerging technologies and artificial intelligence in...
 
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
 
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
IFLA ARL Webinar Series: Social justice, diversity and inclusion in academic ...
 

Recently uploaded

Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
EADTU
 
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
AnaAcapella
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 

Recently uploaded (20)

Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptx
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptx
 
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
 
REMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxREMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptx
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
What is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptxWhat is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptx
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use Cases
 
OS-operating systems- ch05 (CPU Scheduling) ...
OS-operating systems- ch05 (CPU Scheduling) ...OS-operating systems- ch05 (CPU Scheduling) ...
OS-operating systems- ch05 (CPU Scheduling) ...
 
Play hard learn harder: The Serious Business of Play
Play hard learn harder:  The Serious Business of PlayPlay hard learn harder:  The Serious Business of Play
Play hard learn harder: The Serious Business of Play
 
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
 
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
 
Simple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdfSimple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdf
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 

IFLA ARL Satellite conference 2023: “From Impact Factors to Responsible Research Assessment and Open Metrics: which suggestions are currently on the table?

  • 1. Classification: Internal Photo credit© ZB MED / Sima Deghani, published under CC BY-ND 4.0 https://creativecommons.org/licenses/by-nd/4.0/legalcode From Impact Factors to Responsible Research Assessment and Open Metrics– which suggestions are currently on the table? IFLA satellite conference 18-19 August 2023 If not indicated otherwise, the content on these slides is licensed under a Creative Commons Attribution 4.0 International License: https://creativecommons.org/licenses/by/4.0/
  • 2. Page Classification: Internal Why Research Assessment anyway?  Career progression in research  Funding decisions  Monitoring of ongoing projects  Evaluation of finished projects  Evaluation of performance of institutions Further aspects:  assessment is fundamental for a functioning research system  kind of measurement has direct influence on research culture and quality of research  assessment should secure quality and impact and strengthen trust in science 8/19/2023 2 J. Schmitz: Responsible Research Assessment and Open Metrics Based on Science Europe: RECOMMENDATIONS ON RESEARCH ASSESSMENT PROCESSES (2020): https://www.scienceeurope.org/media/3twjxim0/se-position-statement-research-assessment-processes.pdf; p. 6; 8 Necessary because applicants and proposals outnumber positions and funding money! European Commission, Directorate-General for Research and Innovation, Towards a reform of the research assessment system : scoping report, Publications Office, 2021, https://data.europa.eu/doi/10.2777/707440, p. 4
  • 3. Page Classification: Internal What‘s wrong with the current Research Assessment Practices? 8/19/2023 3 J. Schmitz: Responsible Research Assessment and Open Metrics • focus on no. of publications and citations • striving for JIF journals • publish-or-perish-culture → hamper quality; integrity; and trust in science plus… European Commission, Directorate-General for Research and Innovation, Towards a reform of the research assessment system : scoping report, Publications Office, 2021, https://data.europa.eu/doi/10.2777/707440, p. 3-5
  • 4. Page Classification: Internal What‘s wrong with the current Research Assessment Practices? 8/19/2023 4 J. Schmitz: Responsible Research Assessment and Open Metrics Changes through digitisation: • new tasks that require new skills • information overload • multiple kinds of output • openness/ accessibility as issues and keys to reliability and reproducibility • collaboration and trans-/multi-disciplinary approaches are needed to solve complex problems → are not well represented in current research assessment practices European Commission, Directorate-General for Research and Innovation, Towards a reform of the research assessment system : scoping report, Publications Office, 2021, https://data.europa.eu/doi/10.2777/707440, p. 4
  • 5. Page Classification: Internal From Research Assessment to Responsible Research Assessment (RRA) 8/19/2023 5 J. Schmitz: Responsible Research Assessment and Open Metrics “As attention shifts from describing these problems, towards designing and implementing solutions, efforts are coalescing around the idea of responsible research assessment (RRA). This is an umbrella term for approaches to assessment which incentivise, reflect and reward the plural characteristics of high-quality research, in support of diverse and inclusive research cultures”. Curry, Stephen; de Rijcke, Sarah; Hatch, Anna; Pillay, Dorsamy (Gansen); van der Weijden, Inge; Wilsdon, James (2020). The changing role of funders in responsible research assessment: progress, obstacles and the way ahead (RoRI Working Paper No.3). Research on Research Institute. Report. https://doi.org/10.6084/m9.figshare.13227914.v2 , p. 7
  • 6. Page Classification: Internal From Research Assessment to Responsible Research Assessment (RRA)  Examples for RRA guidelines: – DORA (https://sfdora.org/): assessment of research publications and beyond – Leiden Manifesto (https://doi.org/10.1038/520429a): accountability in metrics-based research assessment – Hong Kong Principles (https://www.wcrif.org/guidance/hong-kong-principles): focus on research integrity – INORMS/ SCOPE (https://inorms.net/scope-framework-for-research-evaluation): improvement of research assessment processes 8/19/2023 6 J. Schmitz: Responsible Research Assessment and Open Metrics Based on: Marianne Gauffriau: Navigating Responsible Research Assessment Guidelines. Leiden Madtrics 02/02/2023: https://www.leidenmadtrics.nl/articles/navigating-responsible-research-assessment-guidelines
  • 7. Page Classification: Internal 8/19/2023 7 J. Schmitz: Responsible Research Assessment and Open Metrics Positions & perspectives – UNESCO (multinational organisation) UNESCO (2021): UNESCO Recommendation on Open Science: https://unesdoc.unesco.org/ark:/48223/pf0000379949.local e=en , p.20-22 IV. AREAS OF ACTION […] (ii) Developing an enabling policy environment for open science 17. Member States, according to their specific conditions, governing structures and constitutional provisions, should develop or encourage policy environments […] including policies to incentivize open science practices among researchers. Through a transparent participatory, multi-stakeholder process that includes dialogue with the scientific community, especially early-career researchers, and other open science actors, Member States are encouraged to consider the following: […] h. Encouraging responsible research and researcher evaluation and assessment practices, which incentivize quality science, recognizing the diversity of research outputs, activities and missions. → acknowledging/ incentivising open science practices
  • 8. Page Classification: Internal Positions & perspectives – universities 8/19/2023 8 J. Schmitz: Responsible Research Assessment and Open Metrics EUA (2022): The EUA Open Science Agenda https://www.eua.eu/downloads/publications/eua%20os%20agenda.pdf, p.6
  • 9. Page Classification: Internal Positions & perspectives – research performing institutions 8/19/2023 9 J. Schmitz: Responsible Research Assessment and Open Metrics G6 statement on Open Science (2021): https://os.helmholtz.de/assets/open_science/Downloads/G6_statement_on_ Open_Science.pdf, p.2 → evaluation of open science practices and transparency of the assessment process as such
  • 10. Page Classification: Internal Position & perspectives – researchers 8/19/2023 10 J. Schmitz: Responsible Research Assessment and Open Metrics Initiative for Science in Europe (2022): Centrality of researchers in reforming research assessment: https://initiative-se.eu/wp-content/uploads/2022/03/2022-03-16_ise_report_online_final.pdf, p. 2 → one of the key aspects: assess with those who are assessed!
  • 11. Page Classification: Internal Position & perspectives – researchers 8/19/2023 11 J. Schmitz: Responsible Research Assessment and Open Metrics Bregt Saenen (EUA), Anna Hatch (DORA), Stephen Curry (DORA), Vanessa Proudman (SPARC Europe) and Ashley Lakoduk (DORA) (2021): https://eua.eu/downloads/publications/eua-dora- sparc_case%20study%20report.pdf, p. 11 own highlighting - “Academic career assessment practices should become more open, accurate, transparent, and responsible. Key to meeting this goal is institutions developing and instilling their own standards and structure into assessment processes”. - “However, more accurate, transparent, and responsible approaches to academic evaluation should not primarily or even necessarily aim to add more indicators, but rather seek to find dynamic, context-sensitive, and above all holistic approaches that allow researchers and universities the freedom to pursue/manage academic activities in any way they believe is most effective in service to society”. → autonomy of researchers and institutions when assessing careers
  • 12. Page Classification: Internal Positions & perspectives – funders 8/19/2023 12 J. Schmitz: Responsible Research Assessment and Open Metrics Science Europe: RECOMMENDATIONS ON RESEARCH ASSESSMENT PROCESSES (2020): https://www.scienceeurope.org/media/3twjxim0/se-position-statement-research-assessment- processes.pdf; licensed under CC-BY 4.0, p. 7 • Recommendations: - Processes must be transparent (guidelines, right-to-reply, publication of results, assessment reports for applicants etc.), criteria well-defined (e.g. quality); - Constant evaluation of the assessment processes (also: consider changes in the research system) - Share best practices - Demonstrate/evaluate how bias, discrimination and unfair treatment are addressed - Broader criteria for the selection of reviewers should be developed (more diverse profiles, national and international), also to overcome reviewer fatigue - Guidelines, and training should be provided - Streamline processes to reduce burden to reviewers; time and effort of applicants should be considered as well - Focus on content of application (qualitative assessment instead of quantitative) but also consider the a broad spectrum of research outputs and activities - Implementation of novel assessment techniques, sharing methodologies and outcomes → focus is on assessment processes, not so much on the criteria
  • 13. Page Classification: Internal Initiatives: CoARA – Coalition for Advancing Research Assessment 8/19/2023 13 J. Schmitz: Responsible Research Assessment and Open Metrics Agreement on Reforming Research Assessment (2022): https://coara.eu/app/uploads/2022/09/2022_07_19_rra_agreement_final.pdf; p. 2, own highlighting “As signatories of this Agreement, we agree on the need to reform research assessment practices. Our vision is that the assessment of research, researchers and research organisations recognises the diverse outputs, practices and activities that maximise the quality and impact of research. This requires basing assessment primarily on qualitative judgement, for which peer review is central, supported by responsible use of quantitative indicators. Among other purposes, this is fundamental for: deciding which researchers to recruit, promote or reward, selecting which research proposals to fund, and identifying which research units and organisations to support”. “As of 31 July 2023, there are 527 CoARA member organisations from across the world”. https://coara.eu/coalition/membership/
  • 14. Page Classification: Internal Initiatives: CoARA – Coalition for Advancing Research Assessment 8/19/2023 14 J. Schmitz: Responsible Research Assessment and Open Metrics Agreement on Reforming Research Assessment (2022): https://coara.eu/app/uploads/2022/09/2022_07_19_rra_agreement_final.pdf, p. 13, own highlighting “Collaboration on the basis of common principles will facilitate progress in research assessment reform – Thus far, progress across research organisations and countries has been uneven, and ongoing efforts are fragmented. Collaboration on research assessment reform will allow signatories to move forward on the basis of common principles. This will also diminish the perceived ‘first-mover-disadvantage’ involved in changing a culture of research assessment based on quality, trust and risk-taking that is applied globally”.
  • 15. Page Classification: Internal Initiatives: CoARA – Coalition for Advancing Research Assessment 8/19/2023 15 J. Schmitz: Responsible Research Assessment and Open Metrics Agreement on Reforming Research Assessment (2022): https://coara.eu/app/uploads/2022/09/2022_07_19_rra_agreement_final.pdf; p. 3-4 Principles for assessment: - ethics and integrity before counter-incentives - freedom of scientific research - autonomy of research organisations - data infrastructure and data on which assessment is based should be clear and transparent; control and ownership by research community - focus on quality (peer review) - reward variety of research mission - quality through transparency; openness contributes to quality - support by quantitative indicators only when appropriate - recognition of results that advance science or have an impact - acknowledge diversity & ensure equality - of outputs - of research activities - of approaches - of openness - of culture in research fields - of roles and career stages
  • 16. Page Classification: Internal Initiatives: CoARA – Coalition for Advancing Research Assessment 8/19/2023 16 J. Schmitz: Responsible Research Assessment and Open Metrics Agreement on Reforming Research Assessment (2022): https://coara.eu/app/uploads/2022/09/2022_07_19_rra _agreement_final.pdf; p. 4-10, own highlighting Core Commitments: „[…] 1. Recognise the diversity of contributions to, and careers in, research in accordance with the needs and nature of the research […] 2. Base research assessment primarily on qualitative evaluation for which peer review is central, supported by responsible use of quantitative indicators […] 3. Abandon inappropriate uses in research assessment of journal- and publication-based metrics, in particular inappropriate uses of Journal Impact Factor (JIF) and h-index […] 4. Avoid the use of rankings of research organisations in research assessment […] 5. Commit resources to reforming research assessment as is needed to achieve the organisational changes committed to […] 6. Review and develop research assessment criteria, tools and processes […] 7. Raise awareness of research assessment reform and provide transparent communication, guidance, and training on assessment criteria and processes as well as their use […] 8. Exchange practices and experiences to enable mutual learning within and beyond the Coalition […] 9. Communicate progress made on adherence to the Principles and implementation of the Commitments […] 10. Evaluate practices, criteria and tools based on solid evidence and the state-of-the-art in research on research, and make data openly available for evidence gathering and research […] ! - Review practices by the end of 2023 or within a year - Complete first cycle by the end of 2027 or within 5 years
  • 17. Page Classification: Internal Need for open infrastructure for responsible research assessment 8/19/2023 17 J. Schmitz: Responsible Research Assessment and Open Metrics UNESCO Recommendation on Open Science (2021): https://unesdoc.unesco.org/ark:/ 48223/pf0000379949.locale=en, p.12 “Open science infrastructures refer to shared research infrastructures (virtual or physical, including […] current research information systems, open bibliometrics and scientometrics systems for assessing and analysing scientific domains […]) that are needed to support open science and serve the needs of different communities [...]”. 4) “Keep data collection and analytical processes open, transparent and simple. The construction of the databases required for evaluation should follow clearly stated rules, set before the research has been completed. This was common practice among the academic and commercial groups that built bibliometric evaluation methodology over several decades. Those groups referenced protocols published in the peer-reviewed literature. This transparency enabled scrutiny. […] Recent commercial entrants should be held to the same standards; no one should accept a black-box evaluation machine […]”. https://doi.org/10.1038/520429a
  • 18. Page Classification: Internal Open infrastructure for research assessment 8/19/2023 18 J. Schmitz: Responsible Research Assessment and Open Metrics https://i4oc.org/ https://opencitations.net/datasets
  • 19. Page Classification: Internal Open infrastructure for research assessment 8/19/2023 19 J. Schmitz: Responsible Research Assessment and Open Metrics https://www.dimensions.ai/why-dimensions/ https://openalex.org/about
  • 20. Page Classification: Internal Transformation ZB MED – Von der Bibliothek zum forschenden Informationszentrum 31.08.2023 20 For further information please do not hesitate to contact me Dr Jasmin Schmitz PUBLISSO Advisory Services ZB MED – Information Centre for Life Science Gleueler Straße 60 50931 Cologne schmitz@zbmed.de @jasminschmitz12 (please use email to get in touch) +49 (0) 221 478-32795 https://www.publisso.de/en/advice/ 8/19/2023 J. Schmitz: Responsible Research Assessment and Open Metrics 20