Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Unveiling the Ecosystem of Science: How can we characterize and assess diversity of profiles in science

84 views

Published on

Online seminar given for the Academic Career Hub at CWTS (Leiden University) on May 14, 2020

Published in: Education
  • Be the first to comment

  • Be the first to like this

Unveiling the Ecosystem of Science: How can we characterize and assess diversity of profiles in science

  1. 1. 1 Unveiling the Ecosystem of Science How can we characterize and assess diversity of profiles in science Nicolas Robinson-Garcia Delft Institute of Applied Mathematics TU Delft
  2. 2. 2 http://ecosci.nrobinsongarcia.com/wp
  3. 3. 3 Methodological design for individuals’ assessment Combining experts’ judgments and metrics in research evaluation Application of Structured Expert Judgment using valuation model Definition of a valuation model Motivations and values of scientists to promote diversity in science Multiple case-study analysis (CV, bibliometric data and interviews) Testing “diversity” Effects of task specialization on research careers Based on bibliometric data and contribution statements Outline
  4. 4. 4 Methodological design for individuals’ assessment Combining experts’ judgments and metrics in research evaluation Application of Structured Expert Judgment using valuation model Definition of a valuation model Motivations and values of scientists to promote diversity in science Multiple case-study analysis (CV, bibliometric data and interviews) Testing “diversity” Effects of task specialization on research careers Based on bibliometric data and contribution statements Outline
  5. 5. 5 Why a valuation model? Current individual assessments are… • … based on the notion of excellence and efficiency or career prospects
  6. 6. 6 Why a valuation model? Current individual assessments are… • … based on the notion of excellence and efficiency or career prospects • … considered in isolation and considering that science is a team game
  7. 7. 7 Why a valuation model? Current individual assessments are… • … based on the notion of excellence and efficiency or career prospects • … considered in isolation and considering that science is a team game • … conducted based on universal criteria and considering context or needs
  8. 8. 8 Why a valuation model? Critiques to individual assessment • DORA – Impact Factor • Leiden Manifesto – Misuse of indicators • Metric Tide – Indicators are not adequate Threats from current research evaluation 1. Pervasive effects on the scientific workforce (De Rijcke et al., 2016; Milojevic et al., 2018) 2. Pervasive effects on knowledge production (Nosek et al., 2015; Sarewitz, 2016)
  9. 9. 9 Why a valuation model? Critiques to individual assessment • DORA – Impact Factor • Leiden Manifesto – Misuse of indicators • Metric Tide – Indicators are not adequate Threats from current research evaluation 1. Pervasive effects on the scientific workforce (De Rijcke et al., 2016; Milojevic et al., 2018) 2. Pervasive effects on knowledge production (Nosek et al., 2015; Sarewitz, 2016)
  10. 10. 10 Alternative models of evaluation • S&T Human Capital model (Bozeman et al. 2001) • Shift from ‘outputs’ to ‘capacities’ • Too vague to operationalize consistently • ACUMEN portfolio • Distinction between expertise, outputs and impacts • Unclear how to put into practice • Evaluative Enquiry approach (Fochler & De Rijcke, 2017) • Emphasis on context • Designed for improvement, but not distribution
  11. 11. 11 Valuation model
  12. 12. 12 Evaluative dimensions Set of activities which are valued in an individual’s CV, namely related with: • Background • Scientific work • Social engagement • Capacity to attract resources and train new scholars • …
  13. 13. 13 Evaluative dimensions Scientific engagement Publications Peer review activities … Social engagement Service Outreach … Background International experience Non-academic experience … Capacity building Personal gain: grants, awards, resources… Institutional gain: Funding, resources… Level of openness Transparency in research practices Accessibility and reuse of outputs
  14. 14. 14 Evaluative dimensions Scientific engagement Publications Peer review activities … Social engagement Service Outreach … Background International experience Non-academic experience … Capacity building Personal gain: grants, awards, resources… Institutional gain: Funding, resources… Level of openness Transparency in research practices Accessibility and reuse of outputs
  15. 15. 15 Evaluative dimensions Scientific engagement Publications Peer review activities … Social engagement Service Outreach … Background International experience Non-academic experience … Capacity building Personal gain: grants, awards, resources… Institutional gain: Funding, resources… Level of openness Transparency in research practices Accessibility and reuse of outputs CREDIT
  16. 16. 16 Evaluative dimensions
  17. 17. 17 Personal features Nationality Ethnicity Age Gender
  18. 18. 18 External factors
  19. 19. 19
  20. 20. 20 In progress… CASE STUDIES • How do reported activities of scientists fit within this model? • How do dimensions and activities relate to each other? • Should we account for diversity within dimensions? • How do activities relate with seniority? • Why do scientists perform activities which a priori are not considered in assessment? • Are these “other” activities actually valued?
  21. 21. 21 Next steps Implementing an experimental assessment with Structured Expert Judgment to: • Test evaluators’ perceptions on value of scientists • Test between statements and real practice • Test capacity of adaptation to different needs and roles • Potential application?
  22. 22. 22 QUESTIONS? (Constructive) feedback needed! Nicolas Robinson-Garcia Delft Institute of Applied Mathematics TU Delft

×