Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Towards indicators for ‘opening up’
science and technology policy
Ismael Rafols
Ingenio (CSIC-UPV), Universitat Politècnic...
Pressing demands of research management and evaluation
• Increasing size of research endeavour
 1.5 M papers per year onl...
Can indicators help?
Yes, indicators can help make decisions…
 Increase transparency and sense of objectivity
 Reduce co...
Perverse effects of conventional indicators
Conventional indicators (such as IFs, or h-index)
are (often) biased against:
...
Current use of S&T indicators
Use of conventional S&T indicators is *problematic*
 Narrow inputs (only pubs!)
 Scalar ou...
The Leiden Manifesto (in the “making”) on use of indicators
Metrics
• Should support, not replace expert evaluation.
• Sho...
How can S&T indicators help in science policy?
What type of “answer" should indicators provide?
Model 2: Plural and condit...
From S&T indicators for justification and disciplining…
Justification in decision-making
• Weak justification, “Give me a ...
… towards S&T indicators as tools for deliberation
Yet is possible to design indicators that foster plural reflection rath...
1. Conceptual framework:
“broadening out” vs. “opening up” policy appraisal
Policy use of S&T indicators: Appraisal
Appraisal:
‘the ensemble of processes through which knowledges are
gathered and pr...
Policy use of S&T indicators: Appraisal
Appraisal:
‘the ensemble of processes through which knowledges are
gathered and pr...
narrow
broad
closing-down opening-up
range of
appraisals
inputs
(issues, perspectives,
scenarios, methods)
effect of appra...
narrow
broad
closing-down opening-up
range of
appraisals
inputs
(issues, perspectives,
scenarios, methods)
effect of appra...
narrow
broad
closing-down opening-up
range of
appraisals
inputs
(issues, perspectives,
scenarios, methods)
effect of appra...
narrow
broad
closing-down opening-up
range of
appraisals
inputs
(issues, perspectives,
scenarios, methods)
effect of appra...
narrow
broad
closing-down opening-up
range of
appraisals
inputs
(issues, perspectives,
scenarios, methods)
effect of appra...
narrow
broad
closing-down opening-up
range of
appraisals
inputs
(issues, perspectives,
scenarios, methods)
effect of appra...
2. Examples of Opening Up
a. Broadening out AND Opening up
b. Opening up WITH NARROW inputs
narrow
broad
closing-down opening-up
range of
appraisals
inputs
(issues, perspectives,
scenarios, methods)
effect of appra...
Composite Innovation Indicators (25-30 indicators)
European (Union) Innovation Scoreboard
Grupp and Schubert (2010) show t...
Solution: representing multiple dimensions
(critique by Grupp and Schubert, 2010)
Use of spider diagrams
allows comparing
...
2. Examples of Opening Up
b. Opening up WITH NARROW inputs
narrow
broad
closing-down opening-up
range of
appraisals
inputs
(issues, perspectives,
scenarios, methods)
effect of appra...
1. Excellence: Opening Up Perspectives
Provide different perspectives of scientific impact
Measures of “scientific excellence”
0
0.5
1
1.5
2
2.5
3
3.5
4
ISSTI SPRU MIoIR Imperial WBS LBS
ABSRank
0
1
2
3
4
5
ISSTI ...
Measures of “scientific excellence”
0
0.5
1
1.5
2
2.5
3
3.5
4
ISSTI SPRU MIoIR Imperial WBS LBS
ABSRank
0
1
2
3
4
5
ISSTI ...
2. Interdisciplinarity: Opening Up Perspectives
Explore different concepts of same policy notion
Multiple concepts of interdisciplinarity:
Conspicuous lack of consensus but
most indicators aim to capture
the following c...
Diversity
ISSTI Edinburgh
WoS Cats of references
Assessing interdisciplinarity
ISSTI Edinburgh
Observed/Expected
Cross-citations
Assessing interdisciplinarity Coherence
RiskAnal
PsycholBull
PhilosTRSocA
Organization
JPersSocPsychol
JLawEconOrgan
JIntEcon
Interfaces
EnvironSciPolicy
CanJEcon...
Summary: IS (blue) units are more interdisciplinary than BMS (orange)
More Diverse
Rao-Stirling Diversity
More Coherent
Ob...
3. Research focus: Opening Up Perspectives
Explore directions of research
Rice Varieties
Classic Genetics
Transgenics
Mol. Biology
Genomics
Pests
Plant protection
Weeds
Plant protection
Plant nutr...
US, 2000-12
Ciarli and Rafols (2014, unpublished)
Rice research
India 2000-12Rice research
Ciarli and Rafols (2014, unpublished)
Thailand 2000-12Rice research
Ciarli and Rafols (2014, unpublished)
Brazil 2000-12Rice research
Ciarli and Rafols (2014, unpublished)
3. Summary and conclusions
S&T indicator as a tools to open up the debate
• ‘Conventional’ use of indicators (‘Pure scientist ‘--Pielke)
 Purely ana...
Strategies for opening up or
how to “keep it complex” yet “manageable”
• Presenting contrasting perspectives
 At least TW...
Is ‘opening up’ worth the effort? (1)
Sustaining diversity in S&T system
Decrease in diversity.
Potential unintended conse...
Towards indicators for 'opening up' science and technology policy
Upcoming SlideShare
Loading in …5
×

Towards indicators for 'opening up' science and technology policy

1,105 views

Published on

Ismael Rafols' presentation from the ORCID and Casrai joint conference, Barcelona, May 2015

  • Be the first to comment

Towards indicators for 'opening up' science and technology policy

  1. 1. Towards indicators for ‘opening up’ science and technology policy Ismael Rafols Ingenio (CSIC-UPV), Universitat Politècnica de València SPRU (Science Policy Research Unit), University of Sussex, Brighton, UK Observatoire des Sciences et des Téchniques (OST-HCERES), Paris ORCID CASRAI Barcelona, May 2015 Building on work with Tommaso Ciarli and Andy Stirling (SPRU), Loet Leydesdorff (Amsterdam), Alan Porter (GTech, Atlanta)
  2. 2. Pressing demands of research management and evaluation • Increasing size of research endeavour  1.5 M papers per year only in Web of Science  Globalisation. Many mid-income countries have multiplied their publication output (China)  Within a country: 3,000 postgraduate programmes are evaluated in 48 panels in BR • Increasing competition for funding – globally and locally  Success rates of research calls are very low in the US, EU (10%-20%) • Increasing societal demands  Interactions with industry and social actors (NGOs)  Grand challenges (climate change, epidemics, water & food security) Traditional qualitative techniques of management cannot cope. Hope that use of indicators can help...
  3. 3. Can indicators help? Yes, indicators can help make decisions…  Increase transparency and sense of objectivity  Reduce complexity  Reduce time and costs The dream of rationality, “the science of science policy” (De Solla Price, Garfield, 1960s….Marburguer, Julia Lane, 2000s) but do they lead to the “right” decisions? Evaluation gap (Wouters): “discrepancy between evaluation criteria and the social and economic functions of science”
  4. 4. Perverse effects of conventional indicators Conventional indicators (such as IFs, or h-index) are (often) biased against:  Field research (epidemiology)  Applied research  Social science and humanities  Peripheral countries  Non-English publications and authors  Some topics outside outside mainstream (e.g. preventive medicine)??  re-inforcing existing power structures in S&T reducing diversity, making S&T less relevant to society (Q: would use of peer review lead to same biased outcomes?)
  5. 5. Current use of S&T indicators Use of conventional S&T indicators is *problematic*  Narrow inputs (only pubs!)  Scalar outputs (rankings!)  Aggregated solutions --missing variation  Opaque selections and classifications (privately owned databases)  Large, leading scientometric groups embedded in government / consultancy, with limited possibility of public scrutiny  Sometimes even mathematically debatable  Impact Factor of journals (only 2 years, large error bar)  Average number of citations (pubs) in skewed distributions
  6. 6. The Leiden Manifesto (in the “making”) on use of indicators Metrics • Should support, not replace expert evaluation. • Should match institutional mission • Should not suppress locally relevant research • Should be simple, transparent, accessible and verifiable by evaluated • Should take into account field and country differences/contexts • Metrics for individual researchers must be based on qualitative judgment. • Intended and unintended effects of metrics should be reflected upon before use Hicks, Wouters, Waltman, de Rijcke and Rafols (Nature, in press)
  7. 7. How can S&T indicators help in science policy? What type of “answer" should indicators provide? Model 2: Plural and conditional Exploring complementary choices Facilitating options/choices in landscapes Model 1: Unique and prescriptive Proposing “best choices” Rankings -- ranking list of preferences
  8. 8. From S&T indicators for justification and disciplining… Justification in decision-making • Weak justification, “Give me a number, any number!” • Strong justification, “Show in numberrs that X is the best choice!” S&T Indicators have a performative role:  They don’t just measure. Not ‘just happen to be used’ in science policy (neutral)  Constitutive part incentive structure for “disciplining” (loaded)  They signal to stakeholders what is important. Institutions use these techniques to discipline subjects  Articulate framings, goals and narratives on performance, collaboration, interdisciplinarity…
  9. 9. … towards S&T indicators as tools for deliberation Yet is possible to design indicators that foster plural reflection rather than justifying or reinforcing dominant perspectives This shift is facilitated by trends pushed by ICT and visualisation tools  More inputs (pubs, pats, but also news, webs, etc.)  Multidimensional outputs (interactive maps)  Institutional repositories  Multiple solutions -- highlighting variation, confidence intervals  More inclusive and contrasting classifications (by-passing private data ownership? Pubmed, Arxiv)  More possibilities for open scrutiny (new research groups)
  10. 10. 1. Conceptual framework: “broadening out” vs. “opening up” policy appraisal
  11. 11. Policy use of S&T indicators: Appraisal Appraisal: ‘the ensemble of processes through which knowledges are gathered and produced in order to inform decision-making and wider institutional commitments’ Leach et al. (2008) Breadth: extent to which appraisal covers diverse dimensions of knowledge Openness: degree to which outputs provide an array of options for policies.
  12. 12. Policy use of S&T indicators: Appraisal Appraisal: ‘the ensemble of processes through which knowledges are gathered and produced in order to inform decision-making and wider institutional commitments’ Leach et al. (2010) Example: Allocation of resources based on research “excellence” Breadth: extent to which appraisal covers diverse dimensions of knowledge Narrow: citations/paper Broad: citations, peer interview, stakeholder view, media coverage, altmetrics Openness: degree to which outputs provide an array of options for policies. Closed: fixed composite measure of variables  unitary and prescriptive Open: consideration of various dimensions  plural and conditional
  13. 13. narrow broad closing-down opening-up range of appraisals inputs (issues, perspectives, scenarios, methods) effect of appraisal ‘outputs’ on decision-making Leach et al. 2010 Appraisal methods: broad vs. narrow & closing vs. opening
  14. 14. narrow broad closing-down opening-up range of appraisals inputs (issues, perspectives, scenarios, methods) effect of appraisal ‘outputs’ on decision-making Appraisal methods: broad vs. narrow & close vs. open cost-benefit analysis open hearings consensus conference scenario workshops citizens’ juries multi-criteria mapping q-method sensitivity analysis narrative-based participant observation decision analysis risk assessment structured interviews Stirling et al. (2007)
  15. 15. narrow broad closing-down opening-up range of appraisals inputs (issues, perspectives, scenarios, methods) effect of appraisal ‘outputs’ on decision-making Appraisal methods: broad vs. narrow & closing vs. opening Most conventional S&T indicators??
  16. 16. narrow broad closing-down opening-up range of appraisals inputs (issues, perspectives, scenarios, methods) effect of appraisal ‘outputs’ on decision-making Broadening out S&T Indicators Conventional S&T indicators?? Broadening out Incorporation plural analytical dimensions: global & local networks hybrid lexical-actor nets etc. New analytical inputs: media, blogsphere.
  17. 17. narrow broad closing-down opening-up range of appraisals inputs (issues, perspectives, scenarios, methods) effect of appraisal ‘outputs’ on decision-making Appraisal methods: broad vs. narrow & closing vs. opening Journal rankings University rankings Unitary measures that are opaque, tendency to favour the established perspectives … and easily translated into prescription European Innovation Scoreboard
  18. 18. narrow broad closing-down opening-up range of appraisals inputs (issues, perspectives, scenarios, methods) effect of appraisal ‘outputs’ on decision-making Opening up S&T Indicators Conventional S&T Indicators?? opening-up Making explicit underlying conceptualisations and creating heuristic tools to facilitate exploration NOT about the uniquely best method Or about the unitary best explanation Or the single best prediction
  19. 19. 2. Examples of Opening Up a. Broadening out AND Opening up b. Opening up WITH NARROW inputs
  20. 20. narrow broad closing-down opening-up range of appraisals inputs (issues, perspectives, scenarios, methods) effect of appraisal ‘outputs’ on decision-making 1. Preserving multiple dimensions in broad appraisals Conventional S&T indicators?? Leach et al. 2010 Broadening out opening-up
  21. 21. Composite Innovation Indicators (25-30 indicators) European (Union) Innovation Scoreboard Grupp and Schubert (2010) show that order is highly dependent on indicators weightings. Sensitivity analysis
  22. 22. Solution: representing multiple dimensions (critique by Grupp and Schubert, 2010) Use of spider diagrams allows comparing like with like U-rank, University performance Comparison tools (Univ. Twente) 5.4 Community trademarks indicator
  23. 23. 2. Examples of Opening Up b. Opening up WITH NARROW inputs
  24. 24. narrow broad closing-down opening-up range of appraisals inputs (issues, perspectives, scenarios, methods) effect of appraisal ‘outputs’ on decision-making Opening up S&T Indicators Conventional S&T Indicators?? Leach et al. 2010 opening-up Making explicit underlying conceptualisations and creating heuristic tools to facilitate exploration NOT about the uniquely best method Or about the unitary best explanation Or the single best prediction
  25. 25. 1. Excellence: Opening Up Perspectives Provide different perspectives of scientific impact
  26. 26. Measures of “scientific excellence” 0 0.5 1 1.5 2 2.5 3 3.5 4 ISSTI SPRU MIoIR Imperial WBS LBS ABSRank 0 1 2 3 4 5 ISSTI SPRU MIoIR Imperial WBS LBS Citations/pub Journal-fieldNormalised Which one is more meaningful?? 0 1 2 3 4 ISSTI SPRU MIoIR Imperial WBS LBS JournalImpactFactor Rafols et al. (2012, Research Policy)
  27. 27. Measures of “scientific excellence” 0 0.5 1 1.5 2 2.5 3 3.5 4 ISSTI SPRU MIoIR Imperial WBS LBS ABSRank 0 1 2 3 4 5 ISSTI SPRU MIoIR Imperial WBS LBS Citations/pub Journal-fieldNormalised 0 0.05 0.1 0.15 0.2 ISSTI SPRU MIoIR Imperial WBS LBS Citations/pub Citing-paperNormalised Which one is more meaningful?? 0 1 2 3 4 ISSTI SPRU MIoIR Imperial WBS LBS JournalImpactFactor Rafols et al. (2012, Research Policy)
  28. 28. 2. Interdisciplinarity: Opening Up Perspectives Explore different concepts of same policy notion
  29. 29. Multiple concepts of interdisciplinarity: Conspicuous lack of consensus but most indicators aim to capture the following concepts Integration (diversity & coherence) • Research that draws on diverse bodies of knowledge • Research that links different disciplines Intermediation • Research that lies between or outside the dominant disciplines Coherence Low High Diversity LowHigh InterdisciplinaryMultidisciplinary Monodisciplinary Intermediation Low High Monodisciplinary Interdisciplinary
  30. 30. Diversity ISSTI Edinburgh WoS Cats of references Assessing interdisciplinarity
  31. 31. ISSTI Edinburgh Observed/Expected Cross-citations Assessing interdisciplinarity Coherence
  32. 32. RiskAnal PsycholBull PhilosTRSocA Organization JPersSocPsychol JLawEconOrgan JIntEcon Interfaces EnvironSciPolicy CanJEcon ApplEcon AnnuRevPsychol RandJEcon JPublicEcon JManage JLawEcon HumRelat BiomassBioenerg AtmosEnviron PolicySci JIntBusStud JApplPsychol Econometrica PublicUnderstSci PsycholRev JFinancEcon JApplEcolJAgrarChange ClimaticChange AcadManageJ JRiskRes JDevStud Scientometrics HarvardBusRev IntJMedInform GlobalEnvironChang EconJ JFinanc StudHistPhilosSci DrugInfJ Futures WorldDev StrategicManageJ SciTechnolHumVal EconSoc PublicAdmin Lancet IndCorpChange AccountOrgSoc EnergPolicy Nature AmJSociol ResPolicy TechnolAnalStrateg SocStudSci BritMedJ ISSTI Edinburgh References IntermediationAssessing interdisciplinarity
  33. 33. Summary: IS (blue) units are more interdisciplinary than BMS (orange) More Diverse Rao-Stirling Diversity More Coherent Observed/Expected Cross-Citation Distance More Interstitial Average Similarity 0.02 0.03 0.04 0.05 0.06 0.07
  34. 34. 3. Research focus: Opening Up Perspectives Explore directions of research
  35. 35. Rice Varieties Classic Genetics Transgenics Mol. Biology Genomics Pests Plant protection Weeds Plant protection Plant nutrition Production & socioeconomic issues Consumption Hum. nutrition, food techs) Thinking in terms of research portfolios: the case of rice Ciarli and Rafols (2014, unpublished)
  36. 36. US, 2000-12 Ciarli and Rafols (2014, unpublished) Rice research
  37. 37. India 2000-12Rice research Ciarli and Rafols (2014, unpublished)
  38. 38. Thailand 2000-12Rice research Ciarli and Rafols (2014, unpublished)
  39. 39. Brazil 2000-12Rice research Ciarli and Rafols (2014, unpublished)
  40. 40. 3. Summary and conclusions
  41. 41. S&T indicator as a tools to open up the debate • ‘Conventional’ use of indicators (‘Pure scientist ‘--Pielke)  Purely analytical character (i.e. free of normative assumptions)  Instruments of objectification of dominant perspectives  Aimed at legitimising /justifying decisions (e.g. excellence)  Unitary and prescriptive advice • Opening up scientometrics (‘Honest broker’ --Pielke)  Aimed at locating the actors in their context and dynamics  Not predictive, or explanatory, but exploratory  Construction of indicators is based on choice of perspectives  Make explicit the possible choices on what matters  Supporting debate  Making science policy more ‘socially robust’  Plural and conditional advice Barré (2001, 2004, 2010), Stirling (2008)
  42. 42. Strategies for opening up or how to “keep it complex” yet “manageable” • Presenting contrasting perspectives  At least TWO, in order to give a taste of choice • Simultaneous visualisation of multiple properties / dimensions  Allowing the user take its own perspective • Interactivity  Allowing the user give its own weigh to criteria / factors  Allowing the user manipulate visuals .
  43. 43. Is ‘opening up’ worth the effort? (1) Sustaining diversity in S&T system Decrease in diversity. Potential unintended consequence of the evaluation machine: Why diversity matters Systemic (‘ecological’) understanding of the S&T  S&T outcomes depend on synergistic interactions between disparate elements. Dynamic understanding of excellence and relevance  New social needs, challenges, expectations from S&T Manage diverse portfolios to hedge against uncertainty in research  Office of Portfolio Analysis (National Institutes of Health) http://dpcpsi.nih.gov/opa/ Open possibility for S&T to work for the disenfranchised  Topics outside dominant science (e.g. neglected diseases)

×