SlideShare a Scribd company logo
1 of 30
Download to read offline
It’s not how you measure – it’s
        what you measure

                               Fiona Wood

          Presentation to the ESF Member Organisation Forum
                             on Peer Review
                            6 December 2011

Copyright © 2011, Fiona Wood
Why invest in research?

To change the world

    Economically
    Socially
    Environmentally


              Copyright © 2011, Fiona Wood   2
Where to invest?
    High risk-high return curiosity driven
     research (internet to lasers to the sex
     life of the screw worm)
    Research geared to the grand societal
     challenges
    Creation of new innovation and
     entrepreneurship skills

                Copyright © 2011, Fiona Wood   3
How to invest?
    Support basic/frontier and applied
     research separately

    Develop performance measures that
     match the different expectations




                Copyright © 2011, Fiona Wood   4
How well do bibliometrics measure the
 value of investment in these areas?




            Copyright © 2011, Fiona Wood   5
Not well…




 http://nuclear-news.net/2010/10/16/the-emperors-new-nuclear-clothes/   6
Use and abuse of Bibliometrics
    Count publications, patents, citations to develop S&T
     performance indicators

    Treat them as proxies for performance

    Use them to guide policy development & the
     allocation of funds, grants, promotion etc
OECD Frascati Manual; RN Kostoff Use and Misuse of Metrics in Research Evaluation, Science and
   Engineering Ethics (1997) 3, 109-120




                                Copyright © 2011, Fiona Wood                                     7
BUT….


‘…bibliometrics does not properly take into
  account the originality of the academic’s
  work, conceptual innovation, research
  applications, scientific and industrial utility’

Institut de France Académie des sciences 2011



                   Copyright © 2011, Fiona Wood      8
…and furthermore…
    Citation counts are a measure of quantity, not
     quality
    Bias towards English language, established
     researchers and low risk research
    Induce conformity to the ‘exigencies’ of the
     measure (salami publications, risk-averse
     research, fraud)
    Demoralises and disenfranchises researchers

                 Copyright © 2011, Fiona Wood   9
….leading to….
    an enormous fragmented data sets of
     dubious meaning

    a massive disaggregated literature

    an aggressive band of metrics zealots




                 Copyright © 2011, Fiona Wood   10
11
HOW NOT TO FIX IT…




       Copyright © 2011, Fiona Wood   12
Excellence in Research for
Australia ‘ERA’ - Objectives
    establish an evaluation framework that gives government, industry,
     business and the wider community assurance of the excellence of
     research conducted in Australia’s institutions;
    provide a national stocktake of discipline-level areas of research
     strength and areas where there is opportunity for development in
     Australia’s higher education institutions;
    identify excellence across the full spectrum of research performance;
    identify emerging research areas and opportunities for further
     development; and
    allow for comparisons of Australia’s research nationally and
     internationally for all discipline areas.




                                                                        13
‘ERA’ indicators
1.     research quality (publications,
       citations, research income, peer
       review)
2.     research volume and activity
3.     research application
4.     esteem measures
      Unit of analysis – the research discipline at
      each institution
                                                      14
Rating Scale (magically) derived
from indicators:
5       well above world standard evaluation
4       above world standard
3       at world standard
2       below world standard
1       well below world standard
NA      research outputs did not meet volume
        threshold
    No operational defn for ‘world standard’
                Copyright © 2011, Fiona Wood   15
Problems
    Assessed basic and applied research
     together
    Difficulties with apportioning MICT to
     classification codes
    Used a ranked journals list
    Did not measure impact
    Process not transparent

                Copyright © 2011, Fiona Wood   16
Undesirable outcomes
    Attributing meaning to a meaningless
     classification system
    Discontinuation of fields of research and
     and researcher positions
    Poaching of ‘star research teams’




                Copyright © 2011, Fiona Wood   17
Undesirable outcomes
    Committed just over $35 million,
     occupied over 600 people’s time, and
     assessed 330,000 research outputs
     from 41 higher education institutions
     without addressing any of the real
     questions
Fiona Wood 2011 ERA: an ailing emperor’s new
   clothes. R&D Review Feb-Mar: 12-13. Invited Op. Ed.


                Copyright © 2011, Fiona Wood      18
HOW TO FIX IT
    Develop measures that reflect the
     different purposes for investments in
     curiosity and mission-driven research
    Capture data at sufficiently granulated
     levels
    Increase global collaboration between
     funding agencies in developing &
     trialling new measures
                Copyright © 2011, Fiona Wood   19
Good Models
    Lattes Platform in Brazil
    SIAMPI pilot KNAW
    STAR-METRICS USA




                Copyright © 2011, Fiona Wood   20
Lattes Platform Brazil
    Used in making funding funding decision by
     government and in universities for tenure and
     promotion
    High quality data on history of scientific,
     academic and professional activities of
     researchers and institutions
    Incentives for compliance
    Unique researcher identifiers

http://lattes.cnpq.br/english/index.htm
                     Copyright © 2011, Fiona Wood   21
SIAMPI
  Social Impact Assessment Methods for research and
   funding instruments through the study of Productive
   Interactions between science and society
  Offers a framework to assist in the identification and

   assessment of social impact of research activities,
   programmes and organizations.
  Case studies in Health, ICT, Nanotechnology and
   SS&H
NB ESRC in the UK initiatives in capturing impact
http://www.siampi.eu/Pages/SIA/12/643.bGFuZz1FTkc.html


                    Copyright © 2011, Fiona Wood         22
STAR-METRICS
    Science and Technology in America’s
     Reinvestment – Measuring the Effects
     of Research on Innovation,
     Competitiveness and Science

    NSF, NIH and OSTP

https://www.starmetrics.nih.gov/Star/Participate#about
                 Copyright © 2011, Fiona Wood       23
Phase 1 – focus on job
creation from stimulus
1.    Build a clean data base
2.    Use university administrative records
      to calculate the employment impact of
      federal science spending through the
      American Recovery and Reinvestment
      Act and agencies’ existing budgets



               Copyright © 2011, Fiona Wood   24
Phase 2 - focus on impact
1.    Economic growth – through indicators such as
      patents and business start-ups
2.    Workforce outcomes – measured by student
      mobility into the workforce and employment markers
3.    Scientific knowledge - measured through
      publications and citations
4.    Social outcomes -measured by long-term health and
      environmental impact of funding.




                  Copyright © 2011, Fiona Wood      25
STAR-METRIC use of Visual
analytics
    Visualising techniques – applied to
     complex data sets in trying to
     understand and describe the scientific
     and innovation enterprise as well as
     explaining outcomes to policy makers.
    Utilising visual analytics developed by
     Homeland Security to describe terrorist
     networks to describe scientific ones.
               Copyright © 2011, Fiona Wood   26
Where next for European
funding councils?
    Important to develop protocols about what data to
     collect, what they mean and how to use them
    Need a better balance between accountability and
     risk, requiring major cultural changes in the people
     and organizations involved in the sponsorship and
     evaluation of research
    E-Collaborate in the development of a universal
     template for reporting scientific achievements as
     proposed by Julia Lane, NSF



                    Copyright © 2011, Fiona Wood        27
ERAB has made a great start
    Recommendations and advice provided
     by the European Research Area Board
     in its 2011 contribution to the ‘Common
     Strategic Framework for Research and
     Innovation’ consultation




               Copyright © 2011, Fiona Wood   28
Including…
    Be ambitious and be prepared to take managed risks
     for the sake of the European economy.
    Divide support between curiosity and mission-driven.
     The latter to include both high risk enabling
     technologies and further support for Europeans
     competitiveness.
    Create a number of independent arms length funding
     agencies to support and govern different types of
     excellent research and innovation.



                   Copyright © 2011, Fiona Wood       29
THANK YOU FOR YOUR ATTENTION!

              fwood@une.edu.au




         Copyright © 2011, Fiona Wood   30

More Related Content

What's hot

Stephan - Science and Innovation Policy-making today: Big questions begging f...
Stephan - Science and Innovation Policy-making today: Big questions begging f...Stephan - Science and Innovation Policy-making today: Big questions begging f...
Stephan - Science and Innovation Policy-making today: Big questions begging f...innovationoecd
 
Salazar - Towards more inclusive science and innovations indicators
Salazar - Towards more inclusive science and innovations indicatorsSalazar - Towards more inclusive science and innovations indicators
Salazar - Towards more inclusive science and innovations indicatorsinnovationoecd
 
Dr Katrien Maes, Chief Policy Officer with the League of European Research Un...
Dr Katrien Maes, Chief Policy Officer with the League of European Research Un...Dr Katrien Maes, Chief Policy Officer with the League of European Research Un...
Dr Katrien Maes, Chief Policy Officer with the League of European Research Un...IrishHumanitiesAlliance
 
Research Policy Monitoring in the Era of Open Science & Big Data Workshop Report
Research Policy Monitoring in the Era of Open Science & Big Data Workshop ReportResearch Policy Monitoring in the Era of Open Science & Big Data Workshop Report
Research Policy Monitoring in the Era of Open Science & Big Data Workshop ReportData4Impact
 
Introduction and E-Research Timeline Review
Introduction and E-Research Timeline ReviewIntroduction and E-Research Timeline Review
Introduction and E-Research Timeline ReviewKhadak Raj Adhikari
 
scenarios and Foresight
scenarios and Foresightscenarios and Foresight
scenarios and ForesightIan Miles
 
Heitor - What do we need to measure to foster “Knowledge as Our Common Future”?
Heitor - What do we need to measure to foster “Knowledge as Our Common Future”?Heitor - What do we need to measure to foster “Knowledge as Our Common Future”?
Heitor - What do we need to measure to foster “Knowledge as Our Common Future”?innovationoecd
 
Beaudry - Validation of web mining technique to measure innovation
Beaudry - Validation of web mining technique to measure innovationBeaudry - Validation of web mining technique to measure innovation
Beaudry - Validation of web mining technique to measure innovationinnovationoecd
 
Newman - Lessons from 10 years of nanotechnology bibliometric analysis
Newman - Lessons from 10 years of nanotechnology bibliometric analysisNewman - Lessons from 10 years of nanotechnology bibliometric analysis
Newman - Lessons from 10 years of nanotechnology bibliometric analysisinnovationoecd
 
Data visualisations: drawing actionable insights from science and technology ...
Data visualisations: drawing actionable insights from science and technology ...Data visualisations: drawing actionable insights from science and technology ...
Data visualisations: drawing actionable insights from science and technology ...EFSA EU
 

What's hot (13)

Concept on e-Research
Concept on e-ResearchConcept on e-Research
Concept on e-Research
 
Stephan - Science and Innovation Policy-making today: Big questions begging f...
Stephan - Science and Innovation Policy-making today: Big questions begging f...Stephan - Science and Innovation Policy-making today: Big questions begging f...
Stephan - Science and Innovation Policy-making today: Big questions begging f...
 
Salazar - Towards more inclusive science and innovations indicators
Salazar - Towards more inclusive science and innovations indicatorsSalazar - Towards more inclusive science and innovations indicators
Salazar - Towards more inclusive science and innovations indicators
 
Dr Katrien Maes, Chief Policy Officer with the League of European Research Un...
Dr Katrien Maes, Chief Policy Officer with the League of European Research Un...Dr Katrien Maes, Chief Policy Officer with the League of European Research Un...
Dr Katrien Maes, Chief Policy Officer with the League of European Research Un...
 
Research Policy Monitoring in the Era of Open Science & Big Data Workshop Report
Research Policy Monitoring in the Era of Open Science & Big Data Workshop ReportResearch Policy Monitoring in the Era of Open Science & Big Data Workshop Report
Research Policy Monitoring in the Era of Open Science & Big Data Workshop Report
 
Introduction and E-Research Timeline Review
Introduction and E-Research Timeline ReviewIntroduction and E-Research Timeline Review
Introduction and E-Research Timeline Review
 
scenarios and Foresight
scenarios and Foresightscenarios and Foresight
scenarios and Foresight
 
Heitor - What do we need to measure to foster “Knowledge as Our Common Future”?
Heitor - What do we need to measure to foster “Knowledge as Our Common Future”?Heitor - What do we need to measure to foster “Knowledge as Our Common Future”?
Heitor - What do we need to measure to foster “Knowledge as Our Common Future”?
 
A scientific framework to measure results of research investments
A scientific framework to measure results of research investmentsA scientific framework to measure results of research investments
A scientific framework to measure results of research investments
 
Beaudry - Validation of web mining technique to measure innovation
Beaudry - Validation of web mining technique to measure innovationBeaudry - Validation of web mining technique to measure innovation
Beaudry - Validation of web mining technique to measure innovation
 
Newman - Lessons from 10 years of nanotechnology bibliometric analysis
Newman - Lessons from 10 years of nanotechnology bibliometric analysisNewman - Lessons from 10 years of nanotechnology bibliometric analysis
Newman - Lessons from 10 years of nanotechnology bibliometric analysis
 
Delphi2 results (Cycle 2) and towards Delphi3
Delphi2 results (Cycle 2) and towards Delphi3Delphi2 results (Cycle 2) and towards Delphi3
Delphi2 results (Cycle 2) and towards Delphi3
 
Data visualisations: drawing actionable insights from science and technology ...
Data visualisations: drawing actionable insights from science and technology ...Data visualisations: drawing actionable insights from science and technology ...
Data visualisations: drawing actionable insights from science and technology ...
 

Similar to It's not how you measure, it's what you measure

Upgrading the Quality of Research Outputs of HEIs and Promoting a Culture of ...
Upgrading the Quality of Research Outputs of HEIs and Promoting a Culture of ...Upgrading the Quality of Research Outputs of HEIs and Promoting a Culture of ...
Upgrading the Quality of Research Outputs of HEIs and Promoting a Culture of ...Philippine E-Journals (PEJ)
 
Research Evaluation in an Open Science context
Research Evaluation in an Open Science contextResearch Evaluation in an Open Science context
Research Evaluation in an Open Science contextHilda Muchando
 
Ethiopia: Open Data/Open Science Agenda/Teklemichael Tefera
Ethiopia: Open Data/Open Science Agenda/Teklemichael TeferaEthiopia: Open Data/Open Science Agenda/Teklemichael Tefera
Ethiopia: Open Data/Open Science Agenda/Teklemichael TeferaAfrican Open Science Platform
 
Welcome Speech At The Libsense Regional Open Science Policy Development Workshop
Welcome Speech At The Libsense Regional Open Science Policy Development WorkshopWelcome Speech At The Libsense Regional Open Science Policy Development Workshop
Welcome Speech At The Libsense Regional Open Science Policy Development WorkshopElvis Muyanja
 
Open access for researchers, research managers and libraries
Open access for researchers, research managers and librariesOpen access for researchers, research managers and libraries
Open access for researchers, research managers and librariesIryna Kuchma
 
Open Science by default in Doctoral Schools?
Open Science by default in Doctoral Schools?Open Science by default in Doctoral Schools?
Open Science by default in Doctoral Schools?Ivo Grigorov
 
Oecd uni indcollaboration_ch1_website
Oecd uni indcollaboration_ch1_websiteOecd uni indcollaboration_ch1_website
Oecd uni indcollaboration_ch1_websiteslideshow19
 
ERIC - developing an impact capture system
ERIC - developing an impact capture systemERIC - developing an impact capture system
ERIC - developing an impact capture systemJulie Bayley
 
Itoca Research Policy Linkages In Agriculture Sector
Itoca Research   Policy Linkages In Agriculture SectorItoca Research   Policy Linkages In Agriculture Sector
Itoca Research Policy Linkages In Agriculture Sectorpowerinbetween
 
Open science: what does success look like, and how would we know?
Open science: what does success look like, and how would we know?Open science: what does success look like, and how would we know?
Open science: what does success look like, and how would we know?Jisc
 
Knowhow Issue 6_CRCcollaboration
Knowhow Issue 6_CRCcollaborationKnowhow Issue 6_CRCcollaboration
Knowhow Issue 6_CRCcollaborationCarrie Bengston
 
Impact of public research
Impact of public researchImpact of public research
Impact of public researchKate Brooks
 
Exploring 'Impact': new approaches for alternative scholarly metrics in Africa
Exploring 'Impact': new approaches for alternative scholarly metrics in AfricaExploring 'Impact': new approaches for alternative scholarly metrics in Africa
Exploring 'Impact': new approaches for alternative scholarly metrics in AfricaThomas King
 
The Use of Evidence in Policy Development and Implementation: Constraints and...
The Use of Evidence in Policy Development and Implementation: Constraints and...The Use of Evidence in Policy Development and Implementation: Constraints and...
The Use of Evidence in Policy Development and Implementation: Constraints and...Peter Edwards
 
ESRC Knowledge Brokerage conference: the third sector
ESRC Knowledge Brokerage conference: the third sectorESRC Knowledge Brokerage conference: the third sector
ESRC Knowledge Brokerage conference: the third sectorKarl Wilding
 
UK Foresight - a view from 2005
UK Foresight - a view from 2005UK Foresight - a view from 2005
UK Foresight - a view from 2005Ian Miles
 

Similar to It's not how you measure, it's what you measure (20)

Workshop report
Workshop reportWorkshop report
Workshop report
 
Upgrading the Quality of Research Outputs of HEIs and Promoting a Culture of ...
Upgrading the Quality of Research Outputs of HEIs and Promoting a Culture of ...Upgrading the Quality of Research Outputs of HEIs and Promoting a Culture of ...
Upgrading the Quality of Research Outputs of HEIs and Promoting a Culture of ...
 
Research Evaluation in an Open Science context
Research Evaluation in an Open Science contextResearch Evaluation in an Open Science context
Research Evaluation in an Open Science context
 
Ethiopia: Open Data/Open Science Agenda/Teklemichael Tefera
Ethiopia: Open Data/Open Science Agenda/Teklemichael TeferaEthiopia: Open Data/Open Science Agenda/Teklemichael Tefera
Ethiopia: Open Data/Open Science Agenda/Teklemichael Tefera
 
Welcome Speech At The Libsense Regional Open Science Policy Development Workshop
Welcome Speech At The Libsense Regional Open Science Policy Development WorkshopWelcome Speech At The Libsense Regional Open Science Policy Development Workshop
Welcome Speech At The Libsense Regional Open Science Policy Development Workshop
 
Open access for researchers, research managers and libraries
Open access for researchers, research managers and librariesOpen access for researchers, research managers and libraries
Open access for researchers, research managers and libraries
 
Open Science by default in Doctoral Schools?
Open Science by default in Doctoral Schools?Open Science by default in Doctoral Schools?
Open Science by default in Doctoral Schools?
 
Data users, data producers
Data users, data producersData users, data producers
Data users, data producers
 
Oecd uni indcollaboration_ch1_website
Oecd uni indcollaboration_ch1_websiteOecd uni indcollaboration_ch1_website
Oecd uni indcollaboration_ch1_website
 
ERIC - developing an impact capture system
ERIC - developing an impact capture systemERIC - developing an impact capture system
ERIC - developing an impact capture system
 
Roles, effectiveness, and impact of VTT
Roles, effectiveness, and impact of VTTRoles, effectiveness, and impact of VTT
Roles, effectiveness, and impact of VTT
 
Itoca Research Policy Linkages In Agriculture Sector
Itoca Research   Policy Linkages In Agriculture SectorItoca Research   Policy Linkages In Agriculture Sector
Itoca Research Policy Linkages In Agriculture Sector
 
Open science: what does success look like, and how would we know?
Open science: what does success look like, and how would we know?Open science: what does success look like, and how would we know?
Open science: what does success look like, and how would we know?
 
Knowhow Issue 6_CRCcollaboration
Knowhow Issue 6_CRCcollaborationKnowhow Issue 6_CRCcollaboration
Knowhow Issue 6_CRCcollaboration
 
Impact of public research
Impact of public researchImpact of public research
Impact of public research
 
Exploring 'Impact': new approaches for alternative scholarly metrics in Africa
Exploring 'Impact': new approaches for alternative scholarly metrics in AfricaExploring 'Impact': new approaches for alternative scholarly metrics in Africa
Exploring 'Impact': new approaches for alternative scholarly metrics in Africa
 
The Use of Evidence in Policy Development and Implementation: Constraints and...
The Use of Evidence in Policy Development and Implementation: Constraints and...The Use of Evidence in Policy Development and Implementation: Constraints and...
The Use of Evidence in Policy Development and Implementation: Constraints and...
 
2013 browse uksg
2013 browse uksg2013 browse uksg
2013 browse uksg
 
ESRC Knowledge Brokerage conference: the third sector
ESRC Knowledge Brokerage conference: the third sectorESRC Knowledge Brokerage conference: the third sector
ESRC Knowledge Brokerage conference: the third sector
 
UK Foresight - a view from 2005
UK Foresight - a view from 2005UK Foresight - a view from 2005
UK Foresight - a view from 2005
 

It's not how you measure, it's what you measure

  • 1. It’s not how you measure – it’s what you measure Fiona Wood Presentation to the ESF Member Organisation Forum on Peer Review 6 December 2011 Copyright © 2011, Fiona Wood
  • 2. Why invest in research? To change the world   Economically   Socially   Environmentally Copyright © 2011, Fiona Wood 2
  • 3. Where to invest?   High risk-high return curiosity driven research (internet to lasers to the sex life of the screw worm)   Research geared to the grand societal challenges   Creation of new innovation and entrepreneurship skills Copyright © 2011, Fiona Wood 3
  • 4. How to invest?   Support basic/frontier and applied research separately   Develop performance measures that match the different expectations Copyright © 2011, Fiona Wood 4
  • 5. How well do bibliometrics measure the value of investment in these areas? Copyright © 2011, Fiona Wood 5
  • 7. Use and abuse of Bibliometrics   Count publications, patents, citations to develop S&T performance indicators   Treat them as proxies for performance   Use them to guide policy development & the allocation of funds, grants, promotion etc OECD Frascati Manual; RN Kostoff Use and Misuse of Metrics in Research Evaluation, Science and Engineering Ethics (1997) 3, 109-120 Copyright © 2011, Fiona Wood 7
  • 8. BUT…. ‘…bibliometrics does not properly take into account the originality of the academic’s work, conceptual innovation, research applications, scientific and industrial utility’ Institut de France Académie des sciences 2011 Copyright © 2011, Fiona Wood 8
  • 9. …and furthermore…   Citation counts are a measure of quantity, not quality   Bias towards English language, established researchers and low risk research   Induce conformity to the ‘exigencies’ of the measure (salami publications, risk-averse research, fraud)   Demoralises and disenfranchises researchers Copyright © 2011, Fiona Wood 9
  • 10. ….leading to….   an enormous fragmented data sets of dubious meaning   a massive disaggregated literature   an aggressive band of metrics zealots Copyright © 2011, Fiona Wood 10
  • 11. 11
  • 12. HOW NOT TO FIX IT… Copyright © 2011, Fiona Wood 12
  • 13. Excellence in Research for Australia ‘ERA’ - Objectives   establish an evaluation framework that gives government, industry, business and the wider community assurance of the excellence of research conducted in Australia’s institutions;   provide a national stocktake of discipline-level areas of research strength and areas where there is opportunity for development in Australia’s higher education institutions;   identify excellence across the full spectrum of research performance;   identify emerging research areas and opportunities for further development; and   allow for comparisons of Australia’s research nationally and internationally for all discipline areas. 13
  • 14. ‘ERA’ indicators 1.  research quality (publications, citations, research income, peer review) 2.  research volume and activity 3.  research application 4.  esteem measures Unit of analysis – the research discipline at each institution 14
  • 15. Rating Scale (magically) derived from indicators: 5 well above world standard evaluation 4 above world standard 3 at world standard 2 below world standard 1 well below world standard NA research outputs did not meet volume threshold   No operational defn for ‘world standard’ Copyright © 2011, Fiona Wood 15
  • 16. Problems   Assessed basic and applied research together   Difficulties with apportioning MICT to classification codes   Used a ranked journals list   Did not measure impact   Process not transparent Copyright © 2011, Fiona Wood 16
  • 17. Undesirable outcomes   Attributing meaning to a meaningless classification system   Discontinuation of fields of research and and researcher positions   Poaching of ‘star research teams’ Copyright © 2011, Fiona Wood 17
  • 18. Undesirable outcomes   Committed just over $35 million, occupied over 600 people’s time, and assessed 330,000 research outputs from 41 higher education institutions without addressing any of the real questions Fiona Wood 2011 ERA: an ailing emperor’s new clothes. R&D Review Feb-Mar: 12-13. Invited Op. Ed. Copyright © 2011, Fiona Wood 18
  • 19. HOW TO FIX IT   Develop measures that reflect the different purposes for investments in curiosity and mission-driven research   Capture data at sufficiently granulated levels   Increase global collaboration between funding agencies in developing & trialling new measures Copyright © 2011, Fiona Wood 19
  • 20. Good Models   Lattes Platform in Brazil   SIAMPI pilot KNAW   STAR-METRICS USA Copyright © 2011, Fiona Wood 20
  • 21. Lattes Platform Brazil   Used in making funding funding decision by government and in universities for tenure and promotion   High quality data on history of scientific, academic and professional activities of researchers and institutions   Incentives for compliance   Unique researcher identifiers http://lattes.cnpq.br/english/index.htm Copyright © 2011, Fiona Wood 21
  • 22. SIAMPI   Social Impact Assessment Methods for research and funding instruments through the study of Productive Interactions between science and society   Offers a framework to assist in the identification and assessment of social impact of research activities, programmes and organizations.   Case studies in Health, ICT, Nanotechnology and SS&H NB ESRC in the UK initiatives in capturing impact http://www.siampi.eu/Pages/SIA/12/643.bGFuZz1FTkc.html Copyright © 2011, Fiona Wood 22
  • 23. STAR-METRICS   Science and Technology in America’s Reinvestment – Measuring the Effects of Research on Innovation, Competitiveness and Science   NSF, NIH and OSTP https://www.starmetrics.nih.gov/Star/Participate#about Copyright © 2011, Fiona Wood 23
  • 24. Phase 1 – focus on job creation from stimulus 1.  Build a clean data base 2.  Use university administrative records to calculate the employment impact of federal science spending through the American Recovery and Reinvestment Act and agencies’ existing budgets Copyright © 2011, Fiona Wood 24
  • 25. Phase 2 - focus on impact 1.  Economic growth – through indicators such as patents and business start-ups 2.  Workforce outcomes – measured by student mobility into the workforce and employment markers 3.  Scientific knowledge - measured through publications and citations 4.  Social outcomes -measured by long-term health and environmental impact of funding. Copyright © 2011, Fiona Wood 25
  • 26. STAR-METRIC use of Visual analytics   Visualising techniques – applied to complex data sets in trying to understand and describe the scientific and innovation enterprise as well as explaining outcomes to policy makers.   Utilising visual analytics developed by Homeland Security to describe terrorist networks to describe scientific ones. Copyright © 2011, Fiona Wood 26
  • 27. Where next for European funding councils?   Important to develop protocols about what data to collect, what they mean and how to use them   Need a better balance between accountability and risk, requiring major cultural changes in the people and organizations involved in the sponsorship and evaluation of research   E-Collaborate in the development of a universal template for reporting scientific achievements as proposed by Julia Lane, NSF Copyright © 2011, Fiona Wood 27
  • 28. ERAB has made a great start   Recommendations and advice provided by the European Research Area Board in its 2011 contribution to the ‘Common Strategic Framework for Research and Innovation’ consultation Copyright © 2011, Fiona Wood 28
  • 29. Including…   Be ambitious and be prepared to take managed risks for the sake of the European economy.   Divide support between curiosity and mission-driven. The latter to include both high risk enabling technologies and further support for Europeans competitiveness.   Create a number of independent arms length funding agencies to support and govern different types of excellent research and innovation. Copyright © 2011, Fiona Wood 29
  • 30. THANK YOU FOR YOUR ATTENTION! fwood@une.edu.au Copyright © 2011, Fiona Wood 30