Issues in Priority Setting Wolfgang Polt Joanneum Research [email_address] OECD Workshop on  Rethinking Evaluation in Science and Technology Paris 30.10.2007
Definitions Priority setting (PS) :  conscious/deliberate selection of certain activities/actors/policies at the expense of others with an impact on resource allocation Evaluation :  systematic assessment of the rationale, implementation and impact of a policy intervention
Priority Setting
Typical questions “ Shall we invest more in basic research instead of innovation?” “ What are the technologies that have most private and social return?” “ Shall we rather set up a lab for earth observation or a new particle collider?” …
Dimensions of the PS process Types  of priorities:   thematic priorities (technologies, societal missions) functional/generic Levels  concerning the hierarchical position of the different priority setting actors and/or institutions. Nature  of the priority setting process (e.g. top-down/expert-based vs. bottom-up/participatory, degree of formalization, mechanisms for implementation, evaluation).
Paradigms of Priority Setting Source: Gassler, Polt, Rammer 2007
Paradigms of Priority Setting Source: Gassler, Polt, Rammer 2007
Actors in Priority Setting Multitude of actors (as a function of size, development and complexity of the research, innovation and policy systems) :   Federal and Regional Governments … supported by S&T policy councils or other advisory bodies… Research Councils and Funding Agencies Research Performers (Enterprises, PROs, Research Teams, ..) With different needs, perspectives and capacities with respect to priority setting
Actors in Priority setting – ministries and agencies
Means of Priority Setting Government White Papers Budget plans & allocations Targeted Research and Technology Programmes Government Procurement  Institutions (Profiling, Specialisation) Performance Based Contracting Clusters / Technology Platforms Strategic Research Agendas …
Conceptual underpinning of technology centered PS   Strategic Critical Key Emerging Pathbreaking Infrastructural Generic General Purpose Disruptive „  ..most of these lists of technologies remain at a level which makes them only a poor guide for policy...“ Branscomb (1994) Lists of Technologies
Trends in Priority Setting Technology planning and forecasting (60s, 70s) Technology Assessment, Technology Foresight and Roadmapping (80s, 90s) Trend towards „expertise-supported consultation mechanisms“ Trend towards programme and performance based funding (instead of institutional block grants) as means of priority setting for PROs … but always (and mainly!) a process of political bargaining
Evaluation
Evaluation in the Policy Cycle Evaluation  (Ex ante / Ex post / Interim) is one of several tools of ‚strategic policy intelligence‘ that is used for PS. Others that have been used include (various forms of): Technology Assessment Technology Foresight Technology Roadmapping
Evaluation in the Policy Cycle Foresight TA Technology  Roadmapping
Evaluations in Priority Setting So far, Evaluations have not been used systematically for PS Ex-ante evaluation is less developed than interim and ex-post evaluation Trend towards programme and performance based funding might increase the role of evaluations for PS Future: PS as an outcome of broader, systemic and continous ‚Strategic Policy Intelligence‘ (Foresight, Monitoring, Evaluation, Assessment)?
The challenge for Evaluations in Priority Setting Main Challenge: to be able to  compare  between  alternatives . In the ideal case:  At a given point in time (e.g. the budget decision) Using metrics that ensure comparability At reasonable cost The main question: can we assess the future impacts (on economy and society) of technological developments and the policy interventions addressing these developments sufficiently well in order to allow for prioritization ?
Dimensions of Evaluation Appropriateness  (Are we doing the rigth thing? Is the policy intervention justified?) Quality and Efficiency of implementation  (Are we doing it well? Is the programme management working? Effects and Impacts  (What happens as a result of the programme?) Conclusions and Feedback for policy  (policy learning to improve appropriateness and implementation)
Scope and Limits of Evaluations Scope Appropriateness : progress in applying  ‚ logic chart ‘  models (see e.g. Jordan 2004) Quality and Efficiency of implementation:  most evaluations have centered on this aspect. Sufficiently well developed qualitative approaches. Effects and Impacts: Macro / Meso / Micro level analysis – progress in some instances Additionality : Input/Output/Behavioural: substantial methodological progress (e.g. micro-econometric work including control-group approaches (see OECD 2006)
Approaches to Impact assessment Macro  (aggregate impact on productivity / GDP) Effects of R&D on productivity and growth, see e.g. GUELLEC and VAN POTTELSBERGHE (2001, 2004) Meso  (on the level of industries, technologies or programmes) MANSFIELD (1977 ff) BEISE and STAHL (1998) and FIER (2004) Micro  (on the level of individual R&D projects, enterprises or institutions) See e.g. FELLER and RUEGG (2003) for an  overview of results from the ATP programmes
Approaches to Impact Assessment Econometric Models (Social) Cost-Benefit Analysis Surveys (beneficiaries, innovation surveys) Case Studies Sociometric and   social network   analysis Bibliometrics -   Counts, Citations, Content analysis Historical   Tracing of ‘critical technological events’ Expert judgment
Macro and Meso estimates of Return on R&D Source: Godin and Doré
Meso/Micro level: Social Rates of individual technologies Source: LINK (1999)
Mirco level: ‚Stylized results‘ from short-term ex-post CBA for individual projects .. clearly these differences are a guide for public investment ? Source: Bessette, 2003
Scope and Limits of Evaluations Problems: Timing of effects ?    Timing of evaluation Uncertainity about outcomes    wide range of estimates Attribution to causes & ‚project fallacy‘ Data availability Costs of monitoring and evaluation Limits:  mostly in ex-ante and ex-post impact assessment, (example FP assessments) less so, but still considerable in identification of additionality
Timing of expected economic effects from ATP projects (stylized) Economic Impacts -1  0  1  2  3  4  5  6  7  8  9  10  or more years Announce Competetion Announce Award Complete Project Post-Project Period Short-Term Mid-Term Long-Term Spin-Off Products Product Improvements Costs Quality New Applications Identified New Products and  Processes New Business Opportunities New Business Alliances Company Growth Early Spillover Benefits More New Products and Processes Intra-Industry Diffusion Inter-Industry Diffusion Market Expansion Employment Opportunities Production Distribution Private ROI Spillover Benefits Taxpayer ROI Total Economic Benefits Benefits  to Awardees Proposal Preparation Joint Venture Formation Resource ID Increased R&D Spending Expanded Goals Acceleration Collaboration (R&D Efficiencies) Technology Employment Opportunities Technological Advances
Scope and Limits of Evaluations They give us a good idea about the rationale, implementation and goal attainment of programmes,  … but only in a few cases quantitative evidence about the economic and wider social impacts and in any case not in a way that would allow strict comparison  They are able to demonstrate positive private returns and externalities of R&D, both on the marco, meso and micro level …  but only in terms of orders of magnitude and with considerable range of estimate
Scope and Limits of Evaluations Thus, they were mainly instruments for programme management and for (ex-ante and ex-post) legitimization of policy intervention Alas, what they do not provide us with is a robust basis for exact resource allocation …  because most were not of a quantitative nature (see Licht 2007) …  and almost all were done with a limited scope (in terms of technologies, impact classes, instruments,…)
Towards realistic expectations Thus  „ it is clear that the information requirements […] far exceed what is likely to be available in any practical situation and may in themselves place undue transaction costs upon the subsidy “  (Georghiou/Clarisse in OECD 2006) „… the precise alloaction […] is not important, as long as it is sufficiently diversified. Rather than attempting to refine the allocations, energy and resources may be more productively focused on ways to improve links within the research system.“ (PANNELL, 1999)
Where shall we go from here? Path 1: Push the envelope: Further improve Evaluation methods and practice, e.g. in the following directions: Option Values (see Vonortas 2003) Micro-econometric modelling (e.g. current OECD project) ‚ Evaluations in context‘: evaluation of different instruments in a systemic perspective (e.g. forthcoming Evaluation of the Austrian funding system) …  but there is an inherent limit to how far we can get!
Where shall we go from here? Path 2: Turn the question around: Evaluation of Priority Setting processes itself ! Which PS processes were able to influence the directions of R&D and scientific and technological specialisation patterns of an innovation system?  Can ‚good practices‘ for PS processes be identified?

Wp.Priority Setting, Paris 29 30 Oct. 07

  • 1.
    Issues in PrioritySetting Wolfgang Polt Joanneum Research [email_address] OECD Workshop on Rethinking Evaluation in Science and Technology Paris 30.10.2007
  • 2.
    Definitions Priority setting(PS) : conscious/deliberate selection of certain activities/actors/policies at the expense of others with an impact on resource allocation Evaluation : systematic assessment of the rationale, implementation and impact of a policy intervention
  • 3.
  • 4.
    Typical questions “Shall we invest more in basic research instead of innovation?” “ What are the technologies that have most private and social return?” “ Shall we rather set up a lab for earth observation or a new particle collider?” …
  • 5.
    Dimensions of thePS process Types of priorities: thematic priorities (technologies, societal missions) functional/generic Levels concerning the hierarchical position of the different priority setting actors and/or institutions. Nature of the priority setting process (e.g. top-down/expert-based vs. bottom-up/participatory, degree of formalization, mechanisms for implementation, evaluation).
  • 6.
    Paradigms of PrioritySetting Source: Gassler, Polt, Rammer 2007
  • 7.
    Paradigms of PrioritySetting Source: Gassler, Polt, Rammer 2007
  • 8.
    Actors in PrioritySetting Multitude of actors (as a function of size, development and complexity of the research, innovation and policy systems) : Federal and Regional Governments … supported by S&T policy councils or other advisory bodies… Research Councils and Funding Agencies Research Performers (Enterprises, PROs, Research Teams, ..) With different needs, perspectives and capacities with respect to priority setting
  • 9.
    Actors in Prioritysetting – ministries and agencies
  • 10.
    Means of PrioritySetting Government White Papers Budget plans & allocations Targeted Research and Technology Programmes Government Procurement Institutions (Profiling, Specialisation) Performance Based Contracting Clusters / Technology Platforms Strategic Research Agendas …
  • 11.
    Conceptual underpinning oftechnology centered PS Strategic Critical Key Emerging Pathbreaking Infrastructural Generic General Purpose Disruptive „ ..most of these lists of technologies remain at a level which makes them only a poor guide for policy...“ Branscomb (1994) Lists of Technologies
  • 12.
    Trends in PrioritySetting Technology planning and forecasting (60s, 70s) Technology Assessment, Technology Foresight and Roadmapping (80s, 90s) Trend towards „expertise-supported consultation mechanisms“ Trend towards programme and performance based funding (instead of institutional block grants) as means of priority setting for PROs … but always (and mainly!) a process of political bargaining
  • 13.
  • 14.
    Evaluation in thePolicy Cycle Evaluation (Ex ante / Ex post / Interim) is one of several tools of ‚strategic policy intelligence‘ that is used for PS. Others that have been used include (various forms of): Technology Assessment Technology Foresight Technology Roadmapping
  • 15.
    Evaluation in thePolicy Cycle Foresight TA Technology Roadmapping
  • 16.
    Evaluations in PrioritySetting So far, Evaluations have not been used systematically for PS Ex-ante evaluation is less developed than interim and ex-post evaluation Trend towards programme and performance based funding might increase the role of evaluations for PS Future: PS as an outcome of broader, systemic and continous ‚Strategic Policy Intelligence‘ (Foresight, Monitoring, Evaluation, Assessment)?
  • 17.
    The challenge forEvaluations in Priority Setting Main Challenge: to be able to compare between alternatives . In the ideal case: At a given point in time (e.g. the budget decision) Using metrics that ensure comparability At reasonable cost The main question: can we assess the future impacts (on economy and society) of technological developments and the policy interventions addressing these developments sufficiently well in order to allow for prioritization ?
  • 18.
    Dimensions of EvaluationAppropriateness (Are we doing the rigth thing? Is the policy intervention justified?) Quality and Efficiency of implementation (Are we doing it well? Is the programme management working? Effects and Impacts (What happens as a result of the programme?) Conclusions and Feedback for policy (policy learning to improve appropriateness and implementation)
  • 19.
    Scope and Limitsof Evaluations Scope Appropriateness : progress in applying ‚ logic chart ‘ models (see e.g. Jordan 2004) Quality and Efficiency of implementation: most evaluations have centered on this aspect. Sufficiently well developed qualitative approaches. Effects and Impacts: Macro / Meso / Micro level analysis – progress in some instances Additionality : Input/Output/Behavioural: substantial methodological progress (e.g. micro-econometric work including control-group approaches (see OECD 2006)
  • 20.
    Approaches to Impactassessment Macro (aggregate impact on productivity / GDP) Effects of R&D on productivity and growth, see e.g. GUELLEC and VAN POTTELSBERGHE (2001, 2004) Meso (on the level of industries, technologies or programmes) MANSFIELD (1977 ff) BEISE and STAHL (1998) and FIER (2004) Micro (on the level of individual R&D projects, enterprises or institutions) See e.g. FELLER and RUEGG (2003) for an overview of results from the ATP programmes
  • 21.
    Approaches to ImpactAssessment Econometric Models (Social) Cost-Benefit Analysis Surveys (beneficiaries, innovation surveys) Case Studies Sociometric and social network analysis Bibliometrics - Counts, Citations, Content analysis Historical Tracing of ‘critical technological events’ Expert judgment
  • 22.
    Macro and Mesoestimates of Return on R&D Source: Godin and Doré
  • 23.
    Meso/Micro level: SocialRates of individual technologies Source: LINK (1999)
  • 24.
    Mirco level: ‚Stylizedresults‘ from short-term ex-post CBA for individual projects .. clearly these differences are a guide for public investment ? Source: Bessette, 2003
  • 25.
    Scope and Limitsof Evaluations Problems: Timing of effects ?  Timing of evaluation Uncertainity about outcomes  wide range of estimates Attribution to causes & ‚project fallacy‘ Data availability Costs of monitoring and evaluation Limits: mostly in ex-ante and ex-post impact assessment, (example FP assessments) less so, but still considerable in identification of additionality
  • 26.
    Timing of expectedeconomic effects from ATP projects (stylized) Economic Impacts -1 0 1 2 3 4 5 6 7 8 9 10 or more years Announce Competetion Announce Award Complete Project Post-Project Period Short-Term Mid-Term Long-Term Spin-Off Products Product Improvements Costs Quality New Applications Identified New Products and Processes New Business Opportunities New Business Alliances Company Growth Early Spillover Benefits More New Products and Processes Intra-Industry Diffusion Inter-Industry Diffusion Market Expansion Employment Opportunities Production Distribution Private ROI Spillover Benefits Taxpayer ROI Total Economic Benefits Benefits to Awardees Proposal Preparation Joint Venture Formation Resource ID Increased R&D Spending Expanded Goals Acceleration Collaboration (R&D Efficiencies) Technology Employment Opportunities Technological Advances
  • 27.
    Scope and Limitsof Evaluations They give us a good idea about the rationale, implementation and goal attainment of programmes, … but only in a few cases quantitative evidence about the economic and wider social impacts and in any case not in a way that would allow strict comparison They are able to demonstrate positive private returns and externalities of R&D, both on the marco, meso and micro level … but only in terms of orders of magnitude and with considerable range of estimate
  • 28.
    Scope and Limitsof Evaluations Thus, they were mainly instruments for programme management and for (ex-ante and ex-post) legitimization of policy intervention Alas, what they do not provide us with is a robust basis for exact resource allocation … because most were not of a quantitative nature (see Licht 2007) … and almost all were done with a limited scope (in terms of technologies, impact classes, instruments,…)
  • 29.
    Towards realistic expectationsThus „ it is clear that the information requirements […] far exceed what is likely to be available in any practical situation and may in themselves place undue transaction costs upon the subsidy “ (Georghiou/Clarisse in OECD 2006) „… the precise alloaction […] is not important, as long as it is sufficiently diversified. Rather than attempting to refine the allocations, energy and resources may be more productively focused on ways to improve links within the research system.“ (PANNELL, 1999)
  • 30.
    Where shall wego from here? Path 1: Push the envelope: Further improve Evaluation methods and practice, e.g. in the following directions: Option Values (see Vonortas 2003) Micro-econometric modelling (e.g. current OECD project) ‚ Evaluations in context‘: evaluation of different instruments in a systemic perspective (e.g. forthcoming Evaluation of the Austrian funding system) … but there is an inherent limit to how far we can get!
  • 31.
    Where shall wego from here? Path 2: Turn the question around: Evaluation of Priority Setting processes itself ! Which PS processes were able to influence the directions of R&D and scientific and technological specialisation patterns of an innovation system? Can ‚good practices‘ for PS processes be identified?

Editor's Notes

  • #3 Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent properties of a system
  • #4 Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent properties of a system
  • #7 Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • #8 Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • #9 Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • #10 Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • #11 Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • #14 Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent properties of a system
  • #15 Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • #18 To answer this question we must take a short look into the state of the art in evaluation
  • #23 Numerous studies have been carried out in recent decades about the impacts of R&D
  • #30 This is a task for the sociologists and the historians of science
  • #31 This is a task for the sociologists and the historians of science
  • #32 This is a task for the sociologists and the historians of science