Wp.Priority Setting, Paris 29 30 Oct. 07

429 views

Published on

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
429
On SlideShare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent properties of a system
  • Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent properties of a system
  • Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent properties of a system
  • Why important that it is ‚conscious‘: because ‚patterns‘ and ‚specialisations‘ are emergent propoerties of a system
  • To answer this question we must take a short look into the state of the art in evaluation
  • Numerous studies have been carried out in recent decades about the impacts of R&D
  • This is a task for the sociologists and the historians of science
  • This is a task for the sociologists and the historians of science
  • This is a task for the sociologists and the historians of science
  • Wp.Priority Setting, Paris 29 30 Oct. 07

    1. 1. Issues in Priority Setting <ul><li>Wolfgang Polt </li></ul><ul><li>Joanneum Research </li></ul><ul><li>[email_address] </li></ul><ul><li>OECD Workshop on </li></ul><ul><li>Rethinking Evaluation in Science and Technology </li></ul><ul><li>Paris 30.10.2007 </li></ul>
    2. 2. <ul><li>Definitions </li></ul><ul><ul><li>Priority setting (PS) : conscious/deliberate selection of certain activities/actors/policies at the expense of others with an impact on resource allocation </li></ul></ul><ul><ul><li>Evaluation : systematic assessment of the rationale, implementation and impact of a policy intervention </li></ul></ul>
    3. 3. <ul><li>Priority Setting </li></ul>
    4. 4. <ul><li>Typical questions </li></ul><ul><ul><li>“ Shall we invest more in basic research instead of innovation?” </li></ul></ul><ul><ul><li>“ What are the technologies that have most private and social return?” </li></ul></ul><ul><ul><li>“ Shall we rather set up a lab for earth observation or a new particle collider?” </li></ul></ul><ul><ul><li>… </li></ul></ul>
    5. 5. <ul><li>Dimensions of the PS process </li></ul><ul><ul><li>Types of priorities: </li></ul></ul><ul><ul><ul><li>thematic priorities (technologies, societal missions) </li></ul></ul></ul><ul><ul><ul><li>functional/generic </li></ul></ul></ul><ul><ul><li>Levels concerning the hierarchical position of the different priority setting actors and/or institutions. </li></ul></ul><ul><ul><li>Nature of the priority setting process (e.g. top-down/expert-based vs. bottom-up/participatory, degree of formalization, mechanisms for implementation, evaluation). </li></ul></ul>
    6. 6. <ul><li>Paradigms of Priority Setting </li></ul>Source: Gassler, Polt, Rammer 2007
    7. 7. <ul><li>Paradigms of Priority Setting </li></ul>Source: Gassler, Polt, Rammer 2007
    8. 8. <ul><li>Actors in Priority Setting </li></ul><ul><li>Multitude of actors (as a function of size, development and complexity of the research, innovation and policy systems) : </li></ul><ul><ul><ul><li>Federal and Regional Governments </li></ul></ul></ul><ul><ul><ul><li>… supported by S&T policy councils or other advisory bodies… </li></ul></ul></ul><ul><ul><ul><li>Research Councils and Funding Agencies </li></ul></ul></ul><ul><ul><ul><li>Research Performers (Enterprises, PROs, Research Teams, ..) </li></ul></ul></ul><ul><li>With different needs, perspectives and capacities with respect to priority setting </li></ul>
    9. 9. <ul><li>Actors in Priority setting – ministries and agencies </li></ul>
    10. 10. <ul><li>Means of Priority Setting </li></ul><ul><ul><li>Government White Papers </li></ul></ul><ul><ul><li>Budget plans & allocations </li></ul></ul><ul><ul><li>Targeted Research and Technology Programmes </li></ul></ul><ul><ul><li>Government Procurement </li></ul></ul><ul><ul><li>Institutions (Profiling, Specialisation) </li></ul></ul><ul><ul><li>Performance Based Contracting </li></ul></ul><ul><ul><li>Clusters / Technology Platforms </li></ul></ul><ul><ul><li>Strategic Research Agendas </li></ul></ul><ul><ul><li>… </li></ul></ul>
    11. 11. <ul><li>Conceptual underpinning of technology centered PS </li></ul><ul><li>Strategic </li></ul><ul><li>Critical </li></ul><ul><li>Key </li></ul><ul><li>Emerging </li></ul><ul><li>Pathbreaking </li></ul><ul><li>Infrastructural </li></ul><ul><li>Generic </li></ul><ul><li>General Purpose </li></ul><ul><li>Disruptive </li></ul>„ ..most of these lists of technologies remain at a level which makes them only a poor guide for policy...“ Branscomb (1994) Lists of Technologies
    12. 12. <ul><li>Trends in Priority Setting </li></ul><ul><ul><li>Technology planning and forecasting (60s, 70s) </li></ul></ul><ul><ul><li>Technology Assessment, Technology Foresight and Roadmapping (80s, 90s) </li></ul></ul><ul><ul><li>Trend towards „expertise-supported consultation mechanisms“ </li></ul></ul><ul><ul><li>Trend towards programme and performance based funding (instead of institutional block grants) as means of priority setting for PROs </li></ul></ul><ul><ul><li>… but always (and mainly!) a process of political bargaining </li></ul></ul>
    13. 13. <ul><li>Evaluation </li></ul>
    14. 14. <ul><li>Evaluation in the Policy Cycle </li></ul><ul><ul><li>Evaluation (Ex ante / Ex post / Interim) is one of several tools of ‚strategic policy intelligence‘ that is used for PS. Others that have been used include (various forms of): </li></ul></ul><ul><ul><ul><li>Technology Assessment </li></ul></ul></ul><ul><ul><ul><li>Technology Foresight </li></ul></ul></ul><ul><ul><ul><li>Technology Roadmapping </li></ul></ul></ul>
    15. 15. Evaluation in the Policy Cycle Foresight TA Technology Roadmapping
    16. 16. <ul><li>Evaluations in Priority Setting </li></ul><ul><ul><li>So far, Evaluations have not been used systematically for PS </li></ul></ul><ul><ul><li>Ex-ante evaluation is less developed than interim and ex-post evaluation </li></ul></ul><ul><ul><li>Trend towards programme and performance based funding might increase the role of evaluations for PS </li></ul></ul><ul><ul><li>Future: PS as an outcome of broader, systemic and continous ‚Strategic Policy Intelligence‘ (Foresight, Monitoring, Evaluation, Assessment)? </li></ul></ul>
    17. 17. <ul><li>The challenge for Evaluations in Priority Setting </li></ul><ul><ul><li>Main Challenge: to be able to compare between alternatives . In the ideal case: </li></ul></ul><ul><ul><ul><li>At a given point in time (e.g. the budget decision) </li></ul></ul></ul><ul><ul><ul><li>Using metrics that ensure comparability </li></ul></ul></ul><ul><ul><ul><li>At reasonable cost </li></ul></ul></ul><ul><ul><li>The main question: can we assess the future impacts (on economy and society) of technological developments and the policy interventions addressing these developments sufficiently well in order to allow for prioritization ? </li></ul></ul>
    18. 18. Dimensions of Evaluation <ul><ul><li>Appropriateness (Are we doing the rigth thing? Is the policy intervention justified?) </li></ul></ul><ul><ul><li>Quality and Efficiency of implementation (Are we doing it well? Is the programme management working? </li></ul></ul><ul><ul><li>Effects and Impacts (What happens as a result of the programme?) </li></ul></ul><ul><ul><li>Conclusions and Feedback for policy (policy learning to improve appropriateness and implementation) </li></ul></ul>
    19. 19. Scope and Limits of Evaluations <ul><li>Scope </li></ul><ul><ul><li>Appropriateness : progress in applying ‚ logic chart ‘ models (see e.g. Jordan 2004) </li></ul></ul><ul><ul><li>Quality and Efficiency of implementation: most evaluations have centered on this aspect. Sufficiently well developed qualitative approaches. </li></ul></ul><ul><ul><li>Effects and Impacts: </li></ul></ul><ul><ul><ul><li>Macro / Meso / Micro level analysis – progress in some instances </li></ul></ul></ul><ul><ul><li>Additionality : Input/Output/Behavioural: substantial methodological progress (e.g. micro-econometric work including control-group approaches (see OECD 2006) </li></ul></ul>
    20. 20. Approaches to Impact assessment <ul><li>Macro (aggregate impact on productivity / GDP) </li></ul><ul><ul><li>Effects of R&D on productivity and growth, see e.g. GUELLEC and VAN POTTELSBERGHE (2001, 2004) </li></ul></ul><ul><li>Meso (on the level of industries, technologies or programmes) </li></ul><ul><ul><li>MANSFIELD (1977 ff) </li></ul></ul><ul><ul><li>BEISE and STAHL (1998) and FIER (2004) </li></ul></ul><ul><li>Micro (on the level of individual R&D projects, enterprises or institutions) </li></ul><ul><ul><li>See e.g. FELLER and RUEGG (2003) for an overview of results from the ATP programmes </li></ul></ul>
    21. 21. Approaches to Impact Assessment <ul><li>Econometric Models </li></ul><ul><li>(Social) Cost-Benefit Analysis </li></ul><ul><li>Surveys (beneficiaries, innovation surveys) </li></ul><ul><li>Case Studies </li></ul><ul><li>Sociometric and social network analysis </li></ul><ul><li>Bibliometrics - Counts, Citations, Content analysis </li></ul><ul><li>Historical Tracing of ‘critical technological events’ </li></ul><ul><li>Expert judgment </li></ul>
    22. 22. Macro and Meso estimates of Return on R&D Source: Godin and Doré
    23. 23. Meso/Micro level: Social Rates of individual technologies Source: LINK (1999)
    24. 24. Mirco level: ‚Stylized results‘ from short-term ex-post CBA for individual projects .. clearly these differences are a guide for public investment ? Source: Bessette, 2003
    25. 25. Scope and Limits of Evaluations <ul><li>Problems: </li></ul><ul><ul><li>Timing of effects ?  Timing of evaluation </li></ul></ul><ul><ul><li>Uncertainity about outcomes  wide range of estimates </li></ul></ul><ul><ul><li>Attribution to causes & ‚project fallacy‘ </li></ul></ul><ul><ul><li>Data availability </li></ul></ul><ul><ul><li>Costs of monitoring and evaluation </li></ul></ul><ul><li>Limits: </li></ul><ul><ul><li>mostly in ex-ante and ex-post impact assessment, (example FP assessments) </li></ul></ul><ul><ul><li>less so, but still considerable in identification of additionality </li></ul></ul>
    26. 26. Timing of expected economic effects from ATP projects (stylized) Economic Impacts -1 0 1 2 3 4 5 6 7 8 9 10 or more years Announce Competetion Announce Award Complete Project Post-Project Period Short-Term Mid-Term Long-Term <ul><li>Spin-Off Products </li></ul><ul><li>Product Improvements </li></ul><ul><ul><li>Costs </li></ul></ul><ul><ul><li>Quality </li></ul></ul><ul><li>New Applications Identified </li></ul><ul><li>New Products and Processes </li></ul><ul><li>New Business Opportunities </li></ul><ul><li>New Business Alliances </li></ul><ul><li>Company Growth </li></ul><ul><li>Early Spillover Benefits </li></ul><ul><li>More New Products and Processes </li></ul><ul><li>Intra-Industry Diffusion </li></ul><ul><li>Inter-Industry Diffusion </li></ul><ul><li>Market Expansion </li></ul><ul><li>Employment Opportunities </li></ul><ul><ul><li>Production </li></ul></ul><ul><ul><li>Distribution </li></ul></ul><ul><li>Private ROI </li></ul><ul><li>Spillover Benefits </li></ul><ul><li>Taxpayer ROI </li></ul>Total Economic Benefits Benefits to Awardees <ul><li>Proposal Preparation </li></ul><ul><li>Joint Venture Formation </li></ul><ul><li>Resource ID </li></ul><ul><li>Increased R&D Spending </li></ul><ul><li>Expanded Goals </li></ul><ul><li>Acceleration </li></ul><ul><li>Collaboration (R&D Efficiencies) </li></ul><ul><li>Technology Employment Opportunities </li></ul><ul><li>Technological Advances </li></ul>
    27. 27. Scope and Limits of Evaluations <ul><li>They give us a good idea about the rationale, implementation and goal attainment of programmes, </li></ul><ul><ul><li>… but only in a few cases quantitative evidence about the economic and wider social impacts and in any case not in a way that would allow strict comparison </li></ul></ul><ul><li>They are able to demonstrate positive private returns and externalities of R&D, both on the marco, meso and micro level </li></ul><ul><ul><li>… but only in terms of orders of magnitude and with considerable range of estimate </li></ul></ul>
    28. 28. Scope and Limits of Evaluations <ul><li>Thus, they were mainly instruments for programme management and for (ex-ante and ex-post) legitimization of policy intervention </li></ul><ul><li>Alas, what they do not provide us with is a robust basis for exact resource allocation </li></ul><ul><ul><li>… because most were not of a quantitative nature (see Licht 2007) </li></ul></ul><ul><ul><li>… and almost all were done with a limited scope (in terms of technologies, impact classes, instruments,…) </li></ul></ul>
    29. 29. Towards realistic expectations <ul><li>Thus „ it is clear that the information requirements […] far exceed what is likely to be available in any practical situation and may in themselves place undue transaction costs upon the subsidy “ (Georghiou/Clarisse in OECD 2006) </li></ul><ul><li>„… the precise alloaction […] is not important, as long as it is sufficiently diversified. Rather than attempting to refine the allocations, energy and resources may be more productively focused on ways to improve links within the research system.“ (PANNELL, 1999) </li></ul>
    30. 30. Where shall we go from here? <ul><li>Path 1: Push the envelope: Further improve Evaluation methods and practice, e.g. in the following directions: </li></ul><ul><ul><li>Option Values (see Vonortas 2003) </li></ul></ul><ul><ul><li>Micro-econometric modelling (e.g. current OECD project) </li></ul></ul><ul><ul><li>‚ Evaluations in context‘: evaluation of different instruments in a systemic perspective (e.g. forthcoming Evaluation of the Austrian funding system) </li></ul></ul><ul><ul><li>… but there is an inherent limit to how far we can get! </li></ul></ul>
    31. 31. Where shall we go from here? <ul><li>Path 2: Turn the question around: Evaluation of Priority Setting processes itself ! </li></ul><ul><ul><li>Which PS processes were able to influence the directions of R&D and scientific and technological specialisation patterns of an innovation system? </li></ul></ul><ul><ul><li>Can ‚good practices‘ for PS processes be identified? </li></ul></ul>

    ×