This document summarizes a study on innovation system evaluation practices across EU member states. The researchers conducted interviews with heads of innovation policy in 28 EU countries to develop a typology of system evaluation approaches. They identified four types of approaches: 1) permanent evaluation structures with centralized institutional support, 2) permanent structures with decentralized support, 3) ad hoc evaluation exercises, and 4) no dedicated evaluation. The majority of countries fell into types 3 and 4. The study aims to provide empirical evidence on different system evaluation models used in policymaking across Europe.
➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men 🔝mahisagar🔝 Esc...
Laatsit - Towards a typology of innovation system practices
1. Towards a Typology of Innovation
System Evaluation Practices:
Evidence from EU Member States
Mart Laatsit (Copenhagen Business School, Denmark)
Susana Borrás (Copenhagen Business School, Denmark)
3. Background
• System of innovation concept, but no corresponding translation to
evaluation approaches
• Demand from policy-makers
• Supply efforts by European Commission and OECD
• Insufficient academic attention
• Lack of empirical evidence
4. Conceptualisation
• Previous work
• System evaluation as sum of individual evaluations (Edler et al 2008, Magro
and Wilson 2013)
• System evaluation as an ‘analysis of system health’ (Arnold 2004)
• Our definition of system evaluation
• Evaluation framework that assesses the innovation policies and institutional
frameworks of the innovation system in a way that provide an encompassing
evaluative view
5. System evaluation
• Combining information from three levels relevant to policy-making:
the programme level, policy-mix level and economic performance
level
Economic performance
Programmes
Policy-mix
System
evaluation
6. System evaluation
• Combining information from three levels relevant to policy-making:
the programme level, policy-mix level and economic performance
level
Economic performance
Programmes
Policy-mix
System
evaluation
7. Data collection and analysis
• Semi-structured interviews with 28 EU member states
• Heads of innovation policy/ innovation policy analysis
• Two-stage analysis
YES/NO TYPOLOGY
1. 2.
8. Empirical findings I
AT BE BG HR CY CZ DK EE FI FR DE EL HU IE
OECD
ERAC/CREST
National
IT LV LT LU MT NL PL PT RO SI SK ES SE UK
OECD
ERAC/CREST
National
9. Towards a typology
• Peter Dahler-Larsen 2012, ”The Evaluation Society”
Permanence
Institutional support; routines;
continuity
Coverage Scope of phenomena studied
Organizational responsibility
Accountability, reporting,
placement
Prospective use Planned form of use
Epistemological perspective
Assumptions about concepts,
methods and data
System evaluation as an ‘analysis of system health’. While unclear on the exact methods to be used in this analysis, he suggests it should cover the performance of the major institutional blocks in the system, connectivity within the system as well as knowledge and capabilities (Arnold 2004).
In other words, system evaluations are those with a wide coverage of the object under assessment (the innovation system) in a way that takes into consideration individual policy programs (micro-level), policy mixes (meso-level) and innovative performance (macro-level).
It can be claimed that an evaluation can be called systemic when it helps to assess the two dimensions that make an innovation system: the policy-dimension (i.e. institutional set-up) and the innovation performance dimension (i.e. aggregate data on firm performance). Furthermore, assessing the institutional set-up comprises both policy level evaluations as well as evaluating the overall policy-mix. In fact, it is the policy-mix dimension that provides the connection between programme evaluations and overall economic indicators on innovation performance. Therefore, we see the policy-mix dimension as crucial in understanding the link between policy and its economic effects.
Does the money pay off, is the balance optimal between policy instruments, what is the role of individual instruments in the system?
It can be claimed that an evaluation can be called systemic when it helps to assess the two dimensions that make an innovation system: the policy-dimension (i.e. institutional set-up) and the innovation performance dimension (i.e. aggregate data on firm performance). Furthermore, assessing the institutional set-up comprises both policy level evaluations as well as evaluating the overall policy-mix. In fact, it is the policy-mix dimension that provides the connection between programme evaluations and overall economic indicators on innovation performance. Therefore, we see the policy-mix dimension as crucial in understanding the link between policy and its economic effects.
Does the money pay off, is the balance optimal between policy instruments, what is the role of individual instruments in the system?
semi-structured interviews with policy-makers from the 28 EU member states.
heads of innovation policy or other senior officials in the field of innovation policy
data on general evaluation practices and system evaluations
specifically:
how is the performance of a country’s innovation policy evaluated?
who are the actors involved?
what are the main characteristics of the evaluations in terms of scope, regularity and methods?
Two-stage analysis
determine whether robust system evaluation practices exist in a country
identify the types of system evaluation practices and potential clustering
OECD and ERAC/CREST (CREST 2005-2008, ERAC 2010-2014, PSF 2015)
8 countries with OECD country-reviews
16 countries with ERAC/CREST peer-reviews
5 with both OECD and ERAC/CREST, 8 with none
Feature
Operationalisation
Time/Permanence
* How stable is the evaluation system?
* Institutional support?
* Rules, regulations, incentives, routines?
* Experiences with more than one round of evaluation?
Organizational responsibility
* Who are held accountable for what in the evaluation system?
* Reporting structures?
* Political and organizational placement of evaluation machine
Epistemological perspective
* Theoretical assumptions about innovation system and how it works
* Assumptions about indicators (of what)
* Methodological assumptions
* Data collection processes