22/03/2018
Cluster Evaluation:
Learning for Clusters and Policy-makers –
Completing the Virtuous Circle
TCI European Conference | Sofia, Bulgaria
21 March 2018
22/03/2018
Exploring what happens and why?
• Better understanding
the “magic in the
middle”
What is Evaluation?
• Not audit – all about learning
• Evidencing the difference we make
• Influence policy and delivery
improvement simultaneously
Evaluation Questions
• What are we proposing to do?
• What are we aiming to achieve?
• Who cares or wants to know?
• Who needs to be involved?
• How much should we do?
• When should we start?
Evidencing Value
• Evidencing value – ROI
• - for Participants
• - for partners and stakeholders
• - for funders
• - for policy makers
Overall
objective
Why?
What?
How?
Activities/
tasks
undertaken
Partnership &
collaboration
Evaluation Scope
S E E I N G T H I N G S
Effectiveness
Efficiency
Relevance
Process
TCI CLUSTER EVALUATION WG
• A forum for learning collectively around common,
complex & important cluster evaluation challenges
• Sharing learning
• Identifying gaps
• Trying new approaches
2013 2014 2015 2016 2017
Workshop
Forres
TCI
Conference
Kolding
TCI
Conference
Monterrey
TCI
Conference
Eindhoven
Workshop
Belfast
Workshop
Rzeszow
Workshop
Barcelona
TCI
Conference
Daegu
TCI CLUSTER EVALUATION WG
Workshop
Oslo
TCI
Conference
Bogota
Kolding TCI 2013
22/03/2018
SOME KEY OUTPUTS
• Conference presentations
• Booklet on Designing Cluster Evaluation
(2014)
• Cluster evaluation boardgame (2014)
SOME KEY OUTPUTS
• Perfect cluster evaluation framework (2016)
• Principles to guide cluster evaluation (2016)
• Set of firm-level survey questions aimed at capturing the human element
of clusters (2016)
• Poster & short paper presented at OECD Blue Sky Forum (2016)
• Collected together on website:
www.tci-network.org/evaluation
Principles to Guide Evaluation
1. Evaluation for change
• Evaluation is about learning, not just audit
2. Different audiences need different outputs
3. Evaluation needs to reflect real world context
4. Capture evidence against
• Why (overall ambitions),
• What (projects and programmes) and
• How (collaborative behaviours)
5. Timing of evaluation - reflect the maturity
6. Social capital and trust are often fundamental so find
ways to evidence softer issues
7. Causality is challenging so gather basket of evidence
* From In search of indicators to support the ‘perfect cluster’: Where evaluation theory collides with
policy practice, Smith, M. Wise, E and Wilson, J, OECD Blue Sky Forum on Innovation Indicators,
Ghent October 2016
*
KEY EMERGING ISSUES
• Capturing the ‘human element’ of cluster dynamics
– Use of survey data vs/alongside other types of data (firm-level data, big data, data
from other policy programmes, participatory approaches to data collection … )
• Balance & relationship between programme level evaluations and
cluster initiative evaluations
• Cluster evaluation and new industrial policies (inc. smart specialisation
strategies) for structural transformation
• Cluster evaluation ‘beyond GDP’ as clusters contribute to broader
societal challenges
Core issues …
Looking to future …
Emerging Approaches to
Cluster Evaluation
Social
network
analysis
Participatory
evaluation
Realist
evaluation
Annual
reports
In-depth
studies
Quantitative Impact
Studies
Qualitative Case Studies
Towards
mixed
methods &
evidence
External
assessments
Cluster evaluation as learning
New techniques and approaches are needed to address these
challenges, - need a paradigm shift:
– Evaluation is too often seen as a threat or challenge
– See evaluation as learning
– An essential part of policy governance
– Part of the policy process
(not something that is done ‘afterwards’ to justify activities)
– Influence policy improvement and cluster management improvement
simultaneously
“Policy-makers need to become more comfortable with strategies that aim to
influence rather than control”
OECD (2009)
Cluster Evaluation 2018
• Sofia March 2018
• Cork Working group meeting May 2018
• Toronto Global Conference October 2018
• Human element surveys
• Publications
21
Evaluation and policy
22
World is Faster
Speed of change > Ability to learn and adapt
‘innovation is … the novelty that emerges from conversations
collaborations in dynamic, non-linear, networked
communities.’
Stacey , 2005
New Collaborations
New opportunities
New Business models
24
We need to Reinvent
1940’s
• 25 Shipyards
• 113 Coal mines
• 6 Steel works
• 3% GDP financial services.
• 0% world’s oil
2004
• 3 Shipyards
• 2 Coal mines
• 0 Steel works
• 9% Financial Serv.
• 6% of world’s oil
• Large PC producer
Scotland......
We need to Reinvent – and faster
1940’s
• 25 Shipyards
• 113 Coal mines
• 6 Steel works
• 3% GDP financial services.
• 0% world’s oil
2010
• 1 Shipyards
• 0 Coal mines
• 0 Steel works
• ?% Financial Serv.
• 6% of world’s oil –
• Renewables
• Large PC producer
• Biotech / Digital Media
Scotland......
We need to Reinvent – and even faster
1940’s
• 25 Shipyards
• 113 Coal mines
• 6 Steel works
• 3% GDP financial services.
• 0% world’s oil
2015
• 1 Shipyards
• 0 Coal mines
• 0 Steel works
• ?% Financial Serv.
• ?% of world’s oil – (but £)
• Renewables (offshore)
• Digital Health
Scotland......
• Rate of change
•Impact of globalisation
•Food and Drink/Tourism/Creative Industries
Focus of Policy Evaluation
29
30
Focus of Policy Evaluation
31
Focus of Policy Evaluation
32
Focus of Policy Evaluation
System
Focus of Policy Evaluation
• Project
• Organisation
• Policy/programme
• System
REVOLUTIO
N
AGRICULTURAL INDUSTRIA
L
PANEL: INNOVATION NORWAY &
VINNOVA
PANEL: INNOVATION NORWAY &
VINNOVA
42
NEXT STEPS
• Working Group Meeting: Cork, 14-15th May 2018
• TCI Global Conference: Toronto, 16-18th October 2018
Come Join us!
Madeline Smith
m.smith@gsa.ac.uk

Madeline Smith, Presentation TCI2018 European Conference Sofia

  • 1.
  • 2.
    Cluster Evaluation: Learning forClusters and Policy-makers – Completing the Virtuous Circle TCI European Conference | Sofia, Bulgaria 21 March 2018 22/03/2018
  • 3.
    Exploring what happensand why? • Better understanding the “magic in the middle”
  • 4.
    What is Evaluation? •Not audit – all about learning • Evidencing the difference we make • Influence policy and delivery improvement simultaneously
  • 5.
    Evaluation Questions • Whatare we proposing to do? • What are we aiming to achieve? • Who cares or wants to know? • Who needs to be involved? • How much should we do? • When should we start?
  • 6.
    Evidencing Value • Evidencingvalue – ROI • - for Participants • - for partners and stakeholders • - for funders • - for policy makers
  • 7.
  • 8.
    S E EI N G T H I N G S Effectiveness Efficiency Relevance Process
  • 9.
    TCI CLUSTER EVALUATIONWG • A forum for learning collectively around common, complex & important cluster evaluation challenges • Sharing learning • Identifying gaps • Trying new approaches
  • 11.
    2013 2014 20152016 2017 Workshop Forres TCI Conference Kolding TCI Conference Monterrey TCI Conference Eindhoven Workshop Belfast Workshop Rzeszow Workshop Barcelona TCI Conference Daegu TCI CLUSTER EVALUATION WG Workshop Oslo TCI Conference Bogota
  • 12.
  • 13.
    SOME KEY OUTPUTS •Conference presentations • Booklet on Designing Cluster Evaluation (2014) • Cluster evaluation boardgame (2014)
  • 14.
    SOME KEY OUTPUTS •Perfect cluster evaluation framework (2016) • Principles to guide cluster evaluation (2016) • Set of firm-level survey questions aimed at capturing the human element of clusters (2016) • Poster & short paper presented at OECD Blue Sky Forum (2016) • Collected together on website: www.tci-network.org/evaluation
  • 15.
    Principles to GuideEvaluation 1. Evaluation for change • Evaluation is about learning, not just audit 2. Different audiences need different outputs 3. Evaluation needs to reflect real world context 4. Capture evidence against • Why (overall ambitions), • What (projects and programmes) and • How (collaborative behaviours) 5. Timing of evaluation - reflect the maturity 6. Social capital and trust are often fundamental so find ways to evidence softer issues 7. Causality is challenging so gather basket of evidence * From In search of indicators to support the ‘perfect cluster’: Where evaluation theory collides with policy practice, Smith, M. Wise, E and Wilson, J, OECD Blue Sky Forum on Innovation Indicators, Ghent October 2016 *
  • 16.
    KEY EMERGING ISSUES •Capturing the ‘human element’ of cluster dynamics – Use of survey data vs/alongside other types of data (firm-level data, big data, data from other policy programmes, participatory approaches to data collection … ) • Balance & relationship between programme level evaluations and cluster initiative evaluations • Cluster evaluation and new industrial policies (inc. smart specialisation strategies) for structural transformation • Cluster evaluation ‘beyond GDP’ as clusters contribute to broader societal challenges Core issues … Looking to future …
  • 17.
    Emerging Approaches to ClusterEvaluation Social network analysis Participatory evaluation Realist evaluation Annual reports In-depth studies Quantitative Impact Studies Qualitative Case Studies Towards mixed methods & evidence External assessments
  • 18.
    Cluster evaluation aslearning New techniques and approaches are needed to address these challenges, - need a paradigm shift: – Evaluation is too often seen as a threat or challenge – See evaluation as learning – An essential part of policy governance – Part of the policy process (not something that is done ‘afterwards’ to justify activities) – Influence policy improvement and cluster management improvement simultaneously “Policy-makers need to become more comfortable with strategies that aim to influence rather than control” OECD (2009)
  • 19.
    Cluster Evaluation 2018 •Sofia March 2018 • Cork Working group meeting May 2018 • Toronto Global Conference October 2018 • Human element surveys • Publications 21
  • 20.
  • 21.
    World is Faster Speedof change > Ability to learn and adapt ‘innovation is … the novelty that emerges from conversations collaborations in dynamic, non-linear, networked communities.’ Stacey , 2005 New Collaborations New opportunities New Business models
  • 22.
  • 24.
    We need toReinvent 1940’s • 25 Shipyards • 113 Coal mines • 6 Steel works • 3% GDP financial services. • 0% world’s oil 2004 • 3 Shipyards • 2 Coal mines • 0 Steel works • 9% Financial Serv. • 6% of world’s oil • Large PC producer Scotland......
  • 25.
    We need toReinvent – and faster 1940’s • 25 Shipyards • 113 Coal mines • 6 Steel works • 3% GDP financial services. • 0% world’s oil 2010 • 1 Shipyards • 0 Coal mines • 0 Steel works • ?% Financial Serv. • 6% of world’s oil – • Renewables • Large PC producer • Biotech / Digital Media Scotland......
  • 26.
    We need toReinvent – and even faster 1940’s • 25 Shipyards • 113 Coal mines • 6 Steel works • 3% GDP financial services. • 0% world’s oil 2015 • 1 Shipyards • 0 Coal mines • 0 Steel works • ?% Financial Serv. • ?% of world’s oil – (but £) • Renewables (offshore) • Digital Health Scotland...... • Rate of change •Impact of globalisation •Food and Drink/Tourism/Creative Industries
  • 27.
    Focus of PolicyEvaluation 29
  • 28.
  • 29.
  • 30.
    32 Focus of PolicyEvaluation System
  • 31.
    Focus of PolicyEvaluation • Project • Organisation • Policy/programme • System REVOLUTIO N AGRICULTURAL INDUSTRIA L
  • 32.
  • 33.
  • 35.
    NEXT STEPS • WorkingGroup Meeting: Cork, 14-15th May 2018 • TCI Global Conference: Toronto, 16-18th October 2018 Come Join us!
  • 36.