1. Evidence-based and
open: myths and reality of
policy-making
David Osimo
Open University of Catalunya,
Cristiano Codagnone,
London School of Economics
ICPP 2015, Milan
2. Two “recent” trends
Evidence based policy-making
• Long tradition, but emerging
in 1990s under New Labour
as a answer to “end of
ideology”
• “New Labour is a party of
ideas and ideals but not of
outdated ideology. What
counts is what works”
• Further developer in
automated, data-intensive
decision making
Open policy-making
• Roots in direct democracy
but emerging as part of
“open government” and
“government 2.0 trend
around 2008 with strong
techno-driven approach
• Open data, crowdsourcing
and policy co-design to
increase effectiveness of
public policy
3. Failed expectations?
• No evidence of increased quality of policy-making
• Evaluation carried out keep mentioning the same
systemic weaknesses (Hallsworth, 2011)
• Permanent low participation in online initiatives and
loudest voice effect (National Audit Office, 2012)
(Prieto-martín, Marcos, & Martínez, 2011).
• Financial crisis showed limitations of tools: “in the
face of the crisis, we felt abandoned by conventional
tools” (Trichet 2010)
4. Objectives
• As practitioners and advocates of open and
evidence-based policy, we aim to set the correct
expectations about what “science” and
“openness” can bring.
• We aim to understand what are the factors
behind the failed expectations of EB and Open
Policy-making, and identify possible common
patterns and lessons learnt.
5. Method
• Umbrella review (Grant & Booth, 2009)
• Scientific literature, grey literature such as
reports of the UK National Audit Office, and
online blog entries
• Integrated by personal experience of the
authors
6. Bottlenecks of EBPM
• Adoption of a rigid linear model from empirical data to scientific evidence to policy
decisions (Pielke, 2007). Policy is considered as the continuation of science by other
means.
• The positivist assumption of a predictable world, governed by laws that are fully
understandable and discoverable, without recognizing the complexity of science and
the rise of post-normal science.
• The obsessive removal of values from the equation and the trend towards technocracy
and scientization (Jasanoff, 1990). “Normative” today has become a “killer criticism” in
any policy discussion within government.
• The notion that full scientific knowledge will lead to optimal policy decision, failing to
recognize the importance of the “hiding hand” (Hirschman, 1967) where ignorance can
be a prerequisite of ambitious decisions.
• The sterilization of the relation between scientific advisors and policy-makers, failing to
understand the complexity of the factors that lead to policy decisions (Strassheim &
Kettunen, 2014)
• The importance of “framing the debate” to manipulate and manage policy decisions
(Lakoff, Dean, & Hazen, 2005).
• The difficulty in moving from decision to implementation in a highly distributed system.
Too often “right” and evidence-based decisions encounter resistance in their
implementation at the level of civil servants or intermediate institutions (Hallsworth,
2011)
7. Bottlenecks of open policy-making
• Top down policy mandates on openness struggle to be implemented
at lower level of government
• Inflated expectations of willingness of citizens to participate: unequal
participation is normal on online initiatives (Shirky, 2003). The key is
looking for insight, not representativeness. One open data reuser
could be enough as in Reinhart Rogoff case (Konczal, 2013)
• Excessive focus on the “decision”, rather than what happens before
and after. Expectation that crowdsourcing substitutes government
decision, rather than supporting it.
• Need to account for different level of openness between and within
initiatives
• Metrics are excessively focussed on quantity of participation rather
than quality
8. Design
Implement
Evaluate
Set agenda
Brainstorming
solutions
Drafting
proposals Revising
proposals
Induce
behavioural
change
Collaborativ
e action
Ensure
Buy-in
Monitor
executionCollect
feedback
Identify
problems
Collect
evidence
Set priorities
Analyze data
Uservoice,
ideascale
Etherpad
Co-ment.com
Social
networks
Persuasive
technologies
Challenge.
gov
Open data
Participatr
y sensing
Open Data
visualization
Evidencechall
enge.com
Collaborati
ve
visualizatio
n
Open
discussion
Policy
cycle
Simulate impact
of options
Model and
simulation
Source: CROSSOVER roadmap
9. It’s doesn’t have to be totally open to
the crowd
Open Declaration on European Public
Services
Open to all
Digital Agenda Mid Term review Open to all, 2000 comments received,
1500 participants
Pledge Tracker Only to those organisations committing to
the Grand Coalition for Digital Jobs
OpenIdeo Members of the OpenIdeo community
Daeimplementation Collaborative platform for EU MS
representative
Young Advisors to VP Neelie Kroes Appointed Young Advisors
Need for restricted online spaces
10. Not all the time open
Fuente: http://ebiinterfaces.wordpress.com/2010/11/29/ux-
people-autumn-2010-talks/
Open
brainstorming
Small groups
drafting
Open
commenting
Small group
re-drafting
Open
endorsement
EU Open
Declaration:
11. Common issues
• open and evidence-based are not contradictory but
complementary.
• the expectation that evidence, data or the “crowd” could
substitute the role of government as decision-makers, rather
than supporting them.
• decision-making in the public sphere is never done in a
vacuum, but under competing pressures of different forces:
the “Bermuda Triangle” composed by scientific evidence,
politics and values (ECGT, 2014)
• Stereotyped perspective on the different stakeholders and
lack of a systemic view of their interaction
• Need to consider the full policy cycle, not only decisions
• Lack of a robust evaluation framework
12. Conclusions
• Integrate open and evidence-based approach. The
challenges are similar and the benefits mutual.
• Adopt a systemic perspective of the interactions in
the decision-making process, covering the different
stakeholders and different phases of the policy cycle
• Undertand obth approaches as a support, not a
susbsitute to the role of policy-makers
• Recognize the limitations of scientific knowledge and
human behaviour
14. There’s elite and elite: who benefits?
14
Usual suspects No problem
Not
interested/interesting
Missed opportunity
Low quality of ideas High quality of ideas
Don’t participate in policy debate
Participate in policy debate
Source: adapted
from Kublai
evaluation