Towards a Transformative View of
Evaluation: Evaluating Global Health
Interventions
Presentation at the International Food...
How do we
make
evaluations
matter?

Two
Examples

What is
evaluation?
Evaluating
Research:
Pathways of
Influence

Summing ...
What is evaluation? A useful but
perhaps incomplete definition
• Evaluation is defined both as a means of
assessing perfor...
Purpose of evaluation (Mark,
Henry and Julnes, 2000
• Assessing merit and worth
• Causal questions, RCT, observational stu...
An Example: Primary Prevention
Have a Heart Paisley
Features of complex interventions (Pawson et al., 2004)











The intervention is a theory or theories
The int...
System Dynamic Approaches (Sterman, 2006)

•
•
•
•
•
•

Constantly changing;
Governed by feedback;
Non-linear, History-dep...
‚Solutions‛ Can Also Create New Problems
Policy resistance is the tendency for interventions to be delayed, diluted, or de...
System-as-Cause

Forrester JW. Counterintuitive behavior of social systems. Technology Review 1971;73(3):53-68.
Meadows DH...
So why are evaluations so
often not very useful?
UN Office of the Internal Oversight
Services, 2008
• A Critique of Results-Based Management (2008).
• “Results-based manag...
The UN Critique of performance
management and evaluation
• Lack of strategic direction and cross-organizational
performanc...
The UN Critique (2)
• Lack of clarity on the consequences of good and
poor performance
• Lack of clarity on the capacity n...
The logic of an
evolutionary strategy
 Box et al (1978, p. 303):
 ... the best time to design an experiment is after it ...
What kind of evaluation will you be doing?
Formative

Developmental
Summative

15
A Ten Step approach to Evualation

A

INTERVENTION THEORY AND
DEVELOPING EXPECTATIONS OF
IMPACTS OVER TIME

B

The key com...
An Example: Primary Prevention Have a Heart
Paisley
On what basis do
we make a
decision to
SUSTAIN the
program?

What gets...
POLICY
LANDSCAPE
(dynamic, changes
over time)

B

D
E

A

EVIDENCE

PROGRAMMES

Evaluator

THEORY

C

Hypothesised
Pathway...
Example 1

19
Background
• Project in Collaboration with the China National
Health Development Research Centre to build
Evaluation Capac...
• Literature
Reviews
• Surveys of
Innovations
• Determine
Evaluation Assets
and Needs

Create
Guidelines

Evaluate 3
proje...
Evaluation Approach

Complexity of Intervention

22
Approach to Evaluation

Complexity of Intervention

23
Review of articles
Context

Expectations and
timelines

Understanding and
changing
understanding

Organizational
structure...
Describe
the
Interventio
n

What was the
Setting of the
intervention?
What was the
Context?

What was the
duration of
the
...
What was the
Intervention
Planners’
view of
success?

Were there
formal
structures to
learn and
modify the
program over
ti...
Example 2

27
Goals
“To contribute to improving health and strengthening
health systems in low and middle income countries
(LMICs), by s...
Objectives
• Foster international partnerships and collaboration to
promote the generation and effective communication
and...
Basic Questions
How complete is our knowledge of how these
initiatives are intended to work?
Do we have clarity on the tim...
•

Key Concepts
Limited sphere of control
•

•

Flexible sphere of control

Timeline of impact

• Metrics and incentives
•...
The Problem of
Insufficient Imagination
Granting
Mechanisms

Research,
Capacity
Building and
Knowledge
Translation
activities

Health
Impacts
4. MERIT
REVIEW

B

D
3. NORTH-SOUTH
RESEARCHERS
AND KNOWLEDGE
USERS
RESPOND TO
REQUEST FOR
PROPOSALS

G

F
6.
BUILDING
TE...
Data
Interviews with Teasdale-Corti
planners in multiple funding
organizations including CIHR
and IDRC

Formal analysis of...
Support from evidence
• There was strong support from the

evidence for the overall goal of TeasdaleCorti: there was suppo...
Some learnings (1)
•
•
•
•
•
•

Greater upfront clarity of the anticipated
timeline of impact
Greater upfront thinking abo...
Some learnings (2)
• Limited focus on health equity
• Greater clarify on amount and types of support
•

•
•
•

grantees ne...
Refining the Theory of Change:
examples of learnings
Greater clarity (by providing models,
exemplars, narratives of succes...
Towards a more
imagined plan
1

2
Problem Definition

3

LMIC Priorities:

4
Promoting Research Use and
Influence

Designing Locally-Useful Research

5...
7

8
Equity

9

Sustainability

10

Timeline of Impact:

Clarity and coherence of outcomes and
specific theory of change

...
Ideas for improvement
Recommendations based on proposal
analysis
• More explicit focus on barriers
• Have a clearer focus ...
The ‘blackbox’ of
capacitybuilding

Synergizing
activities

Long timeline
of impact that
may exist for
research to
influen...
Unpacking the Plan

Theory of Change

Developmental - what are capacities needed?

Sphere of Control

Timeline of Impact

...
Evaluating research
• What are pathways of influence by which research
impacts policies?
• Can a results based culture hel...
Research as
Intervention

Individual
Research
Project

?

Policy
Influence

Social
Betterment

Monitoring
and
Evaluation

...
Research Quality

Monitorin
g and
Evaluation
System

Pathways
of Policy
Influence
Research
as
Interventio
n

Context

Mechanis
ms

Outcomes
MODEL 1
From
Knowledge to
Policy
See Knowledge to Policy: Making the
most of Development Research, Fred
Carden, 2009
Overall Context
Stability of Decision
Making Institutions

Research as
Intervention

Research
Quality
Research
Network
s

...
Model 2:
Pathways of
Influence
Mel Mark and Gary Henry, Evaluation,
2003
Research Initiatives in
specific places
• Local research for lasting solutions
• Some key questions:
o What is local about...
Global
Context
Research
as
Interventio
n

Global
Mechanis
ms
Outcomes

Local
Context

Local
Mechanis
ms
Local Context
• Are your theories locally generated? Are the needs
local?
• Examples of local context that have interfered...
Summing up and looking
ahead

58
Models of Causation
(Successionist vs.
Generative Models
of Causation)

Integrating
Knowledge
Translation with
evaluation
...
PPWNov13- Day 2 keynote- S.Sridharan- U Toronto
PPWNov13- Day 2 keynote- S.Sridharan- U Toronto
Upcoming SlideShare
Loading in...5
×

PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

153

Published on

Day 2 keynote: Sanjeev Sridharan, University of Toronto: “Research and evaluation in global health policy processes”

Workshop on Approaches and Methods for Policy Process Research, co-sponsored by the CGIAR Research Programs on Policies, Institutions and Markets (PIM) and Agriculture for Nutrition and Health (A4NH) at IFPRI-Washington DC, November 18-20, 2013.

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
153
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
8
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

  1. 1. Towards a Transformative View of Evaluation: Evaluating Global Health Interventions Presentation at the International Food Policy Research Institute Sanjeev Sridharan The Evaluation Centre for Complex Health Interventions University of Toronto & St. Michael‟s Hospital November 19, 2013
  2. 2. How do we make evaluations matter? Two Examples What is evaluation? Evaluating Research: Pathways of Influence Summing up
  3. 3. What is evaluation? A useful but perhaps incomplete definition • Evaluation is defined both as a means of assessing performance and to identify alternative ways to deliver • “evaluation is the systematic collection and analysis of evidence on the outcomes of programs to make judgments about their relevance, performance and alternative ways to deliver them or to achieve the same results.” …..what role can evaluation/ evaluative thinking play in navigating interventions? 3
  4. 4. Purpose of evaluation (Mark, Henry and Julnes, 2000 • Assessing merit and worth • Causal questions, RCT, observational studies • Programme and organizational improvement o Formative evaluation • Oversight and compliance • Knowledge development o Neglected purpose of many evaluations 4
  5. 5. An Example: Primary Prevention Have a Heart Paisley
  6. 6. Features of complex interventions (Pawson et al., 2004)        The intervention is a theory or theories The intervention involves the actions of people. The intervention consists of a chain of steps These chains of steps or processes are often not linear, and involve negotiation and feedback at each stage. Interventions are embedded in social systems and how they work is shaped by this context. Interventions are prone to modification as they are implemented. Interventions are open systems and change through learning as stakeholders come to understand them.
  7. 7. System Dynamic Approaches (Sterman, 2006) • • • • • • Constantly changing; Governed by feedback; Non-linear, History-dependent; Adaptive and evolving; Characterized by trade-offs; Policy resistance: “The result is policy resistance, the tendency for interventions to be defeated by the system‟s response to the intervention itself.”
  8. 8. ‚Solutions‛ Can Also Create New Problems Policy resistance is the tendency for interventions to be delayed, diluted, or defeated by the response of the system to the intervention itself. -- Meadows, Richardson, Bruckman Meadows DH, Richardson J, Bruckmann G. Groping in the dark: the first decade of global modelling. New York, NY: Wiley, 1982. Merton RK. The unanticipated consequences of purposive social action. American Sociological Review 1936;1936:894-904. Forrester JW. Counterintuitive behavior of social systems. Technology Review 1971;73(3):53-68.
  9. 9. System-as-Cause Forrester JW. Counterintuitive behavior of social systems. Technology Review 1971;73(3):53-68. Meadows DH. Leverage points: places to intervene in a system. Sustainability Institute, 1999. Available at <http://www.sustainabilityinstitute.org/pubs/Leverage_Points.pdf>. Richardson GP. Feedback thought in social science and systems theory. Philadelphia, PA: University of Pennsylvania Press, 1991. Sterman JD. Business dynamics: systems thinking and modeling for a complex world. Boston, MA: Irwin McGraw-Hill, 2000.
  10. 10. So why are evaluations so often not very useful?
  11. 11. UN Office of the Internal Oversight Services, 2008 • A Critique of Results-Based Management (2008). • “Results-based management at the United Nations has been an administrative chore of little value to accountability and decision-making.”
  12. 12. The UN Critique of performance management and evaluation • Lack of strategic direction and cross-organizational performance incentives • Problems of attribution and trivializing innovation • Trivializing outcomes • The practice of lacks rigor • A lack of purpose
  13. 13. The UN Critique (2) • Lack of clarity on the consequences of good and poor performance • Lack of clarity on the capacity needed to build a results-based management system • Technical solutions are not a substitute for substantive clarity
  14. 14. The logic of an evolutionary strategy  Box et al (1978, p. 303):  ... the best time to design an experiment is after it is finished, the converse is that the worst time is the beginning, when least is known. If the entire experiment was designed at the outset, the following would have to be assumed as known: (1) which variables were the most important, (2) over what ranges the variables should be studied... The experimenter is least able to answer such questions at the outset of an investigation but gradually becomes more able to do so as a program evolves. (p. 303) 14
  15. 15. What kind of evaluation will you be doing? Formative Developmental Summative 15
  16. 16. A Ten Step approach to Evualation A INTERVENTION THEORY AND DEVELOPING EXPECTATIONS OF IMPACTS OVER TIME B The key components of the complex intervention The pathways of influence of an evaluation The program theory of the complex intervention Learning framework for the evaluation Learning from the Evidence Base The anticipated timeline of impact C IMPACTS AND LEARNING LEARNING FRAMEWORKS AND PATHWAYS OF INFLUENCE D SPREAD AND SUSTAINABILITY Assessing the impact of the intervention: DESIGN Spreading learning from an evaluation Learning about the intervention over time Reflections on performance and sustainability 16
  17. 17. An Example: Primary Prevention Have a Heart Paisley On what basis do we make a decision to SUSTAIN the program? What gets SPREAD as a result of the evaluation? What are the key LEARNINGS from the evaluation?
  18. 18. POLICY LANDSCAPE (dynamic, changes over time) B D E A EVIDENCE PROGRAMMES Evaluator THEORY C Hypothesised Pathways Evidence Base Linked to Pathways Areas of Uncertainty EVALUATION Design, methods, approaches PROCESS LEARNING RISK LANDSCAPE ORGANISATIONA L LEARNING INDIVIDUA L IMPACTS POLICY IMPACT ALIGNMENT (moving beyond programmes at level of analysis) Interpersonal, Individual & Collective Mechanisms ACTION
  19. 19. Example 1 19
  20. 20. Background • Project in Collaboration with the China National Health Development Research Centre to build Evaluation Capacity for evaluating health system reform efforts • Focus on Health Inequities • Developing guidelines to evaluate health inequity initiatives • Relationship to complexity 20
  21. 21. • Literature Reviews • Surveys of Innovations • Determine Evaluation Assets and Needs Create Guidelines Evaluate 3 projects: Revise Guidelines Levels of Evaluation Influence Test and Refine the Guidelines Individual Inter-personal Collective Evaluation Capacity Building Knowledge Translation 21
  22. 22. Evaluation Approach Complexity of Intervention 22
  23. 23. Approach to Evaluation Complexity of Intervention 23
  24. 24. Review of articles Context Expectations and timelines Understanding and changing understanding Organizational structures for learning Multiple understandings of impacts Sustainability and spread 24
  25. 25. Describe the Interventio n What was the Setting of the intervention? What was the Context? What was the duration of the intervention? Discussion on timelines and trajectories of impacts? Was there a discussion of the evidence informing the program? Was the evidence from multiple disciplines? Is the program a pilot? Challenges of adaptation to specific setting? Were there changes in the intervention over time? How did the evaluation explore this? Was the theory of change described? Did it change over time? 25
  26. 26. What was the Intervention Planners’ view of success? Were there formal structures to learn and modify the program over time? Was the organization al structure of the intervention described? How were impacts studied; what design were implemented ? Was the program a success? Unintended outcomes? Differential impacts for different groups? Did the evaluation help with decisions about sustainability? Was there a Formal process of ruling out threats to internal and external validity? Was there a discussion of what can be spread as part of learnings from the evaluation? 26
  27. 27. Example 2 27
  28. 28. Goals “To contribute to improving health and strengthening health systems in low and middle income countries (LMICs), by supporting innovative international approaches to integrating health knowledge generation and synthesis (including consideration of environmental, economic, socio-cultural, and public policy factors) through research, health research capacity development, and the use of research evidence for health policy and practice.”
  29. 29. Objectives • Foster international partnerships and collaboration to promote the generation and effective communication and use of relevant health research (including consideration of environmental, economic, sociocultural, and public policy factors) in, for and by lowand middle-income countries (LMICs); • Train and support researchers responsive to policy and practice priorities of LMICs relating to or influencing health; and • Support active collaboration between researchers and research users (e.g. policymakers, practitioners, civil society organizations, and community members) to support health priorities of LMICs.
  30. 30. Basic Questions How complete is our knowledge of how these initiatives are intended to work? Do we have clarity on the timeline of impact of such initiatives? How do we ensure that we don‟t get caught up in the „activity space‟ of such initiatives? How do we align metrics to provide incentives to focus on the outcomes?
  31. 31. • Key Concepts Limited sphere of control • • Flexible sphere of control Timeline of impact • Metrics and incentives • Integrated planning and evaluation
  32. 32. The Problem of Insufficient Imagination
  33. 33. Granting Mechanisms Research, Capacity Building and Knowledge Translation activities Health Impacts
  34. 34. 4. MERIT REVIEW B D 3. NORTH-SOUTH RESEARCHERS AND KNOWLEDGE USERS RESPOND TO REQUEST FOR PROPOSALS G F 6. BUILDING TEAMS: INITIAL RELATIONSHIP BUILDING H K 10. STRENGTHENE D RELATIONSHIP S BETWEEN NORTH AND SOUTH RESEARCHERS L 11. KNOWLEDGE USER ENABLED TO IMPLEMENT LEARNINGS M 12. ENABLED RESEARCHERST O P 14. GREATER AWARENESS OF SALIENCE OF RESEARCH AND CAPACITY BUILDING FOR LOCAL POLICY AND PRACTICE CONCERNS V Z Q AA BB 20. IMPROVED HEALTH E J 9. ENHANCED RESEARCH CAPACITY AND CAPACITY TO USE EVIDENCE U 19. STRENGTHENED HEALTH SYSTEMS AND ENHANCED HEALTH EQUITY 5. IMPLEMENTATI ON OF PROJECTS N 13. RESEARCH DISSEMINATI ON 18. LOCAL USE AND INFLUENCE OF RESEARCH AND CAPACITY BUILDING C 8. KNOWLEDGE GENERATION: FOCUSSED AND CROSS-CUTTING THEMES 16. PLANNING FOR IMPLEMENTATION: ALIGNMENT OF RESEARCH AND CAPACITY BUILDING WITH LOCAL POLICY AND PRACTICE NEEDS 1. DRIVERS FOR THE DEVELOPMENT OF TEASDALECORTI I 7. RESEARCH ,CAPACITY BUILDING AND KNOWLEDGE TRANSLATION PROCESSES and OUTPUTS A 2. COMMISSIONI NG RESEARCH AND CAPACITY BUILDING AND KNOWLEDGE TRANSLATION GRANTS : TEAM AND LEADERSHIP GRANTS, KTE GRANTS Indirect Influence Direct Influence Direct Control CC R S 15. ENHANCED LOCAL FOUNDATIONS FOR FUTURE RESEARCH AND CAPACITY BUILDING W X 17. MECHANISMS TO SUSTAIN RELATIONSHI PS AND FOUNDATION S
  35. 35. Data Interviews with Teasdale-Corti planners in multiple funding organizations including CIHR and IDRC Formal analysis of 8 final reports Interviews with grantees at the October 2012 GHRI Ottawa meeting—this includes video interviews with grantees Brief case studies of three grantees including Skype interviews with Southern partners Formal analysis of 8 proposals Bibliometrics analysis Surveys of Teasdale-Corti grantees—separate surveys were conducted with Canadian researchers, Southern researchers and knowledge users Data collection to support theory of change work.
  36. 36. Support from evidence • There was strong support from the evidence for the overall goal of TeasdaleCorti: there was support for how Teasdale-Corti contributed to building research and practice that can contribute to health and strengthening health systems. • There was limited evidence that the Teasdale-Corti initiative directly impacted health outcomes (but need to reflect on the timeline of impact)
  37. 37. Some learnings (1) • • • • • • Greater upfront clarity of the anticipated timeline of impact Greater upfront thinking about planning for sustainability Greater clarity around who counts as a knowledge user Focused monitoring on knowledge-user engagement: possible metrics Greater clarity on what equity in partnerships means Boundaries and contexts of the Teasdale-Corti model
  38. 38. Some learnings (2) • Limited focus on health equity • Greater clarify on amount and types of support • • • • grantees need Focus on the end-user and not just the knowledge user Greater focus on the benefits of the North-South relationships Greater clarity on what progress and accountability means given the complexity of Teasdale-Corti Incentives in the system of metrics to encourage collaboration and co-creation
  39. 39. Refining the Theory of Change: examples of learnings Greater clarity (by providing models, exemplars, narratives of successful implementation, etc.) on how best to create synergy between knowledge generation, knowledge translation and capacity building More explicitly clarify the role of the knowledge user in the implementation of Teasdale-Corti Explicitly recognize the „values‟ that guide the implementation of TeasdaleCorti The implementation of the theory of change needs to be supported by a M&E system that pays attention to the complexities and heterogeneities of the implementation of Teasdale-Corti
  40. 40. Towards a more imagined plan
  41. 41. 1 2 Problem Definition 3 LMIC Priorities: 4 Promoting Research Use and Influence Designing Locally-Useful Research 5 6 Potential for Useful Capacity Building: Collaboration
  42. 42. 7 8 Equity 9 Sustainability 10 Timeline of Impact: Clarity and coherence of outcomes and specific theory of change 11 Monitoring/Evaluation Plan: 12 Gender and Ethical Considerations:
  43. 43. Ideas for improvement Recommendations based on proposal analysis • More explicit focus on barriers • Have a clearer focus on the diversity of knowledge users • Equity should have a more explicit focus • Encourage thinking about sustainability upfront • Planning for relationship-building over time • Monitoring and evaluation of complex interventions
  44. 44. The ‘blackbox’ of capacitybuilding Synergizing activities Long timeline of impact that may exist for research to influence health outcomes Consistency in values and metrics Non-linear models of the research/poli cy interface
  45. 45. Unpacking the Plan Theory of Change Developmental - what are capacities needed? Sphere of Control Timeline of Impact Metrics and incentives Phased approach to commissioning
  46. 46. Evaluating research • What are pathways of influence by which research impacts policies? • Can a results based culture help with greater understanding of the pathways of influence? • Approaches: o Conceptualizing research as an intervention with short and long term goals. One of the intermediate goals is to influence policy. The long term goal is to improve individual lives
  47. 47. Research as Intervention Individual Research Project ? Policy Influence Social Betterment Monitoring and Evaluation Monitoring and Evaluation Monitoring and Evaluation Monitoring and Evaluation Evaluative Thinking Evaluative Thinking Evaluative Thinking Evaluative Thinking Multiple research projects
  48. 48. Research Quality Monitorin g and Evaluation System Pathways of Policy Influence
  49. 49. Research as Interventio n Context Mechanis ms Outcomes
  50. 50. MODEL 1 From Knowledge to Policy See Knowledge to Policy: Making the most of Development Research, Fred Carden, 2009
  51. 51. Overall Context Stability of Decision Making Institutions Research as Intervention Research Quality Research Network s Policy Absorption Capacity Nature of Governance Opportunities for countries in transition Economic crisis and pressures on government Action Research Government Context Clear Government Demand New tools Government interest but leadership absent Interest but no capacity Interesting research but research uninteresting for Government Government disinterest or Mechanism : So, what can you do? Relationship Building Communica tion Networks of Influence Institutional Mechanisms Process Outcomes Policy Capacity Broader Policy Horizons Policy Regimes Outcomes Policy Influence ‚Better‛ Decisions Social Betterment
  52. 52. Model 2: Pathways of Influence Mel Mark and Gary Henry, Evaluation, 2003
  53. 53. Research Initiatives in specific places • Local research for lasting solutions • Some key questions: o What is local about your research? How does your research respond to local needs? What is different about your local networks? What is the value added of the local research? o Are there local pathways of influence? o How different are the local pathways of influence from other more general pathways?
  54. 54. Global Context Research as Interventio n Global Mechanis ms Outcomes Local Context Local Mechanis ms
  55. 55. Local Context • Are your theories locally generated? Are the needs local? • Examples of local context that have interfered with standard influence processes • Does your research address local needs in a timely manner? • Are the local networks different from other more general networks of influence? • Example: What are some of the challenges of policy influence in specific contexts in Africa?
  56. 56. Summing up and looking ahead 58
  57. 57. Models of Causation (Successionist vs. Generative Models of Causation) Integrating Knowledge Translation with evaluation Development al evaluation in Complex Dynamic Settings Ecology of Evidence Capacity Building Portfolio of designs and approaches Program Theory and Incompleteness Time Horizons and Functional forms Spread, Scaling up and Generalization 59
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×