• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
PPWNov13- Day 2 keynote- S.Sridharan- U Toronto
 

PPWNov13- Day 2 keynote- S.Sridharan- U Toronto

on

  • 312 views

Day 2 keynote: Sanjeev Sridharan, University of Toronto: “Research and evaluation in global health policy processes” ...

Day 2 keynote: Sanjeev Sridharan, University of Toronto: “Research and evaluation in global health policy processes”

Workshop on Approaches and Methods for Policy Process Research, co-sponsored by the CGIAR Research Programs on Policies, Institutions and Markets (PIM) and Agriculture for Nutrition and Health (A4NH) at IFPRI-Washington DC, November 18-20, 2013.

Statistics

Views

Total Views
312
Views on SlideShare
304
Embed Views
8

Actions

Likes
0
Downloads
3
Comments
0

2 Embeds 8

http://www.pim.cgiar.org 6
http://www.a4nh.cgiar.org 2

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    PPWNov13- Day 2 keynote- S.Sridharan- U Toronto PPWNov13- Day 2 keynote- S.Sridharan- U Toronto Presentation Transcript

    • Towards a Transformative View of Evaluation: Evaluating Global Health Interventions Presentation at the International Food Policy Research Institute Sanjeev Sridharan The Evaluation Centre for Complex Health Interventions University of Toronto & St. Michael‟s Hospital November 19, 2013
    • How do we make evaluations matter? Two Examples What is evaluation? Evaluating Research: Pathways of Influence Summing up
    • What is evaluation? A useful but perhaps incomplete definition • Evaluation is defined both as a means of assessing performance and to identify alternative ways to deliver • “evaluation is the systematic collection and analysis of evidence on the outcomes of programs to make judgments about their relevance, performance and alternative ways to deliver them or to achieve the same results.” …..what role can evaluation/ evaluative thinking play in navigating interventions? 3
    • Purpose of evaluation (Mark, Henry and Julnes, 2000 • Assessing merit and worth • Causal questions, RCT, observational studies • Programme and organizational improvement o Formative evaluation • Oversight and compliance • Knowledge development o Neglected purpose of many evaluations 4
    • An Example: Primary Prevention Have a Heart Paisley
    • Features of complex interventions (Pawson et al., 2004)        The intervention is a theory or theories The intervention involves the actions of people. The intervention consists of a chain of steps These chains of steps or processes are often not linear, and involve negotiation and feedback at each stage. Interventions are embedded in social systems and how they work is shaped by this context. Interventions are prone to modification as they are implemented. Interventions are open systems and change through learning as stakeholders come to understand them.
    • System Dynamic Approaches (Sterman, 2006) • • • • • • Constantly changing; Governed by feedback; Non-linear, History-dependent; Adaptive and evolving; Characterized by trade-offs; Policy resistance: “The result is policy resistance, the tendency for interventions to be defeated by the system‟s response to the intervention itself.”
    • ‚Solutions‛ Can Also Create New Problems Policy resistance is the tendency for interventions to be delayed, diluted, or defeated by the response of the system to the intervention itself. -- Meadows, Richardson, Bruckman Meadows DH, Richardson J, Bruckmann G. Groping in the dark: the first decade of global modelling. New York, NY: Wiley, 1982. Merton RK. The unanticipated consequences of purposive social action. American Sociological Review 1936;1936:894-904. Forrester JW. Counterintuitive behavior of social systems. Technology Review 1971;73(3):53-68.
    • System-as-Cause Forrester JW. Counterintuitive behavior of social systems. Technology Review 1971;73(3):53-68. Meadows DH. Leverage points: places to intervene in a system. Sustainability Institute, 1999. Available at <http://www.sustainabilityinstitute.org/pubs/Leverage_Points.pdf>. Richardson GP. Feedback thought in social science and systems theory. Philadelphia, PA: University of Pennsylvania Press, 1991. Sterman JD. Business dynamics: systems thinking and modeling for a complex world. Boston, MA: Irwin McGraw-Hill, 2000.
    • So why are evaluations so often not very useful?
    • UN Office of the Internal Oversight Services, 2008 • A Critique of Results-Based Management (2008). • “Results-based management at the United Nations has been an administrative chore of little value to accountability and decision-making.”
    • The UN Critique of performance management and evaluation • Lack of strategic direction and cross-organizational performance incentives • Problems of attribution and trivializing innovation • Trivializing outcomes • The practice of lacks rigor • A lack of purpose
    • The UN Critique (2) • Lack of clarity on the consequences of good and poor performance • Lack of clarity on the capacity needed to build a results-based management system • Technical solutions are not a substitute for substantive clarity
    • The logic of an evolutionary strategy  Box et al (1978, p. 303):  ... the best time to design an experiment is after it is finished, the converse is that the worst time is the beginning, when least is known. If the entire experiment was designed at the outset, the following would have to be assumed as known: (1) which variables were the most important, (2) over what ranges the variables should be studied... The experimenter is least able to answer such questions at the outset of an investigation but gradually becomes more able to do so as a program evolves. (p. 303) 14
    • What kind of evaluation will you be doing? Formative Developmental Summative 15
    • A Ten Step approach to Evualation A INTERVENTION THEORY AND DEVELOPING EXPECTATIONS OF IMPACTS OVER TIME B The key components of the complex intervention The pathways of influence of an evaluation The program theory of the complex intervention Learning framework for the evaluation Learning from the Evidence Base The anticipated timeline of impact C IMPACTS AND LEARNING LEARNING FRAMEWORKS AND PATHWAYS OF INFLUENCE D SPREAD AND SUSTAINABILITY Assessing the impact of the intervention: DESIGN Spreading learning from an evaluation Learning about the intervention over time Reflections on performance and sustainability 16
    • An Example: Primary Prevention Have a Heart Paisley On what basis do we make a decision to SUSTAIN the program? What gets SPREAD as a result of the evaluation? What are the key LEARNINGS from the evaluation?
    • POLICY LANDSCAPE (dynamic, changes over time) B D E A EVIDENCE PROGRAMMES Evaluator THEORY C Hypothesised Pathways Evidence Base Linked to Pathways Areas of Uncertainty EVALUATION Design, methods, approaches PROCESS LEARNING RISK LANDSCAPE ORGANISATIONA L LEARNING INDIVIDUA L IMPACTS POLICY IMPACT ALIGNMENT (moving beyond programmes at level of analysis) Interpersonal, Individual & Collective Mechanisms ACTION
    • Example 1 19
    • Background • Project in Collaboration with the China National Health Development Research Centre to build Evaluation Capacity for evaluating health system reform efforts • Focus on Health Inequities • Developing guidelines to evaluate health inequity initiatives • Relationship to complexity 20
    • • Literature Reviews • Surveys of Innovations • Determine Evaluation Assets and Needs Create Guidelines Evaluate 3 projects: Revise Guidelines Levels of Evaluation Influence Test and Refine the Guidelines Individual Inter-personal Collective Evaluation Capacity Building Knowledge Translation 21
    • Evaluation Approach Complexity of Intervention 22
    • Approach to Evaluation Complexity of Intervention 23
    • Review of articles Context Expectations and timelines Understanding and changing understanding Organizational structures for learning Multiple understandings of impacts Sustainability and spread 24
    • Describe the Interventio n What was the Setting of the intervention? What was the Context? What was the duration of the intervention? Discussion on timelines and trajectories of impacts? Was there a discussion of the evidence informing the program? Was the evidence from multiple disciplines? Is the program a pilot? Challenges of adaptation to specific setting? Were there changes in the intervention over time? How did the evaluation explore this? Was the theory of change described? Did it change over time? 25
    • What was the Intervention Planners’ view of success? Were there formal structures to learn and modify the program over time? Was the organization al structure of the intervention described? How were impacts studied; what design were implemented ? Was the program a success? Unintended outcomes? Differential impacts for different groups? Did the evaluation help with decisions about sustainability? Was there a Formal process of ruling out threats to internal and external validity? Was there a discussion of what can be spread as part of learnings from the evaluation? 26
    • Example 2 27
    • Goals “To contribute to improving health and strengthening health systems in low and middle income countries (LMICs), by supporting innovative international approaches to integrating health knowledge generation and synthesis (including consideration of environmental, economic, socio-cultural, and public policy factors) through research, health research capacity development, and the use of research evidence for health policy and practice.”
    • Objectives • Foster international partnerships and collaboration to promote the generation and effective communication and use of relevant health research (including consideration of environmental, economic, sociocultural, and public policy factors) in, for and by lowand middle-income countries (LMICs); • Train and support researchers responsive to policy and practice priorities of LMICs relating to or influencing health; and • Support active collaboration between researchers and research users (e.g. policymakers, practitioners, civil society organizations, and community members) to support health priorities of LMICs.
    • Basic Questions How complete is our knowledge of how these initiatives are intended to work? Do we have clarity on the timeline of impact of such initiatives? How do we ensure that we don‟t get caught up in the „activity space‟ of such initiatives? How do we align metrics to provide incentives to focus on the outcomes?
    • • Key Concepts Limited sphere of control • • Flexible sphere of control Timeline of impact • Metrics and incentives • Integrated planning and evaluation
    • The Problem of Insufficient Imagination
    • Granting Mechanisms Research, Capacity Building and Knowledge Translation activities Health Impacts
    • 4. MERIT REVIEW B D 3. NORTH-SOUTH RESEARCHERS AND KNOWLEDGE USERS RESPOND TO REQUEST FOR PROPOSALS G F 6. BUILDING TEAMS: INITIAL RELATIONSHIP BUILDING H K 10. STRENGTHENE D RELATIONSHIP S BETWEEN NORTH AND SOUTH RESEARCHERS L 11. KNOWLEDGE USER ENABLED TO IMPLEMENT LEARNINGS M 12. ENABLED RESEARCHERST O P 14. GREATER AWARENESS OF SALIENCE OF RESEARCH AND CAPACITY BUILDING FOR LOCAL POLICY AND PRACTICE CONCERNS V Z Q AA BB 20. IMPROVED HEALTH E J 9. ENHANCED RESEARCH CAPACITY AND CAPACITY TO USE EVIDENCE U 19. STRENGTHENED HEALTH SYSTEMS AND ENHANCED HEALTH EQUITY 5. IMPLEMENTATI ON OF PROJECTS N 13. RESEARCH DISSEMINATI ON 18. LOCAL USE AND INFLUENCE OF RESEARCH AND CAPACITY BUILDING C 8. KNOWLEDGE GENERATION: FOCUSSED AND CROSS-CUTTING THEMES 16. PLANNING FOR IMPLEMENTATION: ALIGNMENT OF RESEARCH AND CAPACITY BUILDING WITH LOCAL POLICY AND PRACTICE NEEDS 1. DRIVERS FOR THE DEVELOPMENT OF TEASDALECORTI I 7. RESEARCH ,CAPACITY BUILDING AND KNOWLEDGE TRANSLATION PROCESSES and OUTPUTS A 2. COMMISSIONI NG RESEARCH AND CAPACITY BUILDING AND KNOWLEDGE TRANSLATION GRANTS : TEAM AND LEADERSHIP GRANTS, KTE GRANTS Indirect Influence Direct Influence Direct Control CC R S 15. ENHANCED LOCAL FOUNDATIONS FOR FUTURE RESEARCH AND CAPACITY BUILDING W X 17. MECHANISMS TO SUSTAIN RELATIONSHI PS AND FOUNDATION S
    • Data Interviews with Teasdale-Corti planners in multiple funding organizations including CIHR and IDRC Formal analysis of 8 final reports Interviews with grantees at the October 2012 GHRI Ottawa meeting—this includes video interviews with grantees Brief case studies of three grantees including Skype interviews with Southern partners Formal analysis of 8 proposals Bibliometrics analysis Surveys of Teasdale-Corti grantees—separate surveys were conducted with Canadian researchers, Southern researchers and knowledge users Data collection to support theory of change work.
    • Support from evidence • There was strong support from the evidence for the overall goal of TeasdaleCorti: there was support for how Teasdale-Corti contributed to building research and practice that can contribute to health and strengthening health systems. • There was limited evidence that the Teasdale-Corti initiative directly impacted health outcomes (but need to reflect on the timeline of impact)
    • Some learnings (1) • • • • • • Greater upfront clarity of the anticipated timeline of impact Greater upfront thinking about planning for sustainability Greater clarity around who counts as a knowledge user Focused monitoring on knowledge-user engagement: possible metrics Greater clarity on what equity in partnerships means Boundaries and contexts of the Teasdale-Corti model
    • Some learnings (2) • Limited focus on health equity • Greater clarify on amount and types of support • • • • grantees need Focus on the end-user and not just the knowledge user Greater focus on the benefits of the North-South relationships Greater clarity on what progress and accountability means given the complexity of Teasdale-Corti Incentives in the system of metrics to encourage collaboration and co-creation
    • Refining the Theory of Change: examples of learnings Greater clarity (by providing models, exemplars, narratives of successful implementation, etc.) on how best to create synergy between knowledge generation, knowledge translation and capacity building More explicitly clarify the role of the knowledge user in the implementation of Teasdale-Corti Explicitly recognize the „values‟ that guide the implementation of TeasdaleCorti The implementation of the theory of change needs to be supported by a M&E system that pays attention to the complexities and heterogeneities of the implementation of Teasdale-Corti
    • Towards a more imagined plan
    • 1 2 Problem Definition 3 LMIC Priorities: 4 Promoting Research Use and Influence Designing Locally-Useful Research 5 6 Potential for Useful Capacity Building: Collaboration
    • 7 8 Equity 9 Sustainability 10 Timeline of Impact: Clarity and coherence of outcomes and specific theory of change 11 Monitoring/Evaluation Plan: 12 Gender and Ethical Considerations:
    • Ideas for improvement Recommendations based on proposal analysis • More explicit focus on barriers • Have a clearer focus on the diversity of knowledge users • Equity should have a more explicit focus • Encourage thinking about sustainability upfront • Planning for relationship-building over time • Monitoring and evaluation of complex interventions
    • The ‘blackbox’ of capacitybuilding Synergizing activities Long timeline of impact that may exist for research to influence health outcomes Consistency in values and metrics Non-linear models of the research/poli cy interface
    • Unpacking the Plan Theory of Change Developmental - what are capacities needed? Sphere of Control Timeline of Impact Metrics and incentives Phased approach to commissioning
    • Evaluating research • What are pathways of influence by which research impacts policies? • Can a results based culture help with greater understanding of the pathways of influence? • Approaches: o Conceptualizing research as an intervention with short and long term goals. One of the intermediate goals is to influence policy. The long term goal is to improve individual lives
    • Research as Intervention Individual Research Project ? Policy Influence Social Betterment Monitoring and Evaluation Monitoring and Evaluation Monitoring and Evaluation Monitoring and Evaluation Evaluative Thinking Evaluative Thinking Evaluative Thinking Evaluative Thinking Multiple research projects
    • Research Quality Monitorin g and Evaluation System Pathways of Policy Influence
    • Research as Interventio n Context Mechanis ms Outcomes
    • MODEL 1 From Knowledge to Policy See Knowledge to Policy: Making the most of Development Research, Fred Carden, 2009
    • Overall Context Stability of Decision Making Institutions Research as Intervention Research Quality Research Network s Policy Absorption Capacity Nature of Governance Opportunities for countries in transition Economic crisis and pressures on government Action Research Government Context Clear Government Demand New tools Government interest but leadership absent Interest but no capacity Interesting research but research uninteresting for Government Government disinterest or Mechanism : So, what can you do? Relationship Building Communica tion Networks of Influence Institutional Mechanisms Process Outcomes Policy Capacity Broader Policy Horizons Policy Regimes Outcomes Policy Influence ‚Better‛ Decisions Social Betterment
    • Model 2: Pathways of Influence Mel Mark and Gary Henry, Evaluation, 2003
    • Research Initiatives in specific places • Local research for lasting solutions • Some key questions: o What is local about your research? How does your research respond to local needs? What is different about your local networks? What is the value added of the local research? o Are there local pathways of influence? o How different are the local pathways of influence from other more general pathways?
    • Global Context Research as Interventio n Global Mechanis ms Outcomes Local Context Local Mechanis ms
    • Local Context • Are your theories locally generated? Are the needs local? • Examples of local context that have interfered with standard influence processes • Does your research address local needs in a timely manner? • Are the local networks different from other more general networks of influence? • Example: What are some of the challenges of policy influence in specific contexts in Africa?
    • Summing up and looking ahead 58
    • Models of Causation (Successionist vs. Generative Models of Causation) Integrating Knowledge Translation with evaluation Development al evaluation in Complex Dynamic Settings Ecology of Evidence Capacity Building Portfolio of designs and approaches Program Theory and Incompleteness Time Horizons and Functional forms Spread, Scaling up and Generalization 59