Workshop: Monitoring, evaluation and impact assessment

2,391 views
2,175 views

Published on

Published in: Business, Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,391
On SlideShare
0
From Embeds
0
Number of Embeds
13
Actions
Shares
0
Downloads
170
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • OM was developed by IDRC, based in Ottawa, as a response to some of the challenges they were facing when using a more conventional results chain. Projects were having difficulty with the linear characteristic of this logic As well as the it’s cause-and-effect nature
  • OM was developed by IDRC, based in Ottawa, as a response to some of the challenges they were facing when using a more conventional results chain. Projects were having difficulty with the linear characteristic of this logic As well as the it’s cause-and-effect nature
  • OM was developed by IDRC, based in Ottawa, as a response to some of the challenges they were facing when using a more conventional results chain. Projects were having difficulty with the linear characteristic of this logic As well as the it’s cause-and-effect nature
  • Workshop: Monitoring, evaluation and impact assessment

    1. 1. Monitoring, evaluation and impact assessment C. Crissman Science Week Workshop 22 July 2011, Penang Photo: A. Gordon
    2. 2. Workshop objectives <ul><li>Introduce Monitoring and Evaluation in Results Based Management </li></ul><ul><li>Introduce the identification of outcomes in projects </li></ul>
    3. 3. What is Results Based Management? <ul><li>goal of RBM : “…a management strategy aimed at achieving important changes in the way organizations operate, with improving performance in terms of results …” </li></ul><ul><li>purpose is “…to improve efficiency and effectiveness through organizational learning , and secondly to fulfill accountability obligations through performance reporting.” </li></ul><ul><li>key success factors in RBM are: </li></ul><ul><li>“ the involvement of stakeholders throughout the management lifecycle in defining realistic expected results, </li></ul><ul><li>assessing risk, </li></ul><ul><li>monitoring progress, </li></ul><ul><li>reporting on performance and </li></ul><ul><li>integrating lessons learned into management decisions.” </li></ul>Source: Meyer 2003
    4. 4. RBM rests on four pillars <ul><li>Planning : defining clear and measurable results and indicators, based on a logic model or framework. </li></ul><ul><li>Monitoring: measuring and describing progress towards results, and the resources consumed, using appropriate indicators. </li></ul><ul><li>Reporting, internally and externally, on progress towards results. </li></ul><ul><li>Managing: using results information (and evaluation) for lesson-learning and management decision making. </li></ul>Source: Meyer, 2003
    5. 5. <ul><li>“… is deliberately results driven, meaning that the drivers for planning will be real-world impacts—the outcomes that make a difference to global development goals…” </li></ul><ul><li>“… is a results-oriented research system —in contrast with, for instance, a results oriented development program…” </li></ul>The CG Consortium Strategy and Results Framework
    6. 6. CG use of RBM tools in Planning, Monitoring and Evaluation <ul><ul><li>Strategic Plan </li></ul></ul><ul><ul><li>Medium-term Plan </li></ul></ul><ul><ul><ul><li>Review of MTP before implementation </li></ul></ul></ul><ul><ul><ul><li>Performance measurement after implementation </li></ul></ul></ul><ul><ul><li>Performance Measurement System </li></ul></ul><ul><ul><li>Research Indicators </li></ul></ul><ul><ul><ul><li>Outputs (Output targets, publications) </li></ul></ul></ul><ul><ul><ul><li>Outcomes </li></ul></ul></ul><ul><ul><ul><li>Impact </li></ul></ul></ul><ul><ul><li>Center Commissioned External Review (a program-level evaluation) </li></ul></ul><ul><ul><li>EPMR (a center-level evaluation) </li></ul></ul>
    7. 7. The results chain Source: ADB, 2006
    8. 8. WorldFish Strategic Plan Research in Development Reduce poverty and vulnerability through fisheries and aquaculture. Increase food and nutrition security through fisheries and aquaculture. Our Strategic Results
    9. 9. Multiple dimensions of poverty Certain groups are systematically disadvantaged due to discrimination based on: gender, ethnicity, race, religion, caste, age, HIV status, migrant status People’s exposure to risks; sensitivity of livelihoods to risks; capacity to use assets and capabilities to cope and adapt Little access to means to make a decent standard of living A complex problem… Vulnerability Income and asset poverty Marginalization
    10. 10. Interventions to reduce poverty Organisational development, labour rights, migrant’s rights, gender equity Improve access to health services, secure land rights, aquatic property rights Diversification, microfinance, education & skills … requiring complex solutions Vulnerability Income and asset poverty Marginalization
    11. 11. Selecting paradigms for implementation AR4D
    12. 12. Strategic Plan – Implementation strategy
    13. 13. Outputs - change in knowledge - change in capacity - change in technology - change in materials - change in policy options - change in awareness/understanding Research - recognition/appreciation of research knowledge Outcomes - use of knowledge by partners - mobilisation of new capacity - extension of technology/materials - change in policy environment Impacts - change in problem - change in opportunities Development - change in actions/behaviour of stakeholders Outcomes - change in productivity - change in equity/empowerment - change in market conditions - change in investments - change in security of assets/habitats
    14. 14. Ex-post Impact Assessment Measuring ‘the change in the problem’ Reduce poverty and vulnerability through fisheries and aquaculture. Increase food and nutrition security through fisheries and aquaculture.
    15. 15. Did we make it to happen?
    16. 16. Following a Recipe A Rocket to the Moon Raising a Child <ul><li>Formulae are critical and necessary </li></ul><ul><li>Sending one rocket increases assurance that next will be ok </li></ul><ul><li>High level of expertise in many specialized fields + coordination </li></ul><ul><li>Rockets similar in critical ways </li></ul><ul><li>High degree of certainty of outcome </li></ul><ul><li>Formulae have only a limited application </li></ul><ul><li>Raising one child gives no assurance of success with the next </li></ul><ul><li>Expertise can help but is not sufficient; relationships are key </li></ul><ul><li>Every child is unique </li></ul><ul><li>Uncertainty of outcome remains </li></ul>Complicated Complex <ul><li>The recipe is essential </li></ul><ul><li>Recipes are tested to assure replicability of later efforts </li></ul><ul><li>No particular expertise; knowing how to cook increases success </li></ul><ul><li>Recipes produce standard products </li></ul><ul><li>Certainty of same results every time </li></ul>Simple (Diagram from Zimmerman 2003)
    17. 17. What should be monitored? SIMPLE COMPLICATED COMPLEX What works? What works in what contexts? (implementation environments and participant characteristics) What works here and now? What do we mean by ‘works’?
    18. 18. Types of interventions Simple intervention Complicated or complex intervention Single causal strand. Intervention is sufficient to produce the impacts Multiple simultaneous causal strands required to produce the impacts Universal mechanism Intervention is necessary to produce the impacts Different causal mechanisms operating in different contexts Linear causality, proportional impact Recursive, with feedback loops, leading to disproportionate impact at critical levels Pre-identified outcomes Emergent outcomes
    19. 19. Challenges for impact assessment Simple Complicated Complex Deciding impacts Likely to be agreed Likely to differ, reflecting different agendas May be emergent Describing impacts More likely to have standardised measures developed Evidence needed about multiple components Harder to plan for given emergence Analysing cause Likely to be clear counter-factual Causal packages and non-linearity Unique, highly contingent causality Reporting Clear messages Complicated message Uptake requires further adaptation
    20. 20. M&E system interaction with project implementation Monitoring captures what happened Evaluation explains why
    21. 21. M&E Implementation strategy
    22. 22. Evaluations and the R4D Results Chain Scale Pilot / Small Global Research Unit of Impact Analysis Project System Program Outcome evaluation studies that measure the effect size Outcome evaluation that measures the scale of output adoption/ uptake Time Input Output Outcome Impact Objectives Goals Program M&E, Impact pathway analysis, Adoption constraints analysis Ex post Impact Assessment as a function of (effect size * scale) Source: taken from a presentation by M. Maredia
    23. 23. Evaluations Process implementation evaluation What did or did not get implemented as planned? Rapid Appraisals provide timely, relevant information to decision-makers on pressing issues they face in the project and program setting. The aim of applied research is . . . to facilitate a more rational decision-making process in real-life circumstances Impact Evaluation … the classic evaluation that attempts to find out the changes that occurred, and to what they can be attributed Performance Logic Chain Assessment The performance logic chain assessment evaluation strategy is used to determine the strength and logic of the causal model behind the policy, program, or project. <ul><li>Pre-implementation assessment </li></ul><ul><li>Are the objectives well defined so that outcomes can be stated in measurable terms? </li></ul><ul><li>Is there a coherent and credible implementation plan that provides clear evidence of how implementation is to proceed and how successful implementation can be distinguished from poor implementation? </li></ul><ul><li>Is the rationale for the deployment of resources clear and commensurate with the requirements for achieving the stated outcomes? </li></ul>Case Studies … use when a manager needs in-depth information to understand more clearly what happened with a policy, program, or project
    24. 24. Logic Models and Theory of Change Photo: World Bank
    25. 25. Conventional logic for achieving results Inspired by Jeff Conklin, cognexus.org Are we efficient? Are we effective? Source: cognexus.org ACTIVITIES OUTPUTS OUTCOMES IMPACT INPUTS Time
    26. 26. Works well for outputs Inspired by Jeff Conklin, cognexus.org Workshops, training manuals, research and assessment reports, guidelines and action plans, strategies, and technical assistance packages, amongst others. Source: cognexus.org ACTIVITIES OUTPUTS OUTCOMES IMPACT INPUTS Time
    27. 27. But not so well for outcomes and impact Source: cognexus.org Multiple pathways of changes in behavior of different, interacting actors; this is where we have the possibility of collecting and making sense of evidence that sustains the impact we are aiming to contribute to. Social change is long-term, complex and is the result of what many actors do (their actions and interactions) Vision and Mission Intentional design ACTIVITIES OUTPUTS OUTCOMES IMPACT INPUTS Time
    28. 28. Logic Models and Theory of Change <ul><li>Terms often used interchangeably </li></ul><ul><li>Confusion by funders and grantees about expectations </li></ul><ul><li>Limited knowledge on how to use </li></ul><ul><li>TOC and LMs can “blend” into each other </li></ul>
    29. 29. Logic Models <ul><li>30 year history </li></ul><ul><li>Clear identification of goals (outcomes) </li></ul><ul><li>First widespread attempt to depict program components so that activities matched outcomes </li></ul>
    30. 30. What is a logic model? Basic United Way format, 1996 Inputs Activities Outputs Inter-mediate Outcomes Long-term Outcomes
    31. 31. Theories of Change <ul><li>Popularized in 1990s to capture complex initiatives </li></ul><ul><li>Outcomes-based </li></ul><ul><li>Causal model </li></ul><ul><li>Articulate underlying assumptions </li></ul>
    32. 32. What is a Theory of Change? Long-term Outcome Necessary Pre- condition Necessary Pre- condition Necessary Pre- condition Necessary Pre- condition Necessary Pre- condition All outcomes that must be achieved BEFORE long-term Explain WHY here Show activities here also
    33. 33. How are they different? <ul><li>Logic models graphically illustrate program components, and creating one helps stakeholders clearly identify outcomes, inputs and activities </li></ul><ul><li>Theories of Change link outcomes and activities to explain HOW and WHY the desired change is expected to come about </li></ul>
    34. 34. How are they different? (1) <ul><li>Logic Models usually start with a program and illustrate its components </li></ul><ul><li>Theories of Change may start with a program, but are best when starting with a goal, before deciding what programmatic approaches are needed </li></ul>
    35. 35. How are they different? (2) <ul><li>Logic Models require identifying program components , so you can see at a glance if outcomes are out of sync with inputs and activities, but they don’t show WHY activities are expected to produce outcomes </li></ul><ul><li>Theories of Change also require justifications at each step – you have to articulate the hypothesis about why something will cause something else </li></ul>
    36. 36. How are they different? <ul><li>Summary </li></ul>Logic Models Theories of Change Representation List of Components Descriptive Critical Thinking Pathway of Change Explanatory
    37. 37. When to Use? <ul><li>Logic Models when you need to: </li></ul><ul><li>Show someone something they can understand at a glance </li></ul><ul><li>Demonstrate you have identified the basic inputs, outputs and outcomes for your work </li></ul><ul><li>Summarize a complex theory into basic categories </li></ul>
    38. 38. When to Use? <ul><li>Theories of Change when you need to: </li></ul><ul><li>Design a complex initiative and want to have a rigorous plan for success </li></ul><ul><li>Evaluate appropriate outcomes at the right time and the right sequence </li></ul><ul><li>Explain why an initiative worked or did not work, and what exactly went wrong </li></ul>

    ×