Jy Recoup Mand E

935 views

Published on

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
935
On SlideShare
0
From Embeds
0
Number of Embeds
44
Actions
Shares
0
Downloads
11
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • So you need to be thinking strategically about what you’re trying to achieve through your engagement work before you can figure out what to measure.
  • Outcome Mapping is a project planning, monitoring and evaluation tool that has been developed by IDRC. It includes a series of steps before during and after a project. More information about this can be found on the IDRC web site.
  • 13 Invented by the US Army Used by all the troops After each Action Now firmly embedded in Army culture Part of the training program
  • Jy Recoup Mand E

    1. 1. Making a difference: M&E of policy research John Young: ODI, London [email_address]
    2. 2. What should you measure? <ul><li>It depends what you’re trying to do…. </li></ul>“ If you don't know where you are going, any road will get you there”
    3. 3. <ul><li>Should be: </li></ul><ul><li>S pecific </li></ul><ul><li>M easurable </li></ul><ul><li>A chievable </li></ul><ul><li>R ealistic </li></ul><ul><li>T ime-bound </li></ul><ul><li>( O bjective) </li></ul>Whatever you measure
    4. 4. M&E of policy research <ul><li>Strategy and direction: Logframes; Social Network Analysis; Impact Pathways; Modular Matrices </li></ul><ul><li>Management : ‘Fit for Purpose’ Reviews; ‘Lighter Touch’ Quality Audits; Horizontal Evaluation; Appreciative Inquiry </li></ul><ul><li>Outputs: Evaluating academic articles and research reports; Evaluating policy and briefing papers; Evaluating websites; Evaluating networks; After Action Reviews </li></ul><ul><li>Uptake: Impact Logs; New Areas for Citation Analysis; User Surveys </li></ul><ul><li>Outcomes and impacts: Outcome Mapping; RAPID Outcome Assessment; Most Significant Change; Innovation Histories; Episode Studies </li></ul>www.odi.org.uk/RAPID/Publications/RAPID_WP_281.html
    5. 5. Logical frameworks √ √ Goal Indicator MOV Assumptions/Risks Purpose Indicator MOV Assumptions/Risks Output 1 Indicator MOV Assumptions/Risks Output 2 Indicator MOV Output 3 Indicator MOV Output 4 Indicator MOV
    6. 6. <ul><li>...and many projects fail when the inputs cease... </li></ul>Change takes a long time Inputs Activities Outputs Outcomes Impact Other Actors Project Effort Behaviour Change
    7. 7. Focusing on change www.odi.org.uk/RAPID/Tools/Toolkits/KM/Outcome_mapping.html
    8. 8. Emphasis on “learning” “… every time we do something again, we should do it better than the last time… ” Goals Results Activities External networks; Colleagues; Information assets; Own knowledge www.odi.org.uk/RAPID/Tools/Toolkits/KM/Index.html Learn during Learn after Learn before
    9. 9. Starts with the attitude that someone has probably already done what I am about to do. I wonder who?” Learning before: Peer Assist www.odi.org.uk/RAPID/Tools/Toolkits/KM/Peer_assists.html
    10. 10. Learning During: Stories <ul><li>What was the situation? </li></ul><ul><li>What was the challenge? </li></ul><ul><li>What was done? </li></ul><ul><li>What was the result? </li></ul><ul><li>What lessons can be drawn? </li></ul>www.odi.org.uk/RAPID/Tools/Toolkits/KM/Stories.html www.mande.co.uk/docs/MSCGuide.pdf <ul><li>Most significant change </li></ul><ul><li>Best stories at each level </li></ul><ul><li>Synthesis </li></ul>Stories of change
    11. 11. Horizontal evaluation <ul><li>Peer review </li></ul><ul><ul><li>Choose the moment </li></ul></ul><ul><ul><li>Choose your peers </li></ul></ul><ul><ul><li>Limited criteria </li></ul></ul><ul><ul><li>e.g. ODI Peer Review </li></ul></ul><ul><li>Appreciative enquiry </li></ul><ul><ul><li>Self-evaluation </li></ul></ul><ul><ul><li>CGIAR/CIAT </li></ul></ul><ul><ul><li>Workshop </li></ul></ul>
    12. 12. Learning after: AAR www.odi.org.uk/RAPID/Tools/Toolkits/KM/AAR.html An after action review asks 4 simple questions: 15 minute team debrief, conducted in a “rank-free” environment. <ul><li>What was supposed to happen? </li></ul><ul><li>What actually happened? </li></ul><ul><li>Why was there a difference? </li></ul><ul><li>What can we learn from it? </li></ul>
    13. 13. Case & Episode Studies <ul><li>Classical case studies: how did evidence shape policy decisions? </li></ul><ul><ul><li>e.g. IFPRI & IDRC </li></ul></ul><ul><ul><li>Overestimate the role of research </li></ul></ul>www.idrc.ca/en/ev-26606-201-1-DO_TOPIC.html www.ifpri.org/impact/impact.htm www.odi.org.uk/RAPID/Publications/BRP_ITDG.html www.odi.org.uk/RAPID/Projects/PPA0104/Index.html www.gdnet.org/middle.php?oid=175 <ul><li>Episode studies: retrospective tracking back from policy change </li></ul><ul><ul><li>e.g. PRSPs, SL, AHC </li></ul></ul><ul><ul><li>Underestimate the role of research </li></ul></ul>
    14. 14. Social Network Analysis www.odi.org.uk/RAPID/Tools/Toolkits/KM/Social_network_analysis.html
    15. 15. RAPID Outcome Mapping www.odi.org.uk/RAPID/Publications/RAPID_WP_266.html
    16. 16. Impact log <ul><li>Partner’s research quoted in media or on a blog: ebpdn research on Japanese Aid on BBC Afrique; Waldo Mendoza of CIES quoted in Géstion. </li></ul><ul><li>ebpdn used to find position: MSc student in US got intern position in Uganda; a member cites active membership of ebpdn in his on-line CVs (on other web sites) </li></ul><ul><li>RAPID framework in academia: Garrett, J. “Improving Results for Nutrition” in the Journal of Nutrition (2008) </li></ul><ul><li>RAPID approaches incorporated into organisations: CIDA research studies to make it more participatory </li></ul>
    17. 17. Stories
    18. 18. RSS Feeds
    19. 19. Webstats
    20. 20. Other approaches: Public <ul><li>Citations, webstats, media logs etc </li></ul><ul><li>Surveys </li></ul><ul><ul><li>Quantitative </li></ul></ul><ul><ul><li>Qualitative </li></ul></ul><ul><li>Distribution lists and attendance records </li></ul><ul><li>Meeting evaluations </li></ul><ul><li>Logs: </li></ul><ul><ul><li>The expected </li></ul></ul><ul><ul><li>The unexpected </li></ul></ul><ul><ul><li>How you have changed </li></ul></ul>Evaluation: Practical Guidelines, Research Councils UK. 2002 www.rcuk.ac.uk/cmsweb/downloads/rcuk/publications/evaluationguide.pdf

    ×