Making a difference: M&E of policy research John Young: ODI, London [email_address]
What should you measure? It depends what you’re trying to do…. “ If you don't know where you are going, any road will get you there”
Should be: S pecific M easurable A chievable R ealistic T ime-bound ( O bjective) Whatever you measure
M&E of policy research Strategy and direction:  Logframes; Social Network Analysis; Impact Pathways; Modular Matrices Management :  ‘Fit for Purpose’ Reviews; ‘Lighter Touch’ Quality Audits; Horizontal Evaluation; Appreciative Inquiry Outputs:  Evaluating academic articles and research reports; Evaluating policy and briefing papers; Evaluating websites; Evaluating networks; After Action Reviews Uptake:  Impact Logs; New Areas for Citation Analysis; User Surveys Outcomes and impacts:  Outcome Mapping; RAPID Outcome Assessment; Most Significant Change; Innovation Histories; Episode Studies www.odi.org.uk/RAPID/Publications/RAPID_WP_281.html
Logical frameworks √ √ Goal Indicator MOV Assumptions/Risks Purpose Indicator MOV Assumptions/Risks Output 1 Indicator MOV Assumptions/Risks Output 2 Indicator MOV Output 3 Indicator MOV Output 4 Indicator MOV
...and many projects fail when the inputs cease... Change takes a long time Inputs Activities Outputs Outcomes Impact Other Actors Project Effort Behaviour Change
Focusing on change www.odi.org.uk/RAPID/Tools/Toolkits/KM/Outcome_mapping.html
Emphasis on “learning” “… every time we do something again, we should do it better than the last time… ” Goals Results Activities External networks; Colleagues; Information assets; Own knowledge www.odi.org.uk/RAPID/Tools/Toolkits/KM/Index.html Learn during Learn after Learn before
Starts with the attitude that someone has probably already done what I am about to do.  I wonder who?” Learning before: Peer Assist www.odi.org.uk/RAPID/Tools/Toolkits/KM/Peer_assists.html
Learning During: Stories What was the situation? What was the challenge? What was done? What was the result? What lessons can be drawn? www.odi.org.uk/RAPID/Tools/Toolkits/KM/Stories.html www.mande.co.uk/docs/MSCGuide.pdf Most significant change Best stories at each level Synthesis Stories of change
Horizontal evaluation Peer review Choose the moment Choose your peers Limited criteria e.g. ODI Peer Review Appreciative enquiry Self-evaluation CGIAR/CIAT Workshop
Learning after: AAR www.odi.org.uk/RAPID/Tools/Toolkits/KM/AAR.html An after action review asks 4 simple questions:  15 minute team debrief, conducted in a “rank-free” environment. What was supposed to happen? What actually happened? Why was there a difference? What can we learn from it?
Case & Episode Studies Classical case studies: how did evidence shape policy decisions? e.g. IFPRI & IDRC Overestimate the role of research www.idrc.ca/en/ev-26606-201-1-DO_TOPIC.html www.ifpri.org/impact/impact.htm www.odi.org.uk/RAPID/Publications/BRP_ITDG.html www.odi.org.uk/RAPID/Projects/PPA0104/Index.html www.gdnet.org/middle.php?oid=175 Episode studies: retrospective tracking back from policy change e.g. PRSPs, SL, AHC Underestimate the role of research
Social Network Analysis www.odi.org.uk/RAPID/Tools/Toolkits/KM/Social_network_analysis.html
RAPID Outcome Mapping www.odi.org.uk/RAPID/Publications/RAPID_WP_266.html
Impact log Partner’s research quoted in media or on a blog:  ebpdn research on Japanese Aid on BBC Afrique; Waldo Mendoza of CIES quoted in Géstion. ebpdn used to find position:  MSc student in US got intern position in Uganda; a member cites active membership of ebpdn in his on-line CVs (on other web sites) RAPID framework in academia:  Garrett, J.  “Improving Results for Nutrition” in the Journal of Nutrition (2008) RAPID approaches incorporated into organisations:  CIDA research studies to make it more participatory
Stories
RSS Feeds
Webstats
Other approaches: Public Citations, webstats, media  logs etc Surveys Quantitative Qualitative Distribution lists and  attendance records Meeting evaluations Logs: The expected The unexpected How you have changed Evaluation: Practical  Guidelines, Research  Councils UK. 2002  www.rcuk.ac.uk/cmsweb/downloads/rcuk/publications/evaluationguide.pdf

Jy Recoup Mand E

  • 1.
    Making a difference:M&E of policy research John Young: ODI, London [email_address]
  • 2.
    What should youmeasure? It depends what you’re trying to do…. “ If you don't know where you are going, any road will get you there”
  • 3.
    Should be: Specific M easurable A chievable R ealistic T ime-bound ( O bjective) Whatever you measure
  • 4.
    M&E of policyresearch Strategy and direction: Logframes; Social Network Analysis; Impact Pathways; Modular Matrices Management : ‘Fit for Purpose’ Reviews; ‘Lighter Touch’ Quality Audits; Horizontal Evaluation; Appreciative Inquiry Outputs: Evaluating academic articles and research reports; Evaluating policy and briefing papers; Evaluating websites; Evaluating networks; After Action Reviews Uptake: Impact Logs; New Areas for Citation Analysis; User Surveys Outcomes and impacts: Outcome Mapping; RAPID Outcome Assessment; Most Significant Change; Innovation Histories; Episode Studies www.odi.org.uk/RAPID/Publications/RAPID_WP_281.html
  • 5.
    Logical frameworks √√ Goal Indicator MOV Assumptions/Risks Purpose Indicator MOV Assumptions/Risks Output 1 Indicator MOV Assumptions/Risks Output 2 Indicator MOV Output 3 Indicator MOV Output 4 Indicator MOV
  • 6.
    ...and many projectsfail when the inputs cease... Change takes a long time Inputs Activities Outputs Outcomes Impact Other Actors Project Effort Behaviour Change
  • 7.
    Focusing on changewww.odi.org.uk/RAPID/Tools/Toolkits/KM/Outcome_mapping.html
  • 8.
    Emphasis on “learning”“… every time we do something again, we should do it better than the last time… ” Goals Results Activities External networks; Colleagues; Information assets; Own knowledge www.odi.org.uk/RAPID/Tools/Toolkits/KM/Index.html Learn during Learn after Learn before
  • 9.
    Starts with theattitude that someone has probably already done what I am about to do. I wonder who?” Learning before: Peer Assist www.odi.org.uk/RAPID/Tools/Toolkits/KM/Peer_assists.html
  • 10.
    Learning During: StoriesWhat was the situation? What was the challenge? What was done? What was the result? What lessons can be drawn? www.odi.org.uk/RAPID/Tools/Toolkits/KM/Stories.html www.mande.co.uk/docs/MSCGuide.pdf Most significant change Best stories at each level Synthesis Stories of change
  • 11.
    Horizontal evaluation Peerreview Choose the moment Choose your peers Limited criteria e.g. ODI Peer Review Appreciative enquiry Self-evaluation CGIAR/CIAT Workshop
  • 12.
    Learning after: AARwww.odi.org.uk/RAPID/Tools/Toolkits/KM/AAR.html An after action review asks 4 simple questions: 15 minute team debrief, conducted in a “rank-free” environment. What was supposed to happen? What actually happened? Why was there a difference? What can we learn from it?
  • 13.
    Case & EpisodeStudies Classical case studies: how did evidence shape policy decisions? e.g. IFPRI & IDRC Overestimate the role of research www.idrc.ca/en/ev-26606-201-1-DO_TOPIC.html www.ifpri.org/impact/impact.htm www.odi.org.uk/RAPID/Publications/BRP_ITDG.html www.odi.org.uk/RAPID/Projects/PPA0104/Index.html www.gdnet.org/middle.php?oid=175 Episode studies: retrospective tracking back from policy change e.g. PRSPs, SL, AHC Underestimate the role of research
  • 14.
    Social Network Analysiswww.odi.org.uk/RAPID/Tools/Toolkits/KM/Social_network_analysis.html
  • 15.
    RAPID Outcome Mappingwww.odi.org.uk/RAPID/Publications/RAPID_WP_266.html
  • 16.
    Impact log Partner’sresearch quoted in media or on a blog: ebpdn research on Japanese Aid on BBC Afrique; Waldo Mendoza of CIES quoted in Géstion. ebpdn used to find position: MSc student in US got intern position in Uganda; a member cites active membership of ebpdn in his on-line CVs (on other web sites) RAPID framework in academia: Garrett, J. “Improving Results for Nutrition” in the Journal of Nutrition (2008) RAPID approaches incorporated into organisations: CIDA research studies to make it more participatory
  • 17.
  • 18.
  • 19.
  • 20.
    Other approaches: PublicCitations, webstats, media logs etc Surveys Quantitative Qualitative Distribution lists and attendance records Meeting evaluations Logs: The expected The unexpected How you have changed Evaluation: Practical Guidelines, Research Councils UK. 2002 www.rcuk.ac.uk/cmsweb/downloads/rcuk/publications/evaluationguide.pdf

Editor's Notes

  • #3 So you need to be thinking strategically about what you’re trying to achieve through your engagement work before you can figure out what to measure.
  • #8 Outcome Mapping is a project planning, monitoring and evaluation tool that has been developed by IDRC. It includes a series of steps before during and after a project. More information about this can be found on the IDRC web site.
  • #13 13 Invented by the US Army Used by all the troops After each Action Now firmly embedded in Army culture Part of the training program