Measuring policy influence: like measuring thin air?


Published on

Presentation about principles, tools, and experience sin measuring the policy influence of research. Developed by RAPID programme at ODI.

Published in: Education, Business, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Measuring policy influence: like measuring thin air?

  1. 1. Measuring policy influence: like measuring thin air? John Young: Arnaldo Pellini:
  2. 2. RAPID Power, Politics and evidence useKnowledge intermediariesand interactions Evidence production and communication
  3. 3. Policy processes are... Cabinet   Donors   Policy   Formula/on   Agenda     Parliament   Se3ng   Decision   Making  Civil  Society   Monitoring  and   Ministries   Evalua/on   Policy   Implementa/on   Private     Sector  
  4. 4. The Cynefn Framework4
  5. 5. Policy change •  Discursive: Client- focused services •  Attitudinal: Farmers have good ideas •  Procedural: Participatory approaches to service development •  Content: UU20, UU25. New guidelines •  Behavioural: Approach being applied in practice5
  6. 6. Focus on behaviour changeProjectTeamOtherActors Inputs Activities Outputs Outcomes Outcome Outcomes Impact Impact Impact Behaviour 6 Change
  7. 7. RAPID Outcome Mapping Approach Define your policy objectives
  8. 8. RAPID Outcome Mapping Approach Media strategy Online communications Develop a network or partnership Academic research communications Policy advocacy coalition More research
  9. 9. Why do M&E?•  To learn about what works•  To manage better•  To account: –  to donors –  to recipients9
  10. 10. Methods•  Classical case studies (IDRC, IFPRI)•  Episode studies (ODI/RAPID)•  Stories of Change (Denning)•  Micro-Narratives (Snowden)•  Impact matrices (Davies)•  Peer evaluations (CHSRF)•  Systematic reviews?•  RCTs?10
  11. 11. Outcome Mapping OUTCOME MAPPING: Building Learning and Reflection into Development Programs Sarah Earl, Fred Carden, and Terry Smutylo DO_TOPIC.html11
  12. 12. Social Network Analysis12
  13. 13. RAPID Outcome Assessment13
  14. 14. A systematic approach1.  Strategy and direction –are you doing the right thing?2.  Management –are you doing what you planned to do?3.  Outputs – are the outputs appropriate for the audience?4.  Uptake – are people aware of your work?5.  Outcomes and impacts –are you having any impact?14
  15. 15. ConclusionsResearch to influence: Measuring impact•  Clear objectives •  Clear objectives •  Theory of change•  Understand the context •  5-levels•  Theory of change •  Multiple methods•  Iterative / learning •  Triangulation approach •  Expect the unexpected15
  16. 16. Recommendations•  Strategy: theory of change, impact pathways, peer review, log frame•  Management: process records, appreciative inquiry, AARs, PRINCE2•  Output: logs, peer-review,•  Uptake: logs, webstats, surveys•  Impact: outcome mapping, stories of change, episode studies, peer review16
  17. 17. VDR 2010VDR 2010: assessing policyinfluence
  18. 18. VDR 2010VDR 2010: Modern Institutions•  Research output of a large governance programme (DfID)•  12 months of research and consultations•  Presented by WB + 13 donors in 12/2009•  Policy influence assessment in 03/2011
  19. 19. Confirm the policy changeAttitudinal changeProcedural changePolicy contentBehaviour change
  20. 20. Policy influencing approaches Evidence / science based Advising Advocacy VDR 2010Inside track / Outside track / cooperation pressure Lobbying Activism Interest / values based Start and Hovland (2004)
  21. 21. Areas of M&EStrategy and direction Logframes, ToC Project Governance,Management processes Research Management Peer reviews, AfterOutputs Action Reviews Citation Analysis, UptakeUptake logs, User Surveys, … Stories of Change,Outcomes and impacts Episode Studies, … Hovland 2007
  22. 22. Areas of M&E Areas of evaluation PlannedStrategy and directionManagement processesOutputs Data on uptake ofUptake VDR 2010Outcomes and impacts Stories of Change
  23. 23. What we did•  Collaborate with a local research/advocacy organization•  Citation analysis Google search•  Interviews with donors representatives•  Interviews with GoV advisers involved in the VDR 2010 process•  Interviews with GoV officials
  24. 24. What we did/learned•  Cultural norms influence the process•  Citation, Google search, and time•  Uptake and www traffic data•  Management of the process emerged as an important theme•  Perception of the quality of the report•  Too early to speak of impact
  25. 25. What we did/learned Areas of evaluation Planned What we did/learnedStrategy and Perception about thedirection timing of the VDR 2010Management Perception of theprocesses process Perception of theOutputs quality of the VDR 2010 Data on uptake of Data on uptake of theUptake VDR 2010 VDR 2010Outcomes and Story of Change /impacts Episode Study
  26. 26. What we did/learnedAssessing the policy influence of research: acase study of governance research in VietNamODI Background Notes, May 2012( of the delivery of theVietnam development report 2010 (Fullreport)(
  27. 27. M&E in ODI•  ODI has 12 core research programs•  145 publications / 74 blogs / 13 issues Development Policy Review and Disasters (2009-2010)•  20 public events / 4500 participants (2009-2010)•  Advisory work
  28. 28. M&E in ODIStrategy and ODI Strategy / Program Strategy / Reviews by direction SMT / ToCs Institute-wide management processes / staffManagement surveys / project completion reports / after processes action reviews / internal peer review Peer reviews / quality assurance / communication Outputs team input / user surveys / end of project report + after action reviews
  29. 29. M&E in ODI Note by communication team /Uptake CommStat / regular updates on uptake / uptake log
  30. 30. M&E in ODICommStat ODI RAPID July – September 2011
  31. 31. M&E in ODI
  32. 32. M&E in ODIOutcome / Stories of Change page / Review by TT Impact heads (2008) / Program evaluations
  33. 33. M&E in ODI
  34. 34. Thank you