Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 1Towards a Research Agenda forImpact Eva...
Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 21. IntroductionThis presentation discus...
Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 33. What is needed to develop the resear...
Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 45. Some burning research questions abou...
Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 5Questions about managing impact evaluat...
Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 6Questions about framing the boundaries ...
Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 7Questions about understanding causes of...
Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 8Further ReadingNONIE Subgroup 2 (2008) ...
Upcoming SlideShare
Loading in …5
×

IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers Handout 1

692 views
643 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
692
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
13
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Patricia Rogers Handout 1

  1. 1. Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 1Towards a Research Agenda forImpact Evaluation of DevelopmentImpact, Innovation and Learning:Towards a Research and Practice Agenda for the FutureInstitute of Development Studies, Brighton, UK26-27 March 2013Professor Patricia Rogers, RMIT Universitywww.betterevaluation.orgpatricia.rogers@rmit.edu.au
  2. 2. Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 21. IntroductionThis presentation discusses how this conference could be part of developing a research agenda forimpact evaluation in development. It discusses the processes that will be needed, different types ofresearch that are needed, and a framework for thinking about burning research questions. Whilemany discussions about impact evaluation have often focused only on challenges in causal inference,this paper argues that the research agenda needs to address a broader range of issues covering allaspects of impact evaluation practice, including issues of values clarification, measurement,synthesis and management, especially in joint projects and ongoing programs. It also recognises thechallenges that are presented by development that goes beyond discrete donor-driven projects toinclude partnerships and community involvement.While the practice agenda will focus on doing what we know, the research agenda focuses onbuilding knowledge to do impact evaluation of development better. This includes building evidencefrom practice.2. What the research agenda needs to coverResearch needs to address the full range of impact evaluation in development: Conducted by – external evaluators, internal evaluators, managers and staff, communities For decision making by - donors, managers, staff, partner governments, regionalassociations, communities Type of impact evaluation - Individual evaluations, evaluations of projects within an overallprogram, evaluation systems Impact evaluation for different purposes – to identify ‘best buys’ for comparative fundingdecisions, to understand how to scale up and translate effective programs to new contexts,to understand ways of improving effectiveness of ongoing programs and policiesResearch needs to address the full range of tasks involved in impact evaluation for development. TheBetterEvaluation Rainbow Framework (see attached document) is one way of representing thedifferent clusters of tasks involved in effective impact evaluation – all of which might be importantfor research.
  3. 3. Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 33. What is needed to develop the research agendaIn addition to the ideas being shared at this conference, serious development of a research agendafor impact evaluation in development will need: Consultation with the different parties involved in conducting, managing, using and beingaffected by impact evaluation for development to identify questions, issues, gaps inknowledge, and what will constitute credible research evidence Identifying ‘undiscussables’ and ‘wicked problems’ in impact evaluation for development Review of documentation outlining the different aspects of evaluation generally – andimpact evaluation for development specifically Review of issues and challenges in impact evaluation for development Review of previous research into impact evaluation in development to identify gaps,promising practices, and potentially appropriate research methods Review of potential methods, tools and approaches from other areas of evaluation andresearch4. Types of researchIt is important to identify the different types of research needed to build knowledge and improvetheory and practice in impact evaluation in development. These include:a) Documenting practiceretrospectivelyDescribing patterns in existing practiceIdentifying good practice and documenting it retrospectivelyIdentifying challenging issues and documenting and analysing themretrospectivelyb) Documenting practiceconcurrentlyTracking evaluations and documenting micro-interactions involvedin key processesc) Positive deviance Identifying good practitioners, especially in relation to enduringchallenges, and engaging intended research users in analysing theirpractice and examining its potential for translation to their contextd) Exemplary evaluationstrialling specific tools/methods /strategiesIdentifying possible applications of promising methods or strategiesand both implementing them and studying theme) Trials of particular toolsto identify possible usesIdentifying potentially useful tools and trialling small scaleapplications (‘eolithic’ research)f) Longitudinal studies ofthe impact of impactevaluationTrack uses and other influences of impact evaluations across timeand different users
  4. 4. Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 45. Some burning research questions about impact evaluation indevelopmentThe framework presented in this paper draws on descriptions of evaluation planning processes1, anearlier research project with a government department on unmet needs for impact evaluationmethods and strategies2, further developed through Subgroup 2 contributions to the NONIEguidelines and the development of the BetterEvaluation website.The suggestions for research questions arise from evaluation challenges identified inBetterEvaluation workshops with development organisations, including sessions at FAO, SAMEA andAfrEA conferences, and seminars with Pact, ILAC and CLEAR Anglophone Africa.Overall questionsQuestionPotentially useful types ofresearch, cases orevaluationmethods/processes1. How do we do impact evaluation of development that actuallysupports development – that is, is not intrinsicallydisempowering, patronising and likely to reinforce inequalitybut supports and strengthens agency, equity and transparency?2. How do we support all agents of development, includingcommunities, to be reflective and empirical about the impact oftheir work?3. Why does so much development impact evaluation fail to beinformed by what has been learned about effective evaluation(for example, the importance of clarifying purpose and intendedusers, what are feasible questions to answer within limitedresources)?1Kemmis, S. A Guide to Evaluation Design: An Evaluation Planner for Curriculum Development Projects.Reprinted (1) as A Guide to Evaluation Design, School of Education, Deakin University, 1981, (2) in S. Kemmis,& L. Bartlett (Eds.) Case study methods, Deakin University Press, 1982, 1983.2Rogers, P. , McDonald, B.(2001) Impact Evaluation Research Project. Melbourne: Department of NaturalResources and Environment, Victoria.
  5. 5. Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 5Questions about managing impact evaluations or impact evaluation systemsQuestionPotentially useful types ofresearch, cases ormethods/processes4. Establish decision making processes What are effective ways tosupport communities to have genuine involvement in decisionmaking about evaluations?5. Define quality evaluation standards How can an evaluationaccommodate different ideas about what constitutes credibleevidence among intended users?6. Develop evaluation plan or framework When should particularstrategies be used to develop the evaluation design (eg developas part of ToR, as part of response to RFP, as separate project)7. Develop evaluation plan or framework How can an evaluationdesign best accommodate emerging issues?8. Determine and secure resources How can organisations working inthe same region share information and data collection?Questions about defining what is to be evaluatedQuestionPotentially useful types ofresearch, cases ormethods/processes9. Develop program theory/logic model How can a theory ofchange/program theory effectively represent complicated aspectsof interventions (multiple layers, components and partners andcomplex aspects (adaptability, emergence)?10. Develop program theory/logic model How can an organisationsupport projects to have locally specific theories ofchange/program theory that are still coherent across a program,an organisation or a sector?11. Identify potential unintended results What are effective strategiesfor identifying potential unintended results in advance?Negative program theory12. What investments and activities are the subjects of evaluation?What examination is made of other investments and activities?Survey of evaluations
  6. 6. Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 6Questions about framing the boundaries for an evaluationQuestionPotentially useful types ofresearch, cases ormethods/processes13. Specify the key evaluation questions What are effective processesfor developing good Key Evaluation Questions– in terms of likelyto be useful and feasible?14. Determine what ‘success’ looks like How can implicit values aboutresults, processes and distribution of benefits be made explicit?RubricsQuestions about describing activities, outcomes, impacts and contextQuestionPotentially useful types ofresearch, cases ormethods/processes15. Collect and/or retrieve data When is purposeful sampling mostappropriate, and how can it be used validly and effectively?16. Collect and/or retrieve data How can long-term results befollowed up?17. Collect and/or retrieve data How can unanticipated negativeoutcomes and impacts be identified and addressed in datacollection and reporting?18. Collect and/or retrieve data How can reasonable intermediateoutcomes be identified for an evaluation that will end beforeimpacts are evident?19. Collect and/or retrieve data How can Big Data be used effectivelyin impact evaluation of development?Big Data
  7. 7. Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 7Questions about understanding causes of outcomes and impactsQuestionPotentially useful types ofresearch, cases ormethods/processes20. Check results match the theory of change/ Compare results to thecounterfactual/ Rule out alternative explanations What arecredible and feasible methods and strategies for non-experimental causal inference in development impact evaluation?Process tracing, qualitativecomparative analysis,comparative case studies,Multiple Lines and Levels ofEvidenceQuestions about synthesising evidence from one or more evaluationsQuestionPotentially useful types ofresearch, cases ormethods/processes21. Synthesise evidence from a single evaluation How can differentvalues be accommodated in developing an overall evaluativejudgement?Rubrics, co-existiveevaluation,22. Synthesise evidence from multiple evaluations How cansystematic reviews which don’t exclude material in terms of thehierarchy of evidence deal with the large number of potentiallyrelevant sources?Realist synthesisQuestions about reporting and supporting useQuestionPotentially useful types ofresearch, cases ormethods/processes23. Identify intended users and intended uses How can an evaluationrespond to significant changes in the intended users during thecourse of an evaluation?24. Ensure accessibility How can an evaluation provide a coherentmessage without only focusing on average effects?25. Support use What are effective strategies for supporting use ofimpact evaluation, especially in difficult situations – eg fragilestates, changing decisionmakers?Longitudinal study,PROGRESA,
  8. 8. Rogers (2013) Towards a Research Agenda for Impact Evaluation in Development Page 8Further ReadingNONIE Subgroup 2 (2008) Impact Evaluation Guidancehttp://valutazioneinvestimenti.formez.it/sites/all/files/NONIE%20IMPACT%20EVALUATION%20GUIDANCE.pdfDiscusses the nature of development that impact evaluation needs to address and outlines fourcomponents of impact evaluation that need to be undertaken appropriately:1. Identifying impacts that are valued2. Gathering evidence of impacts3. Assessing causal attribution or contribution4. Managing the impact evaluation (whether conducted as an internal or external evaluation)Stern, E., Stame, N., Mayne, J., Forss, K., Davies, R., and Befani , B.(2012) Broadening the Range ofDesigns and Methods for Impact Evaluations: Report of a Study commissioned by the Departmentfor International Development, DfID Working Paper No. 38.http://www.dfid.gov.uk/Documents/publications1/design-method-impact-eval.pdfOutlines five different types of impact evaluation designs – Experimental, Statistical, Theory-based,Case-based and Participatory- and attributes of programs that impact evaluation needs to address:duration and time scale; nonlinearity and unpredictability; local customisation of programmes;indirect delivery through intermediate agents such as funds; multiple interventions that influenceeach other.IE4D Group (2011) Impact Evaluation for Development.http://www.scalingimpact.net/files/Impact%20Evaluation%20for%20Development%20-%20Principles%20for%20Action.pdfArgues for impact evaluation for development – impact evaluation that not only assessesdevelopment but consciously and demonstrably contributes to improving development. Sets out aseven-point agenda of rethinking, reshaping and reforming impact evaluation for improvingdevelopment: Development evaluation should: (1) Contribute to sustainable improvements indevelopment (2) Suit the nature of development (3) Draw on the full range of methods and designsfor systematic and rigorous empirical investigation (4) Produce a comprehensive analysis of impactsand outcomes (5) Explain how and why impacts occur (6) Be an integral part of robust systems ofmonitoring, assessment and learning (7) Involve fundamental rethinking, reshaping and reform ofexisting practice.

×