Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Robert Picciotto


Published on

Published in: Technology, Economy & Finance
  • Be the first to comment

  • Be the first to like this

IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Robert Picciotto

  1. 1. Institute for Development StudiesInternational WorkshopMarch 26-27, 2013Impact, Innovation and LearningTowards a Research and Practice Agenda for the FutureAre development evaluators fighting the lastwar?Robert PicciottoKings College London“Evaluation cannot hope for perfect objectivity butneither does this mean that it should slump intorampant subjectivity”Ray Pawson1
  2. 2. The story line• The past is not always prologue• The impact evaluation debate is over• Experimental methods have formidableadvantages… and disadvantages• Complexity/system thinking: a theory?• A new development evaluation agenda:values; human well being; newdevelopment architecture; evaluationbeyond aid2
  3. 3. The risks of “path dependence”• Past decisions lose their relevance, distorttoday’s decisions or impose limits on them:– What worked yesterday may not work tomorrow– What did not work yesterday may, with luck, worktomorrow• Summative evaluation does not necessarilytranslate into formative evaluation: goodevaluators exercise caution when offeringrecommendations• Path dependence also affects evaluationmodels and approaches – this is why we arehere!!!3
  4. 4. The impact evaluation debate is over• Methods have to do with tactics• Even the best tactics fail without astrategy• Attribution analysis of discreteinterventions has dominated recentdebates• The Stern gang has produced a definitive“state of the art” review: manyquestions beyond attribution need to betackled by development evaluators 4
  5. 5. The potential of randomisation• Experimental methods have formidableadvantages: they can help establish causalityby providing a plausible measure of whatresults would have been observed had theintervention not taken place• This approach is ideal for tackling selectionbias and addresses the counterfactualdilemma by ensuring that all the other factorsthat may affect outcomes are identical exceptfor stochastic errors• The statistical significance of findings can bemeasured by statistical testing techniques5
  6. 6. Society is not a laboratory• Redundancy or feasibility issues• External validity: context matters• Failure to address the distinctiveaccountabilities of partners, replication,implementation (who? Why? How ?)• Ethical issues• Experiments are costly, require superiorskills, large studies, large samples andspecialized quality assurance6
  7. 7. Qualitative approaches contribute toevaluation rigour• Without theory there is no credible causalexplanation that adds to developmentknowledge• Understanding of causal relationships calls forsubstantive knowledge of the intervention andits context• Qualitative methods involve participation,observation, text based information, villagemeetings, open ended interviews, etc.• Quality can and should be quantified.• It is also important to figure out why thingsdid not happen7
  8. 8. Mixed methods together with (orbeyond) the experiment– Regression and factor analysis– Quasi-experimental designs– Multivariate statistical modelling– Participatory approaches– Qualitative impact assessment– Beneficiaries surveys– General elimination methodology– Expert panels– Benchmarking– Case studies8
  9. 9. Complexity models and systemsthinking: not yet a theory?• Pluralism and inclusivity in methods• Networks as privileged units of account• Interrelationships• Engagement with multiple actors• Symbols and diagrams to guide evaluativeinquiry• From ‘attribution’ to ‘contribution.• Rely selectively on all evaluation traditions –theories of change, realist and criticalevaluation, democratic evaluation, etc.9
  10. 10. Towards a new developmentevaluation agendaA relevant development evaluationstrategy for our troubled times should: put values at the very centre of thestrategy reflect the new development agendalikely to emerge beyond the MDGs reconsider the ultimate goals ofdevelopment cooperation.10
  11. 11. Why should values come first?• The social arrangements that allow a fewindividuals to accumulate enormous wealthwhile depriving large segments of thepopulation of basic necessities have spawnedwidespread popular anger and resentment• The richest 0.5% hold well over a third of theworlds wealth while 68% share only 4%• Differences in income attributable to effort,skill or entrepreneurship are not widelyresented – unless the rules of the game arerigged• Evaluators should examine these rules! 11
  12. 12. Evaluators can no longer sit on thefence• Current utilization driven evaluation models areclient-controlled and tend to be value free• Evaluators should be independent and seekinspiration from the ethical imperatives of socialjustice• They should master recent policy researchfindings about inequality• They should revise their guiding principles andmetrics in ways consistent with the pursuit ofmore equitable, inclusive and environmentallysustainable growth• They should give higher priority to the politicaldimensions of policies and programs12
  13. 13. We need new metrics• GNI measures fail to capture services providedwithin the household or to reflect environmentallosses, social inequities and human insecurities• Five planets would be needed to get the world upto US living standards• The spectre of global warming is haunting thepoorest and most vulnerable countries• The reality of poverty encompasses factors thatreach well beyond material deprivation• The profile of risk management should rise giventhe rise of national disasters and the insecuritiesof the interconnected global system13
  14. 14. Towards a 3D (McGregor/Sumner)evaluation model14EvaluationcharacteristicsMaterial well-beingRelational well-beingPerceptual well-beingMajor discipline Economics Sociology PsychologyDominant evaluationapproachCost benefitanalysisParticipatoryevaluationEmpowerment evaluationInvestment focus Physical capital Social capital Human capitalMain unit of account Countries Communities IndividualsMain types ofindicatorsSocio-economic Resilience Quality of life
  15. 15. Taking account of a new aidarchitecture• Aid is once again an instrument of geopolitics andnew public and private actors have joined the fray• Aid is increasingly channelled through large,diffuse and adaptable country based programmesor through vertical multi-country initiativesinvolving multiple actors.• Aid coalitions are increasingly operating acrossborders to tackle “problems without passport”• The complex political equations that underlietheir operations cannot be solved throughabstract, politically neutral approaches15
  16. 16. Evaluating development cooperationbeyond aidThe footprint of development cooperation is farmore significant than the aid footprint:• Exports (about $5.8 trillion)- 45 times officialaid flows.• Remittances ($283 billion) - 2.2 times• FDI ($594 billion) - 4.6 times as large• Royalty and licence fees($27 billion) - a fourth• Huge damage caused by climate change as aresult of OECD countries’ energy policies 16
  17. 17. Conclusions• Even the best methods cannot compensate fora misguided strategy• The new development research andevaluation agenda:• emphasize ethical standards and democraticvalues• reflect the new human well being paradigm• come to terms with the new aid architecture• reach well beyond aid• experiment with complexity and systems thinking• promote independence in evaluation processes• devote more attention to the political dimension 17