• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Robert Picciotto
 

IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Robert Picciotto

on

  • 481 views

 

Statistics

Views

Total Views
481
Views on SlideShare
481
Embed Views
0

Actions

Likes
0
Downloads
7
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Robert Picciotto IDS Impact, Innovation and LEarning Workshop March 2013: Day 1, Keynote 1 Robert Picciotto Presentation Transcript

    • Institute for Development StudiesInternational WorkshopMarch 26-27, 2013Impact, Innovation and LearningTowards a Research and Practice Agenda for the FutureAre development evaluators fighting the lastwar?Robert PicciottoKings College London“Evaluation cannot hope for perfect objectivity butneither does this mean that it should slump intorampant subjectivity”Ray Pawson1
    • The story line• The past is not always prologue• The impact evaluation debate is over• Experimental methods have formidableadvantages… and disadvantages• Complexity/system thinking: a theory?• A new development evaluation agenda:values; human well being; newdevelopment architecture; evaluationbeyond aid2
    • The risks of “path dependence”• Past decisions lose their relevance, distorttoday’s decisions or impose limits on them:– What worked yesterday may not work tomorrow– What did not work yesterday may, with luck, worktomorrow• Summative evaluation does not necessarilytranslate into formative evaluation: goodevaluators exercise caution when offeringrecommendations• Path dependence also affects evaluationmodels and approaches – this is why we arehere!!!3
    • The impact evaluation debate is over• Methods have to do with tactics• Even the best tactics fail without astrategy• Attribution analysis of discreteinterventions has dominated recentdebates• The Stern gang has produced a definitive“state of the art” review: manyquestions beyond attribution need to betackled by development evaluators 4
    • The potential of randomisation• Experimental methods have formidableadvantages: they can help establish causalityby providing a plausible measure of whatresults would have been observed had theintervention not taken place• This approach is ideal for tackling selectionbias and addresses the counterfactualdilemma by ensuring that all the other factorsthat may affect outcomes are identical exceptfor stochastic errors• The statistical significance of findings can bemeasured by statistical testing techniques5
    • Society is not a laboratory• Redundancy or feasibility issues• External validity: context matters• Failure to address the distinctiveaccountabilities of partners, replication,implementation (who? Why? How ?)• Ethical issues• Experiments are costly, require superiorskills, large studies, large samples andspecialized quality assurance6
    • Qualitative approaches contribute toevaluation rigour• Without theory there is no credible causalexplanation that adds to developmentknowledge• Understanding of causal relationships calls forsubstantive knowledge of the intervention andits context• Qualitative methods involve participation,observation, text based information, villagemeetings, open ended interviews, etc.• Quality can and should be quantified.• It is also important to figure out why thingsdid not happen7
    • Mixed methods together with (orbeyond) the experiment– Regression and factor analysis– Quasi-experimental designs– Multivariate statistical modelling– Participatory approaches– Qualitative impact assessment– Beneficiaries surveys– General elimination methodology– Expert panels– Benchmarking– Case studies8
    • Complexity models and systemsthinking: not yet a theory?• Pluralism and inclusivity in methods• Networks as privileged units of account• Interrelationships• Engagement with multiple actors• Symbols and diagrams to guide evaluativeinquiry• From ‘attribution’ to ‘contribution.• Rely selectively on all evaluation traditions –theories of change, realist and criticalevaluation, democratic evaluation, etc.9
    • Towards a new developmentevaluation agendaA relevant development evaluationstrategy for our troubled times should: put values at the very centre of thestrategy reflect the new development agendalikely to emerge beyond the MDGs reconsider the ultimate goals ofdevelopment cooperation.10
    • Why should values come first?• The social arrangements that allow a fewindividuals to accumulate enormous wealthwhile depriving large segments of thepopulation of basic necessities have spawnedwidespread popular anger and resentment• The richest 0.5% hold well over a third of theworlds wealth while 68% share only 4%• Differences in income attributable to effort,skill or entrepreneurship are not widelyresented – unless the rules of the game arerigged• Evaluators should examine these rules! 11
    • Evaluators can no longer sit on thefence• Current utilization driven evaluation models areclient-controlled and tend to be value free• Evaluators should be independent and seekinspiration from the ethical imperatives of socialjustice• They should master recent policy researchfindings about inequality• They should revise their guiding principles andmetrics in ways consistent with the pursuit ofmore equitable, inclusive and environmentallysustainable growth• They should give higher priority to the politicaldimensions of policies and programs12
    • We need new metrics• GNI measures fail to capture services providedwithin the household or to reflect environmentallosses, social inequities and human insecurities• Five planets would be needed to get the world upto US living standards• The spectre of global warming is haunting thepoorest and most vulnerable countries• The reality of poverty encompasses factors thatreach well beyond material deprivation• The profile of risk management should rise giventhe rise of national disasters and the insecuritiesof the interconnected global system13
    • Towards a 3D (McGregor/Sumner)evaluation model14EvaluationcharacteristicsMaterial well-beingRelational well-beingPerceptual well-beingMajor discipline Economics Sociology PsychologyDominant evaluationapproachCost benefitanalysisParticipatoryevaluationEmpowerment evaluationInvestment focus Physical capital Social capital Human capitalMain unit of account Countries Communities IndividualsMain types ofindicatorsSocio-economic Resilience Quality of life
    • Taking account of a new aidarchitecture• Aid is once again an instrument of geopolitics andnew public and private actors have joined the fray• Aid is increasingly channelled through large,diffuse and adaptable country based programmesor through vertical multi-country initiativesinvolving multiple actors.• Aid coalitions are increasingly operating acrossborders to tackle “problems without passport”• The complex political equations that underlietheir operations cannot be solved throughabstract, politically neutral approaches15
    • Evaluating development cooperationbeyond aidThe footprint of development cooperation is farmore significant than the aid footprint:• Exports (about $5.8 trillion)- 45 times officialaid flows.• Remittances ($283 billion) - 2.2 times• FDI ($594 billion) - 4.6 times as large• Royalty and licence fees($27 billion) - a fourth• Huge damage caused by climate change as aresult of OECD countries’ energy policies 16
    • Conclusions• Even the best methods cannot compensate fora misguided strategy• The new development research andevaluation agenda:• emphasize ethical standards and democraticvalues• reflect the new human well being paradigm• come to terms with the new aid architecture• reach well beyond aid• experiment with complexity and systems thinking• promote independence in evaluation processes• devote more attention to the political dimension 17