Katherine West NGO cooperation program


Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Pleased to have the opportunity to present some highlights from the recent ANCP Meta-evaluation
  • ANCP is a DFAT grant – providing money to Accredited Australian NGOs ANCP funding is unique, supporting ANGOs own programmes (important point of difference to rest of the aid program) Meta-evaluation is just one component of a broader f-work for understanding ANCP effectivenessKey to the MELF is continuous learning and improvement
  • In summary – will PurposeMethodology And some key findings + Take a brief look at one of the NGO evaluations reviewed as part of the meta-evaluation – in particular examine how they addressed the ’L’ in MEL
  • in order to take a ‘futures thinking’ stance – ANGOs need to make sure that they are adequately investing in finding out - what works, what doesn’t and why from their own programsie undertaking quality evaluations and thus understanding what constitutes quality With a few notable exceptions – individual NGOs don’t have the capacity to stand back and evaluate this themselves Key purpose of this meta-evaluation was to look at how ANCP NGOs are assessing their programs – identifying the strengths and weaknesses of NGO evaluations – and key ‘quality’ factors ?
  • But first what is a meta-evaluation – are people familiar with the term ? Is a recognised evaluation methodology which has been around for > 50 years. It is basically a formal assessment of evaluations (an evaluation of evaluations) – against a set of recognised evaluation standards - adapted for purpose (used OECD DAC, ALNAP humanitarian & AusAID) Assessment framework comprised of eight domains against which evaluations were assessed – scored as good, very good – just satisfactory or poor etcSmall sample – nine ‘promoting opportunities for all’ – Cambodia
  • So what did we find out ?Key finding was that overall quality was adequate / fit for purpose. All (bar one) were of adequate quality to provide information back to the implementing NGOs and their respective communities. Good solid pieces of evaluation work – with a quality of evidence that we can trust for future thinking and programmingThree clear strengths to ANCP evaluationsDemonstrated consultative processStrong targeted participatory methodologies Specific capacity and methods to engage vulnerable groups e.g. children Key quality factors – highlight three Skill of the evaluation team – Technical field in itself – roles and responsibilities of HO and field partner .. Evaluation spend – average <3% on an evaluation event ~ $7000 – quite low - need to be realistic as to what that can deliver 2. ToRs – realistic and clear – strongest factor – quality ToR and quality evaluation process and result – Is a whole other discussion waiting to be had around M&E standards (pros and cons) 3. Adequate consideration to relevant cross cutting issues / inclusive development – most notably gender was not well analysed in this sample Not surprisingly – a bulk of the evaluations were low cost – project specific evaluations with fairly defined parameters for learning – but were able to discern some emerging trends Significant evidence of ‘targeted partnering’ – partic for technical e.g. micro –credit Working with government – build capacity (beyond advocacy) Policy analysis – again implement policy not just advocate for it In General - overwhelming evidence of capacity building and systems thinking in terms of strategy / approach
  • In conclusion – reassurance re quality of the evidence being used for Future thinking Next steps include a focus on partner capacity in evaluation implementation and perhaps improved ‘management’ or guidance of evaluation from Australian NGO partnersAlso key is to ensure better capture of broader learning Majority of the evaluations reviewed were, as mentioned, routine reviews or end project evaluations. Sound – but proportional to expenditure – limited focus and application and considerations of broader learning. One exception was the IWDA thematic review – which was an evaluation in addition to those other processes – aimed at both bringing partners along with them in the process – but also more overtly addressing the ’L’ in MeLVery briefly – Jo from IWDA
  • Key messagesThere are significant benefits to looking across an area of programming with partners to surface and explore deep enablers, constraints, challenges, key strategy and practice learnings that can otherwise remain hidden. Organisations may think issues or experiences are one-offs/specific to their context; dialogue with others working on similar issues can identify common experiences/patterns/influences, and strategies and approaches to address.The opportunity to safely share experiences, challenges and enablers and resources is particularly important in safety and security work where operating context is demanding, dynamic and risky. Bringing a cohort working on similar issues together across geographic contexts means much can be assumed about the everyday challenges involved in the work, enabling a deeper focus and more strategic analysisincluding on underlying foundations for work addressing violence against womenAnd on the combinations of policies, services and approaches that are needed for change.Partnership is a key factor in strengthening women’s agency at both the individual and collective level
  • Key messagesDespite rapid ICT developments, significant challenges (especially time) for small-medium agencies to effectively share learnings internally. Always benefits to looking across a program area to draw out and consolidate issues and learning, even if time and budget only enables a desk review. This review of evaluations and locating current agency and partner practice in an international context was helpful in identifying where approaches and strategies developed in a particular context might be usefully extended, and ‘benchmarking’ against international thinking and good practice.
  • Key issuesVital to share learnings with the broader sector, especially given ANCP meta analysis findings of the re poor standards in relation to gender analysisImportant for Australian aid/ANCP to drive and support improvedaccountability/ performance on gender in future, given its centrality to effective, sustainable development. Specialist/niche agencies that combine programming, research, policy and technical advisory services can provide practical advice, resources and support to other organisations, enabling better, more inclusive andmore effective developmentCollaboration between organisations with different expertise and priorities is central to the future of development practiceMechanisms such as the Australian Development Research Awards have provided important support for collaborative action researchand development of practical tools that support more effective development
  • Katherine West NGO cooperation program

    1. 1. NGO Cooperation Program (ANCP) 2013 Meta-Evaluation Presenters : Katherine West (DFAT) Deb Hartley (Independent) Jo Hayter (IWDA) ACFID University Network Conference 2013
    2. 2. INTRODUCTION NGO Cooperation Program (ANCP) Monitoring, Evaluation and Learning Framework (MELF)  Performance Reporting  Thematic Review  Meta-Evaluation *commitment to a process of continuous learning and improvement*
    3. 3. META – EVALUATION Overview Why – evidence, quality How – methodology, rigour and validity What – key findings Case Study
    4. 4. WHY ?
    5. 5. HOW ? Planning (ToRs) Implementation Reporting Follow up Outcomes Cross Cutting Capacity Building Lessons Learned
    6. 6. WHAT ?
    7. 7. CASE STUDY: IWDA THEMATIC REVIEW Reflections on participatory review with partners Benefits of collectively identifying barriers and enablers
    8. 8. • Value of exploring different approaches to learning • Challenges of reviewing practice thematically • Consolidating and Sharing Learning Reflections Challenges