This document provides guidance for consultants conducting evaluations and impact assessments of WaterAid's Governance and Transparency Fund (GTF) programme. It outlines the purpose and key stakeholders for the evaluation and impact assessment. Consultants have 25 days to complete both exercises. The evaluation will assess programme performance against objectives, while the impact assessment focuses on understanding changes in people's lives resulting from the programme. Guidance is provided on evaluation questions, methodology, timelines, and the differences between evaluations and impact assessments. Countries will take different approaches depending on whether a full or small-scale evaluation is required.
Presentation on Incorporating DRR issues into the WASH program of the Governm...Shakeb Nabi
This is a study conducted by NARRI (National Alliance for Risk Reduction and Response Initiatives to assess the current WASH program of the Government of Bangladesh and how Disaster Risk Reduction can be incorporated into the same.
The project is being funded by European Commission Humanitarian Aid and Civil Protection. For further information please contact Shakeb Nabi (nabi.shakeb@gmail.com). Please also visit our website www.narri-bd.org
we also post lots of interesting stuffs on DRR on our facebook (NARRI Bangladesh). We encourage you to become member of the same
Progress towards Results: Overall Performance Study of the GEF (IWC5 Presenta...Iwl Pcu
Aaron Zazueta, GEF Evaluation Office
Presentation given during the 5th GEF Biennial International Waters Conference in Cairns, Australia during the results-based management session.
Using case-based methods to assess scalability and sustainability: Lessons fr...Barb Knittel
Overview of the SC4CCM project and end-line evaluation questions focused on scalability and sustainability. Methodological approaches including case selection strategies, mixed method approaches, within-case and cross-case analysis processes. (Sangeeta Mookherji, GWU)
IEG’s new report, Results and Performance of the World Bank Group (RAP) provides a timely review of the Bank Group portfolio performance and offers key insights into how the Bank can also do better, to improve its project outcomes and achieve its broader development goals.
This presentation is a brief summary of IEG's Evaluation "Mobile Metropolises: Urban Transport Matters," which examines the World Bank Group’s effectiveness in supporting countries’ efforts to achieve mobility for all (including the poor, women, and disabled persons), sustainable urban transport service delivery (from the financial and environmental perspectives), and urban transport institutional development.
Findings from IEG’s report – A Thirst for Change: An Evaluation of the World Bank Group’s Support for Water Supply and Sanitation with Focus on the Poor.
Presentation on Incorporating DRR issues into the WASH program of the Governm...Shakeb Nabi
This is a study conducted by NARRI (National Alliance for Risk Reduction and Response Initiatives to assess the current WASH program of the Government of Bangladesh and how Disaster Risk Reduction can be incorporated into the same.
The project is being funded by European Commission Humanitarian Aid and Civil Protection. For further information please contact Shakeb Nabi (nabi.shakeb@gmail.com). Please also visit our website www.narri-bd.org
we also post lots of interesting stuffs on DRR on our facebook (NARRI Bangladesh). We encourage you to become member of the same
Progress towards Results: Overall Performance Study of the GEF (IWC5 Presenta...Iwl Pcu
Aaron Zazueta, GEF Evaluation Office
Presentation given during the 5th GEF Biennial International Waters Conference in Cairns, Australia during the results-based management session.
Using case-based methods to assess scalability and sustainability: Lessons fr...Barb Knittel
Overview of the SC4CCM project and end-line evaluation questions focused on scalability and sustainability. Methodological approaches including case selection strategies, mixed method approaches, within-case and cross-case analysis processes. (Sangeeta Mookherji, GWU)
IEG’s new report, Results and Performance of the World Bank Group (RAP) provides a timely review of the Bank Group portfolio performance and offers key insights into how the Bank can also do better, to improve its project outcomes and achieve its broader development goals.
This presentation is a brief summary of IEG's Evaluation "Mobile Metropolises: Urban Transport Matters," which examines the World Bank Group’s effectiveness in supporting countries’ efforts to achieve mobility for all (including the poor, women, and disabled persons), sustainable urban transport service delivery (from the financial and environmental perspectives), and urban transport institutional development.
Findings from IEG’s report – A Thirst for Change: An Evaluation of the World Bank Group’s Support for Water Supply and Sanitation with Focus on the Poor.
This webinar highlights organizations, tools, and programs working to resolve ongoing sustainability and post-implementation challenges.
Panelists:
- Stephanie Ogden, CARE
- Ruud Glotzbach, SNV
- Noah McColl, charity: water
Moderator: Elynn Walter, WASH Advocates
On Thursday 20th November Tim Banfield, Director of Strategy at the Major Projects Authority [MPA] provided some perspective and first hand experience of how projects can be improved.
Tim was introduced by Alan Macklin, Deputy Chair of APM Board and committee member of ProgM SIG, who hosted the latest webinar in the APM series. Alan, who explained that he knew Tim from his time at the National Audit Office described how helpful he had found ‘his’ landmark reports such as; the NAO Guide: Initiating Successful Projects (see Appendix 1 NAO Guide: Initiating successful projects.)
Tim opened his presentation with his tagline “the right projects, done right.” First of all it was about picking the best projects to achieve policy intent and strategic benefit. We shouldn’t try to do too many projects and those that we do choose absolutely need to be prioritised and delivered properly.
The MPA has a comparatively small team (just 80 members of staff) and needs therefore to target its efforts very carefully. Their priority is to identify the best ways to support individual departments with an emphasis on building capability across government. Their objective is to develop understanding rather than push centralised processes.
They have identified 5 key priorities:
1.Challenge, assure and support – Providing expert advice and support to departments ensuring the delivery of major projects
2.Create project based controls through alignment – Providing project teams space to ‘get on and do’ projects
3.Building long term capability – Continuing to build a cadre of experienced project leaders
4.Building the profession – Ensuring that the Civil Service is the ‘place to come’ for project delivery professionals
5.Strategic prioritisation and front end loading – Having the right conversations about department portfolios / ensuring projects are only started when they are ready
"Monitoring and Evaluation System for CAADP Implementation", presentation by Babatunde Omilola at the CAADP Monitoring and Evaluation Framework Validation Workshop held at the Indaba Hotel, Johannesburg, South Africa March 1-3, 2010.
Short presentation on conference evaluation presented to the Geneva Evaluation Network by Laetitia Lienart of IAS and Glenn O'Neil of Owl RE on 16 March 2011
Presented at “Knowledge for Sustainable Development: the Research-Policy Nexus” Global Sustainable Development Network Conference in Bonn, Germany, 23-25 October 2019.
Global grants support large international activities with sustainable, measurable outcomes in Rotary’s areas of focus. Hear an overview of global grants and the importance of Rotarian involvement in monitoring and evaluation. Learn about scholarships and how you can sponsor or host global grant scholars. By the end of the session you’ll understand the components of a quality project and be prepared to start your global grant application.
New Teacher Excellence: The Impact of State Policy on Induction Program Implementation -- While scholars have argued and research demonstrates that intensive induction support can increase teacher effectiveness, satisfaction and retention, there is little consensus around which specific policy levers have an influence on quality mentoring and induction programs and even less research to explain the growth and development of induction programs that have little or no state policy support.
This paper explores the intersection between state induction policy and local induction program implementation and examines the question: How does state policy impact the development and quality of local induction programs?
Full paper available here: http://www.newteachercenter.org/products-and-resources/new-teacher-excellence-impact-state-policy-induction-program-implementation
Achieving Measurable Collective Impact with Results-Based Accountability - Mu...Clear Impact
Partners from local, state and national initiatives are working together to understand how to meet the conditions of collective impact. Organizations often seek like-minded partners in order to reach common goals. Partnerships are formed. Meetings are held. But to what end? Stakeholders are convened from numerous programs aimed at support community well-being. These partnerships often find themselves continuing to focus on the outcomes for individuals, rather than on the collective impact of aligned partners throughout the community. Over time, meeting attendance falls and partners end up falling short of measurable results. What causes these well-intentioned efforts to flounder?
This workshop series will detail how partners and stakeholders can understand and implement the five conditions of collective impact by implementing the RBA framework. Each webinar will focus on a specific condition, allowing participants to have a deeper understanding of what it takes to practically apply RBA to meet that condition. The series will also include case studies that illustrate how partner organizations can align their efforts to achieve measurable community results with sustainable change. Participants are encouraged register for the full series, as each webinar will build upon the content from previous sessions.
Check out more videos and webinars on our website: https://clearimpact.com/resources/videos/
PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...Tariq A. Deen
The session will provide details on: the tool developed by the LEG for monitoring and evaluating progress, effectiveness and gaps (PEG M&E tool) and its application in the process to formulate and implement NAPs; and the best practices for developing monitoring and evaluation (M&E) systems for adaptation at the national level. It will also look at the experiences of countries in developing and applying M&E systems at their national levels.
The Least Developed Countries Expert Group (LEG) has developed National Adaptation Plan technical guidelines. These guidelines will assist developing countries in producing their National Adaptation Plans in a comprehensive and strategic manner.
More information: http://undp-alm.org/resources/training-tools/national-adaptation-plans-technical-guidelines-nap-process
Presentation given during a RiPPLE Consortium Advisory Group meeting to present past and current comms work and introduce the participants to the upcoming challenges and perspectives
New Directions for the Quality Matters ProgramMarylandOnline
From its beginnings as a quality assurance tool for online course design, the Quality Matters Program is evolving to address a broader range of online learning quality assurance and faculty development issues.
This webinar highlights organizations, tools, and programs working to resolve ongoing sustainability and post-implementation challenges.
Panelists:
- Stephanie Ogden, CARE
- Ruud Glotzbach, SNV
- Noah McColl, charity: water
Moderator: Elynn Walter, WASH Advocates
On Thursday 20th November Tim Banfield, Director of Strategy at the Major Projects Authority [MPA] provided some perspective and first hand experience of how projects can be improved.
Tim was introduced by Alan Macklin, Deputy Chair of APM Board and committee member of ProgM SIG, who hosted the latest webinar in the APM series. Alan, who explained that he knew Tim from his time at the National Audit Office described how helpful he had found ‘his’ landmark reports such as; the NAO Guide: Initiating Successful Projects (see Appendix 1 NAO Guide: Initiating successful projects.)
Tim opened his presentation with his tagline “the right projects, done right.” First of all it was about picking the best projects to achieve policy intent and strategic benefit. We shouldn’t try to do too many projects and those that we do choose absolutely need to be prioritised and delivered properly.
The MPA has a comparatively small team (just 80 members of staff) and needs therefore to target its efforts very carefully. Their priority is to identify the best ways to support individual departments with an emphasis on building capability across government. Their objective is to develop understanding rather than push centralised processes.
They have identified 5 key priorities:
1.Challenge, assure and support – Providing expert advice and support to departments ensuring the delivery of major projects
2.Create project based controls through alignment – Providing project teams space to ‘get on and do’ projects
3.Building long term capability – Continuing to build a cadre of experienced project leaders
4.Building the profession – Ensuring that the Civil Service is the ‘place to come’ for project delivery professionals
5.Strategic prioritisation and front end loading – Having the right conversations about department portfolios / ensuring projects are only started when they are ready
"Monitoring and Evaluation System for CAADP Implementation", presentation by Babatunde Omilola at the CAADP Monitoring and Evaluation Framework Validation Workshop held at the Indaba Hotel, Johannesburg, South Africa March 1-3, 2010.
Short presentation on conference evaluation presented to the Geneva Evaluation Network by Laetitia Lienart of IAS and Glenn O'Neil of Owl RE on 16 March 2011
Presented at “Knowledge for Sustainable Development: the Research-Policy Nexus” Global Sustainable Development Network Conference in Bonn, Germany, 23-25 October 2019.
Global grants support large international activities with sustainable, measurable outcomes in Rotary’s areas of focus. Hear an overview of global grants and the importance of Rotarian involvement in monitoring and evaluation. Learn about scholarships and how you can sponsor or host global grant scholars. By the end of the session you’ll understand the components of a quality project and be prepared to start your global grant application.
New Teacher Excellence: The Impact of State Policy on Induction Program Implementation -- While scholars have argued and research demonstrates that intensive induction support can increase teacher effectiveness, satisfaction and retention, there is little consensus around which specific policy levers have an influence on quality mentoring and induction programs and even less research to explain the growth and development of induction programs that have little or no state policy support.
This paper explores the intersection between state induction policy and local induction program implementation and examines the question: How does state policy impact the development and quality of local induction programs?
Full paper available here: http://www.newteachercenter.org/products-and-resources/new-teacher-excellence-impact-state-policy-induction-program-implementation
Achieving Measurable Collective Impact with Results-Based Accountability - Mu...Clear Impact
Partners from local, state and national initiatives are working together to understand how to meet the conditions of collective impact. Organizations often seek like-minded partners in order to reach common goals. Partnerships are formed. Meetings are held. But to what end? Stakeholders are convened from numerous programs aimed at support community well-being. These partnerships often find themselves continuing to focus on the outcomes for individuals, rather than on the collective impact of aligned partners throughout the community. Over time, meeting attendance falls and partners end up falling short of measurable results. What causes these well-intentioned efforts to flounder?
This workshop series will detail how partners and stakeholders can understand and implement the five conditions of collective impact by implementing the RBA framework. Each webinar will focus on a specific condition, allowing participants to have a deeper understanding of what it takes to practically apply RBA to meet that condition. The series will also include case studies that illustrate how partner organizations can align their efforts to achieve measurable community results with sustainable change. Participants are encouraged register for the full series, as each webinar will build upon the content from previous sessions.
Check out more videos and webinars on our website: https://clearimpact.com/resources/videos/
PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...Tariq A. Deen
The session will provide details on: the tool developed by the LEG for monitoring and evaluating progress, effectiveness and gaps (PEG M&E tool) and its application in the process to formulate and implement NAPs; and the best practices for developing monitoring and evaluation (M&E) systems for adaptation at the national level. It will also look at the experiences of countries in developing and applying M&E systems at their national levels.
The Least Developed Countries Expert Group (LEG) has developed National Adaptation Plan technical guidelines. These guidelines will assist developing countries in producing their National Adaptation Plans in a comprehensive and strategic manner.
More information: http://undp-alm.org/resources/training-tools/national-adaptation-plans-technical-guidelines-nap-process
Presentation given during a RiPPLE Consortium Advisory Group meeting to present past and current comms work and introduce the participants to the upcoming challenges and perspectives
New Directions for the Quality Matters ProgramMarylandOnline
From its beginnings as a quality assurance tool for online course design, the Quality Matters Program is evolving to address a broader range of online learning quality assurance and faculty development issues.
Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...OECD CFE
Presentation by Jonathan Potter, OECD LEED Senior Policy Analyst, and Stuart Thompson, OECD LEED Policy Analys, tat the seminar organised by the OECD LEED Trento Centre for the Officers of the Autonomous Province of Trento on 13 November 2015.
https://www.trento.oecd.org
Using case-based methods to assess scalability and sustainability: Lessons fr...JSI
Overview of the SC4CCM project and end-line evaluation questions focused on scalability and sustainability. Methodological approaches including case selection strategies, mixed method approaches, within-case and cross-case analysis processes. (Sangeeta Mookherji, GWU)
Organizational Capacity-Building Series - Session 6: Program EvaluationINGENAES
This session describes different kinds of program evaluations, and key evaluation considerations. These presentations are are part of a workshop series that was implemented in Nepal and 2016 as part of the INGENAES initiative.
Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...OECD Governance
Presentation by Kevin Williams, OECD Secretariat, at the 11th annual meeting of the OECD Senior Budget Officials Performance and Results network, OECD, 26-27 November 2015
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
3. www.wateraid.org
Background and purpose of the two
exercises
• The evaluation is
primarily for
Accountability and the
impact assessment is
primarily for Learning
• Complementary
exercises
• CPs and partners are
primary users of results
• Results for
communication,
fundraising etc
• Progress against the
baseline data is critical
4. www.wateraid.org
Overview of all evaluation and learning
processes and how they link together
Mid Term Review – WHAT so far?
Evaluation – WHAT?
Impact assessment – So WHAT?
Learning review – HOW?
Most significant change
analysis – the WHAT about the SO
WHAT?
7. www.wateraid.org
Different levels and how to deal with
this
• 7 countries are doing a full scale evaluation
• 9 countries are doing small scale evaluation
• All countries are doing an impact
assessment except from Kenya
• Small scale = updating Mid Term Review
• Full Scale = in depth assessment based on
key areas
8. www.wateraid.org
Length of the consultancy and how to
use your time
• Total number of days = 25 to be shared between two
exercises
• Rough Guideline:
Step 1: Understanding the context - understanding of the
problem in country that GTF is addressing
• Background reading - 1 day
• Working with country prog staff and key informants - reinforcing
understanding of programme, stakeholders and intervention design,
findings of MTR (if there was one) and conclusions - up to 3 days
Step 2: Enquiry – conducting self-assessment, semi-
structured interviews, FGD…..8 days
Step 3: Analysis - Self-assessment collation of results,
coding of qualitative data….2.5 accountability and 2.5 days
for learning analysis;
Step 4: Write the first draft of the report - 4 days
Step 5: Revisions and redraft of the report - 4 days
9. www.wateraid.org
Timeline
Dates Actions
4th
April ToR for Evaluation and Impact Assessment sent out to all countries
19th
April Each country to sign contracts with local consultants
23th April Webinar with all consultants
May 17th
Local consultants to submit draft Impact assessment section of
report
May 31st
Local consultants to submit draft Evaluation reports
June 24th
Local consultants to submit final Evaluation report including Impact
assessment
July 11th
MoF to submit global consolidated impact assessment
Week of July 22nd
CM to submit draft Evaluation report and share report to KPMG
Week of July 29th
Last Annual learning meeting
Mid Sept CM to submit Final Global Evaluation report
End of Sept /mid
October
Papa to share report with KPMG
End of October Submission of WaterAid PCR to KPMG
11. www.wateraid.org
In summary....
Questions Monitoring Evaluation Impact Assessment
Why do we do
it?
Measures on-going
activities
Measures performance
against objectives
Assesses change in peoples
lives
What is the
main focus?
Focus on programme
interventions
Focus on programme
interventions
Focus on stakeholders
At what level? Outputs Outcomes/impact Impact and change
What are the
key questions
to ask?
•What is being done?
•Is our programme
progressing as
planned?
•What happened? Did
we achieve what we set
out to achieve in terms
of:
•Effectiveness
•Efficiency
•Relevance
•Sustainability
•Impact
•So what actually
changed?
•For whom?
•How significant is it for
them?
•Will it last?
•What, if anything, did our
programme contribute?
13. www.wateraid.org
GTF Summative Evaluation
• We are conducting a critical analysis of
the GTF Programme in order to assess
whether or not it achieved its goals
Whether the planned activities occurred
Whether the activities led to achievement of goals;
How effective the project was;
How costly the project was; etc.
• This is a summative or end of programme
evaluation
14. www.wateraid.org
Purpose of evaluation
• For accountability- to enable
beneficiaries, board members, etc to know
how funds have been used;
• Our country evaluations will assess:
– objectives against logframe targets and
milestones;
– programme performance by OECD DAC
criteria of effectiveness, efficiency, relevance,
sustainability, replicability and impact
15. www.wateraid.org
A useful Global Evaluation
Report• Top tips include:
– i) the process of collation, analysis and write up
– ii) enhanced rigour and comparability of results and
reports…………...so
• A consistent stance
• Support and advice through online forum
• Verification of evidence
• Step-by-step validation of evaluation results
• Quality assurance processes
• Prioritise the use of country systems
• Use a set of agreed working definitions for key terms
• Use the WaterAid report template
16. www.wateraid.org
Evaluation Questions
Relevance:
• What we expect: Details of the programme’s significance
with respect to increasing voice, accountability and
responsiveness within the local context.
Evaluation Questions:
1. How well did the programme relate to governance priorities at
local, national or internal levels? Please demonstrate with
examples in relation to: i)increasing voice; ii) accountability;
and, iii) responsiveness within the local context.
2. How well did the programme relate to the Country Strategy
Paper aims and objectives? Of WaterAid and where
applicable of the FAN network – ie regional secretariats and of
DFID.
3. How logical is the current theory of change?
17. www.wateraid.org
Effectiveness
• What we expect: An assessment of how far the
intended outcomes were achieved in relation to
targets set in the original logical framework.
Evaluation Questions:
1. Have interventions achieved the objectives? At
country regional and global level.
2. How effective and appropriate was the programme
approach? How effective was the MEL system and
framework?
3. With hindsight, how could it have been improved?
18. www.wateraid.org
Partnership
• What we expect: How well did the
partnership and management
arrangements work and how did they
develop over time? Please consider areas
such as monitoring, evaluation and
learning arrangements. If possible,
consider from a regional perspective.
19. www.wateraid.org
Advocacy
• What we expect: To what extent has GTF
contributed to WaterAid influencing
targets?
Evaluation Questions:
1.How has the programme helped
implement successful advocacy
strategies? Are there any lessons learned
about measuring influencing.
2.How has the programme contributed to
the overall in country advocacy strategy?
20. www.wateraid.org
Equity
• What we expect: Discussion of social differentiation (e.g. by
gender, ethnicity, socio economic group, disability, etc) and
the extent to which the programme had a positive impact
(from an accountability perspective) on the more
disadvantaged groups.
Evaluation Questions:
1. How did the programme actively promote gender equality?
2. What was the impact of the programme on children, youth
and the elderly?
3. What was the impact of the programme on ethnic minorities?
4. If the programme involved work with children, how were child
protection issues addressed?
5. How were the needs of excluded groups, including people
with disabilities and people living with HIV/AIDS addressed
within the programme?
21. www.wateraid.org
Value for Money
• What we expect: Good value for money is the optimal use of
resources to achieve the intended outcome.
Evaluation Questions:
1. Has economy been achieved in the implementation of
programme activities?
2. Could the same inputs have been purchased for less money?
3. Were salaries and other expenditures appropriate to the
context?
4. What are the costs and benefits of this programme?
5. Is there an optimum balance between Economy, Efficiency
and Effectiveness? Overall, did the programme represent
good value for money?
22. www.wateraid.org
Efficiency
• What we expect: How far funding, personnel, regulatory,
administrative, time, other resources and procedures
contributed to or hindered the achievement of outputs.
Evaluation Questions:
1. Are there obvious links between significant expenditures and
key programme outputs? How well did the partnership and
management arrangements work and how did they develop
over time?
2. How well did the financial systems work?
3. Were the risks properly identified and well managed?
4. For advice on measuring value for money in governance
programmes see DFID’s Briefing Note (July 2011)
Indicators and VFM in Governance Programming, available
at: www.dfid.gov.uk
23. www.wateraid.org
Sustainability
• What we expect: Potential for the
continuation of the impact achieved and of
the delivery mechanisms following the
withdrawal of existing funding.
Evaluation Questions:
1.What are the prospects for the benefits of the
programme being sustained after the funding
stops? Did this match the intentions?
2.How have collaboration, networking and
influencing of opinion support sustainability?
24. www.wateraid.org
Innovation & Replicability
• What we expect: How replicable is the
process that introduced the changes/impact?
Refer especially to innovative aspects which
are replicable.
Evaluation Questions:
1.What aspects of the programme are
replicable elsewhere?
2.Under what circumstances and/or in what
contexts would the programme be replicable?
25. www.wateraid.org
Expected impact and change
• What we expect: Details of the broader economic, social, and political
consequences of the programme and how it contributed to the overall
objectives of the Governance and Transparency Fund (increased capability,
accountability and responsiveness) and to poverty reduction.
Evaluation Questions:
1. It is critical to demonstrate the progress in relation to the indicators included in the
GTF programme logframe. The focus is on accountability for the impact.
2. What was the programme’s overall impact and how does this compare with what was
expected? Please demonstrate from an accountability perspective if the perceived
impact was achieved and if not, why not.
3. Did the programme address the intended target group and what was the actual
coverage? Again from an accountability perspective, was the coverage reached? If
not, why not, if yes, how?
4. Who were the direct and indirect/wider beneficiaries of the programme? Again, the
importance here is to set out who these were for accountability purposes.
5. What difference has been made to the lives of those involved in the programme?
Describe the impact.
6. As you are aware, the Consultant is also conducting more detailed critical analysis on
Impact for learning purposes.
26. www.wateraid.org
Vertical Logic of Programme
Impact is the higher level
situation that the project
contributes towards
achieving
Outcome identifies
what will change and
who benefits during
the lifetime of the
project
Outputs are specific
deliverables
Human Resource and
financial inputs
LEARNING:
For the GTF Global
Consultants this
requires evidence of :
‘so what’?
LEARNING:
For the GTF Global
Consultants this
requires evidence of :
‘so what’?
ACCOUNTABILITY
For the GTF Global
Consultants this requires
evidence against
programme specific
objectives
ACCOUNTABILITY
For the GTF Global
Consultants this requires
evidence against
programme specific
objectives
28. www.wateraid.org
Why conduct the impact assessment
component?
• To learn and improve:
• To enable Country programme staff, stakeholders in
country, Wateraid staff and others to really understand
what changed as a result of the programme and to apply
this to future plans
• To test and refine our understanding of how change
happens and how successful we have been in supporting
positive changes for our stakeholders:
• To what extent did we work with the right people? In the right way? How did
this all link up ?
• To what extent did the changes we expected to see along the way support
the long term changes we were aiming to influence?
• What does this tell us about the way we think we can influence change?
• What should we do differently next time?
29. www.wateraid.org
The Impact Assessment
– Focus on the “so what question”
• what’s changed?
• For whom?
• How significant/lasting are these changes for
different stakeholder groups?
• In what ways did the programme contribute
– Expect the unexpected - we are looking for
evidence of positive/negative/ intended and
unintended changes,
– Prioritise analysis over gathering information
– need for open and probing questions
31. www.wateraid.org
Background and context – what we need
to know (Country Programme Theory of
Change):
• The local and national context, including key social,
political and environmental conditions and how they
have changed over the life time of the programme
• Key issues that the programme planned to address
• The target groups who would ultimately benefit from
the programme and how each would benefit?
• The process or sequence of changes that would lead
to the desired long-term goal
• The assumptions that the programme made about the
anticipated process of change
• The other actors/factors who had the potential to
influence the changes sought, both positively or
negatively.
32. www.wateraid.org
Four Domains of Change
1. Changes in the ways in which CSOs function and
network, and their capacity to influence the design,
implementation and evaluation of effective WASH policies at
all levels
2. Changes in the ways that CSOs, including those
representing marginalised groups, are able to engage in
decision-making processes affecting the WASH sector.
3. Changes in the ways in which members of local
communities demand accountability and responsiveness
from governments and service providers in the WASH sector
4. Changes in the ways that Governments and service
providers are accountable to citizens and end users in the
WASH sector
33. www.wateraid.org
Each Domain is broken down further into
“areas of enquiry”
These are the key questions you need to explore across all Domains:
1. What has actually changed for each of the different stakeholder groups,
especially the poorest and most marginalized communities in relation to
WASH (positive, negative, intended and/or unintended changes)
2. How significant and/or sustainable are these changes for the different
target groups?
3. To what extent do these changes compare with baselines and changes
that were planned and expected?
4. How do they link together and/or influence each other?
5. To what extent did the GTF programme contribute to these changes?
How?
6. Who or what else might have contributed to these changes? How?
7. How confident are you in these findings (levels of evidence)?
34. www.wateraid.org
Areas of Enquiry Domain 1
Domain 1 Key Areas of Enquiry
Changes in the ways in which
CSOs function and network,
and their capacity to influence
the design, implementation
and evaluation of effective
WASH policies at all levels
• Ways in which networks have developed
and function over time
• Shifts in CSO capacity
• How this capacity change has influenced
policy and practice at
o local levels
o National level
Note: we will provide further guidelines on how to assess these areas of enquiry in
the next week
35. www.wateraid.org
Areas of Enquiry Domain 2
• P
Domain 2 Key Areas of Enquiry
Changes in the ways that
CSOs, including those
representing marginalized
groups, are able to engage
in decision-making
processes affecting the
WASH sector.
• Shift in awareness, knowledge and confidence of
marginalized groups
• Shifts in the ways that people have been able to
demand their rights
• The extent to which the voices of marginalized
people are making a difference to policy and practice
• Ways in which different CSO strategies have
influenced change (e.g. budget tracking, participation
in stakeholder reviews, etc…)
Note: we will provide further guidelines on how to assess these areas of
enquiry in the next week
36. www.wateraid.org
Areas of Enquiry Domain 3
Domain 3 Key Areas of Enquiry
Changes in the ways in which
members of local communities
demand accountability and
responsiveness from governments
and service providers in the WASH
sector
• Levels of awareness of rights in local
communities
• Ways in which media coverage
supports understanding of rights
• Ways in which citizens are influencing
policy and practice over time
• Changes in community access to WASH
• Changes in community influence over
natural resources
Note: we will provide further guidelines on how to assess these areas of enquiry in
the next week
37. www.wateraid.org
Areas of Enquiry Domain 4
Domain 4 Key Areas of Enquiry
Changes in the ways that
Governments and service providers
are accountable to citizens and end
users in the WASH sector
• Changing levels of governance,
transparency and compliance
• Changes in policy and regulation (e.g.
new policies, laws, standards, political
and institutional framework) – and
the consequences of these
• Changes in practice relating to WASH
(e.g. delivery of new services and
systems) and the consequences
Note: we will provide further guidelines on how to assess these areas of enquiry in the
next week
38. www.wateraid.org
Methodology for both components
Restrict yourselves to using a few tried and tested
tools. We suggest:
– Facilitated self assessment: building on MTR which will
support the evaluation component
• How to do this and who should be involved
• Note: we will be adding some more change questions this time
– Follow up workshop to validate findings and focus on the
impact assessment element
• How to do this and who should be involved
– Other in depths interviews /FGD with key informants as
required
• This might enable a deeper understanding of e.g. how changes
affected particular target groups
39. www.wateraid.org
Guiding Principles for Methodology
• Create an atmosphere where informants
feel able to be honest and provide critical
feedback. Use an appreciative enquiry
approach
• Ensure a mix of both qualitative and
quantitative data is gathered
• For the impact assessment – ask open
and probing questions for a deeper
understanding of change
• Findings must be backed up with evidence
and be set against the original baseline
40. www.wateraid.org
Consideration of sample size
Questions:
• How many people to interview?
• What is a “good enough sample?”
Answers:
• Be pragmatic (you have limited time but need to be
representative).
• Plan with in country focal point:
– Include people/groups/interventions which represent
• good/strong
• Medium
• Poor/weak
• Explain your sampling decisions in the methodology
section of the report ( with an indication of the level of
rigour you believe this provides)
41. www.wateraid.org
The Report
We will provide more detailed guidance over the next week but this is
the guide
35 -40 pages to include:
• Executive Summaries x 2 - (4 pages in total)
• Contents + Abbreviations (1 page)
• Methodology + challenges and limitations (2 pages )
• Country Context and introduction to the programme (3 pages )
• Evaluation Report (10 pages) – findings and conclusions under the
following headings:
– Relevance, Effectiveness, Partnership, Advocacy Equity, Value for
Money, Efficiency, Sustainability, Innovation and Replicability, Expected
Impact and change.
• Impact Assessment Report (10 pages) – findings and conclusions under the
following headings:
– Changes under each Domain
– Overall analysis of impact for different target groups
– What difference the programme has made overall
• Overall conclusions and learning for Country Programmes, for the sector
and globally (4 pages)
• Annexes
42. www.wateraid.org
Other ways to present findings
• Opportunity for Wateraid to share findings and
learning with a wide group of stakeholders -
• Target groups:
– Country programme staff
– Networks in country
– Partners
– Wateraid donors
– Sector specialists
• Supplementary ways of presenting findings (optional)
– Case studies
– Video footage
– Photos
43. www.wateraid.org
Next steps
• Maureen and Catherine to send more
detailed guidelines by Friday April 26th
• In country evaluation team to take stock of
the outcomes of the webinar and
• Prepare and send proposal to Catherine and
Maureen copied to Marta and Papa by
Friday April 26th
with a brief overview of your
plan including:
– Time line
– Your methodology
– Key informants
– Sample size and rationale for this
44. www.wateraid.org
Support and assistance
• Guidance notes to follow:
– Some thematic guidance
– Self-assessment format to use
– Learning questions
– Report format
• Online forum: you will be able to post questions, debate issues and
findings with other consultants involved in the GTF final evaluation and
impact assessment.
– Please expect an email in your inbox with a password and user details from
either Catherine or her colleague Erica Packington Erica will manage access to
the private forum.
– Please action the email immediately to guarantee access.
• E mail contacts:
– Technical advice
• Catherine Currie catherine@iodparc.com
• Maureen O’Flynn maureen@oflynn.demon.co.uk
– Logistics:
• Papa Diouf PapaDiouf@wateraid.org
• Marta Barcelo MartaBarcelo@wateraid.org
Please copy everything to Marta
Papa to do the welcome and introductions Explain also that this webinar will be followed up in the next few days with more detailed guidelines
papa
Papa to lead
Papa
papa
Papa to lead
Maureen to lead on this slide with Catherine adding in. Explain that all processes support each other but that they have different focus and purpose. The questions we ask are different. Go through the set. Use social housing example Will share two brief examples of successful programmes which have had negative impacts ... Tanzania and ??? As illustrations Point out that we evaluate impact ( and they will be doing this) to demonstrate the levels to which we achieved the impact we set out to achieve.. But with assessment we go further – instead of starting with the programme logic and plans, we start with changes in governance and transparency and the people this affects ... We explore what has changed for them ( good and bad) and then we assess what – if anything - our programme was able to contribute. WE ARE NOT TALKING ABOUT ATTRIBUTION
Catherine to lead on all of this component
A consistent stance in the evaluation that does not assume attribution of results to the Governance and Transparency Fund, but rather takes a critical approach and examines alternative explanations; Both the consultant in charge of the Impact Assessments and the consultant in charge of the Global Evaluation Report are available to support and advise individual national evaluation coordinators and consultants. Verification of evidence emerging through ongoing triangulation between the multiple data sources and methods employed; Step-by-step validation of evaluation results by national WaterAid teams (with peer review/ discussions as appropriate); Quality assurance processes that are built in to each national evaluation (as well as the preparation of the final global evaluation report) – should all meet the DAC Evaluation Quality Standards, UNEG Standards, or the comparable national or regional standards where these have been adopted; Prioritising the use of country systems to capitalise on existing data/literature including academia, universities, and civil society; and Using a set of agreed working definitions for key terms [and the WaterAid style Guide] to avoid confusion and inconsistent treatment.
Maureen to lead on this component
Build on what was said earlier.. In doing the evaluation, we have completed the accountability to the donor section and answered many questions around relevance and effectiveness etc. Now we really want to focus on the learning
Build on what we said earlier... This is really much more of an investigation... Really trying to find out what happened. I treat it as a murder mystery story... We know what’s changed but we don’t know how it happened. Your job is to find out.
As impact assessment is so focussed on change and our ability to influence change successfully, we need to be very clear about how we thought change would come about in different countries and contexts Each country programme should have developed its Theory of Change . We need to know about context, issues , key stakeholders. We especially need to understand the sequence of change that we thought would work ( what short term changes might lead to longer term change), what assumptions we made ( e.g. Moe press coverage will serve to influence both the wider public and eventually policy makers) We need to be clear abut other actors and factors who might either support or hinder progress in the areas we are working on
This is based on the specific outcomes that GTF hoped to influence. It is worded in neutral language to allow for you to explore negative and/or unintended changes as well as positive and expected changes Example - successful capacity building programme- staff plan and manage well, programmes are focussed and effective, staff are able to source funding etc... All good and lots of ticks. Bu the impact could be... They are so highly trained that they all leave to get jobs for the UN. We have to ask the so what question in order to understand what we should do differently next time
These are key questions ... You may have covered the positive and expected changes in the evaluation section, but this is an opp to really explore the “so what question” And find out if there have been unexpecxted and/or negative changes too. Baseline question – might be tricky to find the baseline, so you should build in questions that help you understand... E.g. Ask how networks have developed and changed over time the linking question is really important, as we want to know for example if it is worth investing efforts into capacity building – its not for itself... It should lead to changes in policy and practice... There might be bettre ways of making this happen To what extent did GTF contribute- really important question. The change might have happened but GTF might not have had anything to do with it. Example of girls education in Ethiopia - figures doubled in 4 years but it was because the big donors gave the Ethiopian gvt an ultimatum. Smaller orgs also working in advocacy in this area took credit for the change, but it wasn’t really as a result of their work. Remember – success has many parents but failure is an orphan! So try to find out who/what else might have been partly responsible for changes that you see Confidence is an important thing to think about. If you dont have concrete evidence for your claims, then the claims are weak. This is not always bad but you need to be clear about your levels of confidence in reporting change
Explain why we use areas of enquiry rather than indicators – to encourage answers that we don’t expect – remember open and probing questions We use words like shift and trends and levels. Where possible it is good to have checklists/scales to help you make sense of what has changed and for us to compare it across countries and programme. We’ll be sending checklists and ideas to support this. Will provide handout on characteristics of a functioning network Shifts in capacity - will be using the 7 s framework.. aspirations; strategy; organisational skills; human resources; systems and infrastructure; organisational structure; and culture. Each section is broken down into several indicators. Need to be aware that some partners may have stronger capacity that Wateraid, so there may be two way capacity building Wateraid advocacy scrap book make be able to help with trends in the way that organisations have been able to influence policy and practice
Awareness... Note to self – look at CAFOD voice and accountabilty tool – what has Wateraid got? See sustainability framework Need to look at all strategies and explore what differemce they have made –if any
Wiil find notes Need soemthing on media, Must read learning papers: Sustainability in Governancve programmes; Governance and Power analysis tools