This document summarizes the key points from a workshop on project evaluations. The workshop covered:
1) An introduction to project evaluations and the project cycle.
2) Discussion of evaluation criteria like relevance, effectiveness, efficiency, impact and sustainability. Quantitative and qualitative indicators were also covered.
3) Methods for data collection, developing evaluation questions, and analyzing qualitative data. Key points on developing terms of reference for evaluations were also provided.
5. Use both quantitative and qualitative indicators
Compare using trends (increase), thresholds (min. 30%), targets (strategy by 12/Y1)
Quantitative - SMARTER
Specific / Simple (to understand, collect)
Measurable
Attainable/Available at cceptable costs
Relevant to project / stakeholders
Time-bound
Evaluate/Engaging
Reevaluate/Recordable
Qualitative - SPICED
Subjective
Participatory
Interpreted and communicable
Cross-checked and compared
Empowering
Diverse / disaggregated (by gender)
Min. 30 % of participants initiate a project
aiming to address a local issue.
Reasons why participants have (not)
implemented a project to address a local
issue
http://www.smarttoolkit.net/?q=node/391
http://www.europa.eu.int/comm/europeaid/qsm/index_en.htm
5
6. Monitoring x evaluation x audit
Evaluation
• Assessment of project efficiency, effectiveness, impact, relevance and
sustainability for the purpose of learning and accountability to stakeholders
Monitoring
• Ongoing analysis of project progress towards achieving planned results with
the purpose of improving management decision making
Audit
• Assessment of (i) the legality and regularity of project expenditure and
income i.e. compliance with laws and regulations and with applicable
contractual rules and criteria; (ii) whether project funds have been used
efficiently and economically i.e. in accordance with sound financial
management;; and (iii) whether project funds have been used effectively i.e.
for purposes intended.
• Primarily a financial and financial management focus, with the focus of
effectiveness being on project results.
http://www.europa.eu.int/comm/europeaid/qsm/index_en.htm
6
9. Evaluation standards and principles
• EC http://ec.europa.eu/europeaid/
– Impartiality and independence of the evaluation process from the
programming and implementation functions;
– Credibility of the evaluation, through use of appropriately skilled and
independent experts and the transparency of the evaluation process,
including wide dissemination of results;
– Participation of stakeholders in the evaluation process, to ensure
different perspectives and views are taken into account; and
– Usefulness of the evaluation findings and recommendations, through
timely presentation of relevant, clear and concise information to
decision makers.
• OECD DAC key norms, standards, criteria
http://www.oecd.org/development/evaluation/dcdndep/41612905.pdf
• UNDP Evaluation policy http://web.undp.org/evaluation/policy.htm
• IPDET Handbook on Evaluation Ethics, Politics, Standards, Principles
http://dmeforpeace.org/sites/default/files/M14_NA.pdf
• Patton, Michael Quinn.(2008) Utilization-Focused Evaluation: 4th edition
9
15. Evaluation Terms of Reference - example
Background (project, context)
- project name, identification
- history of the project, objectives, results, key activities, progress over
time (add logical framework if you wish – you get more tailored proposals)
- organisational, social and political context in which the evaluation occurs
- main stakeholders involved in the project incl. target groups, beneficiaries,
partners, donors
- focus and scope of the evaluation – which project components,
geographical area, time period, target groups etc.
Our project:
15
16. Evaluation Terms of Reference - example
The rationale and purpose of the evaluation
- Why the evaluation is being undertaken (what do you want to get out of it)
including for accountability, learning, improvement
- Why now
Use of outputs
- How it will benefit the different stakeholders
- To whom, when and how the findings will be reported (debriefing,
presentation, report, videos, posters – printed or on-line…)
Our project:
16
17. Methodology - sources of evaluation questions
• OECD/DAC evaluation criteria (similar to the EC/EuropeAid)
– using project logical framework
• Questions, concerns and values of stakeholders
• Previous research / evaluations
• Guidelines / Evaluation Tools such as Kirkpatrick Model
• Experts
17
Source: Road to Results
18. How to measure learning outcomes?
http://leanlearning.wikispaces.com/learning_analytics
18
20. OECD/DAC Evaluation Criteria
Relevance – are we doing the right things?
The appropriateness of project objectives to the problems that it was
supposed to address, and to the physical and policy environment within
which it operated. The extent to which the project is suited to the priorities
and policies of the target group, recipient and donor.
It should include and including an assessment of the quality of project
preparation and design – i.e. the logic and completeness of the project
planning process, and the internal logic and coherence of the project design.
Potential evaluation questions:
• To what extent are the objectives of the programme still valid?
• Are the activities and outputs of the programme consistent with the
overall goal and the attainment of its objectives?
• Are the activities and outputs of the programme consistent with the
intended impacts and effects?
20
Source: OCEED/DAC, http://www.europa.eu.int/comm/europeaid/qsm/index_en.htm
21. OECD/DAC Evaluation Criteria
Effectiveness – are we doing things right?
An assessment of the contribution made by results to achievement of the
Project Purpose, and how Assumptions have affected project achievements.
This should include specific assessment of the benefits accruing to target
groups, including women and men and identified vulnerable groups such as
children, the elderly and disabled.
Potential evaluation questions:
• To what extent were the objectives achieved / are likely to be achieved?
• What were the major factors influencing the achievement or nonachievement of the objectives?
21
Source: OCEED/DAC, http://www.europa.eu.int/comm/europeaid/qsm/index_en.htm
22. OECD/DAC Evaluation Criteria
Efficiency – is the project worthwhile?
The fact that the project results have been achieved at reasonable cost, i.e.
how wellinputs/means have been converted into activities, in terms of
quality, quantity and time, and the quality of the results achieved.
The project shou use the least costly resources possible in order to achieve
the desired results.
This generally requires comparing alternative approaches to achieving the
same outputs, to see whether the most efficient process has been adopted.
Potential evaluation questions:
• Were activities cost-efficient?
• Were objectives achieved on time?
• Was the programme or project implemented in the most efficient way
compared to alternatives?
22
Source: OCEED/DAC, http://www.europa.eu.int/comm/europeaid/qsm/index_en.htm
23. OECD/DAC Evaluation Criteria
Impact – what changes has the project achieved / contributed to?
The effect of the project on its wider environment, and its contribution to the
wider policy or sector objectives (as summarised in the project’s Overall
Objective).
The positive and negative changes produced by a project, directly or
indirectly, intended or unintended, positive and negative.
This involves the main impacts and effects resulting from the project on the
local social, economic, environmental and other development indicators. It
must also include the positive and negative impact of external factors.
Potential evaluation questions:
• What has happened as a result of the programme or project?
• What real difference has the activity made to the beneficiaries?
• How many people have been affected?
Source: OCEED/DAC, http://www.europa.eu.int/comm/europeaid/qsm/index_en.htm
23
24. OECD/DAC Evaluation Criteria
Sustainability – wil the changes (nebefits for target group) last?
An assessment of the likelihood of benefits produced by the project to
continue to flow after external funding has ended, and with particular
reference to factors of ownership by beneficiaries, policy support, economic
and financial factors, socio-cultural aspects, gender equality, appropriate
technology, environmental aspects, and institutional and management
capacity.
Sustainability is concerned with measuring whether the benefits of a project
are likely to continue after donor funding has been withdrawn. Projects need
to be environmentally as well as financially (and socially) sustainable.
Potential evaluation questions:
• To what extent did the benefits of a programme or project continue after
donor funding ceased?
• What were the major factors which influenced the achievement or nonachievement of sustainability of the programme or project?
24
Source: OCEED/DAC, http://www.europa.eu.int/comm/europeaid/qsm/index_en.htm
25. Evaluation Questions - Descriptive
•
•
•
•
What „is“
Describe project (inputs, activities, outputs) or process
Simple: Who, what, where, when, how, how many …
Often used to gather opinions from target groups.
Examples:
• What is the project objective from the perspectives of different
stakeholders?
• What were the reasons for joining the program?
• How many persons were reached?
• How was the project implemented?
Evaluation design to answer them:
One-shot, before-and-after, time series, (long-term) panel, case studies…
Source: Road to Results
25
26. Evaluation Questions - Normative
• Compare what „is “ with what „should be“ (target)
Examples:
• Do the project activities address the needs of stakeholders?
• To what extent has the project achieved the result/objective indicators?
• Have min. 5.000 persons been reached through the campaign?
• Has the number of schools involved in Global Learning increased?
(baseline!)
Evaluation design to answer them:
One-shot, before-and-after, time series, (long-term) panel, case studies…
Source: Road to Results
26
27. Evaluation Questions – Cause-and-Effect
•
•
•
•
Determine what difference the project makes - what change has it brought
Often refer to outcome, impact
Compare indicators before and after, with and without the project (graph)
Careful about attribution x contribution!
– Can you say that the project achieved this or has contributed to this?
– Are there any alternative explanations (external factors) for achievements?
Examples:
• As a result of the training, have teachers incorporated Global Learning in
their lesson plans?
Evaluation design to answer them:
Experimental (control group), quasi
experimental (compare group),
nonexperimental (causual tracing, case
study, story harvesting, outcome
mapping…)
Goal
with
without
27
Source: Road to Results
T=0
T=1
Time
28. Evaluation Terms of Reference – Evaluation Matrix
• Evaluation criteria and questions
Design, Data
Our project – Question Baseline data Indicators Sources Collection Methods
28
29. Tips for evaluation questions
• Avoid questions on multiple issues – separate these
– Has the methodology been developed and disseminated to min. 5.000
teachers?
• You can develop subquestions for a particular question on an issue
– What concerns have teachers while introducing Global Learning at school ?
(descriptive)
– Has the project addressed these concerns of the teachers? (normative)
– Has the number of teachers using Global Learning increased as a result of
the project? (cause-and-effect)
• Set a realistic number of questions!
Source: Road to Results
29
30. Select a few, most important questions…how?
Would the evaluation question…
Q1 Q2 Q3..
•
•
•
•
•
•
•
•
be in line with the evaluation purpose?
be of interest to stakeholders?
reduce present uncertainty?
yield important information?
be of continuing (not fl eeting) interest?
be critical to the evaluation´s scope and comprehensiveness? (or nice to have)
have an impact on the course of events?
be answerable given the fi nancial and human resources, time, methods, and
technology available?
• Be reasonable to ask given the project cycle?
(Questions about impact, for example, are best answered after the project has
been fully operational for a few years)
Source: Road to Results, Kusters et al.: Making evaluations Matter
30
32. Evaluator
Selection
Interviews
Surveys
Focus groups
Case studies
Final
debriefing
of all partners
Draft evaluation
report
commented by
all partners
Final evaluation report
Desk study
Preliminary findings & conclusions
Initial briefing and inception
Terms of Reference – Objectives,
scope, stakeholders, questions,
budget, schedule, outputs, use.
Timeline and resources
Communication with the Project Partners
Inception
phase
Field
research
Reporting
phase
1-3 months
1-3 months
1-2 months
32
33. Internal evaluators
or
₊ May have a better understanding of
the projetct, context, policies
₊ Develop organisational capacities
₊ Higher ownership of
recommendations by the organisation
₊ Usually cheaper
₋ May not be able to see alternative
perspectives, solutions
₋ Influenced more by the implementing
organisation (want to keep their jobs)
₋ May be less credible to stakeholders
₋ May be time-consuming
external evaluators
₊ May bring a new perspective or
special (technical, evaluation)
expertise
₊ More independent from the
implementer – may facilitate better
between stakeholders (across
hierarchies, in case of mistrust)
₊ Usually perceived as more credible
₋ May not be able to comprehend fully
the project due to time/other
contrains
₋ Usually more expensive
or mixed - participatory
! External evaluation is not necessarily independent! (Who pays the evaluator?
Who checks the quality?)
! Even external evaluation consumes time of the project team!
33
! It depends WHO the evaluator is
34. Evaluation Terms of Reference - example
• Timetable
• Budget
• Human resources – responsibilities, expertise required
Our project:
34
35. Are we ready?
Evaluability
- Are clear why we do the evaluation?
- Do we have an (updated) logical framework?
- Do we have sufficient (baseline, monitoring) data available?
- Do we have accessible reliable information sources?
- Do we have sufficient funds for an internal/external evaluation? Will the
evaluation be cost-effective, will it bring reasonable benefits vs. costs?
- Is it likely that it will be used to improve actions in future? Can stakeholders
influence the evaluation decisions? Will they accept and use the findings? Is
there a strong leadership to put the recommendations in practice?
- Are there no major factors hindering the evaluation? Are staff members or
other stakeholders overloaded due to other priorities? Are there any
tendencies that would affect impartiality?
Source: UNDP Handbook, Road to Results, Making evaluations matter
35
38. Data Analysis and Interpreting
• Needs to be clear before data collection
• Methodology incl. reliability and limits is a part of report
• Categorize and triangulate findings
• Use quotes, examples, graphs
• Distinguiish between findings (evidence) and interpreting (conclusions)
• Do not generalize findings from 3 respondents to the whole sector!
• Have a short summary for those who cannot read the whole report
Source: Road to Results, evaluace.com
38
39. Reporting: as per the expected use by each stakeholder
Case
study
39
40. Reporting – example of quantitative data
the Project A
Project A
Project A
Project A
Project A
Source: Inka Pibilova
40
41. Reporting – example of quantitative data
No. of women reached
Participationosvětových akcích
Účast na in awareness raising
1033
Checked-up inv mobilních…
Vyšetřeny mobile clinics
3244
Odeslané do onkocenter
Sent to oncocentres
476
Checked-up in oncocentre
Vyšetřeny v onkocentrech
303
Rakovina
Diagnosed with cancer
31
Diagnosed Prekanceróza
with pre-cancer
48
Treated with cancer
Léčba rakoviny
Léčba with pre-cancer
Treated prekancerózy
Vyléčeno
Cured
20
2
?
Source: Naviga4: Prevention and early detection of women with cancer, Georgia, MFA CR
41
42. Reporting – example of qualitative data
Project A
Finding: Most volunteers expressed doubts about
the Programme, which partially impacted external
communication and advocacy. They believed the
donor should better clarify the program objectives
– if the programme is to primarily serve the
communities in the South or the young
professionals from Europe.
Source: Inka Pibilova
“I believe the development sector
needs well trained and well managed
professionals, not volunteers sent
with a weak/unclear mandate to 'do
something'. This may end up doing
more harm than good." Volunteer
42
43. Reporting –– example of qualitative data
Reporting example of qualitative data
Too expensive
treatment...
Project A
...incorrect
Radiation
inaccessible
Chemistry teacher and mother, 26 years old
Cervical precancerosis
Incorrect treatment and relapse
Source: Inka Pibilova
43
44. Where to learn more?
• Examples of evaluation reports (see bottom of evaluation section) and
other tools at http://www.evaluace.com/ - or contact inka@evaluace.com
• Road to Results
https://openknowledge.worldbank.org/handle/10986/2699
• EPDET development evaluation training 31 August – 6 September 2014 in
Slovakia – check http://www.dww.cz/index.php?page=epdet
• www.Betterevaluation.org
• EC evaluation guidelines
http://ec.europa.eu/europeaid/evaluation/methodology/tools/too_en.ht
m
• UNDP Handbook (p. 194 - 200) ttp://web.undp.org/evaluation/handbook/
• OECD DAC key norms, standards, criteria
http://www.oecd.org/development/evaluation/dcdndep/41612905.pdf
• RISK, UK: How do we know it is working?
http://www.developmenteducationreview.com/issue11-review1
• The Most Significant Change Guide:
http://www.mande.co.uk/docs/MSCGuide.pdf
44