SlideShare a Scribd company logo
Monitoring and Evaluation for development and governmental organizations
By Guta Mengesha
Email: gutamengeshadinagde@gmail.com
MA in Project Management and Finance
Ethiopia, East Africa
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Outline summary
• Key Terms
• Introduction
• M& E definitions
• Difference and similarities of M&E
• Goals of Monitoring
• Tools of Monitoring
• Evaluation Principles
• Tools for Evaluation
• Why Evaluation?
• Classification of Evaluation
• Evaluation planning
• Objective of Evaluation
• Steps in Evaluation
• M&E framework
• Planning tools
• Data quality Audit
• Indicators
• Baseline survey
• TOR
• Evolution Report
• Evaluation outline
• Reference
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
KEY TERMS
 Indicators
 Baseline
 Benchmark
 Counterfactual
 Data quality
 Re-programing
 Schedule crashing
 Fast tracking
 Fire-up plan
 Scope creep
 Milestone
 Theory of change (ToC)
 Decomposing
 TOR, Statement of work
 M&E framework
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
KEY TERMS
Indicator: An indicator is a particular
characteristic or dimension used to
measure intended change for a given
result.
Baseline: The value of a performance
indicator that exists prior to
implementation of the program, project or
intervention.
Benchmark: What you hope to achieve by
the end or are expected values or levels of
achievement at specified periods and
Reference point or standard against which
performance or achievements can be
assessed.
Counterfactual :the situation that would
have existed over time without the
changes introduced by the intervention
Schedule cursing: Adding additional
resources to the project to accelerate the
progress of the schedule
Fast Track: The technique of speeding up the
project schedule by altering the planned
schedule through doing work
simultaneously that would have ideally
been performed consecutively.
Fire-up plan: Accelerating a plan by
assigning additional resource
Decomposing: A technique to separate or
break down project deliverables into
smaller elements, components or parts
ToC: Tool that outlines the strategic intent
of the organization by illustrating how the
change will take place (or flow) from
projects and activities all the way up to
the portfolio level of the organization
M&E framework:Tool that outlines the
indicators the program team will use to
measure a program’s performance against
its stated objectives and outcomes.
Scope creep: refers to gradual changes in
project scope that occur without a formal
scope change procedure. Scope creep is
considered negative since unapproved
changes in scope affect cost and schedule
but do not allow complementary revisions
to cost and schedule estimates
Reprograming:a condition of amending or
modifying program budget and activities
in response to prevalent program truth or
in response to program reality.
Crashing: The technique of speeding up
the project schedule by using more
resources (i.e.: people, materials, or
equipment) than what was originally
planned
By Guta Mengesha
Monitoring and Evaluation for Development and Governmental organizations
INTRODUCTION
Monitoring and Evaluation (M & E) is integral part of project/program/strategy/ management cycle.
ME system a crosscutting activity that
Monitoring and Evaluation (M & E) enable us to check the bottom line of development work.
In development work, the term bottom line means whether we are making a difference in the problem
or not, while in business, the terms refer to whether we are making a profit or not in doing the business.
In monitoring and evaluation, we do not look for a profit; rather, we want to see whether
we are making a difference from what we had earlier.
M & E is all about trying to ascertain if your planned activities are being implemented on track and if
these activities have brought change.
Monitoring enables managers to keep track of progress, to adjust operations to take account of experience
and to formulate budgetary requests and justify any needed increase in expenditure and evaluation assess
other areas such as achievement of intended goals, cost-efficiency, effectiveness, impact and / or
sustainability and address issues of causality and further it help to make adjustmenet to design and
implementation of their project or other interventions.
A baseline study is the first phase of a project evaluation.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
What is Monitoring and Evaluation by Organisation for European Co-operation and
Development (OECD)?
MONITORING: is a continuous function that uses the systematic collection of data on specified
indicators to provide management and the main stakeholders of an ongoing development
intervention with indications of the extent of progress and achievement of objectives and
progress in the use of allocated funds (p. 27).
Monitoring focuses on progress made in terms of results: What did we deliver? What did we
achieve? Why did we (not) achieve certain results? How? Can we improve?
Its implementation is in relation to activity schedules and expenditure of allocated funds, and
its progress and achievements in relation to its objectives. Is planned activities indeed been
implemented?
Monitoring tracks the actual performance against what was planned or expected by collecting
and analyzing data on the indicators according to pre-determined standards.
In broad terms, monitoring is carried out in order to track progress and performance as a basis
for decision-making at various steps in the process of an initiative or project (IFAD 2021)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
What is Monitoring and Evaluation by…………………
EVALUATION: is the systematic and objective assessment of an ongoing or completed project,
program, or policy, including its design, implementation, and results. The aim is to determine the
relevance and fulfillment of objectives, development efficiency, effectiveness, impact, and
sustainability. An evaluation should provide information that is credible and useful, enabling the
incorporation of lessons learned into the decision making process of both recipients and donors (p. 21).
Evaluation also refers to the process of determining the worth or significance of an activity, policy or
programme. An assessment, as systematic and objective as possible, of a planned, on-going, or
completed development intervention” (glossary OECD-DAC).
Evaluations are held at a certain point in time (as opposed to monitoring, which is continuous) and are
retrospective. In order to assure credibility, evaluations are con-ducted by independent experts
(external). Evaluation involves the assessment of the programs towards the achievement of results,
milestones, and impact of the outcomes based on the use of performance indicators.
It is the periodic assessment of the design, implementation, outcomes and impact of a development
intervention.(OCED 2020). Is simply an in-depth assessment to see activities brought a change! Early
warning system that project go in wrong way.
It studies the outcome of a project (changes in income, better housing quality, distribution of the
benefits between different groups, the cost-effectiveness of the projects as compared with other options,
etc.) to inform the design of future projects.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Other theory by (Thomas Winderi ,PhD) simply elaborate
Monitoring as a dash board of your car that tells you
“How fast you are going?” “How money fuel left over?”
“Which door left open?”
Monitoring is concerned with Performance of a project,
program, service or a policy.
Monitoring is conducted typically by internal staffs
Monitoring is continues process or non-stop even after
an activity
Monitoring typically support management of project/
Program, policy or service
Evaluation is like an occasional check up of your vehicle
 It takes back to assesse a overall value of
project/program/service
 Are usually conducted by external specialist to ensure
unbiased judgment
 Is conducted one-off activity during and end of a
program
 Is more systematic answer “Is program,
policy/Service relevant ?”- (suite to priorities of target
group), is effective? Achieve result “Is efficient?” “
achieve at reasonable cost “ Have impact?” What real
difference brought? “IS sustainable?’ Will positive
change continue once funding is cut?
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Difference between monitoring and evaluation
Dimension Monitoring Evaluation
Characteristics Continuous proces
Periodic: at essential milestones, such as the mid-terms of the
program implementation; at the end or a substantial period after
program conclusion
Objective
Keeps track changes from baseline; oversight;
analyzes and documents progress
In-depth analysis; Compares planned with actual achievements,
validate what result were achieved and not
Focus
Focuses on inputs, activities, outputs,
implementation processes, continued relevance,
likely results at outcome level
Focuses on outputs to inputs; results to cost; processes used to
achieve results; overall relevance; impact, and sustainability.
Answer
Answers what activities were implemented and the
results achieved.
Answers why and how results were achieved. Contributes to
building theories and models for change.
Use
Alerts managers to problems and provides options
for corrective measures.
Provides managers with strategy and policy options.
Benefit
Self-assessment by program managers, supervisors,
community stakeholders, and donors.
Internal and/external analysis by program managers, supervisors,
community stakeholders, donors, and/or external evaluators.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Result Based Monitoring and Evaluation by
World Bank ten step to result monitoring
Monitoring Evaluation
Clarifies program
objectives
Analyzes why intended results
were or were not achieved
Links activities and
resource to the
objectives
Assesses specific causal
contributions of activities
Translates objectives into
performance indicators and
sets targets
Examines implementation
process
Routinely collects data on
these indictors, compares
actual with targets
Explores unintended results.
Report progress to
managers and alert them
to problem
Provides lessons, high lights
significant accomplishment or
program potential, and offers
recommendations for
inmprovement
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Goals of Monitoring
 To ensure that activities, input, output proceed according to plan
 To determine inputs are optimally utilized
 Ensure all activities are carried out by right people and in time
 To provide record of activities, input and output
 To warn the deviation from plan
 To assist managers in decision making
 To be integrated to all stages in project cycle
What makes monitoring and Evaluation similar
 Both activities require dedicated funds, trained personnel, monitoring and evaluation
tools, effective data collection and storage facilities, and time for effective inspection visits
in the field.
 Both are necessary management tools to inform decision-making and demonstrate
accountability. By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
In Monitoring There are 5 area to monitor percentage complete of the activities, Hitting baseline dates, Budget
,Quality and External dependencies Monitoring involves comparing actual performance with plans to evaluate
the effectiveness of plans, identify weaknesses early on and take corrective action if required. Example: Budget
Budget- Actual Comparison Report
Project title:Sonan Program
Period covered:Jan 2021- Dec 2021
Currency used:12000 Monitoring on 30 June 2021
Code Budget description
Annual
Budget
Budget to
date
Actual
to date
Variance to
date
Variance
%
Utilization
%
Note #
Income
AO Program cost 8,000 6,000 7,000 1,000 17% 88% Activity done
BO Investment 4,000 2,000 1,000 -1,000
-50% 25%
Item not procured
0
Total income 12,000 8,000 8,000 0 0% 67%
Expenditure
OS Operations 10,000 5,000 5,500 -500 -10% 55% tolerabel
OP Program Managemenet 1,000 500 700 -200
-40% 70%
staff not assined
0
Total expenditure: 11,000 5,500 6,200 -700 -13% 56%
SURPLUS/(DEFICIT) 1,000 2,500 1,800 -700
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
PURPOSE OF MONITORING AND EVALUATION
1. Provide data for budget revision or re-programming
2. Suggest for rescheduling if project run behind schedule
3. Re-budgeting a project (Appropriating) a fund from one
head to another; avoiding expense under unnecessary
heading
4. For re-assigning staff( shifting from one area or recruiting
temporary staff to meet time schedule)
5. Provide constant feedback on the extent to which the
projects are achieving their goals;
6. Identify potential problems and their causes at an early stage
and suggest possible solutions to problems;
7. Monitor the efficiency with which the different components
of the project are being implemented and suggest
improvement.
8. Evaluate the extent to which the project can achieve its
general objectives; Provide guidelines for the planning of
future projects;
9. Improve project design and show the need for mid-course
corrections;
10. Show need for mid-course corrections
11. To promote learning: to identify lessons of general
applicability, to learn how different approach to
participation affect outcome, and impact,
12. For strategic management: provision of information to
inform setting and adjustment of objectives and strategies.
13. To ensure accountability: to assess if program effective,
appropriately, effectively executed to be accountable to key
agencies supporting the action, expenditure and result are
agreed.
14. For capacity building: building the capacity, self-reliance
and confidence of beneficiaries and implementing staff and
partners to effectively initiate and implement development
initiatives.
15. For operational management: provision of the information
needed to co-ordinate the human, financial and physical
resources committed to the project or programme, and to
improve performance.( Estrella and Gaventa 2020)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Indicators for monitoring Project/Program/strategy/service usually monitored against whether
 Running on schedule
 Running within the planned cost
 Receiving adequate cost
Techniques/Tools of monitoring
1. First hand information
2. Format report
3. Project status report
4. Project schedule chart
5. Project financial status report
6. Informal report
7. Graphic presentation
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
EVALUATION in eye blink 
The word evaluation has its origin from Latine
word “Valupure” which means value of
particular thing, idea or action. Thus, it help us
to understand the worth, quality, significant
amount ,degree or condition of any
intervention.
Purpose of Evaluation from the two
prospective
I. From knowledge prospective: is to
establish new knowledge about social
problems and effectiveness of program or
policies to alleviate them and it help us to
make plan for future work.
II. From accountability prospective: is to
make the best possible use of fund to
accountable for worth of a project,
program or policy. It help to measure
accomplishment to avoid mistake and avid
weakness. Is to verify the benefit reached
the people for whom the program was
meant and to observe efficiency of tools
and technique employed.
Principles of Evaluation
1. Continuity
2. Inexpensive
3. Minimum hindrance to day to day work
4. Total participation
5. External evaluation
6. Agency or Program totality
7. Sharing
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Methods of Evaluation (Tools and technique)
1. First hand information: get information from host of staff, line officers, field personnel, other
specialist, public who associated to the project, direct observation, hearing the performance
and pitfalls
2. Formal/Informal periodic report:(Formal report include Project status report, Project
Schedule chart, Project financial status report) and (Informal report include anonymous
letters, press report, compliant by beneficiaries, petitions may reveal true nature yet biased
and may contain maligned information.
3. Graphic presentations: (Chart, graph, pictures, illustrations and the like)
4. Standing evaluation review committee:(host of experts who meet regularly at frequent
intervals to discuses the problem and suggest remedial measures)
5. Project profiles: by investigating team on standardized unit
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Methods of Evaluation…..
Project Review Meetings
The main function of project review meetings is to identify deviations from the project plan so
corrective action can be quickly taken.
During these meetings, participants focus on
(1) current problems with the work, schedule or costs, and how they should be resolved,
(2) anticipated problems, and
(3) opportunities to improve project performance.
Review meetings are the managerial equivalent to the “quality circle” (QC) groups used in
production environments.
Review meetings can be informal and scheduled weekly, or formal and scheduled whenever
needed or according to particular phases of the project
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Methods of Evaluation(Tools and technique)………….
Formal review
Among the most common formal reviews conducted during project definition and execution phases are
the following:
(1) Preliminary Design Review. The functional design is reviewed to determine whether the concept &
planned implementation fits the basic operational requirements.
(2) Critical Design Review. Details of the hardware & software design are reviewed to ensure that they
conform to the preliminary design specifications.
(3) Functional Readiness Review. For high-volume products or mass-produced goods, tests are
performed on the first, or early, items to evaluate the efficacy of the mfg. process.
(4) Product Readiness Review. Manufactured products are compared to specifications and requirements
to ensure that the controlling design documentation produces items that meet requirements
Formal critical reviews serve several purposes:
 minimization of risk,
 identification of uncertainties,
 assurances of technical integrity, and
 assessment of alternative design and engineering approaches.
N.B Formal reviews can be a precondition for continuing the project(as in the phased project planning approach).
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Methods of Evaluation(Tools and technique)……..
Informal review
Are held frequently & regularly, and involve a small number of people.
also are referred to as “peer reviews” because the people involved are usually members of the project team.
These reviews mainly focus on project status, special problems, emerging issues, and the performance of the project
with regard to requirements, budgets, & schedules.
Selection of meeting participants depends on the phase of the project and issues at hand so that only the appropriate project
team members, customer representatives, functional or line managers, and PMs are chosen.
Before these meetings, status reports & forecast time & cost-to-complete are updated
Charts and Tables
Charts & tables are the most expeditious way for displaying cost, schedule, and work
performance info.
Their advantage include
 reduce large amounts of complex information into simple, comprehensible formats.
• clarify information on project progress, performance, and predictions.
The problem with use of charts & tables is that they neither reveal the underlying causes of problems nor
suggest opportunities.
There are also oral and written reports.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Why Evaluation?
1. Top management and the customer want to know how the project is progressing, and
project personnel need to be kept abreast of project status and work changes
2. To improve performance & uncover extant or potential problems so they can be corrected
3. To improve accountability(reward for success and responsible for failure from talk most to
lower.)
4. For generating knowledge
5. Serves the purpose of summarizing project status to keep stakeholders informed
6. For decision making in that policy makers, planners, financers for economically sound
decision and to judge the merit of intervention.
7. Once the project is completed, evaluation’s purpose is to summarize and assess the
outcome(Source: Coach Alexander)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
CLASSIFICATION OF EVALUATION
I. Based on aim of Evaluation
A. Formative (Interim) evaluation: is undertaken to
improve the strategy, design, and performance or
way of functioning of an on-going program/project.
Is conducted during program development stage
and trough out project cycle & it provide
information for corrective action. (Are Process
evaluation, ex-ante evaluation and Project
Appraisal.)
B. Summative evaluation: on the other hand, is
undertaken to make an overall judgment about the
effectiveness of a completed project that is no
longer functioning, often to ensure accountability.
Focus on outcome and impact. It occur after project
completed. (Are Outcome evaluation ,impact
evaluation and Ex-post evaluation(after 2-5 years).
,
Example: Construction of residential house to have
better standard life or to have more disposable income.
Activities( Purchase of stone, sand, bars, aggregate,
labor, cement, Nile, Iron sheet)of construction are
Process Evaluation in way done on right standard ,timely
with required quality, materials well used and An
outcome Evaluation is change in life having more
income if rent out or stopping renting to have more
money and financially Impact Evaluation measure long
term range better livelihood due to more money. That
means better health, education and better nut
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
CLASSIFICATION OF EVALUATION…..
Formative evaluation is designed to pilot
he project as it progresses.
It asks the questions “What is happening?” and “How
is the project proceeding?”
Summary evaluation is designed to
appraise the project after completion.
It addresses the questions “What happened?” and
“What were the results?”
project evaluation must incorporate three
performance criteria simultaneously—cost,
schedule, & technical performance, and it must
account for the impact that changes in any one
work area will have on other related areas
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Classification of Evaluation….
II. Based on agency conducting (Who is Evaluating)
A. Participatory approach :is a broad concept focusing on the involvement of primary and
other stakeholders in an undertaking such as program planning, design implementation,
monitoring, and evaluation. It is a process of individuals and collective learning and
capacity development through which people become more aware and conscious of their
strengths and weaknesses, their wider social realities, and their visions and perspectives of
development outcomes. Is also Multi-vocal evaluation
B. Conventional Evaluation:It aims at making a judgment on the program for accountability
purposes rather than empowering program stakeholders. It strives for the scientific
objectivity of Monitoring and Evaluation findings, thereby distancing the External
Evaluators from stakeholders. It tends to emphasize the need for information on program
funding agencies and policymakers rather than program implementers and people
affected by the program..
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
III. Based on the ways of doing an evaluation
1. Self-evaluation:This involves an organization or project holding up a mirror to itself and
assessing how it is doing, as a way of learning and improving practice.
2. Internal evaluation: This is intended to involve as many people with a direct stake in the
work as possible. This may mean project staff and beneficiaries are working together on the
evaluation. If an outsider is called in, he or she is to act as a facilitator of the process, but not
as an evaluator.
3. Rapid participatory appraisal:This is a qualitative way of doing evaluations. It is semi-
structured and carried out by an interdisciplinary team over a short time. It is used as a
starting point for understanding a local situation and is a quick, cheap, and useful way to
gather information. It involves the use of secondary data review, direct observation, semi--
structured interviews, key informants, group discussions, games, diagrams, maps, and
calendars.
4. External evaluation:This is an evaluation done by a carefully chosen outsider or outsider
team with adequate experience and expertise
5. Interactive evaluation:This involves a very active interaction between an outside evaluator
or evaluation team and the personnel in an organization or project being evaluated
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
IV. Based on timing when they are carried out
1.Ex-ante evaluatie-is a forward-looking assessment of
the likely future effects of new initiatives and support
such as policies, programmes and strategies. It takes
place prior to the implementation of an initiative
2.Midterm evaluation or Formative evaluation intends
to improve performance, most often conducted during
the implementation phase of projects or programmes
3.Final or terminal evaluations or Summative
evaluation is conducted at the end of an initiative (or a
phase of that initiative) to determine the extent to
which anticipated outcomes were produced
4.Ex-post evaluation:usually conducted two years or
more after completion. Its purpose is to study how well
the initiative (programme or project) served its aims, to
assess sustainability of results and impacts and to draw
conclusions for similar initiatives in the future.(NUDAF
on UNDP ME hand book)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
V. Base on the use of Evaluation,
I. Democratic Evaluation: using evaluation
to facilitate conversation,
II. Utilization Evaluation: designing an
evaluation that has an intended use by
intended users
III. Developmental Evaluation conducting an
evaluation in accordance with a
developmental innovative project to
provide feedback and support decision
making in the process of the work
How to select an Evaluation type?
You can select the evaluation type based on
• The objectives and priorities of your project
• The purpose of the project evaluation
• The nature of the project (i.e., whether it is
process-oriented or outcome-oriented)
• The time frame for conducting the evaluation
(i.e., during or after the project)
• How, and by whom, the results will be used
• The time frame and budget for completing
the evaluation
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Evaluation Planning
Five components in Evaluation planning
1) Have a ME framework it can be fogram, ToC,
2) Assign Roles and responsibilities-RACI diagram
3) List all indicators( process , output, outcome impact) use of indictor reference sheet to
standardize to all participant
4) Budget: is very importins aspect and can not overly emphasized as money is required to
do a things. The to sustain the system we need to pump in the required resource. List all
the activities take care of each like HR(Less focused in money project when cost of living is
higher than salaries now days), fees to consultants and never compromise quality.
5) Activity plan: month or quarter or if yearly have a full flogged to meet a target and be
realistic to you and team.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Template of M& E Plan
Logo or Project Name
• Executive summary: written later
• Acronym:
• Glossary of terms
• Introduction:
1.Back ground,
1.1.Purpose/Objective of ME plan,
1.2.Overview of project,
1.3.logical framework
1.4 .List of indicators
2.ME framework
3.Data flow
4.Evaluation
5.Appendex (Indicator reference sheet)
5.1.Indicator reference sheet
5.2. Budget By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
BUDGET TEMPLATE
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Core Objectives/key Criteria/ of Evaluation
1. Relevance
2. Efficiency
3. Effectiveness
4. Impact
5. Sustainability
6. Causality
7. Alternative strategy
Let we see them one by one
1.Relevance:
It refers to whether the program examines the
appropriateness of results to the national needs and
priorities of target groups. The extent to which the
intervention is suited to the priorities, policies of target
group, partner country and donor. Is project weighted?
The appropriateness of project objectives to the problems
intended to be addressed, and to the physical and policy
environment within which the project operates.
2.Efficiency:
Efficiency tells you whether the input into the work is
appropriate in terms of the output. It assesses the results
obtained with the expenditure incurred and the
resources used by the program during a given time.
The analysis focuses on the relationship between the
quantity, quality, and timeliness of inputs, including
personnel, consultants, travel, training, equipment, and
miscellaneous costs, and the quantity, quality, and
timeliness of the outputs produced and delivered. Were
we are cost effect by achieving maximum resource with
minimum resource
Whether project outputs have been achieved at reasonable
cost, i.e how well inputs have been used in activities and
converted into outputs..
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Core Objectives of Monitoring and Evaluation….
3.Effectiveness:
Effectiveness is a measure of the extent to which a project
(or development program) achieves its specific objectives.
Looks at actual magnitude.
How well the outputs contributed to the achievement of
project purpose and the overall goal(s), and how well
assumed external conditions contributed to project
achievements. If, for example, we conducted an
intervention study to improve the agricultural production?
If their income increased and By How much?
4.Impact:
The effect of the project on its wider environment, and its
contribution to the wider policy, sector, and to country
wide development strategy.
The positive and negative change produced by an
intervention, direct or indirect, intended or unintended.
Possible questions---
What has happened as a result of project/program?
What the real difference the activity made to beneficery?
How money people have been affected?
5.Sustainability:
Sustainability refers to the durability of program results
after the termination of the technical cooperation
channeled through the program.
The likelihood that benefits produced by the project
continue to flow after external funding has ended.
6.Causality:
An assessment of causality examines the factors that have
affected the program results.
7.Alternative strategy:
Program evaluation may find significant unforeseen
positive or negative results of program activities.
Once identified, appropriate action can be taken to
enhance or mitigate them for a more significant overall
impact. (Source: IFAD)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
STEPS IN EVALUATION
Step-1:Desk Review: It involves reading all the relevant documents like
a) Project document
b) Review the implementation plan
c) Review Budget
d) Project report
Step-2:Data collection tools
i. Develop or contextualize data collection tools that align to the project/Program
ii. Test data collection tools
iii. Outline the list of people to be interviewed primarily those face-to-face
Step-3:Training of the enumerators
Train the enumerators on the use of tools especially if there need to collect from the field to engage them as data
collection agent.
Step-4:Piolet test tools
It helps to ensure to capture the necessary information and for improvement those tools need to be oriented well
Step-5:Lounch data collection
a. Interview people from the field
b. Interview key informant
c. Collect secondary data from renowned source By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
STEPS IN EVALUATION…………
Step-6:Data entry and analyses:
Enter the data, analyze and try to generate those tables
by use of Excel, SPSS, Stata and any other software.
Step-7:Generate report:
 Formulate recommendation for improvement.
 Ensure it is speaking to the project objectives bring
forward the key issues in executive summary for to
clarify in the body. In conclusion also based on the
assessment tell if project achieved its objective or not.
Hit the head of the nail do not bush around.
Step-8:Circulate the fist draft report:
Make your fist draft basic report to allow management to
give comment and allow them to criticize it. However, do
not change the finding the actual data rather change the
interpretation.
Step-9:Circulate to stakeholders for public comment
 To government
 Like minded organization
 Research institution and universities
Step-10:Improve the report further
Make the necessary changes and re-submit the final
version
Note: Be concise , specific and objective about fees
Be on time in submitting the report
Do not change the finding and be ethical
Explain if the project has achieved its objective or not
Follow the TOR as a guiding point
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Result based evaluation example by world Bank
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Example of monitoring by World Bank
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Monitoring and Evaluation framework
 It is table that outlines the indicators are to be used for project and how theses indicators will be
collected , analyzed and reported on. It will also explain who will be responsible for the Monitoring
and Evaluation activities( UNDP hand book of ME)
 Is a diagram that increase the understanding of project goal, objectives, output, outcomes and impact.
 It define how project will trigger different level of change from activities to impact
 It articulates external and internal elements that could affect a project success.
 It helps in accountability, assessment and decision making
 It outlines the indicators the program team will use to measure a program’s performance against its
stated objectives and outcomes
 It is the first stage in developing the plan for how the progress of a program will be quantified,
monitored and evaluated during scheduled intervals throughout the program lifecycle
 It is communication and planning tool
 It serve as a “Map” for program goal means the impact
 It links between resource and activities
 It guide the selection of indicators and subset of plan
By Guta Mengesha
Monitoring and Evaluation for Development and Governmental organizations
How to develope Monitoring and Evaluation framework
o It begins with understanding of planning tools like log frame, TOC and Result framework
o Illustrate the relation between input, process, output, outcome and goal
o Link a program to the desired impact (UNDP M& E Hand book)
Tools in M & E cycle
Tool is an instrument used to get job or tasks done.
Quality of these tools
A. Reliability: Tools must be consistent and adequate for carrying out
function of ME
A. Objectivity: the tools must not be biased and opinionated
B. Adequacy: the tool must thoroughly capture the required data and
information that is required to fulfill ME requirements
A. Usability: the tools should be easy to handle and not complicated for use
By Guta Mengesha
Monitoring and Evaluation for Development and Governmental organizations
Tools we use in M & E
There are about many tools in general but,
I. Planning Tools: which are used in planning process in M & E cycle
II. Data collection Tools: which are used in collection of data while carrying out M&E
functions
III. Auxiliary Tools: these are crosscutting tools can be used in both planning and data
collection.
IV. M&E Guidelines, policies and hand book: more contextualized as per your organization
system like UN. If such resource there refer during planning and other activities as your
Bible.
V. M&E software: like Horizon 5, It make easy for to record , analyze and report the data.
Source: Coach Alexander)
By Guta Mengesha
Monitoring and Evaluation for Development and Governmental organizations
The six main components of a project M&E system
1. Measurable objectives for the project and its components.
2. Structured indicators covering: inputs, process, outputs, outcomes, impact, and exogenous
factors.
3. Data collection mechanisms capable of monitoring progress over time, including baselines
and a means to compare progress and achievements against targets.
4. Building on baselines and data collection with an evaluation framework and methodology
capable of establishing causation (i.e capable of attributing observed change to given
interventions or other factors).
5. Clear mechanisms for reporting and use of M&E results in decision-making.
6. Sustainable organisational arrangements for data collection, management, analysis, and
reporting. (FAO 2020)
By Guta Mengesha
Monitoring and Evaluation for Development and Governmental organizations
Types M& E framework or (Planning Tools)
Are mainly four and the lease can go one(Log frame, ToC, Result framework &GOPP
1. LOGICAL FRAMEWORK(LOG FRAME) : Is a systematic and analytical planning process used
for result based planning of a project(Or program) and for associated M&E system.
It takes a more narrow and practical look at the relationship between inputs and results in a
project/program.
For example: Let we assume Ethiopian Renessa's Dam on Abaya River for power/ Reseravoir.
Input: Human resource, Cement, Machines, stone, Enforcement bars, sand, gravel
Process: Laborers, Engineers, Mixing cement, production of RCC, installation of turbine,
Generators
Output: Dam being constricted
Outcome: Generation of power
Impact: Economic empowerment of community
Earning increase of a country…..
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
 A log frame is a tool used to communicate the program logic, facilitate planning, and act as
the foundation for the monitoring and evaluation processes. It is also a visual representation
of how a program aligns to an organization’s strategy or program Theory of Change. The
way it works is by creating clear linkages between the successful implementation of
program activities (projects) and the realization of programmatic outcomes and goals.
 It complements the Results Framework (RF) in a Country Development Cooperation
Strategy (CDCS) by carrying the development hypothesis through from the overall program
to the supporting projects and their associated activities, in the form of the project
hierarchy (USAID manual)
 Logframe describes casualist.
 It is the tool that must be used as the basis for designing projects
 It is to the point and focuses only on one specific pathway that a project deals with to create
an intended change unlike ToC.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Key element in diagram( Source: USAID)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Key elements of Logic model
The key elements of the Log Frame Matrix include the narrative summary, the indicators and their data sources, and the
assumptions .
1. The narrative summary: identifies the hierarchy of results in the project hypothesis, from lowest level result to highest level
result, as well as the activities and other resources
Activities: work done by implementers
Inputs :which include the project activities, are the resources the project expends in order to produce outputs
Outputs :are what are produced as result of inputs
Outcome: is the key result to be achieved by the project
Goal /impact: is a higher-level result to which the project, along with others, will contribute
2. Indicators: measure a particular dimension or characteristic of a result in the Log Frame and are the basis for observing
progress toward that result. Are base for M & E are several in dimension like quality, quantity, time, location and behavioral
3. Data sources: specify exactly where the indicator data will come from, and when it will be collected.
4. Means of verification: Tools or means to obtain the information required by the indicators. It include project reports, field
verification photo, video, Ad-hoc studies, pre-post test
5.Assumptions: which are the most critical factors that could affect achievement of the project’s planned results and have
implications. Assumptions describe necessary internal and external conditions. Assumptions can also be risks, Risk assessment,
monitor, high/medium/low risk, Risk management counteract/re-design (activity level),Abandon (killer assumption) &(NORAD
2021) By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Example: Livelihood project for women in Jimma, Ethiopia LF(Source: Author)
PROJECT LOGIC DESCRIPTIONN INDICATORS VERIFICATION ASSUMPTIONS
GOAL Improved livelihood and Resilient
women
Sustainable livelihood, good
wellbeing, transformed life style
Women business site,
donor visit
Forecasted budget will be secured and all
partners are cooperative
OUTCOME Capacity developed,
Income generation,
Self-reliant women,
Skilled women, reduced vulnerability
% Life style change
% Dietary improved
% women owned business
Testimonies, survey
data, FGD with women,
church Member visit
New business will success, revenue will
grow, women will continue business
without support
OUTPUT Empowered women, understanding of local
opportunities, motivation to embark on new
income generation jobs
Venture established
%women started business,
improved income,% bank account
Photo, video, reports
Attendance, per-diem
sheet
Trade office help in acquiring business
license, technical school support, and
local government. Support
ACTIVITIES Identifying business priorities
Organize women
Secure market place
Starting a priorities
Quantity of items procured, service
given, number of women
completed training, skill gained,
item delivered(GRN)
Invoices, committee
approvals, field
verification
Local rulers support well, women are
committed towards their goal
INPUT Training, counseling Pen, notebooks,
material, service purchase,
stakeholder
Activity per day, applying training,
actual expense
Invoice, committee
recommendation
Women are willing to participate,
materials are available in local market
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
TYPES PLANNING TOOLS IN M & E CYCLE……………...
2.THEORY OF CHANGE(TOC):is a methodology for planning, participation and evaluation that is used in
companies, philosophy, non-for profit research and government sectors to promote social change. It defines long-
term goals and then maps backward to identify necessary preconditions.
It shows a bigger picture of all underlying process and possible pathways leading to long-term behavioral change
individual, institutional or community level.
Theory of Change (ToC) is a tool that outlines the strategic intent of the organization by illustrating how the change
will take place (or flow) from projects and activities all the way up to the portfolio level of the organization. In
essence, a ToC describes how the organization will realize the change it would like to see in the world. (PMD Pro
Guide 2021)
It visualize all possible evidences, assumptions linked to those changes.
It also provides the blueprint or pathway by mapping out long-term goals and linking them to existing preconditions,
besides specifying the causal link for each precondition. It also lists the basic assumptions about achieving a specific
set of outcomes underpinning the importance of the context, which in turn helps in agreeing with the logical
narrative for the intervention(kultar,Dharmedara &Varun)
.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Diagram of ToC
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
A Theory of Change is both a Framework and a Process:
It is a Framework - ToC enables organizations to visualize how to focus their energy on achieving their overall
outcomes, goals, and vision.
It is a Process – It allows organizations to identify milestones and conditions that must occur if a program is to
achieve its pathway to change
A Theory of Change is not:
• An absolute truth about how change will or must happen.
• A definitive approach intended to eliminate the uncertainty that will always exist in complex and emerging
social processes.
• A substitute for a logical framework as a rigid planning tool.
A theory of change defines all building blocks required to bring about a given long-term goal. This set of
connected building blocks — interchangeably referred to as outcomes, results, accomplishments, or preconditions
— is depicted on a map known as a pathway of change/change framework, which is a graphic representation of
the change process.(Source PM4NGO 2022)
A Theory of Change (ToC) is a detailed map of the work ahead that provides a path (or paths) for organizations
and programs. This path will include a variety of components that will assist the Program Manager and other
stakeholders in linking program activities to overall objectives
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
ToC Example for the long-term outcome is the long-term employment of domestic violence
survivors at a livable wage ( Page 1)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Page 2
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Assumptions
A. There are job available for women.
B. Pay livable wage, provide job security
C. Psychological empowerment
D. Women learn a job and compete
E. Minimum literacy to admit
F. Attend child care
G. Commitment in program
Intervention
1. Outreach campaign
2. Screening
3. Set up counseling sessions
4. Lead group sessions
5. Help provide for short-term crises, such as
housing evictions or court appearances
6. Provide one-on-one counseling
7. Develop curricular in electrical, plumbing,
carpentry and building maintenance
8. Conduct classes
9. Curricula and experiential learning situations
developed
10. Identify potential employers
11. Create employer database
12. Match women to internships
13. Help women secure permanent jobs
(source PMD Pro 2022)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
3.RESULT FRAMEWORK: is a planning, communication and management tool that emphasize on results to
provide clarity around your key project objectives. It was introduce in the mid 1990s by USAID as a new approach
to monitor its program throughout the agency.
It emphasize on result to provide clarity around the key project objectives and outline each of them intermediate
results, output, outcome relate and facilitate achievement of each objectives.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Result Framework is more output oriented, as it focuses on the ‘things that would be on
ground’ after completion of the project. These are basically the results that we want to achieve
and the underlying assumption is that achievement of these results would lead to achievement
of the envisaged objective.
4.ZOPP (Zielorientierte Projektplanung) or GOPP (Goal Oriented Project Planning): is
an adapted form of LFA that is suitable for the development sector. It also uses the same logical
approach of LFA, but it is more flexible in accommodating the qualitative and subjective nature
of issues inherent in the development sector
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Result framework by USDAID CDCS Goal, Development Objectives (DOs), Intermediate Results
(IRs), sub-IRs
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Monitoring & Evaluation (M&E) Framework Example (Risk and assumption added or separate ( Source tool4Dv
)
P
INDICATOR DEFINITION
How is it calculated?
BASELINE
What is the
current value?
TARGET
What is the
target
value?
DATA SOURCE
How will it be
measured?
FREQUENCY
How often will
it be measured?
RESPONSIB
LE
Who will
measure it?
REPORTING
Where will it
be reported?
Goal Percentage of Grades
6 primary students
continuing on to high
school.
Number students who start the
first day of Grade 7 divided by
the total number of Grade 6
students in the previous year,
multiplied by 100.
50% 60% Primary and high
school enrolment
records.
Annual Program
manager
Annual
enrolment
report
Outcomes Reading proficiency
among children in
Grade 6.
Sum of all reading proficiency
test scores for all students in
Grade 6 divided by the total
number of students in Grade 6.
Average score:
47
Average
score: 57
Reading
proficiency tests
using the national
assessment tool.
Every 6 months Teachers 6 monthly
teacher reports
Outputs Number of students
who completed a
summer reading
camp.
Total number of students who
were present on both the first
and last day of the summer
reading camp.
0 500 Summer camp
attendance
records.
End of every
camp
Teachers Camp review
report
Number of parents of
children in Grade 6
who helped their
children read at home
in the last week.
Total number of parents who
answered “yes” to the question
“Did you help your child read at
home any time in the last
week?”
0 500 Survey of parents. End of every
camp
Program
officer
Survey report
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
II. Data collection Tools
Data Definition
Individual fact, statics or piece of information, frequently numerical. In more technical sense, data
are collection of qualitative or quantitative variables concerning one or more things , where a datum
is a single value of a single variable.
Data is factual information used as a basis of reasoning, discussion or calculation
Must be recorded; if it is not written down, it is not data
Example: price and cost, weight, employee name, product name,
Data Quality Definition
• How well the information collected represents the program activities
• Refers to the worth/accuracy of the information collected
• Data that reflects true performance
• Focuses on ensuring that data management process is of a high standard
Tools we use to get your data. These are qualitative and quantities.
a) Qualitative data collection tools: Focus groups discussions(FGD), where you in the field to talk to a group of 9
to 10 individual in group not one on one.
b) Quantitative data collection tools: interview, questionnaire, survey (on field to collect data ), observation ( You
don’t talk to any one
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Measuring Quality Data
The data is only as good as its quality. The quality of data is measured against six
criteria:
1. Validity – data which clearly, directly and adequately represents the result that was
intended to be measured
2. Reliability – if the process were repeated over and over again it would yield the
same result
3. Integrity – data have been protected from deliberate bias or manipulation for
political/personal reasons
4. Precision – data are accurate with sufficient detail
5. Timeliness – data are current and information is available on time
6. Confidentiality – clients are assured that their data will be maintained according to
national and/or international standards for data
Good quality data can be recognized by checking the FACTS
F – Formatted Properly
A – Accurate
C – Complete
T – Timely
S – Segmented properly By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Dimension of data quality or standard
Data quality audit(DQA): is an activity conducted to
(1) verify the quality of reported data for key indicators at a selected site
(2) Assess the ability of data management system to collect and report the quality data.
Dimension How it is measured
Accuracy How well does the piece of information reflect reality? Is true reflection on ground?
Completeness Does it fulfill your expectation of what’s comprehensive? With no gap or convincing
Consistency Dose information stored in one place relevant data stored elsewhere? Use of similar tools
Timeliness Is your information available when you need it? Should not assume longer time
Validity Is information in specific format, does it follow business rule, is it in an unusable format?
Uniqueness Is this the only instance in which this information appears in the database? Organic of from source
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
What are reasons why we conduct data quality audit?
1) It build confidence among stakeholders
2) It ensure that data are free of errors
3) It promote good decision making and corrective action by the management
4) It promote efficiency and effectives in the implantation of activities
5) It is good M&E practice
Who will do a DQA
a) Any ME staff
b) Any project member
c) Independent professional
Steps in conducting DQA
Step 1:Preparation:Select site, request document, review document, prepare for actual site visit
Step 2:Assessment: assess ME unit and data management system, look at their integrity I
assessing data collection and reporting system
Step 3:Considataion and reporting, Draft report(Finding, conclusion and recommendations0
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
INDICATORS
Indicators provide parameters against which to assess project performance and achievement in
terms of quantity (how many/how much?), time (when?), target group (who?) and quality (how
good?). Indicators can be quantitative, (number of people, number of ha, % of adoption), semi-
quantitative (scale, ranking), or qualitative (perceptions, opinions, categories).( Rioux 2021)
An indicator is a qualitative or quantitative measure of program performance that is used to
demonstrate changes and which details whether the program results are being or have been
achieved.(UNFPA)
For indicators to be useful for monitoring and evaluating program results, it is therefore
important to identify indicators that are direct, objective practical and adequate and to regularly
update them.
Indicator is a variable whose value changes from the baseline level (at the time the program
began) to a new value after the program and its activities have made their impact felt. (H.
Ultimate)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
How to write a monitoring and evaluation (M&E) framework
• The first step in writing an M&E framework is to decide which indicators you will use to
measure the success of your program
• You need to choose indicators for each level of your program – outputs, outcomes and goal
Here is an example of some indicators for the goal, outcome and output of an education
program.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
DEFINING INDICATOR
Once you have chosen your indicators you need to write a definition for each one. The
definition describes exactly how the indicator is calculated. If you don’t have definitions there
is a serious risk that indicators might be calculated differently at different times, which means
the results can’t be compared.
Here is an example of how one indicator in the education program is defined:
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Measure the baseline and set the target
Before you start your program you need to measure the starting value of each indicator – this
is called the “baseline”. In the education example above that means you would need to
measure the current percentage of Grade 6 students continuing on to Grade 7 (before you
start your program).
Once you know the baseline you need to set a target for improvement. Before you set the
target it’s important to do some research on what a realistic target actually is. Many people
set targets that are unachievable, without realising it.
Indicator results are used to assess whether the program is working or not, so it’s very
important that decision makers and stakeholders (not just the donor) have access to them as
soon as possible.
Finally, decide who will be responsible for measuring each indicator.
 Output indicators are often measured by field staff or program managers
 Outcome and goal indicators may be measured by evaluation consultants or even national
agencies
 Progress reports (source www. tools4dev )
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Baseline Survey Plan should address each of the
following items:
Background and purpose of baseline study
• Description of program design and target beneficiaries
• Objective of study including list of baseline
indicators drawn from logical framework
• Review of existing data sources
Data collection methods
• Defined units of study (communities, households,
individuals, etc.)
• Proposed primary data collection methods
• Sampling description
Survey design
• Survey questionnaire and/or topical outline
• Arrangements for pre-testing
Guidelines for Fieldwork
• Composition of assessment team
• Training to be provided to enumerators
• Timetable for fieldwork
• Arrangements for supervision/coordination in the field
Data analysis procedures
• Arrangements for data entry and processing
(including data cleaning)
• Proposed framework for analysis
• Proposed data tables indicator calculations and
criteria for data desegregation
• Training required for data management and analysis
Reporting and feedback
• Proposed format of baseline study report
• Arrangement for presentation(Source: WFP 2003)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
INDICATORS…..
An indicator is also a measurement. It measures the value of the change in meaningful units
that can be compared to past and future units. This is usually expressed as a percentage or
a number.
Finally, an indicator focuses on a single aspect of a program or project. This aspect may be an
input, an output or an overarching objective, but it should be narrowly defined in a way that
captures this one aspect as precisely as possible.
In the context of M&E, an indicator is said to be a quantitative standard of measurement or
an instrument which gives us information (UNAIDS, 2010). Indicators help to capture data and
provide information to monitor performance, measure achievement, determine accountability
and improve the effectiveness of projects/ programmes.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Keeping these points in view, we emphasize that good indicators should possess the following
characteristics:
 Relevant to the program.
 Relevant to the national standards.
 Feasible to construct.
 Easy to interpret.
 Enable tracking of change over time.
The way the indicators are developed varies by the organizations/project and objectives therein. DOPA
criteria encapsulate the most important requirements of useful indicators. In DOPA, the letters D, O, P,
and A stand as follows:
D: for Direct, meaning that an indicator must be able to measure the intended change directly closely;
O: for Objective. This means that your indicator must be unambiguous with a clear operational
definition of every term stated in the objective;
P: for Practical, which means that an indicator must be practical in terms of data collection, budget and
timelines to be helpful in decision-making;
A: for adequacy, which says that an adequate number of indicators must be considered so that they
can capture the progress made adequately towards the desired output.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Types of Program Performance Indicators
An indicator can be classified as;
1. Input indicator
2. Process/performance indicator
3. Output indicator
4. Outcome indicator
5. Impact indicator
6. Exogenous Indicator
1. Input indicator
Input indicators are quantified and time-bound statements of
resources to be provided. Information on these indicators comes
from accounting and management records. Input indicators are
often left out of discussions of project monitoring though they
are part of the management information system.
Are quantified and time-bound statements of the resources
financed by the project, and are usually monitored by routine
accounting and management records . Input indicators are used
mainly by the managers closest to the tasks of implementation
and are consulted frequently, as often as daily or weekly.
Here are a few examples of this indicator:
 Vehicle operating costs for the crop extension service
 Appointment of staff
 Provision of building
 Home care supplies purchased per month
2. Process/Performance indicator
Performance indicators measure what happens during
implementation. It monitor the activities completed during
implementation, and are often specified as milestones or
completion of sub-contracted tasks, as set out in time-scaled
work schedules. Best indicator procurement process.
Often they, are tabulated as a set of contracted completions or
milestone events taken from an activity plan. Exempels are:
• Date by which building site clearance must be completed
• Latest date for delivery of fertilizer to farm stores
• Proportion of VCT clients returning to collect their COVID-
19 test results
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Types of Program Performance Indicators…..
3. Output Indicator
Output indicators show the immediate physical and financial outputs
or outcomes of the Project: physical quantities, organizational
strengthening, and initial flow of services.
It monitor the production of goods and delivery of services by the
project. They are often evaluated and reported with the use of
performance measures based on cost or operational ratios .
Exempels include
o Cost per kilometer of road construction.
o Crop yield per acre of land.
o The ratio of textbooks to students.
o Time is taken to process a credit application
4. Outcome Indicator
Are specific to a project’s purpose and the logical chain of cause and
effect that underlies its design. Often achievement of outcomes will
depend at least in part on the actions of beneficiaries in responding
to project outputs, and indicators will depend on data collected from
beneficiaries example
o perceptions of improved reliability of irrigation supply,
o proportion of farmers who have tried a new variety of seed and
intend to use it again.
o percentage of women satisfied with the maternity health care
they receive
5.Impact indicator
Refer to medium or long-term developmental change to which the
project is expected to contribute.
For example:
• (health) incidence of low birth weight
• (education) continuation rates from primary to secondary
education by sex.
6.Exogenous indicators
Are those that cover factors outside the control of the
project but which might affect its outcome, including risks
and performance.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Types of Program Performance Indicators…..
Indicators can also be classified base on the following characters
1.Quantitative indicators:indicators are designed to provide hard data that permit rigorous
statistical analyses,
 Number
 Percent
 Rate
 Ratio/ Proportion
2.Qualitative indicators:indicators provide insights into changes in organizational processes,
attitudes, beliefs, and behavior of individuals.
 Compliance with
 Quality of
 Extent of
 Level of
3.Efficiency indicators: tell us if we are getting the desired output against our investment.
Cost per unit of (clients served, student, patient, etc.)
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Types of Program Performance
Indicators…..
4.Economic development indicators
 Average annuel Household income
 Earned income level
 Per capita income
 Percent of people below the poverty line
 The growth rate of business
5. Social development indicators
 Death rate
 Life expectancy at birth
 Infant mortality rate
 Literacy rate
 Percent of dwellings with safe water
 Student-teacher ratio
 School enormst rate
6.Political/ Organizational development
indicators
 Number of community organizations
 Type of organized sports
 Participation in the youth groups
 Participation in public meetings
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Common Problems in Specifying Indicators
These are:
1) The indicators are irrelevant and do not correspond to the output level;
2) The indicators do not include an objective standard against which achievement can be
assessed;
3) The indicators constructed are without reference to the baseline;
4) The indicators are numerous and redundant with little consideration of time, human
resources and cost required to collect data for the construction of the indicators;
5) The indicators are unrealistic and sometimes very difficult to conceptualize and measure;
6) The indicators are not representative of the universe.(Source: UNFPA)
UNFPA put forward some suggestions that can be practiced. These include, among others, the
involvement of the stakeholders in the process of selecting the indicators, procuring baseline
data, the involvement of all those who are partners in the program, following the program
design.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
TERM OF REFERENCE (TOR)
Like all other research proposals, evaluation proposals are also prepared in response to a request for bid or
request for proposal (RFP).
A proposal is a formal document issued by a sponsor to solicit services from the evaluators. These proposals are
prepared by the Terms of Reference (TOR) provided by the sponsor and included in the RFP.
It becomes almost mandatory on the part of the bidder to follow this TOR in the preparation of the proposal.
The main sections in Terms of Reference (TOR) for an evaluation process usually include, among others, are:
1. Background.
2. Objectieve.
3. Evaluation scope and focus.
4. The time frame and deliverable.
5. Methodologie.
6. Information sources
7. Evaluation of team composition
8. Logistical support
9. Involvement of key stakeholders
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
TERM OF REFERENCE (TOR)…..
1.Background
This section is designed to provide the information on the history and current status of the
program/project being evaluated, including how it works (its objectives, strategies, and management
process), duration, budget, and important stakeholders such as donors, partners, implementing
organizations.
2.Objectives
This section narrates what the project or the organization wants to achieve out of the evaluation.
3.Evaluation scope and focus
In consultation with the stakeholders, identify the significant evaluation objectives and questions in TOR,
the validity of design, effectiveness, efficiency, impact, sustainability, factors affecting performance, etc.
4.The time frame and deliverable
Here you indicate the duration of the assignment and different phases of the assignment from inception
report to the final report.
.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
TERM OF REFERENCE (TOR)…..
5.Methodology
Here you might prefer a particular approach of data collection or a specific evaluation design.
6.Information sources
List the information to be used by the evaluation, such as monitoring, review, evaluation, and
other reports
7.Evaluation team composition
Decide on the number of team members, specify the members’ profile
8.Logistical support
This includes time-frame, costs, team composition requirements, and the like.
9.Involvement of key stakeholders
Specify the involvement of such stakeholders as internal staff, program partners, donors who
will make use of the evaluation results
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
COMPONENT OF AN EVALUATION REPORT
Whether you are monitoring or evaluating, at some point, or points, there will be a reporting process.
This reporting process follows the stages of analyzing information.
You will report to different stakeholders (Board, Management team, Staff, Beneficiaries, and Donors) in
different ways, sometimes in written form, sometimes verbally and, increasingly, making use of tools
as Power-point presentations, slides and videos.
A written evaluation report may be prepared in line with the following format; these are the
components of an evaluation report.
Component includes
1. Executive summary.
2. Preface.
3. Content page.
4. Introduction.
5. Findings.
6. Conclusions.
7. Recommendations.
8. Appendices.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
COMPONENT OF AN EVALUATION REPORT……….
1. The executive summary is intended for time-constraint readers but must be attractive to
make people curious so that they want to read the entire report.
2. A preface is a place where you become courteous to thank people and make broad
comments about the processes and findings.
3. The introductory section is designed to deal with the background of the project, the need
for the evaluation, and the entire activity in a nutshell.
4. The findings section will accommodate the results about the efficiency, effectiveness, and
impact thereof that have emerged.
5. The conclusions you draw will follow your findings, while your
6. Recommendations will address weaknesses followed by what needs to be done to
strengthen the programs being evaluated.
7. Appendix: Include your Terms of References (TOR), a questionnaire used in the evaluation,
and any other reference documents in the appendices, which you could not accommodate
inside the text By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Example from UNICEF Guide “Program Manager’s
Planning, Monitoring, and Evaluation Toolkit (2016):
Evaluation Report: Suggested Outline
Title page
Name of project/program or theme being evaluated
Country of project/program or theme
Name of the organization to which the report is submitted.
Names and affiliations of the evaluators.
Table of contents
Identify the chapters and major sections along with the
page numbers.
List of tables, graphs, and charts by page numbers.
Acknowledgments
Identify those who contributed to the evaluation.
List of acronyms/ abbreviations
For example:
TOR: Terms of reference
NGO: Non-Govt. Organization
Executive summary.
Summarize essential information on the subject being
evaluated, the purpose and objectives of the evaluation,
methods applied, and significant limitations, the most
important findings, onclusions, and recommendations in
priority order. provides a summary of the complete
evaluation starting from its design, intended users,
evaluation focus, key results and its implication on
programme application
Introduction
Describe the project/program/theme being evaluated. This
includes the problems that the interventions are
addressing; the aims, strategies, scope, and cost of the
response; its key stakeholders and their roles in
implementing the intervention.
Summarize the evaluation purpose, objectives, and key
questions. Explain the rationale for the selection/non-
selection of evaluation criteria.
Describe the methodology employed to conduct the
evaluation and its limitations, if any.
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Example UNICEF………
List who were involved in conducting the evaluation and
what their roles were.
Describe the structure of the evaluation report.
Findings and conclusions
State findings based on the evidence derived from the
information collected. Assess the degree to which the
intervention design is applying results-based management
principles. In providing a critical assessment of
performance, analyze the linkages between inputs,
activities, outputs, outcomes, and if possible impact.
Summarize the achievement of results in quantitative and
qualitative terms. Analyze factors that affected
performance as well as unintended effects, both positive
and negative. Discuss the relative contributions of
stakeholders to the achievement of the results.
Conclusions should be substantiated by the findings and
be consistent with the data collected. They must relate to
the evaluation objectives and provide answers to the
evaluation questions. They should also include a
discussion of the reasons for successes and failures,
especially the constraints and enabling factors.
Lessons learned
Based on the evaluation findings, the overall experience in
other contexts, and whenever possible, provide lessons
learned that may be applicable in different situations as
well. Include both positive and negative lessons.
Recommendations
Formulate relevant, specific, and realistic
recommendations that are based on the evidence gathered,
conclusions made, and lessons learned. Discuss their
anticipated implications. Consult key stakeholders when
developing the recommendations.
Provide suggested timelines and cost estimates (where
relevant) for implementation.
Annexes
Attach a TOR (for the evaluation).
List persons interviewed, sites visited.
List documents reviewed (reports, publications).
Append the data collection instruments (e.g., copies of
questionnaires, surveys, etc.).
By Guta Mengesha
Monitoring and Evaluation for development and governmental organizations
Reference:
 Organisation for European Co-operation and Development (OECD-2021)
 Monitoring and Evaluation step by step guide by World Bank hand book
 Thomas Winderi, What is difference between Monitoring and evaluation
 Estrella and Gaventa (2020) Monitoring and Evaluation
 Coach Alexander Monitoring and evaluation framework 2022
 Handbook on Monitoring and Evaluation by :The International Fund for Agricultural
Development (IFAD)
 Monitoring and Evaluation hand book UNDP
 Monitoring and Evaluation hand book UNICEF
 Monitoring and Evaluation hand book FAO
 Project management manual by USAID
 Kultar Dharmedara &Varun.Theory of change and PMD pro 2020
 World Food Program A base line survey 2013
 Program Indicator by UNFPA, UNAIDS, UN Women Ultimate H book and www. tools4dev
By Guta Mengesha

More Related Content

Similar to Monitoring and Evaluation for development and governmental organizations.pdf

evaluation of deped proj,prog and activi
evaluation of deped proj,prog and activievaluation of deped proj,prog and activi
evaluation of deped proj,prog and activi
Mei Miraflor
 
Introduction to M&E- WG1&2.ppt
Introduction to M&E- WG1&2.pptIntroduction to M&E- WG1&2.ppt
Introduction to M&E- WG1&2.ppt
MdFarhanShahriar3
 
COURSEWORK.pdf
COURSEWORK.pdfCOURSEWORK.pdf
COURSEWORK.pdf
mohamedsaidomar6
 
A2011214171642 1
A2011214171642 1A2011214171642 1
A2011214171642 1
Sreejith Rajan P
 
Project monitoring and control & planning for monitoring
Project monitoring and control & planning for monitoringProject monitoring and control & planning for monitoring
Project monitoring and control & planning for monitoring
Sandeep Kumar
 
Importance of Monitoring and Evaluation to Decentralization
Importance of Monitoring and Evaluation to DecentralizationImportance of Monitoring and Evaluation to Decentralization
Importance of Monitoring and Evaluation to Decentralization
Issam Yousif 2000+
 
Project Monitoring and Evaluation
Project Monitoring and Evaluation Project Monitoring and Evaluation
Project Monitoring and Evaluation
Central University of Karnataka Kalaburagi
 
Monitoring & Evaluation Framework - Fiinovation
Monitoring & Evaluation Framework - FiinovationMonitoring & Evaluation Framework - Fiinovation
Monitoring & Evaluation Framework - Fiinovation
Fiinovation | Innovative Financial Advisors Pvt.Ltd
 
M & E Fundamentals.
M & E Fundamentals.M & E Fundamentals.
M & E Fundamentals.
PrestonAssociates
 
Presentation on M&E, Presented by Sushanta Kumar Sarker
Presentation on M&E, Presented by Sushanta Kumar SarkerPresentation on M&E, Presented by Sushanta Kumar Sarker
Presentation on M&E, Presented by Sushanta Kumar Sarker
Sushanta Kumar Sarker
 
Presentation on M&E, presented by Sushanta kumar sarker, Bangladesh
Presentation on M&E, presented by Sushanta kumar sarker, BangladeshPresentation on M&E, presented by Sushanta kumar sarker, Bangladesh
Presentation on M&E, presented by Sushanta kumar sarker, Bangladesh
Sushanta Kumar Sarker
 
Day 1
Day 1Day 1
Research, Monitoring and Evaluation, in Public Health
Research, Monitoring and Evaluation, in Public HealthResearch, Monitoring and Evaluation, in Public Health
Research, Monitoring and Evaluation, in Public Health
aghedogodday
 
Group Two Presentation tr.pptx at mountains of the moon university
Group Two Presentation tr.pptx at mountains of the moon universityGroup Two Presentation tr.pptx at mountains of the moon university
Group Two Presentation tr.pptx at mountains of the moon university
wyclifkugonza
 
Implementation and Evaluation
Implementation and EvaluationImplementation and Evaluation
Implementation and Evaluation
Jo Balucanag - Bitonio
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
selam49
 
UNIT - I - ppe.pptx
UNIT - I - ppe.pptxUNIT - I - ppe.pptx
UNIT - I - ppe.pptx
michael191017
 
Monitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesMonitoring and Evaluation of Health Services
Monitoring and Evaluation of Health Services
Nayyar Kazmi
 
Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]
skzarif
 
M&e notes unit 2(7)
M&e notes unit 2(7)M&e notes unit 2(7)
M&e notes unit 2(7)
Abraham Ncunge
 

Similar to Monitoring and Evaluation for development and governmental organizations.pdf (20)

evaluation of deped proj,prog and activi
evaluation of deped proj,prog and activievaluation of deped proj,prog and activi
evaluation of deped proj,prog and activi
 
Introduction to M&E- WG1&2.ppt
Introduction to M&E- WG1&2.pptIntroduction to M&E- WG1&2.ppt
Introduction to M&E- WG1&2.ppt
 
COURSEWORK.pdf
COURSEWORK.pdfCOURSEWORK.pdf
COURSEWORK.pdf
 
A2011214171642 1
A2011214171642 1A2011214171642 1
A2011214171642 1
 
Project monitoring and control & planning for monitoring
Project monitoring and control & planning for monitoringProject monitoring and control & planning for monitoring
Project monitoring and control & planning for monitoring
 
Importance of Monitoring and Evaluation to Decentralization
Importance of Monitoring and Evaluation to DecentralizationImportance of Monitoring and Evaluation to Decentralization
Importance of Monitoring and Evaluation to Decentralization
 
Project Monitoring and Evaluation
Project Monitoring and Evaluation Project Monitoring and Evaluation
Project Monitoring and Evaluation
 
Monitoring & Evaluation Framework - Fiinovation
Monitoring & Evaluation Framework - FiinovationMonitoring & Evaluation Framework - Fiinovation
Monitoring & Evaluation Framework - Fiinovation
 
M & E Fundamentals.
M & E Fundamentals.M & E Fundamentals.
M & E Fundamentals.
 
Presentation on M&E, Presented by Sushanta Kumar Sarker
Presentation on M&E, Presented by Sushanta Kumar SarkerPresentation on M&E, Presented by Sushanta Kumar Sarker
Presentation on M&E, Presented by Sushanta Kumar Sarker
 
Presentation on M&E, presented by Sushanta kumar sarker, Bangladesh
Presentation on M&E, presented by Sushanta kumar sarker, BangladeshPresentation on M&E, presented by Sushanta kumar sarker, Bangladesh
Presentation on M&E, presented by Sushanta kumar sarker, Bangladesh
 
Day 1
Day 1Day 1
Day 1
 
Research, Monitoring and Evaluation, in Public Health
Research, Monitoring and Evaluation, in Public HealthResearch, Monitoring and Evaluation, in Public Health
Research, Monitoring and Evaluation, in Public Health
 
Group Two Presentation tr.pptx at mountains of the moon university
Group Two Presentation tr.pptx at mountains of the moon universityGroup Two Presentation tr.pptx at mountains of the moon university
Group Two Presentation tr.pptx at mountains of the moon university
 
Implementation and Evaluation
Implementation and EvaluationImplementation and Evaluation
Implementation and Evaluation
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
 
UNIT - I - ppe.pptx
UNIT - I - ppe.pptxUNIT - I - ppe.pptx
UNIT - I - ppe.pptx
 
Monitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesMonitoring and Evaluation of Health Services
Monitoring and Evaluation of Health Services
 
Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]
 
M&e notes unit 2(7)
M&e notes unit 2(7)M&e notes unit 2(7)
M&e notes unit 2(7)
 

Recently uploaded

Domestic investment in pension schemes - a good thing?
Domestic investment in pension schemes - a good thing?Domestic investment in pension schemes - a good thing?
Domestic investment in pension schemes - a good thing?
Henry Tapper
 
Northeastern University degree offer diploma Transcript
Northeastern University degree offer diploma TranscriptNortheastern University degree offer diploma Transcript
Northeastern University degree offer diploma Transcript
oywfdy
 
Most Girls Call Navi Mumbai 9930245274 Provide Best And Top Girl Service And ...
Most Girls Call Navi Mumbai 9930245274 Provide Best And Top Girl Service And ...Most Girls Call Navi Mumbai 9930245274 Provide Best And Top Girl Service And ...
Most Girls Call Navi Mumbai 9930245274 Provide Best And Top Girl Service And ...
sharonblush
 
Girls Call Marine lines 9910780858 Provide Best And Top Girl Service And No1 ...
Girls Call Marine lines 9910780858 Provide Best And Top Girl Service And No1 ...Girls Call Marine lines 9910780858 Provide Best And Top Girl Service And No1 ...
Girls Call Marine lines 9910780858 Provide Best And Top Girl Service And No1 ...
maigasapphire
 
Monthly Market Risk Update: July 2024 [SlideShare]
Monthly Market Risk Update: July 2024 [SlideShare]Monthly Market Risk Update: July 2024 [SlideShare]
Monthly Market Risk Update: July 2024 [SlideShare]
Commonwealth
 
How do I sell my Hamster kombat currency?
How do I sell my Hamster kombat currency?How do I sell my Hamster kombat currency?
How do I sell my Hamster kombat currency?
CRYPTO SPACE 🪙
 
AI.It's simple to believe artificial intelligence (AI) has the solution to ev...
AI.It's simple to believe artificial intelligence (AI) has the solution to ev...AI.It's simple to believe artificial intelligence (AI) has the solution to ev...
AI.It's simple to believe artificial intelligence (AI) has the solution to ev...
Thailand Appreciative Inquiry Network
 
Hamster kombat withdrawal date - what you need to know!
Hamster kombat withdrawal date - what you need to know!Hamster kombat withdrawal date - what you need to know!
Hamster kombat withdrawal date - what you need to know!
CRYPTO SPACE 🪙
 
How can I withdraw my hamster tokens to real money in India.
How can I withdraw my hamster tokens to real money in India.How can I withdraw my hamster tokens to real money in India.
How can I withdraw my hamster tokens to real money in India.
CRYPTO SPACE 🪙
 
20240710 Calibre Mining 2024 Investor Presentation.pdf
20240710 Calibre Mining 2024 Investor Presentation.pdf20240710 Calibre Mining 2024 Investor Presentation.pdf
20240710 Calibre Mining 2024 Investor Presentation.pdf
Adnet Communications
 
学生妹丝袜【网芷:ht28.co】校服做爱下载app观看>>>[网趾:ht28.co】]<<<
学生妹丝袜【网芷:ht28.co】校服做爱下载app观看>>>[网趾:ht28.co】]<<<学生妹丝袜【网芷:ht28.co】校服做爱下载app观看>>>[网趾:ht28.co】]<<<
学生妹丝袜【网芷:ht28.co】校服做爱下载app观看>>>[网趾:ht28.co】]<<<
amzhoxvzidbke
 
how do I sell my hamster coins on OKX exchange.
how do I sell my hamster coins on OKX exchange.how do I sell my hamster coins on OKX exchange.
how do I sell my hamster coins on OKX exchange.
CRYPTO SPACE 🪙
 
How do I cash out hamster kombat tokens?
How do I cash out hamster kombat tokens?How do I cash out hamster kombat tokens?
How do I cash out hamster kombat tokens?
CRYPTO SPACE 🪙
 
how to sell hamster kombat tokens for USD.
how to sell hamster kombat tokens for USD.how to sell hamster kombat tokens for USD.
how to sell hamster kombat tokens for USD.
CRYPTO SPACE 🪙
 
What website can I sell my hamster kombat tokens.
What website can I sell my hamster kombat tokens.What website can I sell my hamster kombat tokens.
What website can I sell my hamster kombat tokens.
CRYPTO SPACE 🪙
 
Perfect trading ebook - is it a legitimate forex trading Ebook or SCAM!
Perfect trading ebook - is it a legitimate forex trading Ebook or SCAM!Perfect trading ebook - is it a legitimate forex trading Ebook or SCAM!
Perfect trading ebook - is it a legitimate forex trading Ebook or SCAM!
TRADERS SPACE 💱
 
how to increase profit as an hamster Miner - earn over 100,000,000+ token's p...
how to increase profit as an hamster Miner - earn over 100,000,000+ token's p...how to increase profit as an hamster Miner - earn over 100,000,000+ token's p...
how to increase profit as an hamster Miner - earn over 100,000,000+ token's p...
CRYPTO SPACE 🪙
 
how to make money from hamster kombat: beginners guide.
how to make money from hamster kombat: beginners guide.how to make money from hamster kombat: beginners guide.
how to make money from hamster kombat: beginners guide.
CRYPTO SPACE 🪙
 
Can I sell my hamster kombat tokens Now! (latest update - 2024)
Can I sell my hamster kombat tokens Now! (latest update - 2024)Can I sell my hamster kombat tokens Now! (latest update - 2024)
Can I sell my hamster kombat tokens Now! (latest update - 2024)
CRYPTO SPACE 🪙
 
Girls Call DN Nagar 9910780858 Provide Best And Top Girl Service And No1 in City
Girls Call DN Nagar 9910780858 Provide Best And Top Girl Service And No1 in CityGirls Call DN Nagar 9910780858 Provide Best And Top Girl Service And No1 in City
Girls Call DN Nagar 9910780858 Provide Best And Top Girl Service And No1 in City
margaretblush
 

Recently uploaded (20)

Domestic investment in pension schemes - a good thing?
Domestic investment in pension schemes - a good thing?Domestic investment in pension schemes - a good thing?
Domestic investment in pension schemes - a good thing?
 
Northeastern University degree offer diploma Transcript
Northeastern University degree offer diploma TranscriptNortheastern University degree offer diploma Transcript
Northeastern University degree offer diploma Transcript
 
Most Girls Call Navi Mumbai 9930245274 Provide Best And Top Girl Service And ...
Most Girls Call Navi Mumbai 9930245274 Provide Best And Top Girl Service And ...Most Girls Call Navi Mumbai 9930245274 Provide Best And Top Girl Service And ...
Most Girls Call Navi Mumbai 9930245274 Provide Best And Top Girl Service And ...
 
Girls Call Marine lines 9910780858 Provide Best And Top Girl Service And No1 ...
Girls Call Marine lines 9910780858 Provide Best And Top Girl Service And No1 ...Girls Call Marine lines 9910780858 Provide Best And Top Girl Service And No1 ...
Girls Call Marine lines 9910780858 Provide Best And Top Girl Service And No1 ...
 
Monthly Market Risk Update: July 2024 [SlideShare]
Monthly Market Risk Update: July 2024 [SlideShare]Monthly Market Risk Update: July 2024 [SlideShare]
Monthly Market Risk Update: July 2024 [SlideShare]
 
How do I sell my Hamster kombat currency?
How do I sell my Hamster kombat currency?How do I sell my Hamster kombat currency?
How do I sell my Hamster kombat currency?
 
AI.It's simple to believe artificial intelligence (AI) has the solution to ev...
AI.It's simple to believe artificial intelligence (AI) has the solution to ev...AI.It's simple to believe artificial intelligence (AI) has the solution to ev...
AI.It's simple to believe artificial intelligence (AI) has the solution to ev...
 
Hamster kombat withdrawal date - what you need to know!
Hamster kombat withdrawal date - what you need to know!Hamster kombat withdrawal date - what you need to know!
Hamster kombat withdrawal date - what you need to know!
 
How can I withdraw my hamster tokens to real money in India.
How can I withdraw my hamster tokens to real money in India.How can I withdraw my hamster tokens to real money in India.
How can I withdraw my hamster tokens to real money in India.
 
20240710 Calibre Mining 2024 Investor Presentation.pdf
20240710 Calibre Mining 2024 Investor Presentation.pdf20240710 Calibre Mining 2024 Investor Presentation.pdf
20240710 Calibre Mining 2024 Investor Presentation.pdf
 
学生妹丝袜【网芷:ht28.co】校服做爱下载app观看>>>[网趾:ht28.co】]<<<
学生妹丝袜【网芷:ht28.co】校服做爱下载app观看>>>[网趾:ht28.co】]<<<学生妹丝袜【网芷:ht28.co】校服做爱下载app观看>>>[网趾:ht28.co】]<<<
学生妹丝袜【网芷:ht28.co】校服做爱下载app观看>>>[网趾:ht28.co】]<<<
 
how do I sell my hamster coins on OKX exchange.
how do I sell my hamster coins on OKX exchange.how do I sell my hamster coins on OKX exchange.
how do I sell my hamster coins on OKX exchange.
 
How do I cash out hamster kombat tokens?
How do I cash out hamster kombat tokens?How do I cash out hamster kombat tokens?
How do I cash out hamster kombat tokens?
 
how to sell hamster kombat tokens for USD.
how to sell hamster kombat tokens for USD.how to sell hamster kombat tokens for USD.
how to sell hamster kombat tokens for USD.
 
What website can I sell my hamster kombat tokens.
What website can I sell my hamster kombat tokens.What website can I sell my hamster kombat tokens.
What website can I sell my hamster kombat tokens.
 
Perfect trading ebook - is it a legitimate forex trading Ebook or SCAM!
Perfect trading ebook - is it a legitimate forex trading Ebook or SCAM!Perfect trading ebook - is it a legitimate forex trading Ebook or SCAM!
Perfect trading ebook - is it a legitimate forex trading Ebook or SCAM!
 
how to increase profit as an hamster Miner - earn over 100,000,000+ token's p...
how to increase profit as an hamster Miner - earn over 100,000,000+ token's p...how to increase profit as an hamster Miner - earn over 100,000,000+ token's p...
how to increase profit as an hamster Miner - earn over 100,000,000+ token's p...
 
how to make money from hamster kombat: beginners guide.
how to make money from hamster kombat: beginners guide.how to make money from hamster kombat: beginners guide.
how to make money from hamster kombat: beginners guide.
 
Can I sell my hamster kombat tokens Now! (latest update - 2024)
Can I sell my hamster kombat tokens Now! (latest update - 2024)Can I sell my hamster kombat tokens Now! (latest update - 2024)
Can I sell my hamster kombat tokens Now! (latest update - 2024)
 
Girls Call DN Nagar 9910780858 Provide Best And Top Girl Service And No1 in City
Girls Call DN Nagar 9910780858 Provide Best And Top Girl Service And No1 in CityGirls Call DN Nagar 9910780858 Provide Best And Top Girl Service And No1 in City
Girls Call DN Nagar 9910780858 Provide Best And Top Girl Service And No1 in City
 

Monitoring and Evaluation for development and governmental organizations.pdf

  • 1. Monitoring and Evaluation for development and governmental organizations By Guta Mengesha Email: gutamengeshadinagde@gmail.com MA in Project Management and Finance Ethiopia, East Africa By Guta Mengesha
  • 2. Monitoring and Evaluation for development and governmental organizations Outline summary • Key Terms • Introduction • M& E definitions • Difference and similarities of M&E • Goals of Monitoring • Tools of Monitoring • Evaluation Principles • Tools for Evaluation • Why Evaluation? • Classification of Evaluation • Evaluation planning • Objective of Evaluation • Steps in Evaluation • M&E framework • Planning tools • Data quality Audit • Indicators • Baseline survey • TOR • Evolution Report • Evaluation outline • Reference By Guta Mengesha
  • 3. Monitoring and Evaluation for development and governmental organizations KEY TERMS  Indicators  Baseline  Benchmark  Counterfactual  Data quality  Re-programing  Schedule crashing  Fast tracking  Fire-up plan  Scope creep  Milestone  Theory of change (ToC)  Decomposing  TOR, Statement of work  M&E framework By Guta Mengesha
  • 4. Monitoring and Evaluation for development and governmental organizations KEY TERMS Indicator: An indicator is a particular characteristic or dimension used to measure intended change for a given result. Baseline: The value of a performance indicator that exists prior to implementation of the program, project or intervention. Benchmark: What you hope to achieve by the end or are expected values or levels of achievement at specified periods and Reference point or standard against which performance or achievements can be assessed. Counterfactual :the situation that would have existed over time without the changes introduced by the intervention Schedule cursing: Adding additional resources to the project to accelerate the progress of the schedule Fast Track: The technique of speeding up the project schedule by altering the planned schedule through doing work simultaneously that would have ideally been performed consecutively. Fire-up plan: Accelerating a plan by assigning additional resource Decomposing: A technique to separate or break down project deliverables into smaller elements, components or parts ToC: Tool that outlines the strategic intent of the organization by illustrating how the change will take place (or flow) from projects and activities all the way up to the portfolio level of the organization M&E framework:Tool that outlines the indicators the program team will use to measure a program’s performance against its stated objectives and outcomes. Scope creep: refers to gradual changes in project scope that occur without a formal scope change procedure. Scope creep is considered negative since unapproved changes in scope affect cost and schedule but do not allow complementary revisions to cost and schedule estimates Reprograming:a condition of amending or modifying program budget and activities in response to prevalent program truth or in response to program reality. Crashing: The technique of speeding up the project schedule by using more resources (i.e.: people, materials, or equipment) than what was originally planned By Guta Mengesha
  • 5. Monitoring and Evaluation for Development and Governmental organizations INTRODUCTION Monitoring and Evaluation (M & E) is integral part of project/program/strategy/ management cycle. ME system a crosscutting activity that Monitoring and Evaluation (M & E) enable us to check the bottom line of development work. In development work, the term bottom line means whether we are making a difference in the problem or not, while in business, the terms refer to whether we are making a profit or not in doing the business. In monitoring and evaluation, we do not look for a profit; rather, we want to see whether we are making a difference from what we had earlier. M & E is all about trying to ascertain if your planned activities are being implemented on track and if these activities have brought change. Monitoring enables managers to keep track of progress, to adjust operations to take account of experience and to formulate budgetary requests and justify any needed increase in expenditure and evaluation assess other areas such as achievement of intended goals, cost-efficiency, effectiveness, impact and / or sustainability and address issues of causality and further it help to make adjustmenet to design and implementation of their project or other interventions. A baseline study is the first phase of a project evaluation. By Guta Mengesha
  • 6. Monitoring and Evaluation for development and governmental organizations What is Monitoring and Evaluation by Organisation for European Co-operation and Development (OECD)? MONITORING: is a continuous function that uses the systematic collection of data on specified indicators to provide management and the main stakeholders of an ongoing development intervention with indications of the extent of progress and achievement of objectives and progress in the use of allocated funds (p. 27). Monitoring focuses on progress made in terms of results: What did we deliver? What did we achieve? Why did we (not) achieve certain results? How? Can we improve? Its implementation is in relation to activity schedules and expenditure of allocated funds, and its progress and achievements in relation to its objectives. Is planned activities indeed been implemented? Monitoring tracks the actual performance against what was planned or expected by collecting and analyzing data on the indicators according to pre-determined standards. In broad terms, monitoring is carried out in order to track progress and performance as a basis for decision-making at various steps in the process of an initiative or project (IFAD 2021) By Guta Mengesha
  • 7. Monitoring and Evaluation for development and governmental organizations What is Monitoring and Evaluation by………………… EVALUATION: is the systematic and objective assessment of an ongoing or completed project, program, or policy, including its design, implementation, and results. The aim is to determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact, and sustainability. An evaluation should provide information that is credible and useful, enabling the incorporation of lessons learned into the decision making process of both recipients and donors (p. 21). Evaluation also refers to the process of determining the worth or significance of an activity, policy or programme. An assessment, as systematic and objective as possible, of a planned, on-going, or completed development intervention” (glossary OECD-DAC). Evaluations are held at a certain point in time (as opposed to monitoring, which is continuous) and are retrospective. In order to assure credibility, evaluations are con-ducted by independent experts (external). Evaluation involves the assessment of the programs towards the achievement of results, milestones, and impact of the outcomes based on the use of performance indicators. It is the periodic assessment of the design, implementation, outcomes and impact of a development intervention.(OCED 2020). Is simply an in-depth assessment to see activities brought a change! Early warning system that project go in wrong way. It studies the outcome of a project (changes in income, better housing quality, distribution of the benefits between different groups, the cost-effectiveness of the projects as compared with other options, etc.) to inform the design of future projects. By Guta Mengesha
  • 8. Monitoring and Evaluation for development and governmental organizations Other theory by (Thomas Winderi ,PhD) simply elaborate Monitoring as a dash board of your car that tells you “How fast you are going?” “How money fuel left over?” “Which door left open?” Monitoring is concerned with Performance of a project, program, service or a policy. Monitoring is conducted typically by internal staffs Monitoring is continues process or non-stop even after an activity Monitoring typically support management of project/ Program, policy or service Evaluation is like an occasional check up of your vehicle  It takes back to assesse a overall value of project/program/service  Are usually conducted by external specialist to ensure unbiased judgment  Is conducted one-off activity during and end of a program  Is more systematic answer “Is program, policy/Service relevant ?”- (suite to priorities of target group), is effective? Achieve result “Is efficient?” “ achieve at reasonable cost “ Have impact?” What real difference brought? “IS sustainable?’ Will positive change continue once funding is cut? By Guta Mengesha
  • 9. Monitoring and Evaluation for development and governmental organizations Difference between monitoring and evaluation Dimension Monitoring Evaluation Characteristics Continuous proces Periodic: at essential milestones, such as the mid-terms of the program implementation; at the end or a substantial period after program conclusion Objective Keeps track changes from baseline; oversight; analyzes and documents progress In-depth analysis; Compares planned with actual achievements, validate what result were achieved and not Focus Focuses on inputs, activities, outputs, implementation processes, continued relevance, likely results at outcome level Focuses on outputs to inputs; results to cost; processes used to achieve results; overall relevance; impact, and sustainability. Answer Answers what activities were implemented and the results achieved. Answers why and how results were achieved. Contributes to building theories and models for change. Use Alerts managers to problems and provides options for corrective measures. Provides managers with strategy and policy options. Benefit Self-assessment by program managers, supervisors, community stakeholders, and donors. Internal and/external analysis by program managers, supervisors, community stakeholders, donors, and/or external evaluators. By Guta Mengesha
  • 10. Monitoring and Evaluation for development and governmental organizations Result Based Monitoring and Evaluation by World Bank ten step to result monitoring Monitoring Evaluation Clarifies program objectives Analyzes why intended results were or were not achieved Links activities and resource to the objectives Assesses specific causal contributions of activities Translates objectives into performance indicators and sets targets Examines implementation process Routinely collects data on these indictors, compares actual with targets Explores unintended results. Report progress to managers and alert them to problem Provides lessons, high lights significant accomplishment or program potential, and offers recommendations for inmprovement By Guta Mengesha
  • 11. Monitoring and Evaluation for development and governmental organizations Goals of Monitoring  To ensure that activities, input, output proceed according to plan  To determine inputs are optimally utilized  Ensure all activities are carried out by right people and in time  To provide record of activities, input and output  To warn the deviation from plan  To assist managers in decision making  To be integrated to all stages in project cycle What makes monitoring and Evaluation similar  Both activities require dedicated funds, trained personnel, monitoring and evaluation tools, effective data collection and storage facilities, and time for effective inspection visits in the field.  Both are necessary management tools to inform decision-making and demonstrate accountability. By Guta Mengesha
  • 12. Monitoring and Evaluation for development and governmental organizations In Monitoring There are 5 area to monitor percentage complete of the activities, Hitting baseline dates, Budget ,Quality and External dependencies Monitoring involves comparing actual performance with plans to evaluate the effectiveness of plans, identify weaknesses early on and take corrective action if required. Example: Budget Budget- Actual Comparison Report Project title:Sonan Program Period covered:Jan 2021- Dec 2021 Currency used:12000 Monitoring on 30 June 2021 Code Budget description Annual Budget Budget to date Actual to date Variance to date Variance % Utilization % Note # Income AO Program cost 8,000 6,000 7,000 1,000 17% 88% Activity done BO Investment 4,000 2,000 1,000 -1,000 -50% 25% Item not procured 0 Total income 12,000 8,000 8,000 0 0% 67% Expenditure OS Operations 10,000 5,000 5,500 -500 -10% 55% tolerabel OP Program Managemenet 1,000 500 700 -200 -40% 70% staff not assined 0 Total expenditure: 11,000 5,500 6,200 -700 -13% 56% SURPLUS/(DEFICIT) 1,000 2,500 1,800 -700 By Guta Mengesha
  • 13. Monitoring and Evaluation for development and governmental organizations PURPOSE OF MONITORING AND EVALUATION 1. Provide data for budget revision or re-programming 2. Suggest for rescheduling if project run behind schedule 3. Re-budgeting a project (Appropriating) a fund from one head to another; avoiding expense under unnecessary heading 4. For re-assigning staff( shifting from one area or recruiting temporary staff to meet time schedule) 5. Provide constant feedback on the extent to which the projects are achieving their goals; 6. Identify potential problems and their causes at an early stage and suggest possible solutions to problems; 7. Monitor the efficiency with which the different components of the project are being implemented and suggest improvement. 8. Evaluate the extent to which the project can achieve its general objectives; Provide guidelines for the planning of future projects; 9. Improve project design and show the need for mid-course corrections; 10. Show need for mid-course corrections 11. To promote learning: to identify lessons of general applicability, to learn how different approach to participation affect outcome, and impact, 12. For strategic management: provision of information to inform setting and adjustment of objectives and strategies. 13. To ensure accountability: to assess if program effective, appropriately, effectively executed to be accountable to key agencies supporting the action, expenditure and result are agreed. 14. For capacity building: building the capacity, self-reliance and confidence of beneficiaries and implementing staff and partners to effectively initiate and implement development initiatives. 15. For operational management: provision of the information needed to co-ordinate the human, financial and physical resources committed to the project or programme, and to improve performance.( Estrella and Gaventa 2020) By Guta Mengesha
  • 14. Monitoring and Evaluation for development and governmental organizations Indicators for monitoring Project/Program/strategy/service usually monitored against whether  Running on schedule  Running within the planned cost  Receiving adequate cost Techniques/Tools of monitoring 1. First hand information 2. Format report 3. Project status report 4. Project schedule chart 5. Project financial status report 6. Informal report 7. Graphic presentation By Guta Mengesha
  • 15. Monitoring and Evaluation for development and governmental organizations EVALUATION in eye blink  The word evaluation has its origin from Latine word “Valupure” which means value of particular thing, idea or action. Thus, it help us to understand the worth, quality, significant amount ,degree or condition of any intervention. Purpose of Evaluation from the two prospective I. From knowledge prospective: is to establish new knowledge about social problems and effectiveness of program or policies to alleviate them and it help us to make plan for future work. II. From accountability prospective: is to make the best possible use of fund to accountable for worth of a project, program or policy. It help to measure accomplishment to avoid mistake and avid weakness. Is to verify the benefit reached the people for whom the program was meant and to observe efficiency of tools and technique employed. Principles of Evaluation 1. Continuity 2. Inexpensive 3. Minimum hindrance to day to day work 4. Total participation 5. External evaluation 6. Agency or Program totality 7. Sharing By Guta Mengesha
  • 16. Monitoring and Evaluation for development and governmental organizations Methods of Evaluation (Tools and technique) 1. First hand information: get information from host of staff, line officers, field personnel, other specialist, public who associated to the project, direct observation, hearing the performance and pitfalls 2. Formal/Informal periodic report:(Formal report include Project status report, Project Schedule chart, Project financial status report) and (Informal report include anonymous letters, press report, compliant by beneficiaries, petitions may reveal true nature yet biased and may contain maligned information. 3. Graphic presentations: (Chart, graph, pictures, illustrations and the like) 4. Standing evaluation review committee:(host of experts who meet regularly at frequent intervals to discuses the problem and suggest remedial measures) 5. Project profiles: by investigating team on standardized unit By Guta Mengesha
  • 17. Monitoring and Evaluation for development and governmental organizations Methods of Evaluation….. Project Review Meetings The main function of project review meetings is to identify deviations from the project plan so corrective action can be quickly taken. During these meetings, participants focus on (1) current problems with the work, schedule or costs, and how they should be resolved, (2) anticipated problems, and (3) opportunities to improve project performance. Review meetings are the managerial equivalent to the “quality circle” (QC) groups used in production environments. Review meetings can be informal and scheduled weekly, or formal and scheduled whenever needed or according to particular phases of the project By Guta Mengesha
  • 18. Monitoring and Evaluation for development and governmental organizations Methods of Evaluation(Tools and technique)…………. Formal review Among the most common formal reviews conducted during project definition and execution phases are the following: (1) Preliminary Design Review. The functional design is reviewed to determine whether the concept & planned implementation fits the basic operational requirements. (2) Critical Design Review. Details of the hardware & software design are reviewed to ensure that they conform to the preliminary design specifications. (3) Functional Readiness Review. For high-volume products or mass-produced goods, tests are performed on the first, or early, items to evaluate the efficacy of the mfg. process. (4) Product Readiness Review. Manufactured products are compared to specifications and requirements to ensure that the controlling design documentation produces items that meet requirements Formal critical reviews serve several purposes:  minimization of risk,  identification of uncertainties,  assurances of technical integrity, and  assessment of alternative design and engineering approaches. N.B Formal reviews can be a precondition for continuing the project(as in the phased project planning approach). By Guta Mengesha
  • 19. Monitoring and Evaluation for development and governmental organizations Methods of Evaluation(Tools and technique)…….. Informal review Are held frequently & regularly, and involve a small number of people. also are referred to as “peer reviews” because the people involved are usually members of the project team. These reviews mainly focus on project status, special problems, emerging issues, and the performance of the project with regard to requirements, budgets, & schedules. Selection of meeting participants depends on the phase of the project and issues at hand so that only the appropriate project team members, customer representatives, functional or line managers, and PMs are chosen. Before these meetings, status reports & forecast time & cost-to-complete are updated Charts and Tables Charts & tables are the most expeditious way for displaying cost, schedule, and work performance info. Their advantage include  reduce large amounts of complex information into simple, comprehensible formats. • clarify information on project progress, performance, and predictions. The problem with use of charts & tables is that they neither reveal the underlying causes of problems nor suggest opportunities. There are also oral and written reports. By Guta Mengesha
  • 20. Monitoring and Evaluation for development and governmental organizations Why Evaluation? 1. Top management and the customer want to know how the project is progressing, and project personnel need to be kept abreast of project status and work changes 2. To improve performance & uncover extant or potential problems so they can be corrected 3. To improve accountability(reward for success and responsible for failure from talk most to lower.) 4. For generating knowledge 5. Serves the purpose of summarizing project status to keep stakeholders informed 6. For decision making in that policy makers, planners, financers for economically sound decision and to judge the merit of intervention. 7. Once the project is completed, evaluation’s purpose is to summarize and assess the outcome(Source: Coach Alexander) By Guta Mengesha
  • 21. Monitoring and Evaluation for development and governmental organizations CLASSIFICATION OF EVALUATION I. Based on aim of Evaluation A. Formative (Interim) evaluation: is undertaken to improve the strategy, design, and performance or way of functioning of an on-going program/project. Is conducted during program development stage and trough out project cycle & it provide information for corrective action. (Are Process evaluation, ex-ante evaluation and Project Appraisal.) B. Summative evaluation: on the other hand, is undertaken to make an overall judgment about the effectiveness of a completed project that is no longer functioning, often to ensure accountability. Focus on outcome and impact. It occur after project completed. (Are Outcome evaluation ,impact evaluation and Ex-post evaluation(after 2-5 years). , Example: Construction of residential house to have better standard life or to have more disposable income. Activities( Purchase of stone, sand, bars, aggregate, labor, cement, Nile, Iron sheet)of construction are Process Evaluation in way done on right standard ,timely with required quality, materials well used and An outcome Evaluation is change in life having more income if rent out or stopping renting to have more money and financially Impact Evaluation measure long term range better livelihood due to more money. That means better health, education and better nut By Guta Mengesha
  • 22. Monitoring and Evaluation for development and governmental organizations CLASSIFICATION OF EVALUATION….. Formative evaluation is designed to pilot he project as it progresses. It asks the questions “What is happening?” and “How is the project proceeding?” Summary evaluation is designed to appraise the project after completion. It addresses the questions “What happened?” and “What were the results?” project evaluation must incorporate three performance criteria simultaneously—cost, schedule, & technical performance, and it must account for the impact that changes in any one work area will have on other related areas By Guta Mengesha
  • 23. Monitoring and Evaluation for development and governmental organizations Classification of Evaluation…. II. Based on agency conducting (Who is Evaluating) A. Participatory approach :is a broad concept focusing on the involvement of primary and other stakeholders in an undertaking such as program planning, design implementation, monitoring, and evaluation. It is a process of individuals and collective learning and capacity development through which people become more aware and conscious of their strengths and weaknesses, their wider social realities, and their visions and perspectives of development outcomes. Is also Multi-vocal evaluation B. Conventional Evaluation:It aims at making a judgment on the program for accountability purposes rather than empowering program stakeholders. It strives for the scientific objectivity of Monitoring and Evaluation findings, thereby distancing the External Evaluators from stakeholders. It tends to emphasize the need for information on program funding agencies and policymakers rather than program implementers and people affected by the program.. By Guta Mengesha
  • 24. Monitoring and Evaluation for development and governmental organizations III. Based on the ways of doing an evaluation 1. Self-evaluation:This involves an organization or project holding up a mirror to itself and assessing how it is doing, as a way of learning and improving practice. 2. Internal evaluation: This is intended to involve as many people with a direct stake in the work as possible. This may mean project staff and beneficiaries are working together on the evaluation. If an outsider is called in, he or she is to act as a facilitator of the process, but not as an evaluator. 3. Rapid participatory appraisal:This is a qualitative way of doing evaluations. It is semi- structured and carried out by an interdisciplinary team over a short time. It is used as a starting point for understanding a local situation and is a quick, cheap, and useful way to gather information. It involves the use of secondary data review, direct observation, semi-- structured interviews, key informants, group discussions, games, diagrams, maps, and calendars. 4. External evaluation:This is an evaluation done by a carefully chosen outsider or outsider team with adequate experience and expertise 5. Interactive evaluation:This involves a very active interaction between an outside evaluator or evaluation team and the personnel in an organization or project being evaluated By Guta Mengesha
  • 25. Monitoring and Evaluation for development and governmental organizations IV. Based on timing when they are carried out 1.Ex-ante evaluatie-is a forward-looking assessment of the likely future effects of new initiatives and support such as policies, programmes and strategies. It takes place prior to the implementation of an initiative 2.Midterm evaluation or Formative evaluation intends to improve performance, most often conducted during the implementation phase of projects or programmes 3.Final or terminal evaluations or Summative evaluation is conducted at the end of an initiative (or a phase of that initiative) to determine the extent to which anticipated outcomes were produced 4.Ex-post evaluation:usually conducted two years or more after completion. Its purpose is to study how well the initiative (programme or project) served its aims, to assess sustainability of results and impacts and to draw conclusions for similar initiatives in the future.(NUDAF on UNDP ME hand book) By Guta Mengesha
  • 26. Monitoring and Evaluation for development and governmental organizations V. Base on the use of Evaluation, I. Democratic Evaluation: using evaluation to facilitate conversation, II. Utilization Evaluation: designing an evaluation that has an intended use by intended users III. Developmental Evaluation conducting an evaluation in accordance with a developmental innovative project to provide feedback and support decision making in the process of the work How to select an Evaluation type? You can select the evaluation type based on • The objectives and priorities of your project • The purpose of the project evaluation • The nature of the project (i.e., whether it is process-oriented or outcome-oriented) • The time frame for conducting the evaluation (i.e., during or after the project) • How, and by whom, the results will be used • The time frame and budget for completing the evaluation By Guta Mengesha
  • 27. Monitoring and Evaluation for development and governmental organizations Evaluation Planning Five components in Evaluation planning 1) Have a ME framework it can be fogram, ToC, 2) Assign Roles and responsibilities-RACI diagram 3) List all indicators( process , output, outcome impact) use of indictor reference sheet to standardize to all participant 4) Budget: is very importins aspect and can not overly emphasized as money is required to do a things. The to sustain the system we need to pump in the required resource. List all the activities take care of each like HR(Less focused in money project when cost of living is higher than salaries now days), fees to consultants and never compromise quality. 5) Activity plan: month or quarter or if yearly have a full flogged to meet a target and be realistic to you and team. By Guta Mengesha
  • 28. Monitoring and Evaluation for development and governmental organizations Template of M& E Plan Logo or Project Name • Executive summary: written later • Acronym: • Glossary of terms • Introduction: 1.Back ground, 1.1.Purpose/Objective of ME plan, 1.2.Overview of project, 1.3.logical framework 1.4 .List of indicators 2.ME framework 3.Data flow 4.Evaluation 5.Appendex (Indicator reference sheet) 5.1.Indicator reference sheet 5.2. Budget By Guta Mengesha
  • 29. Monitoring and Evaluation for development and governmental organizations BUDGET TEMPLATE By Guta Mengesha
  • 30. Monitoring and Evaluation for development and governmental organizations Core Objectives/key Criteria/ of Evaluation 1. Relevance 2. Efficiency 3. Effectiveness 4. Impact 5. Sustainability 6. Causality 7. Alternative strategy Let we see them one by one 1.Relevance: It refers to whether the program examines the appropriateness of results to the national needs and priorities of target groups. The extent to which the intervention is suited to the priorities, policies of target group, partner country and donor. Is project weighted? The appropriateness of project objectives to the problems intended to be addressed, and to the physical and policy environment within which the project operates. 2.Efficiency: Efficiency tells you whether the input into the work is appropriate in terms of the output. It assesses the results obtained with the expenditure incurred and the resources used by the program during a given time. The analysis focuses on the relationship between the quantity, quality, and timeliness of inputs, including personnel, consultants, travel, training, equipment, and miscellaneous costs, and the quantity, quality, and timeliness of the outputs produced and delivered. Were we are cost effect by achieving maximum resource with minimum resource Whether project outputs have been achieved at reasonable cost, i.e how well inputs have been used in activities and converted into outputs.. By Guta Mengesha
  • 31. Monitoring and Evaluation for development and governmental organizations Core Objectives of Monitoring and Evaluation…. 3.Effectiveness: Effectiveness is a measure of the extent to which a project (or development program) achieves its specific objectives. Looks at actual magnitude. How well the outputs contributed to the achievement of project purpose and the overall goal(s), and how well assumed external conditions contributed to project achievements. If, for example, we conducted an intervention study to improve the agricultural production? If their income increased and By How much? 4.Impact: The effect of the project on its wider environment, and its contribution to the wider policy, sector, and to country wide development strategy. The positive and negative change produced by an intervention, direct or indirect, intended or unintended. Possible questions--- What has happened as a result of project/program? What the real difference the activity made to beneficery? How money people have been affected? 5.Sustainability: Sustainability refers to the durability of program results after the termination of the technical cooperation channeled through the program. The likelihood that benefits produced by the project continue to flow after external funding has ended. 6.Causality: An assessment of causality examines the factors that have affected the program results. 7.Alternative strategy: Program evaluation may find significant unforeseen positive or negative results of program activities. Once identified, appropriate action can be taken to enhance or mitigate them for a more significant overall impact. (Source: IFAD) By Guta Mengesha
  • 32. Monitoring and Evaluation for development and governmental organizations STEPS IN EVALUATION Step-1:Desk Review: It involves reading all the relevant documents like a) Project document b) Review the implementation plan c) Review Budget d) Project report Step-2:Data collection tools i. Develop or contextualize data collection tools that align to the project/Program ii. Test data collection tools iii. Outline the list of people to be interviewed primarily those face-to-face Step-3:Training of the enumerators Train the enumerators on the use of tools especially if there need to collect from the field to engage them as data collection agent. Step-4:Piolet test tools It helps to ensure to capture the necessary information and for improvement those tools need to be oriented well Step-5:Lounch data collection a. Interview people from the field b. Interview key informant c. Collect secondary data from renowned source By Guta Mengesha
  • 33. Monitoring and Evaluation for development and governmental organizations STEPS IN EVALUATION………… Step-6:Data entry and analyses: Enter the data, analyze and try to generate those tables by use of Excel, SPSS, Stata and any other software. Step-7:Generate report:  Formulate recommendation for improvement.  Ensure it is speaking to the project objectives bring forward the key issues in executive summary for to clarify in the body. In conclusion also based on the assessment tell if project achieved its objective or not. Hit the head of the nail do not bush around. Step-8:Circulate the fist draft report: Make your fist draft basic report to allow management to give comment and allow them to criticize it. However, do not change the finding the actual data rather change the interpretation. Step-9:Circulate to stakeholders for public comment  To government  Like minded organization  Research institution and universities Step-10:Improve the report further Make the necessary changes and re-submit the final version Note: Be concise , specific and objective about fees Be on time in submitting the report Do not change the finding and be ethical Explain if the project has achieved its objective or not Follow the TOR as a guiding point By Guta Mengesha
  • 34. Monitoring and Evaluation for development and governmental organizations Result based evaluation example by world Bank By Guta Mengesha
  • 35. Monitoring and Evaluation for development and governmental organizations Example of monitoring by World Bank By Guta Mengesha
  • 36. Monitoring and Evaluation for development and governmental organizations Monitoring and Evaluation framework  It is table that outlines the indicators are to be used for project and how theses indicators will be collected , analyzed and reported on. It will also explain who will be responsible for the Monitoring and Evaluation activities( UNDP hand book of ME)  Is a diagram that increase the understanding of project goal, objectives, output, outcomes and impact.  It define how project will trigger different level of change from activities to impact  It articulates external and internal elements that could affect a project success.  It helps in accountability, assessment and decision making  It outlines the indicators the program team will use to measure a program’s performance against its stated objectives and outcomes  It is the first stage in developing the plan for how the progress of a program will be quantified, monitored and evaluated during scheduled intervals throughout the program lifecycle  It is communication and planning tool  It serve as a “Map” for program goal means the impact  It links between resource and activities  It guide the selection of indicators and subset of plan By Guta Mengesha
  • 37. Monitoring and Evaluation for Development and Governmental organizations How to develope Monitoring and Evaluation framework o It begins with understanding of planning tools like log frame, TOC and Result framework o Illustrate the relation between input, process, output, outcome and goal o Link a program to the desired impact (UNDP M& E Hand book) Tools in M & E cycle Tool is an instrument used to get job or tasks done. Quality of these tools A. Reliability: Tools must be consistent and adequate for carrying out function of ME A. Objectivity: the tools must not be biased and opinionated B. Adequacy: the tool must thoroughly capture the required data and information that is required to fulfill ME requirements A. Usability: the tools should be easy to handle and not complicated for use By Guta Mengesha
  • 38. Monitoring and Evaluation for Development and Governmental organizations Tools we use in M & E There are about many tools in general but, I. Planning Tools: which are used in planning process in M & E cycle II. Data collection Tools: which are used in collection of data while carrying out M&E functions III. Auxiliary Tools: these are crosscutting tools can be used in both planning and data collection. IV. M&E Guidelines, policies and hand book: more contextualized as per your organization system like UN. If such resource there refer during planning and other activities as your Bible. V. M&E software: like Horizon 5, It make easy for to record , analyze and report the data. Source: Coach Alexander) By Guta Mengesha
  • 39. Monitoring and Evaluation for Development and Governmental organizations The six main components of a project M&E system 1. Measurable objectives for the project and its components. 2. Structured indicators covering: inputs, process, outputs, outcomes, impact, and exogenous factors. 3. Data collection mechanisms capable of monitoring progress over time, including baselines and a means to compare progress and achievements against targets. 4. Building on baselines and data collection with an evaluation framework and methodology capable of establishing causation (i.e capable of attributing observed change to given interventions or other factors). 5. Clear mechanisms for reporting and use of M&E results in decision-making. 6. Sustainable organisational arrangements for data collection, management, analysis, and reporting. (FAO 2020) By Guta Mengesha
  • 40. Monitoring and Evaluation for Development and Governmental organizations Types M& E framework or (Planning Tools) Are mainly four and the lease can go one(Log frame, ToC, Result framework &GOPP 1. LOGICAL FRAMEWORK(LOG FRAME) : Is a systematic and analytical planning process used for result based planning of a project(Or program) and for associated M&E system. It takes a more narrow and practical look at the relationship between inputs and results in a project/program. For example: Let we assume Ethiopian Renessa's Dam on Abaya River for power/ Reseravoir. Input: Human resource, Cement, Machines, stone, Enforcement bars, sand, gravel Process: Laborers, Engineers, Mixing cement, production of RCC, installation of turbine, Generators Output: Dam being constricted Outcome: Generation of power Impact: Economic empowerment of community Earning increase of a country….. By Guta Mengesha
  • 41. Monitoring and Evaluation for development and governmental organizations  A log frame is a tool used to communicate the program logic, facilitate planning, and act as the foundation for the monitoring and evaluation processes. It is also a visual representation of how a program aligns to an organization’s strategy or program Theory of Change. The way it works is by creating clear linkages between the successful implementation of program activities (projects) and the realization of programmatic outcomes and goals.  It complements the Results Framework (RF) in a Country Development Cooperation Strategy (CDCS) by carrying the development hypothesis through from the overall program to the supporting projects and their associated activities, in the form of the project hierarchy (USAID manual)  Logframe describes casualist.  It is the tool that must be used as the basis for designing projects  It is to the point and focuses only on one specific pathway that a project deals with to create an intended change unlike ToC. By Guta Mengesha
  • 42. Monitoring and Evaluation for development and governmental organizations Key element in diagram( Source: USAID) By Guta Mengesha
  • 43. Monitoring and Evaluation for development and governmental organizations Key elements of Logic model The key elements of the Log Frame Matrix include the narrative summary, the indicators and their data sources, and the assumptions . 1. The narrative summary: identifies the hierarchy of results in the project hypothesis, from lowest level result to highest level result, as well as the activities and other resources Activities: work done by implementers Inputs :which include the project activities, are the resources the project expends in order to produce outputs Outputs :are what are produced as result of inputs Outcome: is the key result to be achieved by the project Goal /impact: is a higher-level result to which the project, along with others, will contribute 2. Indicators: measure a particular dimension or characteristic of a result in the Log Frame and are the basis for observing progress toward that result. Are base for M & E are several in dimension like quality, quantity, time, location and behavioral 3. Data sources: specify exactly where the indicator data will come from, and when it will be collected. 4. Means of verification: Tools or means to obtain the information required by the indicators. It include project reports, field verification photo, video, Ad-hoc studies, pre-post test 5.Assumptions: which are the most critical factors that could affect achievement of the project’s planned results and have implications. Assumptions describe necessary internal and external conditions. Assumptions can also be risks, Risk assessment, monitor, high/medium/low risk, Risk management counteract/re-design (activity level),Abandon (killer assumption) &(NORAD 2021) By Guta Mengesha
  • 44. Monitoring and Evaluation for development and governmental organizations Example: Livelihood project for women in Jimma, Ethiopia LF(Source: Author) PROJECT LOGIC DESCRIPTIONN INDICATORS VERIFICATION ASSUMPTIONS GOAL Improved livelihood and Resilient women Sustainable livelihood, good wellbeing, transformed life style Women business site, donor visit Forecasted budget will be secured and all partners are cooperative OUTCOME Capacity developed, Income generation, Self-reliant women, Skilled women, reduced vulnerability % Life style change % Dietary improved % women owned business Testimonies, survey data, FGD with women, church Member visit New business will success, revenue will grow, women will continue business without support OUTPUT Empowered women, understanding of local opportunities, motivation to embark on new income generation jobs Venture established %women started business, improved income,% bank account Photo, video, reports Attendance, per-diem sheet Trade office help in acquiring business license, technical school support, and local government. Support ACTIVITIES Identifying business priorities Organize women Secure market place Starting a priorities Quantity of items procured, service given, number of women completed training, skill gained, item delivered(GRN) Invoices, committee approvals, field verification Local rulers support well, women are committed towards their goal INPUT Training, counseling Pen, notebooks, material, service purchase, stakeholder Activity per day, applying training, actual expense Invoice, committee recommendation Women are willing to participate, materials are available in local market By Guta Mengesha
  • 45. Monitoring and Evaluation for development and governmental organizations TYPES PLANNING TOOLS IN M & E CYCLE……………... 2.THEORY OF CHANGE(TOC):is a methodology for planning, participation and evaluation that is used in companies, philosophy, non-for profit research and government sectors to promote social change. It defines long- term goals and then maps backward to identify necessary preconditions. It shows a bigger picture of all underlying process and possible pathways leading to long-term behavioral change individual, institutional or community level. Theory of Change (ToC) is a tool that outlines the strategic intent of the organization by illustrating how the change will take place (or flow) from projects and activities all the way up to the portfolio level of the organization. In essence, a ToC describes how the organization will realize the change it would like to see in the world. (PMD Pro Guide 2021) It visualize all possible evidences, assumptions linked to those changes. It also provides the blueprint or pathway by mapping out long-term goals and linking them to existing preconditions, besides specifying the causal link for each precondition. It also lists the basic assumptions about achieving a specific set of outcomes underpinning the importance of the context, which in turn helps in agreeing with the logical narrative for the intervention(kultar,Dharmedara &Varun) . By Guta Mengesha
  • 46. Monitoring and Evaluation for development and governmental organizations Diagram of ToC By Guta Mengesha
  • 47. Monitoring and Evaluation for development and governmental organizations A Theory of Change is both a Framework and a Process: It is a Framework - ToC enables organizations to visualize how to focus their energy on achieving their overall outcomes, goals, and vision. It is a Process – It allows organizations to identify milestones and conditions that must occur if a program is to achieve its pathway to change A Theory of Change is not: • An absolute truth about how change will or must happen. • A definitive approach intended to eliminate the uncertainty that will always exist in complex and emerging social processes. • A substitute for a logical framework as a rigid planning tool. A theory of change defines all building blocks required to bring about a given long-term goal. This set of connected building blocks — interchangeably referred to as outcomes, results, accomplishments, or preconditions — is depicted on a map known as a pathway of change/change framework, which is a graphic representation of the change process.(Source PM4NGO 2022) A Theory of Change (ToC) is a detailed map of the work ahead that provides a path (or paths) for organizations and programs. This path will include a variety of components that will assist the Program Manager and other stakeholders in linking program activities to overall objectives By Guta Mengesha
  • 48. Monitoring and Evaluation for development and governmental organizations ToC Example for the long-term outcome is the long-term employment of domestic violence survivors at a livable wage ( Page 1) By Guta Mengesha
  • 49. Monitoring and Evaluation for development and governmental organizations Page 2 By Guta Mengesha
  • 50. Monitoring and Evaluation for development and governmental organizations Assumptions A. There are job available for women. B. Pay livable wage, provide job security C. Psychological empowerment D. Women learn a job and compete E. Minimum literacy to admit F. Attend child care G. Commitment in program Intervention 1. Outreach campaign 2. Screening 3. Set up counseling sessions 4. Lead group sessions 5. Help provide for short-term crises, such as housing evictions or court appearances 6. Provide one-on-one counseling 7. Develop curricular in electrical, plumbing, carpentry and building maintenance 8. Conduct classes 9. Curricula and experiential learning situations developed 10. Identify potential employers 11. Create employer database 12. Match women to internships 13. Help women secure permanent jobs (source PMD Pro 2022) By Guta Mengesha
  • 51. Monitoring and Evaluation for development and governmental organizations 3.RESULT FRAMEWORK: is a planning, communication and management tool that emphasize on results to provide clarity around your key project objectives. It was introduce in the mid 1990s by USAID as a new approach to monitor its program throughout the agency. It emphasize on result to provide clarity around the key project objectives and outline each of them intermediate results, output, outcome relate and facilitate achievement of each objectives. By Guta Mengesha
  • 52. Monitoring and Evaluation for development and governmental organizations Result Framework is more output oriented, as it focuses on the ‘things that would be on ground’ after completion of the project. These are basically the results that we want to achieve and the underlying assumption is that achievement of these results would lead to achievement of the envisaged objective. 4.ZOPP (Zielorientierte Projektplanung) or GOPP (Goal Oriented Project Planning): is an adapted form of LFA that is suitable for the development sector. It also uses the same logical approach of LFA, but it is more flexible in accommodating the qualitative and subjective nature of issues inherent in the development sector By Guta Mengesha
  • 53. Monitoring and Evaluation for development and governmental organizations Result framework by USDAID CDCS Goal, Development Objectives (DOs), Intermediate Results (IRs), sub-IRs By Guta Mengesha
  • 54. Monitoring and Evaluation for development and governmental organizations Monitoring & Evaluation (M&E) Framework Example (Risk and assumption added or separate ( Source tool4Dv ) P INDICATOR DEFINITION How is it calculated? BASELINE What is the current value? TARGET What is the target value? DATA SOURCE How will it be measured? FREQUENCY How often will it be measured? RESPONSIB LE Who will measure it? REPORTING Where will it be reported? Goal Percentage of Grades 6 primary students continuing on to high school. Number students who start the first day of Grade 7 divided by the total number of Grade 6 students in the previous year, multiplied by 100. 50% 60% Primary and high school enrolment records. Annual Program manager Annual enrolment report Outcomes Reading proficiency among children in Grade 6. Sum of all reading proficiency test scores for all students in Grade 6 divided by the total number of students in Grade 6. Average score: 47 Average score: 57 Reading proficiency tests using the national assessment tool. Every 6 months Teachers 6 monthly teacher reports Outputs Number of students who completed a summer reading camp. Total number of students who were present on both the first and last day of the summer reading camp. 0 500 Summer camp attendance records. End of every camp Teachers Camp review report Number of parents of children in Grade 6 who helped their children read at home in the last week. Total number of parents who answered “yes” to the question “Did you help your child read at home any time in the last week?” 0 500 Survey of parents. End of every camp Program officer Survey report By Guta Mengesha
  • 55. Monitoring and Evaluation for development and governmental organizations II. Data collection Tools Data Definition Individual fact, statics or piece of information, frequently numerical. In more technical sense, data are collection of qualitative or quantitative variables concerning one or more things , where a datum is a single value of a single variable. Data is factual information used as a basis of reasoning, discussion or calculation Must be recorded; if it is not written down, it is not data Example: price and cost, weight, employee name, product name, Data Quality Definition • How well the information collected represents the program activities • Refers to the worth/accuracy of the information collected • Data that reflects true performance • Focuses on ensuring that data management process is of a high standard Tools we use to get your data. These are qualitative and quantities. a) Qualitative data collection tools: Focus groups discussions(FGD), where you in the field to talk to a group of 9 to 10 individual in group not one on one. b) Quantitative data collection tools: interview, questionnaire, survey (on field to collect data ), observation ( You don’t talk to any one By Guta Mengesha
  • 56. Monitoring and Evaluation for development and governmental organizations Measuring Quality Data The data is only as good as its quality. The quality of data is measured against six criteria: 1. Validity – data which clearly, directly and adequately represents the result that was intended to be measured 2. Reliability – if the process were repeated over and over again it would yield the same result 3. Integrity – data have been protected from deliberate bias or manipulation for political/personal reasons 4. Precision – data are accurate with sufficient detail 5. Timeliness – data are current and information is available on time 6. Confidentiality – clients are assured that their data will be maintained according to national and/or international standards for data Good quality data can be recognized by checking the FACTS F – Formatted Properly A – Accurate C – Complete T – Timely S – Segmented properly By Guta Mengesha
  • 57. Monitoring and Evaluation for development and governmental organizations Dimension of data quality or standard Data quality audit(DQA): is an activity conducted to (1) verify the quality of reported data for key indicators at a selected site (2) Assess the ability of data management system to collect and report the quality data. Dimension How it is measured Accuracy How well does the piece of information reflect reality? Is true reflection on ground? Completeness Does it fulfill your expectation of what’s comprehensive? With no gap or convincing Consistency Dose information stored in one place relevant data stored elsewhere? Use of similar tools Timeliness Is your information available when you need it? Should not assume longer time Validity Is information in specific format, does it follow business rule, is it in an unusable format? Uniqueness Is this the only instance in which this information appears in the database? Organic of from source By Guta Mengesha
  • 58. Monitoring and Evaluation for development and governmental organizations What are reasons why we conduct data quality audit? 1) It build confidence among stakeholders 2) It ensure that data are free of errors 3) It promote good decision making and corrective action by the management 4) It promote efficiency and effectives in the implantation of activities 5) It is good M&E practice Who will do a DQA a) Any ME staff b) Any project member c) Independent professional Steps in conducting DQA Step 1:Preparation:Select site, request document, review document, prepare for actual site visit Step 2:Assessment: assess ME unit and data management system, look at their integrity I assessing data collection and reporting system Step 3:Considataion and reporting, Draft report(Finding, conclusion and recommendations0 By Guta Mengesha
  • 59. Monitoring and Evaluation for development and governmental organizations INDICATORS Indicators provide parameters against which to assess project performance and achievement in terms of quantity (how many/how much?), time (when?), target group (who?) and quality (how good?). Indicators can be quantitative, (number of people, number of ha, % of adoption), semi- quantitative (scale, ranking), or qualitative (perceptions, opinions, categories).( Rioux 2021) An indicator is a qualitative or quantitative measure of program performance that is used to demonstrate changes and which details whether the program results are being or have been achieved.(UNFPA) For indicators to be useful for monitoring and evaluating program results, it is therefore important to identify indicators that are direct, objective practical and adequate and to regularly update them. Indicator is a variable whose value changes from the baseline level (at the time the program began) to a new value after the program and its activities have made their impact felt. (H. Ultimate) By Guta Mengesha
  • 60. Monitoring and Evaluation for development and governmental organizations How to write a monitoring and evaluation (M&E) framework • The first step in writing an M&E framework is to decide which indicators you will use to measure the success of your program • You need to choose indicators for each level of your program – outputs, outcomes and goal Here is an example of some indicators for the goal, outcome and output of an education program. By Guta Mengesha
  • 61. Monitoring and Evaluation for development and governmental organizations DEFINING INDICATOR Once you have chosen your indicators you need to write a definition for each one. The definition describes exactly how the indicator is calculated. If you don’t have definitions there is a serious risk that indicators might be calculated differently at different times, which means the results can’t be compared. Here is an example of how one indicator in the education program is defined: By Guta Mengesha
  • 62. Monitoring and Evaluation for development and governmental organizations Measure the baseline and set the target Before you start your program you need to measure the starting value of each indicator – this is called the “baseline”. In the education example above that means you would need to measure the current percentage of Grade 6 students continuing on to Grade 7 (before you start your program). Once you know the baseline you need to set a target for improvement. Before you set the target it’s important to do some research on what a realistic target actually is. Many people set targets that are unachievable, without realising it. Indicator results are used to assess whether the program is working or not, so it’s very important that decision makers and stakeholders (not just the donor) have access to them as soon as possible. Finally, decide who will be responsible for measuring each indicator.  Output indicators are often measured by field staff or program managers  Outcome and goal indicators may be measured by evaluation consultants or even national agencies  Progress reports (source www. tools4dev ) By Guta Mengesha
  • 63. Monitoring and Evaluation for development and governmental organizations Baseline Survey Plan should address each of the following items: Background and purpose of baseline study • Description of program design and target beneficiaries • Objective of study including list of baseline indicators drawn from logical framework • Review of existing data sources Data collection methods • Defined units of study (communities, households, individuals, etc.) • Proposed primary data collection methods • Sampling description Survey design • Survey questionnaire and/or topical outline • Arrangements for pre-testing Guidelines for Fieldwork • Composition of assessment team • Training to be provided to enumerators • Timetable for fieldwork • Arrangements for supervision/coordination in the field Data analysis procedures • Arrangements for data entry and processing (including data cleaning) • Proposed framework for analysis • Proposed data tables indicator calculations and criteria for data desegregation • Training required for data management and analysis Reporting and feedback • Proposed format of baseline study report • Arrangement for presentation(Source: WFP 2003) By Guta Mengesha
  • 64. Monitoring and Evaluation for development and governmental organizations INDICATORS….. An indicator is also a measurement. It measures the value of the change in meaningful units that can be compared to past and future units. This is usually expressed as a percentage or a number. Finally, an indicator focuses on a single aspect of a program or project. This aspect may be an input, an output or an overarching objective, but it should be narrowly defined in a way that captures this one aspect as precisely as possible. In the context of M&E, an indicator is said to be a quantitative standard of measurement or an instrument which gives us information (UNAIDS, 2010). Indicators help to capture data and provide information to monitor performance, measure achievement, determine accountability and improve the effectiveness of projects/ programmes. By Guta Mengesha
  • 65. Monitoring and Evaluation for development and governmental organizations Keeping these points in view, we emphasize that good indicators should possess the following characteristics:  Relevant to the program.  Relevant to the national standards.  Feasible to construct.  Easy to interpret.  Enable tracking of change over time. The way the indicators are developed varies by the organizations/project and objectives therein. DOPA criteria encapsulate the most important requirements of useful indicators. In DOPA, the letters D, O, P, and A stand as follows: D: for Direct, meaning that an indicator must be able to measure the intended change directly closely; O: for Objective. This means that your indicator must be unambiguous with a clear operational definition of every term stated in the objective; P: for Practical, which means that an indicator must be practical in terms of data collection, budget and timelines to be helpful in decision-making; A: for adequacy, which says that an adequate number of indicators must be considered so that they can capture the progress made adequately towards the desired output. By Guta Mengesha
  • 66. Monitoring and Evaluation for development and governmental organizations Types of Program Performance Indicators An indicator can be classified as; 1. Input indicator 2. Process/performance indicator 3. Output indicator 4. Outcome indicator 5. Impact indicator 6. Exogenous Indicator 1. Input indicator Input indicators are quantified and time-bound statements of resources to be provided. Information on these indicators comes from accounting and management records. Input indicators are often left out of discussions of project monitoring though they are part of the management information system. Are quantified and time-bound statements of the resources financed by the project, and are usually monitored by routine accounting and management records . Input indicators are used mainly by the managers closest to the tasks of implementation and are consulted frequently, as often as daily or weekly. Here are a few examples of this indicator:  Vehicle operating costs for the crop extension service  Appointment of staff  Provision of building  Home care supplies purchased per month 2. Process/Performance indicator Performance indicators measure what happens during implementation. It monitor the activities completed during implementation, and are often specified as milestones or completion of sub-contracted tasks, as set out in time-scaled work schedules. Best indicator procurement process. Often they, are tabulated as a set of contracted completions or milestone events taken from an activity plan. Exempels are: • Date by which building site clearance must be completed • Latest date for delivery of fertilizer to farm stores • Proportion of VCT clients returning to collect their COVID- 19 test results By Guta Mengesha
  • 67. Monitoring and Evaluation for development and governmental organizations Types of Program Performance Indicators….. 3. Output Indicator Output indicators show the immediate physical and financial outputs or outcomes of the Project: physical quantities, organizational strengthening, and initial flow of services. It monitor the production of goods and delivery of services by the project. They are often evaluated and reported with the use of performance measures based on cost or operational ratios . Exempels include o Cost per kilometer of road construction. o Crop yield per acre of land. o The ratio of textbooks to students. o Time is taken to process a credit application 4. Outcome Indicator Are specific to a project’s purpose and the logical chain of cause and effect that underlies its design. Often achievement of outcomes will depend at least in part on the actions of beneficiaries in responding to project outputs, and indicators will depend on data collected from beneficiaries example o perceptions of improved reliability of irrigation supply, o proportion of farmers who have tried a new variety of seed and intend to use it again. o percentage of women satisfied with the maternity health care they receive 5.Impact indicator Refer to medium or long-term developmental change to which the project is expected to contribute. For example: • (health) incidence of low birth weight • (education) continuation rates from primary to secondary education by sex. 6.Exogenous indicators Are those that cover factors outside the control of the project but which might affect its outcome, including risks and performance. By Guta Mengesha
  • 68. Monitoring and Evaluation for development and governmental organizations Types of Program Performance Indicators….. Indicators can also be classified base on the following characters 1.Quantitative indicators:indicators are designed to provide hard data that permit rigorous statistical analyses,  Number  Percent  Rate  Ratio/ Proportion 2.Qualitative indicators:indicators provide insights into changes in organizational processes, attitudes, beliefs, and behavior of individuals.  Compliance with  Quality of  Extent of  Level of 3.Efficiency indicators: tell us if we are getting the desired output against our investment. Cost per unit of (clients served, student, patient, etc.) By Guta Mengesha
  • 69. Monitoring and Evaluation for development and governmental organizations Types of Program Performance Indicators….. 4.Economic development indicators  Average annuel Household income  Earned income level  Per capita income  Percent of people below the poverty line  The growth rate of business 5. Social development indicators  Death rate  Life expectancy at birth  Infant mortality rate  Literacy rate  Percent of dwellings with safe water  Student-teacher ratio  School enormst rate 6.Political/ Organizational development indicators  Number of community organizations  Type of organized sports  Participation in the youth groups  Participation in public meetings By Guta Mengesha
  • 70. Monitoring and Evaluation for development and governmental organizations Common Problems in Specifying Indicators These are: 1) The indicators are irrelevant and do not correspond to the output level; 2) The indicators do not include an objective standard against which achievement can be assessed; 3) The indicators constructed are without reference to the baseline; 4) The indicators are numerous and redundant with little consideration of time, human resources and cost required to collect data for the construction of the indicators; 5) The indicators are unrealistic and sometimes very difficult to conceptualize and measure; 6) The indicators are not representative of the universe.(Source: UNFPA) UNFPA put forward some suggestions that can be practiced. These include, among others, the involvement of the stakeholders in the process of selecting the indicators, procuring baseline data, the involvement of all those who are partners in the program, following the program design. By Guta Mengesha
  • 71. Monitoring and Evaluation for development and governmental organizations TERM OF REFERENCE (TOR) Like all other research proposals, evaluation proposals are also prepared in response to a request for bid or request for proposal (RFP). A proposal is a formal document issued by a sponsor to solicit services from the evaluators. These proposals are prepared by the Terms of Reference (TOR) provided by the sponsor and included in the RFP. It becomes almost mandatory on the part of the bidder to follow this TOR in the preparation of the proposal. The main sections in Terms of Reference (TOR) for an evaluation process usually include, among others, are: 1. Background. 2. Objectieve. 3. Evaluation scope and focus. 4. The time frame and deliverable. 5. Methodologie. 6. Information sources 7. Evaluation of team composition 8. Logistical support 9. Involvement of key stakeholders By Guta Mengesha
  • 72. Monitoring and Evaluation for development and governmental organizations TERM OF REFERENCE (TOR)….. 1.Background This section is designed to provide the information on the history and current status of the program/project being evaluated, including how it works (its objectives, strategies, and management process), duration, budget, and important stakeholders such as donors, partners, implementing organizations. 2.Objectives This section narrates what the project or the organization wants to achieve out of the evaluation. 3.Evaluation scope and focus In consultation with the stakeholders, identify the significant evaluation objectives and questions in TOR, the validity of design, effectiveness, efficiency, impact, sustainability, factors affecting performance, etc. 4.The time frame and deliverable Here you indicate the duration of the assignment and different phases of the assignment from inception report to the final report. . By Guta Mengesha
  • 73. Monitoring and Evaluation for development and governmental organizations TERM OF REFERENCE (TOR)….. 5.Methodology Here you might prefer a particular approach of data collection or a specific evaluation design. 6.Information sources List the information to be used by the evaluation, such as monitoring, review, evaluation, and other reports 7.Evaluation team composition Decide on the number of team members, specify the members’ profile 8.Logistical support This includes time-frame, costs, team composition requirements, and the like. 9.Involvement of key stakeholders Specify the involvement of such stakeholders as internal staff, program partners, donors who will make use of the evaluation results By Guta Mengesha
  • 74. Monitoring and Evaluation for development and governmental organizations COMPONENT OF AN EVALUATION REPORT Whether you are monitoring or evaluating, at some point, or points, there will be a reporting process. This reporting process follows the stages of analyzing information. You will report to different stakeholders (Board, Management team, Staff, Beneficiaries, and Donors) in different ways, sometimes in written form, sometimes verbally and, increasingly, making use of tools as Power-point presentations, slides and videos. A written evaluation report may be prepared in line with the following format; these are the components of an evaluation report. Component includes 1. Executive summary. 2. Preface. 3. Content page. 4. Introduction. 5. Findings. 6. Conclusions. 7. Recommendations. 8. Appendices. By Guta Mengesha
  • 75. Monitoring and Evaluation for development and governmental organizations COMPONENT OF AN EVALUATION REPORT………. 1. The executive summary is intended for time-constraint readers but must be attractive to make people curious so that they want to read the entire report. 2. A preface is a place where you become courteous to thank people and make broad comments about the processes and findings. 3. The introductory section is designed to deal with the background of the project, the need for the evaluation, and the entire activity in a nutshell. 4. The findings section will accommodate the results about the efficiency, effectiveness, and impact thereof that have emerged. 5. The conclusions you draw will follow your findings, while your 6. Recommendations will address weaknesses followed by what needs to be done to strengthen the programs being evaluated. 7. Appendix: Include your Terms of References (TOR), a questionnaire used in the evaluation, and any other reference documents in the appendices, which you could not accommodate inside the text By Guta Mengesha
  • 76. Monitoring and Evaluation for development and governmental organizations Example from UNICEF Guide “Program Manager’s Planning, Monitoring, and Evaluation Toolkit (2016): Evaluation Report: Suggested Outline Title page Name of project/program or theme being evaluated Country of project/program or theme Name of the organization to which the report is submitted. Names and affiliations of the evaluators. Table of contents Identify the chapters and major sections along with the page numbers. List of tables, graphs, and charts by page numbers. Acknowledgments Identify those who contributed to the evaluation. List of acronyms/ abbreviations For example: TOR: Terms of reference NGO: Non-Govt. Organization Executive summary. Summarize essential information on the subject being evaluated, the purpose and objectives of the evaluation, methods applied, and significant limitations, the most important findings, onclusions, and recommendations in priority order. provides a summary of the complete evaluation starting from its design, intended users, evaluation focus, key results and its implication on programme application Introduction Describe the project/program/theme being evaluated. This includes the problems that the interventions are addressing; the aims, strategies, scope, and cost of the response; its key stakeholders and their roles in implementing the intervention. Summarize the evaluation purpose, objectives, and key questions. Explain the rationale for the selection/non- selection of evaluation criteria. Describe the methodology employed to conduct the evaluation and its limitations, if any. By Guta Mengesha
  • 77. Monitoring and Evaluation for development and governmental organizations Example UNICEF……… List who were involved in conducting the evaluation and what their roles were. Describe the structure of the evaluation report. Findings and conclusions State findings based on the evidence derived from the information collected. Assess the degree to which the intervention design is applying results-based management principles. In providing a critical assessment of performance, analyze the linkages between inputs, activities, outputs, outcomes, and if possible impact. Summarize the achievement of results in quantitative and qualitative terms. Analyze factors that affected performance as well as unintended effects, both positive and negative. Discuss the relative contributions of stakeholders to the achievement of the results. Conclusions should be substantiated by the findings and be consistent with the data collected. They must relate to the evaluation objectives and provide answers to the evaluation questions. They should also include a discussion of the reasons for successes and failures, especially the constraints and enabling factors. Lessons learned Based on the evaluation findings, the overall experience in other contexts, and whenever possible, provide lessons learned that may be applicable in different situations as well. Include both positive and negative lessons. Recommendations Formulate relevant, specific, and realistic recommendations that are based on the evidence gathered, conclusions made, and lessons learned. Discuss their anticipated implications. Consult key stakeholders when developing the recommendations. Provide suggested timelines and cost estimates (where relevant) for implementation. Annexes Attach a TOR (for the evaluation). List persons interviewed, sites visited. List documents reviewed (reports, publications). Append the data collection instruments (e.g., copies of questionnaires, surveys, etc.). By Guta Mengesha
  • 78. Monitoring and Evaluation for development and governmental organizations Reference:  Organisation for European Co-operation and Development (OECD-2021)  Monitoring and Evaluation step by step guide by World Bank hand book  Thomas Winderi, What is difference between Monitoring and evaluation  Estrella and Gaventa (2020) Monitoring and Evaluation  Coach Alexander Monitoring and evaluation framework 2022  Handbook on Monitoring and Evaluation by :The International Fund for Agricultural Development (IFAD)  Monitoring and Evaluation hand book UNDP  Monitoring and Evaluation hand book UNICEF  Monitoring and Evaluation hand book FAO  Project management manual by USAID  Kultar Dharmedara &Varun.Theory of change and PMD pro 2020  World Food Program A base line survey 2013  Program Indicator by UNFPA, UNAIDS, UN Women Ultimate H book and www. tools4dev By Guta Mengesha