2. 1. De-technicalize language.
ROADMAP TO
MONITORING
What are
we trying
to
change?
Where
do we
want to
get to?
How are
we going
to get
there?
problem
analysis
goals,
objectives
strategy,
activities
What do we expect
to happen along
the way?
OUTPUT
OUTCOME
IMPACT
RESULTS
How
do we
know
we are
on the
right
road?
indicators,
baseline,
targets
3. 2. Understand the concepts
behind the terms.
What happened
next?
Change of
behavior in
participants.
Change at
population/
societal
level.
Organization
s have direct
control over
this result.
RESULTS
Have the activities
taken place?
The very first
result of an
activity.
ACTIVITIES
So what?
Organization
s have less
control over
this result.
Organization
s have very
little control,
if any.
OUTPUT
OUTCOME
IMPACT
5. 3. Know the difference.
Monitoring
Evaluation
Ongoing process
Event that occurs periodically
Recordkeeping/Tracking Activities
Analyzing results
Observing trends
Assessing impact
Mainly descriptive, recording inputs,
outputs, and activities (e.g. How
many children received
supplementary school feeding?)
More analytical and examines
processes (e,g. Did implementing
school feeding successfully increase
attendance levels?)
Allows us to make adjustments or
corrective actions in a project
Informs future programming for all
stakeholders
Data collection is part of day-to-day
management & activities
Additional, special data may be
collected using research
methodologies
Is objective and systematic
Is objective and systematic
6. M&E: What’s the Difference?
EXERCISE
We monitor…
1. A child’s height and weight.
We evaluate…
1. Whether children are growing
at a normal rate.
2. The number of weekly visits to 2. The effectiveness of homechronically ill people.
based care.
3. The number of families
3. An increase/ decrease in food
planting improved crop
security.
varieties.
4. The number of people trained 4. Whether reports of human
in human rights.
rights abuses have increased
or decreased and why.
7. 4. Make conscientious
Cost
methodology decisions.
Special or ’point’
studies
Keep expectations realistic.
Specific sample
surveys
Focus groups
Existing records
(e.g. household
lists)
Observation
Routine
statistics
Key informant
interviews
Complexity
8. 5. Keep your eye on the prize.
Remember:
You do not fatten a calf by weighing it.
~English proverb
Good luck in your M&E efforts!
Editor's Notes
Keywords: monitoring, evaluation, M&E, Jennifer Lentfer, how-matters.org, international aid, philanthropy, organizational development, metrics, evidence
These are abstract concepts - but understandable.
Aim to build partners’ capacity to measure their own progress in a more meaningful way.
M&E is about testing assumptions – this should be the new definition of building “evidence”
Monitoring is a process that systematically observes events and activities related to our work. When we monitor, we gather information regularly to check our progress.
Evaluation, on the other hand, is the assessment of a program’s relevance, efficiency, effectiveness, and impact on the target population and beneficiaries. We evaluate periodically.
This slide shows the trade-offs in cost and complexity among different methods of data collection.
Can also think of M&E as regular/routine (part of everyday duties such as beneficiary records) or special/periodic (which require additional time, resources, planning such as household surveys)
Consider the level of financial/human resources available.
The effort expended should match the improvement in decision-making.
It’s not about the indicators.It’s about reflection, learning, and adaptation based on new knowledge.