SlideShare a Scribd company logo
PREPARED AND PRESENTED BY
DR ZABRON KENGERA
GEOGRAPHY DEPARMENT
UNIVERSITY OF DARES SALAAM
 The status of M&E Globally and Tanzania in
particular
 The Paris Declaration(2005) emphasis the
need for managing for change/results
 The need for result oriented reporting and
assessments
 However, many countries especially in sub-
Saharan Africa lack the necessarily capacity
to monitor the development progress and
use findings to improve the performance of
various sectoral interventions
 The study by World bank (2006) noted
institutional and individual gap for M&E
capacity
 Many countries do not have the demand for
M&E
 Those who have it have not been able to
develop a comprehensive and systematic
M&E
 Yet still the focus has been more on
individual with lack of integration at
organizational level
 Thus, comprehensive training is needed to
feel the gap in M&E capacity
 Limited teaching of project management as
academic and professional discipline
 In Tanzania, PPM looks like new discipline
and profession with no advocate
 No ownership of the field, more of a tool
than profession.
 Difficulties in Mainstreaming of its teaching
 As the result we are lacking both PPM and
M&E experts.
 Some organizations/individuals think M&E is
unnecessarily and expensive exercises
 Some believe it is something they can simply
avoid
 While some compare Monitoring with
Supervision, some think Project Evaluation is
more of a postmortem of the project.
 Other managers are not even aware of the
relevancy and the need of M&E in their
Organizations.
Poor project implementations
Misuse and sometimes duplication of
resources
Lack of accountability
Implementation for its own sake
Poor project impact
Irrelevant and poor acceptability of
projects.
 Lack of trust and support from other
development stakeholders including
donars and project beneficiaries
• Monitoring is the regular and ongoing
collection and analysis of information on the
progress of the project and what difference
the project is making.
OR
• Monitoring is the routine collection and
analysis of information to track progress
against set plans and check compliance to
established standards.
• It helps the project team to keep focused
and energised and to ensure that they are
on track in achieving their objectives.
 It reminds the project officials on whether
they are carrying out their chosen
strategies and actions effectively.
 It identify areas that need to be adapted or
changed.
 It helps identify trends and patterns, adapt
strategies and inform decisions for
project/programme management.
Questions (present continuous tense)
 Are the inputs(finance , materials and
personnel) available at the right amount and
time?
 Are activities leading to the expected
outputs
 Are they implemented as proposed
 Are there factors which stall the progress?
 Are outputs leading to the outcomes?
 How do beneficiaries feel about the work-
What is causing delays or unexpected results
 Are we doing the right thing to the right
beneficiaries (approach and methodology)
 Is there anything happening that should
lead management to modify the operation’s
implementation plan
 Are activities being implemented on schedule
and within budget?
 Results monitoring
 Tracks effects and impacts.
 This is where monitoring merges with
evaluation to determine if the
project/programme is on target towards its
intended results (outputs, outcomes, impact)
 Whether there may be any unintended
impact (positive or negative).
 Process (activity) monitoring
Tracks the use of inputs and resources, the
progress of activities and the delivery of
outputs.
• It examines how activities are delivered –
the efficiency in time and resources.
• It is often conducted in conjunction with
compliance monitoring and feeds into the
evaluation of impact.
 For example, a water and sanitation project
may monitor that targeted households
receive septic systems according to
schedule.
 Compliance monitoring-ensures
compliance with donor regulations and
expected results, grant and contract
requirements, local governmental
regulations and laws, and ethical standards.
For example, a shelter project may monitor
that shelters adhere to agreed national and
international safety standards
 Context (situation) monitoring
Tracks the setting in which the
project/programme operates, especially as it
affects identified risks and assumptions, but
also any unexpected considerations that may
arise.
• It includes the field as well as the larger
political, institutional, funding, and policy
context that affect the project/programme.
 For example, a project in a conflict-prone
area may monitor potential fighting that
could not only affect project success but
endanger project staff and volunteers.
 Beneficiary monitoring- tracks beneficiary
perceptions of a project/programme. It
includes beneficiary satisfaction or
complaints with the project/programme,
including their participation, treatment,
access to resources and their overall
experience of change.
 Financial monitoring- accounts for costs by
input and activity within predefined
categories of expenditure. It is often
conducted in conjunction with compliance
and process monitoring
 Organizational monitoring- tracks the
sustainability, institutional development and
capacity building in the project/programme
and with its partners..
 It is often done in conjunction with the
monitoring processes of the larger,
implementing organization
 Systematic collection and analysis
 Good design- guided by clear purpose,
methodology type of information and
which indicators
 Monitoring needs to be timely, so
information can be readily used to inform
project/programme implementation.
 Focus on the results- Not monitoring for
its own sake look at whether the project
is yielding the intended results.
 Regular visits
 Aim to inform decision and improve project
performance(relevant and useful)
 Assess the relevancy and performance
 Participation of key stakeholders to increase
ownership and usefulness of the information
1. Progress reports - Obtain and analyze project
documents to get information on progress.
2.Workplans- The extent to which they have been
implemented and how do they reflect the
project objectives and activities.
3. Participation(Stakeholder meetings)- Getting
feedback from partners and beneficiaries of the
project progress.
4. Field Visits- Validation, triangulation and get
first hand information over the conditions and
trends of changes following the introduced
intervention. Supplement day to day monitoring
done by administrators.
5. Indicators and Data collection instruments ie
questionnaire and checklists
 An Internal activity normally done by
project staffs for internal use
 Is an essential part of good day-to-day
management practice
 Is concerned with verifying that project
activities are being undertaken services are
being delivered, and the project is leading to
the desired behavior changes described in
the project proposal.
 Focuses more on inputs, activities and
outputs
 External oriented normally done by
external consultants and experts.
 Is an essential activity in a longer-term
dynamic learning process
 Focuses more on in-depth information
outcomes and impacts
 Normally challenges the design
 Periodic and uses more of past tense.
 Learning and sharing of information with
other stake
 Improve and inform future
project/plans(postmotam)
 Relies on more detailed data from surveys or
studies) in addition to that collected through
the monitoring system to understand the
project in greater depth.
 Assesses higher level outcomes and impact
and may verify some of the findings from the
monitoring.
 It explore both anticipated and
unanticipated results.
 Evaluation reports/studies may form a
foundation a starting point (baseline
information) for monitoring changes
immediately after project implementation
 Through the results of periodic evaluations,
monitoring tools and strategies can be
refined and improved further.
 M & E are two different management
tools that are closely related, interactive
and mutually supportive
 Overlaps- Process based evaluation is
equated to Monitoring
 Through routine tracking of project
progress, monitoring can provide
quantitative and qualitative data useful
for designing and implementing project
evaluation exercises
 Evaluation is an assessment, as systematic and
objective as possible, of an ongoing or
completed project, programme or policy, its
design, implementation and results.
 While, Monitoring aims at tracking changes in
program results over time, Evaluation - seeks
to understand specifically why these changes
occur
 The aim is to determine the impact, relevance
efficiency, effectiveness, perceptions of the
beneficiaries, participation and sustainability
of the project activities and outcomes.
 It therefore provide information that is
credible and useful, enabling the
incorporation of lessons learned into the
decision-making process to the key
stakeholders
 Evaluations involve identifying and reflecting
upon the effects of what has been done, and
judging their worth.
 Their findings allow project/programme
managers, beneficiaries, partners, donors
and other project/programme stakeholders
to learn from the experience and improve
future interventions.
Evaluation is closely related to
monitoring but includes taking a
more in-depth look at the
outcomes or impact of a piece of
work and the extent to which the
stated objectives have been
achieved at a particular point in
time (e.g. at the mid-point in a
project and after its completion).
Evaluation helps the project team to assess
whether:
 They are working on the right issue,
 Their selected goals and objectives, the
strategies and the underlying assumptions,
and how efficient the project work has
been and whether they have used the
project resources wisely (Chapman, 2006).
 Learning from the project failures and
success, can provide valuable insights into
what works and what does not.
 M&E should also include analysing how
changes in the external context might have
influenced the project.
 Like any research activities, evaluations should
be guided by the following principles/ethics
 Objectivity- Evaluation should reveal and convey
technically adequate information about the
features that determine worth or merit of the
program being evaluated.
 Systematic enquiry analysis and reporting
 Useful (No evaluation is good unless results are
used)
 Cost effective- efficient use of resources-looks at
the availability, relevant and focused
information.
 Respectful- Respect the culture of the
respondents
 Consideration of the respondents: Develop
interests in their responses and views: Don’t
 Garbage in garbage out -Explain the purpose
and procedures you have passed through
(methodology)
 Ensure that an evaluation will be conducted
legally, ethically, and with due regard for
the welfare of those involved as well as
those affected by its results.
 Collective responsibility
 Protect your respondents(Anoimity)
 Ensure that an evaluation will be realistic
and practical
 Evaluation responds to the concerns of one
interest group more than another
 Biasness of selecting of the respondents
 Respondents are influenced either with
politics and personal differences (lacks
objectivity)
 Evaluator conducts an evaluation when
he/she lacks sufficient skills or experience.
 Impact
 What changes did the project bring about?
 Were there any unplanned or unintended
changes?
 Effectiveness
 Were the operation’s objectives achieved?
 Did the outputs lead to the intended
outcomes
 Effectiveness: Goal attainment. The extent to
which the project has been able to attain the
intended objectives(Percentage of
population reached, number of say terraces
constructed:
 There is no universally accepted standard of
the percentage but there are views that at
least 70%.The remaining can be explained by
possible externalities (political, physical and
other factors beyond the control of the
project management
 We normally don’t expect such a perfect
project model. There are however, several
cases where projects have over-performed.
 Efficiency
 Were stocks of items available on time
and in the right quantities and quality?
 Were activities implemented on
schedule and within budget?
 Were outputs delivered economically?
 Feasibility and Practicability:
 Environment/physical factors, social-
cultural and economic factors,
technology, resources required and
operational costs
 Relevance and legitimacy
 Were the operation’s objectives consistent with
beneficiaries’ needs and the organizational policies
 Participation- Tells a lot about the extent of
benefits, ownership, relevancy and acceptability
 The level and extent of community participation
 Stage of participation and the nature of people
involved
 Sustainability:
 Are the benefits likely to be maintained for an
extended period after assistance ends?
 By local people, local organization, capacity building,
participation, relevancy, practicability, alternative
funding
 Policy issues: context and policy implication
and environment
 Leadership and organizations: Institutional,
local capacity, qualified personnel
 Financial and economic: Enough money, the
economic status of the beneficiaries
 Technological factors: At the time of the
project implementation and during
evaluations
 Social cultural factors: Did cultural factors
inform the project formulation and
implementation
1. Timing of the evaluation(stage in the
PCM)
 Ex-ante evaluation- Before project
evaluation. It includes assessments of the
project proposal as a part of project appraisal
through EIA, CBA, SEA and CV
 Formative evaluations(process/activity)-
occur during project/programme
implementation to improve performance and
assess compliance(observes schedule, target
group, budget and deviations).
 Midterm evaluations are formative in
purpose and occur midway through
implementation.
 Summative evaluations- occur at the end of
project/programme implementation to assess
effectiveness and impact
 Final evaluations are summative in purpose
and are conducted (often externally) at the
completion of project/ programme
implementation to assess how well the
project/programme achieved its intended
objectives.
 Ex-post evaluations are conducted some
time after implementation to assess
longterm impact and sustainability.
• Can we include Baseline survey on this
category?.
• Baseline information set the foundation of
both monitoring and evaluation
• It informs the evaluation team of the
situation and conditions of population and
geographical areas prior to the interventions
being evaluation.
• We should therefore distinguish Baseline
from situational analysis and social economic
survey
 Internal or self-evaluations- are conducted
by those responsible for implementing a
project/programme.
 External or independent evaluations- are
conducted by evaluator(s) outside of the
implementing team, lending it a degree of
objectivity and often technical expertise.
-These tend to focus on accountability.
 What is your opinion. Would you recommend
for internal or external and why?
 What are the basis of your arguments:
o Objectivity and biasness
o Cost effectiveness
o External Vs internal influence and control
o Relevancy (to whom)
 Participatory evaluations are conducted
with the beneficiaries and other key
stakeholders, and can be empowering,
building their capacity, ownership and
support
 Joint evaluations are conducted
collaboratively by more than one
implementing partner to build consensus at
different levels, credibility and joint support
 Meta-evaluations are used to assess the
evaluation process itself.
 Thematic evaluations focus on one theme,
such as gender or environment, typically
across a number of projects, programmes or
the whole organization
 Cluster/sector evaluations focus on a set of
related activities, projects or programmes,
typically across sites and implemented by
multiple organizations (e.g.
 Impact evaluations focus on the effect of a
project/programme, rather than on its
management and delivery
1. Strategic level:
 Looks at long-term objective of the
project,
 external environment and
 Resource allocation
2. Tactical(management)
 Effienciency and Resource use
3. Operational level
 Implementation
 Time schedule and budget
 Track progress of the project.
 Identify problems/mistakes earlier and
remedy and avoid them
 Improve project performance
 Identify opportunities for future project
implementation
 Help decision-making and generate
insights and learning on which strategies
and approaches are more or less
effective in different contexts and
circumstances
 Clarify performance, costs and time
relationships
 Track and take account of the changing
context, especially the attitudes of
stakeholders.
 Respond timely and adequately to the
project threats
 Provide information to stakeholders
 Ensure effective and efficient use of
resources.
 Ensure accountability to various
stakeholders.
 Keep focused on the broader picture and
long-term goals to ensure you are
working towards these, rather than just
focusing on activities
 Generate useful evidence and data to
help support and strengthen your project
 Learning and knowledge sharing
 Provide feedback to stakeholders
 Demonstrate to internal and external
stakeholders over the impacts of the project
to socio-economic wellbeing
 Promote and celebrate our work
Of course, there are lot commonalities
between Evaluation and research
especially on systematic investigation.
 Fundamental differences
 Research generally is conducted to
produce knowledge that is generalizable
across different programs
 Research seek to prove, Evaluation seek
to improve(Michael, Patton, 2002)
 Evaluations are conducted to generate
findings that are intended for use by the
specific programs under which evaluations
are conducted.
 In research, the questions for investigation
are researcher derived.
 Questions for evaluation, on the other
hand, are derived from the program itself or
its stakeholders
 In research, the role of a researcher is more
defined and clear.
 Research is conducted in a more controlled
environment as compared with evaluation
which may need to analyze the context in
which the project is being implemented
 Evaluators, on the other hand, may often
have role conflicts as they may be a part
of the program in which the evaluation is
being conducted.
 Results of the research benefits small
audience as compared with the
evaluation results
 Published vs Often not Published
 Although research and evaluation are
very similar and both use systematic
methods, their intent and purposes
differ.
 As Patton says, “Research seeks to prove,
evaluation seeks to improve…”
 Evaluation involves looking closely at the
operations of programs or program
initiative, and with the understanding
gained from this examination, make
recommendations for improving the
program.
 Intention for doing evaluation is to help
improve programs and not to blame or
criticism or seek to eliminate something.
 Can we compare and contrast the following:
 Are how are they related and what are their
differences. Definitely they have different
roles in managing the project
 Project Monitoring and Evaluation
 Research
 Project supervision
 Project auditing
 Project tracking
 Quality control
 Quality and availability of M and E experts
 Perceptions from Employers(Resource waste and
consuming)
 The demand for M and E is still low
 M and E training is still at infancy stage.
 M &E as surplus activity(something we can do
without)
 M &E as experts are being seen as auditors(bring
troubles to project managers
 Monitoring and evaluation is an on going
activity.
 It exists in all stages of the PCM
 Good M&E should start at the design and
planning stage of a programme or project.
 Off course M&E alone can not solve
weak strategies and project design.
 However, It is very hard to carry out M
and E without a good planning
 Identify the key issues and root of the
problems you want to address (your
situation or problem analysis)
 Identify the key changes that you want
 Develop effective strategies to get what
you want
 Design ways to monitor their progress
 Determine what resources and knowledge
are required
 Ensure that your work is cost effective.
1. Initial Assessments- Initial needs
Assessments, stakeholder analysis
2. Planning- Project design(LFA), M&E plan,
Baseline study(project start)
3. Implementation- Process and Midterm
and end line(final evaluation)
4. Post implementation- Impact study or
evaluation for either dissemination, use
of lessons and possible longitudinal
evaluation/longtidinal studies.
 Like other stages of the project cycle, M&E
passes through the following stages
 Plan/decision
 Implementation
 Reflect/learn/decide/adjust
 Implement
 Monitor
 Reflect and adjust
 Implement
 Evaluate/learn/decide
What does this suggest?:
• M & E is an endless process
• It is not a one time thing
• Planning for M & E should start at the very
beginning of the project
• There are different types, forms and objectives
for M & E
• M and E information may differ according to the
stage and objective.
 An evaluation plan typically includes descriptions of
the following:
 Purpose of program(Brief description of the project)
 The objective evaluation
 Evaluation questions/issues
 Data collection plans
 Data analysis plans
 Dissemination and reporting activities
 Other evaluation products
 Timeline and budget
 Staff responsible for each evaluation activity
 What the project is going to achieve, from
the overall goal to specific objectives and
outputs
 What activities will be carried out to
achieve its outputs and purpose
 What resources (inputs) are required
 How the progress and ultimate success of
the project will be measured and verified
(including qualitative and quantitative
indicators)
 Understanding the system approach
 Difference views of an M&E system
 Levels of an M&E system
 A good M&E system
 Components of an M&E system
 Comparing private and public sector
 Limitations/problems of M&E system in the
public sector.
 Systems approach
System is a group of interconnected
and interrelated components to form a
whole (Senge, 1990)
 From a system thinking approach-
interrelated parts/elements.
 Example of systems: Ecological systems,
body systems, car systems and in our case
M&E system.
 What is common to all systems is that no
part/element will work independently.
 M&E system are critical parts/
constituents/components which individually
would not function effectively unless they
are put together.
 In a very simplified language M&E system in
meant to capture, manage, analyze, report
and use and share information.
 Narrow view:
 M&E as a series of data collection tools
designed to gather information and summary
of progress against a pre-defined set of
objectives and indicators.
 Wide scope
 M&E system to cover elements such as
people, processes , baseline studies,
reporting, learning mechanisms and data
storage.
 International
 Country
 Project level
 More than a list of ‘indicators’ !
 Clarity on who is going to use the information for
what kinds of decision making
 System is ‘usable’ for the level of decision
making
 The system is ‘operational’ – clear on who
is to collect and report what data by
when and to whom
 Sustainability-bugdet, human resources and
commitments and incentives to implement.
There are no standardized and universally
agreed components of the M&E system.
A well functional M&E system may
compose the following components.
Components/elements of the M&E system
The status and Comparison of an M&E
system in Private and Public sector
Limitations/problems of M&E system in
Tanzania.
Good M&E system
M&E plan- role, components and features
 There are might be no universally
accepted and standardized of what
exactly constitutes an M&E systems.
 There could be also different in wording
and terminologies of some of the agreed
components of the system.
 Mixed grill- Components and
elements?.They may have different
connotation and interpretations
 Some have gone further to suggest even
the number of elements/components of
comprising of an M&E system
 Many writers seem to suggest that there are
12 of components/elements of a functional
M&E system
 There could raise one practical question on
how did they arrive to this figure.
 There are observations for example that
some of the components INCLUDED in the
M&E system are also included in the M&E
plan.
 The fact that an M&E plan is one of the M&E
system elements, makes it a bit difficult to
suggest the number and exact elements of
the system
But generally a functional M&E system may
require the following elements.
 Organizational structure with M&E unit
which coordinates all the M&E fund, internal
organs or outsourcing
 People- Human element (human capacity)
adequate staffs with necessary technical
know how and experience. An M&E unit may
consist of M&E manager, officers, stastitian,
IT and field officers?
 M&E plan- A plan which shows the tactics
of how to operationalize the system
 Costed M&E work plan a more detailed
plan to make M&E plan operational with
(budget for each activity.
 M&E framework/Logical Framework(at
project level) which outlines the
objectives, outputs, activities , outputs
and inputs. This may as well help to
develop indicators and risk management
 Data base/Data storage - Tracking system
including how data is stored and retrieved at
different level
 Indicators and Targets
 Baseline information- to act as a yardstick
on assessing the progress towards success
 Supporting processes- including training
people, supervision, information flows
between different people, review of the
information, reporting of mistakes and
failures.
 Communication, advocacy and culture for
M&E-Policies and strategies which promotes
M&E continuity in an organization
 Mechanisms for data capturing, analysis at
different levels(Survey and surveillance)
 Participation: Who participates in the M&E
process, how and why
 Reporting system- who reports to whom
including different reports generated at
different level
 Data use, dissemination and learning: Use
the M&E information for decision making and
develop mechanisms for information sharing.
 M&E as a displines and profession is still at
infancy stage.
 For reasons we discussed earlier
 Discussions with a few experts and
practitioners in this field seem to suggest
the followings:
 Comparatively, M&E systems are more
visible in the private/NGOs than the
public sectors.
 More financial control with a few other
elements of monitoring and evaluation.
 The nature of employment is more
performance based as compared to
indefinite employments and managerial
post acquisition in the government
sector.
 Yet still the M&E system is still not
effective as it lacks a lot of important
 Government officials are not ready to
be measured.
 Poor functioning of OPRAS can serve a
good case to illustrate this argument.
 OPRAS in many cases have been frustrated
and its seems to be a bit in practical.
 The government was trying to introduce
the OPEN PERFOMANCE MANAGEMENT-
Ones salary could be raised based on his
performance.
 Many field these forms because it could
raise their salaries.
 Lack of capacity and political will- very
little budget capacity development unless
there is donar funding.
 It is only recently the government have
demanded every ministry to have M&E units
within the Planning and policy departments.
The majority are economists probably with
very little or no expertise in monitoring and
evaluation
 )
 Lack motivation- No resources, resources
are not coming on time therefore no need
to measure(no enabling environment
 Long processes of getting resources:
Funds are not reliable so difficult to
implement and measure according to
plans/less control of the process
 So many
contingencies/unplanned/political
interference and competing
priorities(Dengue and other disasters
which may need to be attended using the
 Laxity – for so several reasons including
corrupt environment and lack of
performance based culture.
 Many M&E systems are donar forced
 Lack of tools in many daily operations
 Lack of common tools for assessments-
causes different understandings of the
same results
 Data related problems- accessibility,
quality and cleaning
 Technological factors – Data base
 It is tool and strategy for operationalizing
the M&E system.
 While the M&E system in normally
instituted at organizational level, an M&E
plan normally works at project level.
 Used as a guide of what you should
evaluate, what information you need and
who you are evaluating for
 The plan outline the key evaluation
questions
 Depending on the detail, the evaluation
plan can be useful in identify the people
responsible for different evaluation tasks
at different stages of the project
 The plan need to be well articulated so
that it can be implemented by anybody at
anytime.
 Normally prepared at the beginning of the
project
 To allow the project staff to plan ahead
of time
 Information needs: What, who and why
 Information source: Methods, frequency,
location of data to be collected
 Responsibility for MER- Who is
responsible
 What indicators should be used to
measures and Monitor each stage of the
project.
 How should the information be collected
 How to Measure quality and effectiveness
 How and when to communicate the
i) Identify your evaluation audience: Who
are you evaluation for and for what
purpose.
ii) Define the evaluation questions
-Process- how well was the project
designed and implemented
-Outputs- Expected goods and services
-Outcome- To what extent did it meet the
overall needs, how valuable are the
outcomes.
-Learning- What worked well and what did
iii) Identify the monitoring questions(more
specific question deducted from the
evaluation question
iv) Identify the indicators and data sources
- What information(indicator) do you need
to answer the Monitoring and evaluation
questions and where do you get this
information(data sources)
V) Identify who is responsible for data
collection and timeline
Vi) Identify who will evaluate the data and
how will it be reported and how
- Option although highly recommended
Vii) Review the M&E plan
- Highlight the data sources
- Reordering the plan in several ways
including data sources, data collection
timeline and framework
- Select /prioritize some question for
budgeting purposes
- Reduce collection of unused information
 User friendly
 Implementable
 Monitor the use of the project outputs
 Monitor the effectiveness of the project
 Monitor the production/process of the
project outputs
 Asses the project inputs
 Assess the effectiveness and relevancy of
the project outputs
 Asses the extent to which the observable
impacts can be attributed to the project
• Appreciate and demonstrate how Logic
models and theories can guide
Evaluation and project designing.
• Discuss the relevancy of at least two
common Logic models( Project
Conceptual Models and Logical
Framework) and Theory of change.
• Understand major variable and
components of these models/theories.
. Understand the differences and
complementarities between Logic
models and Theory of change.
 A logic is a set of rules or relationships that
govern behaviour.
 Explain relationship between elements and
between an element and the whole.
 The term logic model/Models is used as a
generic label to depict ways of displaying the
intervention and how such intervention will
bring the desired change.
 Like a road map, a logic model shows the route
travelled (or steps taken) to reach a certain
destination
 They are narrative or graphical
depictions of processes in real life that
communicate the underlying assumptions
upon which an activity is expected to lead
to a specific result.
 In program or project management logical
models demonstrate how an
intervention (a project, a program, a
policy, a strategy) is likely to contribute
to possible or actual impacts.
 They describe linkages among program
resources, activities, outputs, and audiences,
and highlights different orders of outcomes
related to a specific problem or situation
 They express the thinking behind an initiative's
plan.
 Also explain why the program ought to work,
why it can succeed where other attempts have
failed.
 They normally provide direction and clarity
by presenting the big picture of change along
with certain important milestones.
 Initially used mainly for Program Evaluation
but increasingly used for project planning
and implementation
 Used during different stages of the program
for new and existing programs.
 During planning , a logic models can help
to clarify program strategy, development
M&E systems, develop evidence for
measuring change(indicators), identify
appropriate outcome targets, set priorities
for allocating resources and timelines
During implementation:
 They help to describe, modify or enhance
the program
 Provide an inventory of what you have and
what you need to operate the program or
initiative, develop a management plan
 Make mid-course adjustments
During evaluation to:
 Reveal information needs and provide a
framework for interpreting results
 Document accomplishments
 Provide evidence about the program
 Identify differences between the ideal
program and its real operation
 Frame questions about attribution (of cause
and effect) and contribution (of initiative
components to the outcomes)
 Tell the story of the program or initiative
 Program logic/ logic model/
intervention logic
 Program theory/theory of
change/model of change,
 Causal model/results chain/ causal
chain/chain of causation
 Road map/ conceptual map/
pathways map,
 Mental model, blueprint for change,
framework for action/ program
framework
 Program hypothesis/ theoretical
Monotoring and evaluation  principles and theories

More Related Content

What's hot

Monitoring and evaluation (2)
Monitoring and evaluation (2)Monitoring and evaluation (2)
Monitoring and evaluation (2)Dr.RAJEEV KASHYAP
 
M&E Plan
M&E PlanM&E Plan
M&E Plan
DavidMakoko1
 
7 M&E: Indicators
7 M&E: Indicators7 M&E: Indicators
7 M&E: Indicators
Tony
 
Components of a monitoring and evaluation system
Components of a monitoring and evaluation system  Components of a monitoring and evaluation system
Components of a monitoring and evaluation system
Preston Healthcare Consulting
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects
Tony
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
Knowledge Management Center
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
migom doley
 
Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]skzarif
 
M & E Fundamentals.
M & E Fundamentals.M & E Fundamentals.
M & E Fundamentals.
PrestonAssociates
 
Logical framework
Logical frameworkLogical framework
Logical framework
Md. Ayatullah Khan
 
Project Monitoring and Evaluation
Project Monitoring and Evaluation Project Monitoring and Evaluation
Project Monitoring and Evaluation
Central University of Karnataka Kalaburagi
 
Monitoring & Evaluation Framework - Fiinovation
Monitoring & Evaluation Framework - FiinovationMonitoring & Evaluation Framework - Fiinovation
Monitoring & Evaluation Framework - Fiinovation
Fiinovation | Innovative Financial Advisors Pvt.Ltd
 
Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios
athanzeer
 
Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2
Meshack Lomoywara
 
Monitoring and Evaluation Framework
Monitoring and Evaluation FrameworkMonitoring and Evaluation Framework
Monitoring and Evaluation Framework
Dr. Joy Kenneth Sala Biasong
 
Monitoring and evaluation frameworks logical framework
Monitoring and evaluation frameworks logical frameworkMonitoring and evaluation frameworks logical framework
Monitoring and evaluation frameworks logical framework
Preston Healthcare Consulting
 
Project evaluation
Project evaluationProject evaluation
Project evaluation
Keshav Maheshwari
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.
Muthuraj K
 
Monitoring & Evalution ...... Orientation PPT
Monitoring & Evalution ...... Orientation PPT Monitoring & Evalution ...... Orientation PPT
Monitoring & Evalution ...... Orientation PPT
Gus Prakash
 

What's hot (20)

Monitoring and evaluation (2)
Monitoring and evaluation (2)Monitoring and evaluation (2)
Monitoring and evaluation (2)
 
M&E Plan
M&E PlanM&E Plan
M&E Plan
 
7 M&E: Indicators
7 M&E: Indicators7 M&E: Indicators
7 M&E: Indicators
 
Components of a monitoring and evaluation system
Components of a monitoring and evaluation system  Components of a monitoring and evaluation system
Components of a monitoring and evaluation system
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]
 
Monitoring indicators
Monitoring indicatorsMonitoring indicators
Monitoring indicators
 
M & E Fundamentals.
M & E Fundamentals.M & E Fundamentals.
M & E Fundamentals.
 
Logical framework
Logical frameworkLogical framework
Logical framework
 
Project Monitoring and Evaluation
Project Monitoring and Evaluation Project Monitoring and Evaluation
Project Monitoring and Evaluation
 
Monitoring & Evaluation Framework - Fiinovation
Monitoring & Evaluation Framework - FiinovationMonitoring & Evaluation Framework - Fiinovation
Monitoring & Evaluation Framework - Fiinovation
 
Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios
 
Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2
 
Monitoring and Evaluation Framework
Monitoring and Evaluation FrameworkMonitoring and Evaluation Framework
Monitoring and Evaluation Framework
 
Monitoring and evaluation frameworks logical framework
Monitoring and evaluation frameworks logical frameworkMonitoring and evaluation frameworks logical framework
Monitoring and evaluation frameworks logical framework
 
Project evaluation
Project evaluationProject evaluation
Project evaluation
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.
 
Monitoring & Evalution ...... Orientation PPT
Monitoring & Evalution ...... Orientation PPT Monitoring & Evalution ...... Orientation PPT
Monitoring & Evalution ...... Orientation PPT
 

Similar to Monotoring and evaluation principles and theories

Monitoring and evaluation1
Monitoring and evaluation1Monitoring and evaluation1
Monitoring and evaluation1
PCPD Palestine
 
evaluation of deped proj,prog and activi
evaluation of deped proj,prog and activievaluation of deped proj,prog and activi
evaluation of deped proj,prog and activi
Mei Miraflor
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
Discover JKUAT
 
Program Rationale and Logic for Post Monitoring
Program Rationale and Logic for Post MonitoringProgram Rationale and Logic for Post Monitoring
Program Rationale and Logic for Post MonitoringThabang Nare
 
Monitoring and Evaluation Lesson 2.pptx
Monitoring and Evaluation Lesson 2.pptxMonitoring and Evaluation Lesson 2.pptx
Monitoring and Evaluation Lesson 2.pptx
Vallentine Okumu
 
Definations for Learning 24 July 2022 [Autosaved].pptx
Definations for Learning 24 July 2022 [Autosaved].pptxDefinations for Learning 24 July 2022 [Autosaved].pptx
Definations for Learning 24 July 2022 [Autosaved].pptx
InayatUllah780749
 
Curriculum monitoring
Curriculum monitoringCurriculum monitoring
Curriculum monitoring
navanitha sinnasamy
 
Project Monitoring and Evaluation (M and E Plan) Notes
Project Monitoring and Evaluation (M and E Plan) NotesProject Monitoring and Evaluation (M and E Plan) Notes
Project Monitoring and Evaluation (M and E Plan) Notes
Excellence Foundation for South Sudan
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
Srinivasan Rengasamy
 
Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9Harinder Goel
 
Monitoring & Evaluation
Monitoring & EvaluationMonitoring & Evaluation
Monitoring & Evaluation
Devegowda S R
 
Project monitoring
Project monitoringProject monitoring
Project monitoring
Krishna Mishra
 
Writing evaluation report of a project
Writing evaluation report of a projectWriting evaluation report of a project
Writing evaluation report of a project
03363635718
 
Almm monitoring and evaluation tools draft[1]acm
Almm monitoring and evaluation tools draft[1]acmAlmm monitoring and evaluation tools draft[1]acm
Almm monitoring and evaluation tools draft[1]acmAlberto Mico
 
Project management
Project managementProject management
Project management
Manish Runthala
 
Evaluation Capacity Development
Evaluation Capacity DevelopmentEvaluation Capacity Development
Evaluation Capacity Development
Olivier Serrat
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
selam49
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdf
noblex1
 
Lec 06 planning
Lec 06 planningLec 06 planning
Lec 06 planning
SAJID ALI RUK
 
USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)
Emma Sherry
 

Similar to Monotoring and evaluation principles and theories (20)

Monitoring and evaluation1
Monitoring and evaluation1Monitoring and evaluation1
Monitoring and evaluation1
 
evaluation of deped proj,prog and activi
evaluation of deped proj,prog and activievaluation of deped proj,prog and activi
evaluation of deped proj,prog and activi
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
 
Program Rationale and Logic for Post Monitoring
Program Rationale and Logic for Post MonitoringProgram Rationale and Logic for Post Monitoring
Program Rationale and Logic for Post Monitoring
 
Monitoring and Evaluation Lesson 2.pptx
Monitoring and Evaluation Lesson 2.pptxMonitoring and Evaluation Lesson 2.pptx
Monitoring and Evaluation Lesson 2.pptx
 
Definations for Learning 24 July 2022 [Autosaved].pptx
Definations for Learning 24 July 2022 [Autosaved].pptxDefinations for Learning 24 July 2022 [Autosaved].pptx
Definations for Learning 24 July 2022 [Autosaved].pptx
 
Curriculum monitoring
Curriculum monitoringCurriculum monitoring
Curriculum monitoring
 
Project Monitoring and Evaluation (M and E Plan) Notes
Project Monitoring and Evaluation (M and E Plan) NotesProject Monitoring and Evaluation (M and E Plan) Notes
Project Monitoring and Evaluation (M and E Plan) Notes
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
 
Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9
 
Monitoring & Evaluation
Monitoring & EvaluationMonitoring & Evaluation
Monitoring & Evaluation
 
Project monitoring
Project monitoringProject monitoring
Project monitoring
 
Writing evaluation report of a project
Writing evaluation report of a projectWriting evaluation report of a project
Writing evaluation report of a project
 
Almm monitoring and evaluation tools draft[1]acm
Almm monitoring and evaluation tools draft[1]acmAlmm monitoring and evaluation tools draft[1]acm
Almm monitoring and evaluation tools draft[1]acm
 
Project management
Project managementProject management
Project management
 
Evaluation Capacity Development
Evaluation Capacity DevelopmentEvaluation Capacity Development
Evaluation Capacity Development
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdf
 
Lec 06 planning
Lec 06 planningLec 06 planning
Lec 06 planning
 
USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)
 

Recently uploaded

Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
Thiyagu K
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Thiyagu K
 
PART A. Introduction to Costumer Service
PART A. Introduction to Costumer ServicePART A. Introduction to Costumer Service
PART A. Introduction to Costumer Service
PedroFerreira53928
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
EduSkills OECD
 
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
Nguyen Thanh Tu Collection
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
Col Mukteshwar Prasad
 
The French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free downloadThe French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free download
Vivekanand Anglo Vedic Academy
 
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
EugeneSaldivar
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
Steve Thomason
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
Celine George
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
MIRIAMSALINAS13
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
EverAndrsGuerraGuerr
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
Fundacja Rozwoju Społeczeństwa Przedsiębiorczego
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
beazzy04
 
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCECLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
BhavyaRajput3
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdf
joachimlavalley1
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
Tamralipta Mahavidyalaya
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
MysoreMuleSoftMeetup
 

Recently uploaded (20)

Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
 
PART A. Introduction to Costumer Service
PART A. Introduction to Costumer ServicePART A. Introduction to Costumer Service
PART A. Introduction to Costumer Service
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
 
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
 
The French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free downloadThe French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free download
 
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...TESDA TM1 REVIEWER  FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
 
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCECLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
CLASS 11 CBSE B.St Project AIDS TO TRADE - INSURANCE
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdf
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
 

Monotoring and evaluation principles and theories

  • 1. PREPARED AND PRESENTED BY DR ZABRON KENGERA GEOGRAPHY DEPARMENT UNIVERSITY OF DARES SALAAM
  • 2.  The status of M&E Globally and Tanzania in particular  The Paris Declaration(2005) emphasis the need for managing for change/results  The need for result oriented reporting and assessments  However, many countries especially in sub- Saharan Africa lack the necessarily capacity to monitor the development progress and use findings to improve the performance of various sectoral interventions
  • 3.  The study by World bank (2006) noted institutional and individual gap for M&E capacity  Many countries do not have the demand for M&E  Those who have it have not been able to develop a comprehensive and systematic M&E
  • 4.  Yet still the focus has been more on individual with lack of integration at organizational level  Thus, comprehensive training is needed to feel the gap in M&E capacity
  • 5.  Limited teaching of project management as academic and professional discipline  In Tanzania, PPM looks like new discipline and profession with no advocate  No ownership of the field, more of a tool than profession.  Difficulties in Mainstreaming of its teaching  As the result we are lacking both PPM and M&E experts.  Some organizations/individuals think M&E is unnecessarily and expensive exercises
  • 6.  Some believe it is something they can simply avoid  While some compare Monitoring with Supervision, some think Project Evaluation is more of a postmortem of the project.  Other managers are not even aware of the relevancy and the need of M&E in their Organizations.
  • 7. Poor project implementations Misuse and sometimes duplication of resources Lack of accountability Implementation for its own sake Poor project impact Irrelevant and poor acceptability of projects.  Lack of trust and support from other development stakeholders including donars and project beneficiaries
  • 8. • Monitoring is the regular and ongoing collection and analysis of information on the progress of the project and what difference the project is making. OR • Monitoring is the routine collection and analysis of information to track progress against set plans and check compliance to established standards. • It helps the project team to keep focused and energised and to ensure that they are on track in achieving their objectives.
  • 9.  It reminds the project officials on whether they are carrying out their chosen strategies and actions effectively.  It identify areas that need to be adapted or changed.  It helps identify trends and patterns, adapt strategies and inform decisions for project/programme management.
  • 10. Questions (present continuous tense)  Are the inputs(finance , materials and personnel) available at the right amount and time?  Are activities leading to the expected outputs  Are they implemented as proposed  Are there factors which stall the progress?  Are outputs leading to the outcomes?
  • 11.  How do beneficiaries feel about the work- What is causing delays or unexpected results  Are we doing the right thing to the right beneficiaries (approach and methodology)  Is there anything happening that should lead management to modify the operation’s implementation plan  Are activities being implemented on schedule and within budget?
  • 12.  Results monitoring  Tracks effects and impacts.  This is where monitoring merges with evaluation to determine if the project/programme is on target towards its intended results (outputs, outcomes, impact)  Whether there may be any unintended impact (positive or negative).
  • 13.  Process (activity) monitoring Tracks the use of inputs and resources, the progress of activities and the delivery of outputs. • It examines how activities are delivered – the efficiency in time and resources. • It is often conducted in conjunction with compliance monitoring and feeds into the evaluation of impact.
  • 14.  For example, a water and sanitation project may monitor that targeted households receive septic systems according to schedule.  Compliance monitoring-ensures compliance with donor regulations and expected results, grant and contract requirements, local governmental regulations and laws, and ethical standards. For example, a shelter project may monitor that shelters adhere to agreed national and international safety standards
  • 15.  Context (situation) monitoring Tracks the setting in which the project/programme operates, especially as it affects identified risks and assumptions, but also any unexpected considerations that may arise. • It includes the field as well as the larger political, institutional, funding, and policy context that affect the project/programme.
  • 16.  For example, a project in a conflict-prone area may monitor potential fighting that could not only affect project success but endanger project staff and volunteers.  Beneficiary monitoring- tracks beneficiary perceptions of a project/programme. It includes beneficiary satisfaction or complaints with the project/programme, including their participation, treatment, access to resources and their overall experience of change.
  • 17.  Financial monitoring- accounts for costs by input and activity within predefined categories of expenditure. It is often conducted in conjunction with compliance and process monitoring  Organizational monitoring- tracks the sustainability, institutional development and capacity building in the project/programme and with its partners..
  • 18.  It is often done in conjunction with the monitoring processes of the larger, implementing organization
  • 19.  Systematic collection and analysis  Good design- guided by clear purpose, methodology type of information and which indicators  Monitoring needs to be timely, so information can be readily used to inform project/programme implementation.  Focus on the results- Not monitoring for its own sake look at whether the project is yielding the intended results.
  • 20.  Regular visits  Aim to inform decision and improve project performance(relevant and useful)  Assess the relevancy and performance  Participation of key stakeholders to increase ownership and usefulness of the information
  • 21. 1. Progress reports - Obtain and analyze project documents to get information on progress. 2.Workplans- The extent to which they have been implemented and how do they reflect the project objectives and activities. 3. Participation(Stakeholder meetings)- Getting feedback from partners and beneficiaries of the project progress. 4. Field Visits- Validation, triangulation and get first hand information over the conditions and trends of changes following the introduced intervention. Supplement day to day monitoring done by administrators. 5. Indicators and Data collection instruments ie questionnaire and checklists
  • 22.  An Internal activity normally done by project staffs for internal use  Is an essential part of good day-to-day management practice  Is concerned with verifying that project activities are being undertaken services are being delivered, and the project is leading to the desired behavior changes described in the project proposal.  Focuses more on inputs, activities and outputs
  • 23.  External oriented normally done by external consultants and experts.  Is an essential activity in a longer-term dynamic learning process  Focuses more on in-depth information outcomes and impacts  Normally challenges the design  Periodic and uses more of past tense.  Learning and sharing of information with other stake  Improve and inform future project/plans(postmotam)
  • 24.  Relies on more detailed data from surveys or studies) in addition to that collected through the monitoring system to understand the project in greater depth.  Assesses higher level outcomes and impact and may verify some of the findings from the monitoring.  It explore both anticipated and unanticipated results.
  • 25.  Evaluation reports/studies may form a foundation a starting point (baseline information) for monitoring changes immediately after project implementation  Through the results of periodic evaluations, monitoring tools and strategies can be refined and improved further.
  • 26.  M & E are two different management tools that are closely related, interactive and mutually supportive  Overlaps- Process based evaluation is equated to Monitoring  Through routine tracking of project progress, monitoring can provide quantitative and qualitative data useful for designing and implementing project evaluation exercises
  • 27.  Evaluation is an assessment, as systematic and objective as possible, of an ongoing or completed project, programme or policy, its design, implementation and results.  While, Monitoring aims at tracking changes in program results over time, Evaluation - seeks to understand specifically why these changes occur  The aim is to determine the impact, relevance efficiency, effectiveness, perceptions of the beneficiaries, participation and sustainability of the project activities and outcomes.
  • 28.  It therefore provide information that is credible and useful, enabling the incorporation of lessons learned into the decision-making process to the key stakeholders
  • 29.  Evaluations involve identifying and reflecting upon the effects of what has been done, and judging their worth.  Their findings allow project/programme managers, beneficiaries, partners, donors and other project/programme stakeholders to learn from the experience and improve future interventions.
  • 30. Evaluation is closely related to monitoring but includes taking a more in-depth look at the outcomes or impact of a piece of work and the extent to which the stated objectives have been achieved at a particular point in time (e.g. at the mid-point in a project and after its completion).
  • 31. Evaluation helps the project team to assess whether:  They are working on the right issue,  Their selected goals and objectives, the strategies and the underlying assumptions, and how efficient the project work has been and whether they have used the project resources wisely (Chapman, 2006).
  • 32.  Learning from the project failures and success, can provide valuable insights into what works and what does not.  M&E should also include analysing how changes in the external context might have influenced the project.
  • 33.  Like any research activities, evaluations should be guided by the following principles/ethics  Objectivity- Evaluation should reveal and convey technically adequate information about the features that determine worth or merit of the program being evaluated.  Systematic enquiry analysis and reporting  Useful (No evaluation is good unless results are used)  Cost effective- efficient use of resources-looks at the availability, relevant and focused information.  Respectful- Respect the culture of the respondents  Consideration of the respondents: Develop interests in their responses and views: Don’t
  • 34.  Garbage in garbage out -Explain the purpose and procedures you have passed through (methodology)  Ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved as well as those affected by its results.  Collective responsibility  Protect your respondents(Anoimity)  Ensure that an evaluation will be realistic and practical
  • 35.  Evaluation responds to the concerns of one interest group more than another  Biasness of selecting of the respondents  Respondents are influenced either with politics and personal differences (lacks objectivity)  Evaluator conducts an evaluation when he/she lacks sufficient skills or experience.
  • 36.  Impact  What changes did the project bring about?  Were there any unplanned or unintended changes?  Effectiveness  Were the operation’s objectives achieved?  Did the outputs lead to the intended outcomes
  • 37.  Effectiveness: Goal attainment. The extent to which the project has been able to attain the intended objectives(Percentage of population reached, number of say terraces constructed:  There is no universally accepted standard of the percentage but there are views that at least 70%.The remaining can be explained by possible externalities (political, physical and other factors beyond the control of the project management  We normally don’t expect such a perfect project model. There are however, several cases where projects have over-performed.
  • 38.  Efficiency  Were stocks of items available on time and in the right quantities and quality?  Were activities implemented on schedule and within budget?  Were outputs delivered economically?  Feasibility and Practicability:  Environment/physical factors, social- cultural and economic factors, technology, resources required and operational costs
  • 39.  Relevance and legitimacy  Were the operation’s objectives consistent with beneficiaries’ needs and the organizational policies  Participation- Tells a lot about the extent of benefits, ownership, relevancy and acceptability  The level and extent of community participation  Stage of participation and the nature of people involved  Sustainability:  Are the benefits likely to be maintained for an extended period after assistance ends?  By local people, local organization, capacity building, participation, relevancy, practicability, alternative funding
  • 40.  Policy issues: context and policy implication and environment  Leadership and organizations: Institutional, local capacity, qualified personnel  Financial and economic: Enough money, the economic status of the beneficiaries  Technological factors: At the time of the project implementation and during evaluations  Social cultural factors: Did cultural factors inform the project formulation and implementation
  • 41. 1. Timing of the evaluation(stage in the PCM)  Ex-ante evaluation- Before project evaluation. It includes assessments of the project proposal as a part of project appraisal through EIA, CBA, SEA and CV  Formative evaluations(process/activity)- occur during project/programme implementation to improve performance and assess compliance(observes schedule, target group, budget and deviations).  Midterm evaluations are formative in purpose and occur midway through implementation.  Summative evaluations- occur at the end of project/programme implementation to assess effectiveness and impact
  • 42.  Final evaluations are summative in purpose and are conducted (often externally) at the completion of project/ programme implementation to assess how well the project/programme achieved its intended objectives.  Ex-post evaluations are conducted some time after implementation to assess longterm impact and sustainability.
  • 43. • Can we include Baseline survey on this category?. • Baseline information set the foundation of both monitoring and evaluation • It informs the evaluation team of the situation and conditions of population and geographical areas prior to the interventions being evaluation. • We should therefore distinguish Baseline from situational analysis and social economic survey
  • 44.  Internal or self-evaluations- are conducted by those responsible for implementing a project/programme.  External or independent evaluations- are conducted by evaluator(s) outside of the implementing team, lending it a degree of objectivity and often technical expertise. -These tend to focus on accountability.
  • 45.  What is your opinion. Would you recommend for internal or external and why?  What are the basis of your arguments: o Objectivity and biasness o Cost effectiveness o External Vs internal influence and control o Relevancy (to whom)
  • 46.  Participatory evaluations are conducted with the beneficiaries and other key stakeholders, and can be empowering, building their capacity, ownership and support  Joint evaluations are conducted collaboratively by more than one implementing partner to build consensus at different levels, credibility and joint support
  • 47.  Meta-evaluations are used to assess the evaluation process itself.  Thematic evaluations focus on one theme, such as gender or environment, typically across a number of projects, programmes or the whole organization  Cluster/sector evaluations focus on a set of related activities, projects or programmes, typically across sites and implemented by multiple organizations (e.g.
  • 48.  Impact evaluations focus on the effect of a project/programme, rather than on its management and delivery
  • 49. 1. Strategic level:  Looks at long-term objective of the project,  external environment and  Resource allocation 2. Tactical(management)  Effienciency and Resource use 3. Operational level  Implementation  Time schedule and budget
  • 50.  Track progress of the project.  Identify problems/mistakes earlier and remedy and avoid them  Improve project performance  Identify opportunities for future project implementation  Help decision-making and generate insights and learning on which strategies and approaches are more or less effective in different contexts and circumstances
  • 51.  Clarify performance, costs and time relationships  Track and take account of the changing context, especially the attitudes of stakeholders.  Respond timely and adequately to the project threats  Provide information to stakeholders
  • 52.  Ensure effective and efficient use of resources.  Ensure accountability to various stakeholders.  Keep focused on the broader picture and long-term goals to ensure you are working towards these, rather than just focusing on activities  Generate useful evidence and data to help support and strengthen your project
  • 53.  Learning and knowledge sharing  Provide feedback to stakeholders  Demonstrate to internal and external stakeholders over the impacts of the project to socio-economic wellbeing  Promote and celebrate our work
  • 54. Of course, there are lot commonalities between Evaluation and research especially on systematic investigation.  Fundamental differences  Research generally is conducted to produce knowledge that is generalizable across different programs  Research seek to prove, Evaluation seek to improve(Michael, Patton, 2002)
  • 55.  Evaluations are conducted to generate findings that are intended for use by the specific programs under which evaluations are conducted.  In research, the questions for investigation are researcher derived.  Questions for evaluation, on the other hand, are derived from the program itself or its stakeholders  In research, the role of a researcher is more defined and clear.  Research is conducted in a more controlled environment as compared with evaluation which may need to analyze the context in which the project is being implemented
  • 56.  Evaluators, on the other hand, may often have role conflicts as they may be a part of the program in which the evaluation is being conducted.  Results of the research benefits small audience as compared with the evaluation results  Published vs Often not Published
  • 57.  Although research and evaluation are very similar and both use systematic methods, their intent and purposes differ.  As Patton says, “Research seeks to prove, evaluation seeks to improve…”  Evaluation involves looking closely at the operations of programs or program initiative, and with the understanding gained from this examination, make recommendations for improving the program.  Intention for doing evaluation is to help improve programs and not to blame or criticism or seek to eliminate something.
  • 58.  Can we compare and contrast the following:  Are how are they related and what are their differences. Definitely they have different roles in managing the project  Project Monitoring and Evaluation  Research  Project supervision  Project auditing  Project tracking  Quality control
  • 59.  Quality and availability of M and E experts  Perceptions from Employers(Resource waste and consuming)  The demand for M and E is still low  M and E training is still at infancy stage.  M &E as surplus activity(something we can do without)  M &E as experts are being seen as auditors(bring troubles to project managers
  • 60.  Monitoring and evaluation is an on going activity.  It exists in all stages of the PCM  Good M&E should start at the design and planning stage of a programme or project.  Off course M&E alone can not solve weak strategies and project design.  However, It is very hard to carry out M and E without a good planning
  • 61.  Identify the key issues and root of the problems you want to address (your situation or problem analysis)  Identify the key changes that you want  Develop effective strategies to get what you want  Design ways to monitor their progress  Determine what resources and knowledge are required  Ensure that your work is cost effective.
  • 62. 1. Initial Assessments- Initial needs Assessments, stakeholder analysis 2. Planning- Project design(LFA), M&E plan, Baseline study(project start) 3. Implementation- Process and Midterm and end line(final evaluation) 4. Post implementation- Impact study or evaluation for either dissemination, use of lessons and possible longitudinal evaluation/longtidinal studies.
  • 63.
  • 64.  Like other stages of the project cycle, M&E passes through the following stages  Plan/decision  Implementation  Reflect/learn/decide/adjust  Implement  Monitor  Reflect and adjust  Implement  Evaluate/learn/decide
  • 65. What does this suggest?: • M & E is an endless process • It is not a one time thing • Planning for M & E should start at the very beginning of the project • There are different types, forms and objectives for M & E • M and E information may differ according to the stage and objective.
  • 66.  An evaluation plan typically includes descriptions of the following:  Purpose of program(Brief description of the project)  The objective evaluation  Evaluation questions/issues  Data collection plans  Data analysis plans  Dissemination and reporting activities  Other evaluation products  Timeline and budget  Staff responsible for each evaluation activity
  • 67.  What the project is going to achieve, from the overall goal to specific objectives and outputs  What activities will be carried out to achieve its outputs and purpose  What resources (inputs) are required  How the progress and ultimate success of the project will be measured and verified (including qualitative and quantitative indicators)
  • 68.
  • 69.  Understanding the system approach  Difference views of an M&E system  Levels of an M&E system  A good M&E system  Components of an M&E system  Comparing private and public sector  Limitations/problems of M&E system in the public sector.
  • 70.  Systems approach System is a group of interconnected and interrelated components to form a whole (Senge, 1990)  From a system thinking approach- interrelated parts/elements.  Example of systems: Ecological systems, body systems, car systems and in our case M&E system.
  • 71.  What is common to all systems is that no part/element will work independently.  M&E system are critical parts/ constituents/components which individually would not function effectively unless they are put together.  In a very simplified language M&E system in meant to capture, manage, analyze, report and use and share information.
  • 72.  Narrow view:  M&E as a series of data collection tools designed to gather information and summary of progress against a pre-defined set of objectives and indicators.  Wide scope  M&E system to cover elements such as people, processes , baseline studies, reporting, learning mechanisms and data storage.
  • 74.  More than a list of ‘indicators’ !  Clarity on who is going to use the information for what kinds of decision making  System is ‘usable’ for the level of decision making  The system is ‘operational’ – clear on who is to collect and report what data by when and to whom  Sustainability-bugdet, human resources and commitments and incentives to implement.
  • 75. There are no standardized and universally agreed components of the M&E system. A well functional M&E system may compose the following components. Components/elements of the M&E system The status and Comparison of an M&E system in Private and Public sector Limitations/problems of M&E system in Tanzania. Good M&E system M&E plan- role, components and features
  • 76.  There are might be no universally accepted and standardized of what exactly constitutes an M&E systems.  There could be also different in wording and terminologies of some of the agreed components of the system.  Mixed grill- Components and elements?.They may have different connotation and interpretations  Some have gone further to suggest even the number of elements/components of comprising of an M&E system
  • 77.  Many writers seem to suggest that there are 12 of components/elements of a functional M&E system  There could raise one practical question on how did they arrive to this figure.  There are observations for example that some of the components INCLUDED in the M&E system are also included in the M&E plan.  The fact that an M&E plan is one of the M&E system elements, makes it a bit difficult to suggest the number and exact elements of the system
  • 78.
  • 79. But generally a functional M&E system may require the following elements.  Organizational structure with M&E unit which coordinates all the M&E fund, internal organs or outsourcing  People- Human element (human capacity) adequate staffs with necessary technical know how and experience. An M&E unit may consist of M&E manager, officers, stastitian, IT and field officers?
  • 80.  M&E plan- A plan which shows the tactics of how to operationalize the system  Costed M&E work plan a more detailed plan to make M&E plan operational with (budget for each activity.  M&E framework/Logical Framework(at project level) which outlines the objectives, outputs, activities , outputs and inputs. This may as well help to develop indicators and risk management
  • 81.  Data base/Data storage - Tracking system including how data is stored and retrieved at different level  Indicators and Targets  Baseline information- to act as a yardstick on assessing the progress towards success
  • 82.  Supporting processes- including training people, supervision, information flows between different people, review of the information, reporting of mistakes and failures.  Communication, advocacy and culture for M&E-Policies and strategies which promotes M&E continuity in an organization
  • 83.  Mechanisms for data capturing, analysis at different levels(Survey and surveillance)  Participation: Who participates in the M&E process, how and why  Reporting system- who reports to whom including different reports generated at different level  Data use, dissemination and learning: Use the M&E information for decision making and develop mechanisms for information sharing.
  • 84.  M&E as a displines and profession is still at infancy stage.  For reasons we discussed earlier  Discussions with a few experts and practitioners in this field seem to suggest the followings:
  • 85.  Comparatively, M&E systems are more visible in the private/NGOs than the public sectors.  More financial control with a few other elements of monitoring and evaluation.  The nature of employment is more performance based as compared to indefinite employments and managerial post acquisition in the government sector.  Yet still the M&E system is still not effective as it lacks a lot of important
  • 86.  Government officials are not ready to be measured.  Poor functioning of OPRAS can serve a good case to illustrate this argument.  OPRAS in many cases have been frustrated and its seems to be a bit in practical.  The government was trying to introduce the OPEN PERFOMANCE MANAGEMENT- Ones salary could be raised based on his performance.  Many field these forms because it could raise their salaries.
  • 87.  Lack of capacity and political will- very little budget capacity development unless there is donar funding.  It is only recently the government have demanded every ministry to have M&E units within the Planning and policy departments. The majority are economists probably with very little or no expertise in monitoring and evaluation  )
  • 88.  Lack motivation- No resources, resources are not coming on time therefore no need to measure(no enabling environment  Long processes of getting resources: Funds are not reliable so difficult to implement and measure according to plans/less control of the process  So many contingencies/unplanned/political interference and competing priorities(Dengue and other disasters which may need to be attended using the
  • 89.  Laxity – for so several reasons including corrupt environment and lack of performance based culture.  Many M&E systems are donar forced  Lack of tools in many daily operations  Lack of common tools for assessments- causes different understandings of the same results  Data related problems- accessibility, quality and cleaning  Technological factors – Data base
  • 90.  It is tool and strategy for operationalizing the M&E system.  While the M&E system in normally instituted at organizational level, an M&E plan normally works at project level.  Used as a guide of what you should evaluate, what information you need and who you are evaluating for  The plan outline the key evaluation questions
  • 91.  Depending on the detail, the evaluation plan can be useful in identify the people responsible for different evaluation tasks at different stages of the project  The plan need to be well articulated so that it can be implemented by anybody at anytime.  Normally prepared at the beginning of the project  To allow the project staff to plan ahead of time
  • 92.  Information needs: What, who and why  Information source: Methods, frequency, location of data to be collected  Responsibility for MER- Who is responsible  What indicators should be used to measures and Monitor each stage of the project.  How should the information be collected  How to Measure quality and effectiveness  How and when to communicate the
  • 93. i) Identify your evaluation audience: Who are you evaluation for and for what purpose. ii) Define the evaluation questions -Process- how well was the project designed and implemented -Outputs- Expected goods and services -Outcome- To what extent did it meet the overall needs, how valuable are the outcomes. -Learning- What worked well and what did
  • 94. iii) Identify the monitoring questions(more specific question deducted from the evaluation question iv) Identify the indicators and data sources - What information(indicator) do you need to answer the Monitoring and evaluation questions and where do you get this information(data sources) V) Identify who is responsible for data collection and timeline
  • 95. Vi) Identify who will evaluate the data and how will it be reported and how - Option although highly recommended Vii) Review the M&E plan - Highlight the data sources - Reordering the plan in several ways including data sources, data collection timeline and framework - Select /prioritize some question for budgeting purposes - Reduce collection of unused information
  • 96.  User friendly  Implementable  Monitor the use of the project outputs  Monitor the effectiveness of the project  Monitor the production/process of the project outputs  Asses the project inputs  Assess the effectiveness and relevancy of the project outputs  Asses the extent to which the observable impacts can be attributed to the project
  • 97.
  • 98. • Appreciate and demonstrate how Logic models and theories can guide Evaluation and project designing. • Discuss the relevancy of at least two common Logic models( Project Conceptual Models and Logical Framework) and Theory of change. • Understand major variable and components of these models/theories. . Understand the differences and complementarities between Logic models and Theory of change.
  • 99.  A logic is a set of rules or relationships that govern behaviour.  Explain relationship between elements and between an element and the whole.  The term logic model/Models is used as a generic label to depict ways of displaying the intervention and how such intervention will bring the desired change.  Like a road map, a logic model shows the route travelled (or steps taken) to reach a certain destination
  • 100.  They are narrative or graphical depictions of processes in real life that communicate the underlying assumptions upon which an activity is expected to lead to a specific result.  In program or project management logical models demonstrate how an intervention (a project, a program, a policy, a strategy) is likely to contribute to possible or actual impacts.
  • 101.  They describe linkages among program resources, activities, outputs, and audiences, and highlights different orders of outcomes related to a specific problem or situation  They express the thinking behind an initiative's plan.  Also explain why the program ought to work, why it can succeed where other attempts have failed.
  • 102.  They normally provide direction and clarity by presenting the big picture of change along with certain important milestones.  Initially used mainly for Program Evaluation but increasingly used for project planning and implementation  Used during different stages of the program for new and existing programs.  During planning , a logic models can help to clarify program strategy, development M&E systems, develop evidence for measuring change(indicators), identify appropriate outcome targets, set priorities for allocating resources and timelines
  • 103. During implementation:  They help to describe, modify or enhance the program  Provide an inventory of what you have and what you need to operate the program or initiative, develop a management plan  Make mid-course adjustments
  • 104. During evaluation to:  Reveal information needs and provide a framework for interpreting results  Document accomplishments  Provide evidence about the program  Identify differences between the ideal program and its real operation  Frame questions about attribution (of cause and effect) and contribution (of initiative components to the outcomes)  Tell the story of the program or initiative
  • 105.  Program logic/ logic model/ intervention logic  Program theory/theory of change/model of change,  Causal model/results chain/ causal chain/chain of causation  Road map/ conceptual map/ pathways map,  Mental model, blueprint for change, framework for action/ program framework  Program hypothesis/ theoretical