SlideShare a Scribd company logo
Importance of Monitoring and 
Evaluation
Lecture Overview 
 Monitoring and Evaluation 
 How to Build an M&E System
What is M, what is E, why and how to monitor 
MONITORING & EVALUATION
What is Monitoring 
 Ongoing process that generates information to inform decision 
about the program while it is being implemented. 
 Routine collection and analysis of information to track progress 
against set plans and check compliance to established standards 
 Helps identify trends & patterns, adapts strategies, and inform 
decisions 
 Key words: 
• Continuous – ongoing, frequent in nature 
• Collecting and analyzing information – to measure progress 
towards goals 
• Comparing results – assessing the performance of a 
program/project
Why is Monitoring Important? 
 Evidence of how much has been or has NOT been achieved 
• Quantitative: numbers, percentage 
• Qualitative: narrative or observation 
 Examination of trends 
 Highlight problems 
 Early warning signs 
 Corrective actions 
 Evaluate effectiveness of management action 
 Determine achievement of results
What is Evaluation 
 Evaluation is an assessment of an intervention to determine its 
relevance, efficiency, effectiveness, impact, and sustainability. The 
intention is to provide information that is credible and useful, 
enabling incorporation of lessons learned into decision making 
processes. 
 Key Words: 
• Assessment – of the value of an event of action 
• Relevance 
• Efficiency 
• Effectiveness 
• Impact 
• Sustainability 
• Lessons learned
What is Evaluation? 
Evaluation 
Program 
Evaluation 
Impact 
Evaluation
What is M and what is E? 
Monitoring 
Measures progress towards 
goals, but doesn’t tell us the 
extent to which results 
achieved or the impact 
Continuous, frequent 
Has to take place during the 
intervention 
Evaluation 
Measures whether progress 
towards goal is caused by 
the intervention - causality 
Infrequent, time bound 
Can evaluate ongoing or 
completed intervention
Monitoring and Evaluation 
Evaluation 
Program 
Evaluation 
Impact 
Evaluation 
Monitoring
Components of Program Evaluation 
 What are the characteristics of the target population? 
What are the risks and opportunities? What 
programs are most suitable? 
 What is the logical chain connecting our program to 
the desired results? 
 Is the program being rolled out as planned? Is their 
high uptake among clients? What do they think of it? 
 What was the impact and the magnitude of the 
program? 
 Given the magnitude of impact and cost, how 
efficient is the program? 
Needs assessment 
Program theory 
assessment 
Monitoring and 
process evaluation 
Impact evaluation 
Cost effectiveness 
Are your questions connected to decision-making?
Evaluation 
Programme 
Evaluation 
Impact 
Evaluation 
Program Evaluation
Who is this Evaluation For? 
 Academics 
 Donors 
• Their Constituents 
 Politicians / policymakers 
 Technocrats 
 Implementers 
 Proponents, Skeptics 
 Beneficiaries
How can Impact Evaluation Help Us? 
 Answers the following questions 
• What works best, why and when? 
• How can we scale up what works? 
 Surprisingly little hard evidence on what works 
 Can do more with given budget with better evidence 
 If people knew money was going to programmes that worked, could 
help increase pot for anti-poverty programmes
Programs and their Evaluations: Where do we Start? 
Intervention 
 Start with a problem 
 Verify that the problem actually 
exists 
 Generate a theory of why the 
problem exists 
 Design the program 
 Think about whether the 
solution is cost effective 
Program Evaluation 
 Start with a question 
 Verify the question hasn’t been 
answered 
 State a hypothesis 
 Design the evaluation 
 Determine whether the value 
of the answer is worth the cost 
of the evaluation
Endline 
Evaluation 
Life Cycle of a Program 
Baseline 
Evaluation 
Change or 
improvement 
Distributing 
reading 
materials and 
training 
volunteers 
• Reading 
materials delivered 
• Volunteers 
trained 
• Target children 
are reached 
• Classes are run, 
volunteers show up 
• Attendance in 
classes 
• Entire district is 
covered 
• Refresher training of 
teachers 
• Tracking the target 
children, convincing 
parents to send their child 
• Incentives to volunteer to 
run classes daily and 
efficiently (motivation) 
• Efforts made for children 
to attend regularly 
• Improve coverage 
Theory of 
Change/ Needs 
Assessment 
Designing 
the program 
to implement 
Background 
preparation, 
logistics, roll 
out of 
program 
Monitoring 
implementati 
on 
• Process 
evaluation 
• Progress towards 
target 
Planning for 
continuous 
improvement 
Reporting 
findings - 
impact, 
process 
evaluation 
findings 
Using the 
findings to 
improve 
program 
model and 
delivery
Program Theory – a Snap Shot 
Impacts 
Outcomes 
Outputs 
Activities 
Inputs 
Results 
Implementation
With a Focus on measuring both implementation and results? 
HOW TO BUILD AN M&E SYSTEM
Methods of Monitoring 
 First hand information 
 Citizens reporting 
 Surveys 
 Formal reports by project/programme staff 
• Project status report 
• Project schedule chart 
• Project financial status report 
• Informal Reports 
• Graphic presentations
Monitoring: Questions 
 Is the intervention implemented as designed? Does the program 
perform? 
Inputs Outputs Outcomes 
Implementation 
Plans and 
targets 
 Is intervention money, staff and other inputs available and put to 
use as planned? Are inputs used effectively? 
 Are the services being delivered as planned? 
 Is the intervention reaching the right population and target 
numbers? 
 Is the target population satisfied with services? Are they utilizing the 
services? 
 What is the intensity of the treatment?
Implementing Monitoring 
 Develop a monitoring plan 
• How should implementation be carried out? What is going to be 
changed? 
• Are the staff’s incentives aligned with project? Can they be 
incentivized to follow the implementation protocol? 
• How will you train staff? How will they interact with beneficiaries 
or other stakeholders? 
• What supplies or tools can you give your staff to make following 
the implementation design easier? 
• What can you do to monitor? (Field visits, tracking forms, 
administrative data, etc.) 
• Intensity of monitoring (frequency, resources required,…)?
Ten Steps to a Results-based Monitoring and Evaluation 
System 
Conducting a 
readiness 
and needs 
assessment 
Selecting key 
indicators to 
monitor 
outcomes 
Planning for 
improvement 
selecting 
realistic targets 
Using 
evaluation 
information 
Using findings 
1 2 3 4 5 6 7 8 9 10 
Agreeing on 
outcomes to 
monitor and 
evaluate 
Gathering 
baseline data 
on indicators 
Monitoring for 
results 
Reporting 
findings 
Sustaining the 
M&E system 
within the 
organization
Conducting a needs and 
readiness assessment 
1 2 3 4 5 6 7 8 9 10 
 What are the current systems that exist? 
 What is the need for the monitoring and evaluation? 
 Who will benefit from this system? 
 At what levels will the data be used? 
 Do we have organization willingness and capacity to establish the M&E 
system? 
 Who has the skills to design and build the M&E system? Who will 
manage? 
 What are the barriers to implementing M&E system on the ground 
(resource-crunch)? 
 How will you fight these barriers? 
 Will there be pilot programs that can be evaluated within the M&E 
system? 
- DO WE GO AHEAD?
Agreeing on outcomes 
(to monitor and evaluate) 
1 2 3 4 5 6 7 8 9 10 
 What are we trying to achieve? What is the vision that our M&E system 
will help us achieve? 
 Are there national or sectoral goals (commitment to achieving the 
MDGs)? 
 Political/donor driven interest in goals? 
 In other words, what are our Outcomes: Improving coverage, learning 
outcomes… broader than focusing on merely inputs and activities
Selecting key indicators 
to monitor outcomes 
1 2 3 4 5 6 7 8 9 10 
 Identify WHAT needs to get measured so that we know we have 
achieved our results? 
 Avoid broad based results, but assess based on feasibility, time, 
cost, relevance 
 Indicator development is a core activity in building an M&E system 
and drives all subsequent data collection, analysis, and reporting 
 Arriving at indicators will take come time 
 Identify plans for data collection, analysis, reporting 
PILOT! PILOT! PILOT!
Gathering baseline data 
on indicators 
1 2 3 4 5 6 7 8 9 10 
 Where are we today? 
 What is the performance of indicators today? 
 Sources of baseline information: Primary or Secondary data 
 Date types: Qualitative or Quantitative 
 Data collection instruments
Planning for 
improvement selecting 
realistic targets 
1 2 3 4 5 6 7 8 9 10 
 Targets – quantifiable levels of the indicators 
 Sequential, feasible and measurable targets 
 If we reach our sequential set of targets, then we will reach our 
outcomes! 
 Time bound – Universal enrolment by 2015 (outcome – better economic 
opportunities), Every child immunized by 2013 (outcome - reduction in 
infant mortality) etc. 
 Funding and resources available to be taken into account 
Target 
1 
Target 
2 
Target 
3 
Outcomes
Monitoring for 
implementation and 
results 
1 2 3 4 5 6 7 8 9 10 
Impacts 
Outcomes 
Outputs 
Activities 
Inputs 
Results 
Implementation 
Results 
monitoring 
Implementation 
monitoring 
Change in percentage children 
who cannot read; Change in 
teacher attendance 
Provision of materials; training 
of volunteers; usage of 
material; number of volunteers 
teaching
Evaluation(?), Using 
Evaluation Information 
1 2 3 4 5 6 7 8 9 10 
Monitoring does not information on attribution and causality. Information 
through Evaluation can be useful to 
 Helps determine are the right things being done 
 Helps select competing strategies by comparing results – are there 
better ways of doing things? 
 Helps build consensus on scale-up 
 Investigate why something did not work – scope for in-depth 
analysis 
 Evaluate the costs relative to benefits and help allocate limited 
resources
Reporting findings 
Sustaining M&E System 
1 2 3 4 5 6 7 8 9 10 
Using Results 
 Reporting Findings: What findings are reported to whom, in what format, 
and at what intervals. A good M&E system should provide an early warning 
system to detect problems or inconsistencies, as well as being a vehicle for 
demonstrating the value of an intervention – so do not hide poor results. 
 Using Results: recognize both internal and external uses of your results 
 Sustaining the M&E System: Some ways of doing this are generating 
demand, assigning responsibilities, increasing capacity, gather ing 
trustworthy data.
THANK YOU

More Related Content

What's hot

Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
Southern Range, Berhampur, Odisha
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
Ahmadzay
 
M&E Plan
M&E PlanM&E Plan
M&E Plan
DavidMakoko1
 
Monitoring And Evaluation
Monitoring And EvaluationMonitoring And Evaluation
Monitoring And Evaluation
Nick Nyamapfeni
 
Monitoring and Evaluation for development and governmental organizations.pdf
Monitoring and Evaluation for development and governmental organizations.pdfMonitoring and Evaluation for development and governmental organizations.pdf
Monitoring and Evaluation for development and governmental organizations.pdf
GutaMengesha1
 
Logical framework
Logical frameworkLogical framework
Logical framework
Md. Ayatullah Khan
 
7 M&E: Indicators
7 M&E: Indicators7 M&E: Indicators
7 M&E: Indicators
Tony
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects
Tony
 
Introduction to monitoring and evaluation
Introduction to monitoring and evaluationIntroduction to monitoring and evaluation
Introduction to monitoring and evaluation
Meshack Lomoywara
 
Monitoring And Evaluation Presentation
Monitoring And Evaluation PresentationMonitoring And Evaluation Presentation
Monitoring And Evaluation Presentation
EquiGov Institute
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
Knowledge Management Center
 
Results Based Monitoring and Evaluation
Results Based Monitoring and EvaluationResults Based Monitoring and Evaluation
Results Based Monitoring and Evaluation
Madhawa Waidyaratna
 
Presentation Training on Result Based Management (RBM) for M&E Staff
Presentation Training on Result Based Management (RBM) for M&E StaffPresentation Training on Result Based Management (RBM) for M&E Staff
Presentation Training on Result Based Management (RBM) for M&E Staff
Fida Karim 🇵🇰
 
Monotoring and evaluation principles and theories
Monotoring and evaluation  principles and theoriesMonotoring and evaluation  principles and theories
Monotoring and evaluation principles and theories
commochally
 
Monitoring and Evaluation of Plan
Monitoring and Evaluation of PlanMonitoring and Evaluation of Plan
Monitoring and Evaluation of Plan
Ruth Senorin
 
monitoring and evaluation
monitoring and evaluationmonitoring and evaluation
monitoring and evaluation
Tarek Sallam
 
Monitoring and Evaluating
Monitoring and EvaluatingMonitoring and Evaluating
Monitoring and Evaluating
GlobalGiving
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
Discover JKUAT
 
Monitoring and Evaluation Policy
Monitoring and Evaluation PolicyMonitoring and Evaluation Policy
Monitoring and Evaluation PolicyKomal Zahra
 

What's hot (20)

Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
 
M&E Plan
M&E PlanM&E Plan
M&E Plan
 
Monitoring And Evaluation
Monitoring And EvaluationMonitoring And Evaluation
Monitoring And Evaluation
 
Monitoring and Evaluation for development and governmental organizations.pdf
Monitoring and Evaluation for development and governmental organizations.pdfMonitoring and Evaluation for development and governmental organizations.pdf
Monitoring and Evaluation for development and governmental organizations.pdf
 
Logical framework
Logical frameworkLogical framework
Logical framework
 
7 M&E: Indicators
7 M&E: Indicators7 M&E: Indicators
7 M&E: Indicators
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects
 
Introduction to monitoring and evaluation
Introduction to monitoring and evaluationIntroduction to monitoring and evaluation
Introduction to monitoring and evaluation
 
Monitoring And Evaluation Presentation
Monitoring And Evaluation PresentationMonitoring And Evaluation Presentation
Monitoring And Evaluation Presentation
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
 
Results Based Monitoring and Evaluation
Results Based Monitoring and EvaluationResults Based Monitoring and Evaluation
Results Based Monitoring and Evaluation
 
Presentation Training on Result Based Management (RBM) for M&E Staff
Presentation Training on Result Based Management (RBM) for M&E StaffPresentation Training on Result Based Management (RBM) for M&E Staff
Presentation Training on Result Based Management (RBM) for M&E Staff
 
Monotoring and evaluation principles and theories
Monotoring and evaluation  principles and theoriesMonotoring and evaluation  principles and theories
Monotoring and evaluation principles and theories
 
Monitoring indicators
Monitoring indicatorsMonitoring indicators
Monitoring indicators
 
Monitoring and Evaluation of Plan
Monitoring and Evaluation of PlanMonitoring and Evaluation of Plan
Monitoring and Evaluation of Plan
 
monitoring and evaluation
monitoring and evaluationmonitoring and evaluation
monitoring and evaluation
 
Monitoring and Evaluating
Monitoring and EvaluatingMonitoring and Evaluating
Monitoring and Evaluating
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
 
Monitoring and Evaluation Policy
Monitoring and Evaluation PolicyMonitoring and Evaluation Policy
Monitoring and Evaluation Policy
 

Viewers also liked

What is Evaluation
What is EvaluationWhat is Evaluation
What is Evaluation
clearsateam
 
Evaluation Methods
Evaluation MethodsEvaluation Methods
Evaluation Methods
clearsateam
 
Threats and Analysis
Threats and AnalysisThreats and Analysis
Threats and Analysis
clearsateam
 
Project from Start to Finish
Project from Start to FinishProject from Start to Finish
Project from Start to Finish
clearsateam
 
Designing Indicators
Designing IndicatorsDesigning Indicators
Designing Indicators
clearsateam
 
Theory of Change
Theory of ChangeTheory of Change
Theory of Change
clearsateam
 
Experimental Evaluation Methods
Experimental Evaluation MethodsExperimental Evaluation Methods
Experimental Evaluation Methods
clearsateam
 
Sampling, Statistics and Sample Size
Sampling, Statistics and Sample SizeSampling, Statistics and Sample Size
Sampling, Statistics and Sample Size
clearsateam
 
Cost Effectiveness Analysis
Cost Effectiveness AnalysisCost Effectiveness Analysis
Cost Effectiveness Analysis
clearsateam
 
Developing State Monitoring Systems
Developing State Monitoring SystemsDeveloping State Monitoring Systems
Developing State Monitoring Systems
clearsateam
 
Measuring Impact
Measuring ImpactMeasuring Impact
Measuring Impact
clearsateam
 
Digital Data Collection
Digital Data CollectionDigital Data Collection
Digital Data Collection
clearsateam
 
02 management vs leadership
02 management vs leadership02 management vs leadership
02 management vs leadershipNguyễn Thanh
 
DIRECTING - A Function of Management
DIRECTING - A Function of ManagementDIRECTING - A Function of Management
DIRECTING - A Function of ManagementSugandha Vidge
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
Srinivasan Rengasamy
 
Od interventions
Od interventionsOd interventions
Od interventions
gaurav jain
 
Chapter 04 Directing function of management
Chapter 04 Directing function of managementChapter 04 Directing function of management
Chapter 04 Directing function of management
Patel Jay
 
Directing function in management
Directing  function in managementDirecting  function in management
Directing function in management
Diviya Arun
 
Lecture 1 introduction of project method
Lecture 1 introduction of project methodLecture 1 introduction of project method
Lecture 1 introduction of project methodRoel Hernandez
 
The Coordination System
The Coordination SystemThe Coordination System
The Coordination System
Megan Bui
 

Viewers also liked (20)

What is Evaluation
What is EvaluationWhat is Evaluation
What is Evaluation
 
Evaluation Methods
Evaluation MethodsEvaluation Methods
Evaluation Methods
 
Threats and Analysis
Threats and AnalysisThreats and Analysis
Threats and Analysis
 
Project from Start to Finish
Project from Start to FinishProject from Start to Finish
Project from Start to Finish
 
Designing Indicators
Designing IndicatorsDesigning Indicators
Designing Indicators
 
Theory of Change
Theory of ChangeTheory of Change
Theory of Change
 
Experimental Evaluation Methods
Experimental Evaluation MethodsExperimental Evaluation Methods
Experimental Evaluation Methods
 
Sampling, Statistics and Sample Size
Sampling, Statistics and Sample SizeSampling, Statistics and Sample Size
Sampling, Statistics and Sample Size
 
Cost Effectiveness Analysis
Cost Effectiveness AnalysisCost Effectiveness Analysis
Cost Effectiveness Analysis
 
Developing State Monitoring Systems
Developing State Monitoring SystemsDeveloping State Monitoring Systems
Developing State Monitoring Systems
 
Measuring Impact
Measuring ImpactMeasuring Impact
Measuring Impact
 
Digital Data Collection
Digital Data CollectionDigital Data Collection
Digital Data Collection
 
02 management vs leadership
02 management vs leadership02 management vs leadership
02 management vs leadership
 
DIRECTING - A Function of Management
DIRECTING - A Function of ManagementDIRECTING - A Function of Management
DIRECTING - A Function of Management
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
 
Od interventions
Od interventionsOd interventions
Od interventions
 
Chapter 04 Directing function of management
Chapter 04 Directing function of managementChapter 04 Directing function of management
Chapter 04 Directing function of management
 
Directing function in management
Directing  function in managementDirecting  function in management
Directing function in management
 
Lecture 1 introduction of project method
Lecture 1 introduction of project methodLecture 1 introduction of project method
Lecture 1 introduction of project method
 
The Coordination System
The Coordination SystemThe Coordination System
The Coordination System
 

Similar to Importance of M&E

M&E Concepts.pptx
M&E Concepts.pptxM&E Concepts.pptx
M&E Concepts.pptx
FranciscoSingano1
 
ME_Katende (2).ppt
ME_Katende (2).pptME_Katende (2).ppt
ME_Katende (2).ppt
ShriAnveshaDasari
 
Assessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.pptAssessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.ppt
ShahidMahmood503398
 
Monitoring and evaluation presentation equi gov
Monitoring and evaluation presentation equi govMonitoring and evaluation presentation equi gov
Monitoring and evaluation presentation equi gov
EquiGov Institute
 
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in NepalThe Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
Deepak Karki
 
Best Practices in Nonprofit Impact Measurement , CNM
Best Practices in Nonprofit Impact Measurement , CNMBest Practices in Nonprofit Impact Measurement , CNM
Best Practices in Nonprofit Impact Measurement , CNMGreenlights
 
Ten_Steps_Results_Based_MESystem.ppt
Ten_Steps_Results_Based_MESystem.pptTen_Steps_Results_Based_MESystem.ppt
Ten_Steps_Results_Based_MESystem.ppt
Negussie5
 
Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2
Meshack Lomoywara
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
gggadiel
 
Basics of Extension Evaluation (Foundations Course 2019)
Basics of Extension Evaluation (Foundations Course 2019)Basics of Extension Evaluation (Foundations Course 2019)
Basics of Extension Evaluation (Foundations Course 2019)
Ayanava Majumdar (Dr. A), Alabama Cooperative Extension System
 
EMIS
EMIS EMIS
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
contentli
 
Monitoring and impact assessment tools
Monitoring and impact assessment toolsMonitoring and impact assessment tools
Monitoring and impact assessment tools
Brajendra Singh Meena
 
Practical Evaluation Workshop
Practical Evaluation WorkshopPractical Evaluation Workshop
Practical Evaluation Workshop
Mentoring Partnership of Minnesota
 
Unit IV_Monitoring_and_Evaluation.pptx
Unit IV_Monitoring_and_Evaluation.pptxUnit IV_Monitoring_and_Evaluation.pptx
Unit IV_Monitoring_and_Evaluation.pptx
MusondaMofu2
 
first-batch-me-training.pptx
first-batch-me-training.pptxfirst-batch-me-training.pptx
first-batch-me-training.pptx
MaiwandHoshmand1
 
Curriculum monitoring
Curriculum monitoringCurriculum monitoring
Curriculum monitoring
navanitha sinnasamy
 
Performance Measurement for Local Governments
Performance Measurement for Local GovernmentsPerformance Measurement for Local Governments
Performance Measurement for Local Governments
Ravikant Joshi
 

Similar to Importance of M&E (20)

M&E Concepts.pptx
M&E Concepts.pptxM&E Concepts.pptx
M&E Concepts.pptx
 
ME_Katende (2).ppt
ME_Katende (2).pptME_Katende (2).ppt
ME_Katende (2).ppt
 
Assessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.pptAssessment MEAL Frameworks in scientific field.ppt
Assessment MEAL Frameworks in scientific field.ppt
 
Monitoring and evaluation presentation equi gov
Monitoring and evaluation presentation equi govMonitoring and evaluation presentation equi gov
Monitoring and evaluation presentation equi gov
 
M&E CLW 26Nov2015, MMM
M&E CLW 26Nov2015, MMMM&E CLW 26Nov2015, MMM
M&E CLW 26Nov2015, MMM
 
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in NepalThe Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
 
Best Practices in Nonprofit Impact Measurement , CNM
Best Practices in Nonprofit Impact Measurement , CNMBest Practices in Nonprofit Impact Measurement , CNM
Best Practices in Nonprofit Impact Measurement , CNM
 
Ten_Steps_Results_Based_MESystem.ppt
Ten_Steps_Results_Based_MESystem.pptTen_Steps_Results_Based_MESystem.ppt
Ten_Steps_Results_Based_MESystem.ppt
 
Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Basics of Extension Evaluation (Foundations Course 2019)
Basics of Extension Evaluation (Foundations Course 2019)Basics of Extension Evaluation (Foundations Course 2019)
Basics of Extension Evaluation (Foundations Course 2019)
 
EMIS
EMIS EMIS
EMIS
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 
Monitoring and impact assessment tools
Monitoring and impact assessment toolsMonitoring and impact assessment tools
Monitoring and impact assessment tools
 
Practical Evaluation Workshop
Practical Evaluation WorkshopPractical Evaluation Workshop
Practical Evaluation Workshop
 
Unit IV_Monitoring_and_Evaluation.pptx
Unit IV_Monitoring_and_Evaluation.pptxUnit IV_Monitoring_and_Evaluation.pptx
Unit IV_Monitoring_and_Evaluation.pptx
 
first-batch-me-training.pptx
first-batch-me-training.pptxfirst-batch-me-training.pptx
first-batch-me-training.pptx
 
Curriculum monitoring
Curriculum monitoringCurriculum monitoring
Curriculum monitoring
 
Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation
 
Performance Measurement for Local Governments
Performance Measurement for Local GovernmentsPerformance Measurement for Local Governments
Performance Measurement for Local Governments
 

Recently uploaded

Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
Pavel ( NSTU)
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
EduSkills OECD
 
PART A. Introduction to Costumer Service
PART A. Introduction to Costumer ServicePART A. Introduction to Costumer Service
PART A. Introduction to Costumer Service
PedroFerreira53928
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
Celine George
 
Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......
Ashokrao Mane college of Pharmacy Peth-Vadgaon
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
Excellence Foundation for South Sudan
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
DeeptiGupta154
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
rosedainty
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
Jisc
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
Balvir Singh
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
TechSoup
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
Fundacja Rozwoju Społeczeństwa Przedsiębiorczego
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
Tamralipta Mahavidyalaya
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
Steve Thomason
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
Jisc
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
Jheel Barad
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
Celine George
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
Delapenabediema
 

Recently uploaded (20)

Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
 
PART A. Introduction to Costumer Service
PART A. Introduction to Costumer ServicePART A. Introduction to Costumer Service
PART A. Introduction to Costumer Service
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
 
Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......Ethnobotany and Ethnopharmacology ......
Ethnobotany and Ethnopharmacology ......
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
Instructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptxInstructions for Submissions thorugh G- Classroom.pptx
Instructions for Submissions thorugh G- Classroom.pptx
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
 

Importance of M&E

  • 1. Importance of Monitoring and Evaluation
  • 2. Lecture Overview  Monitoring and Evaluation  How to Build an M&E System
  • 3. What is M, what is E, why and how to monitor MONITORING & EVALUATION
  • 4. What is Monitoring  Ongoing process that generates information to inform decision about the program while it is being implemented.  Routine collection and analysis of information to track progress against set plans and check compliance to established standards  Helps identify trends & patterns, adapts strategies, and inform decisions  Key words: • Continuous – ongoing, frequent in nature • Collecting and analyzing information – to measure progress towards goals • Comparing results – assessing the performance of a program/project
  • 5. Why is Monitoring Important?  Evidence of how much has been or has NOT been achieved • Quantitative: numbers, percentage • Qualitative: narrative or observation  Examination of trends  Highlight problems  Early warning signs  Corrective actions  Evaluate effectiveness of management action  Determine achievement of results
  • 6. What is Evaluation  Evaluation is an assessment of an intervention to determine its relevance, efficiency, effectiveness, impact, and sustainability. The intention is to provide information that is credible and useful, enabling incorporation of lessons learned into decision making processes.  Key Words: • Assessment – of the value of an event of action • Relevance • Efficiency • Effectiveness • Impact • Sustainability • Lessons learned
  • 7. What is Evaluation? Evaluation Program Evaluation Impact Evaluation
  • 8. What is M and what is E? Monitoring Measures progress towards goals, but doesn’t tell us the extent to which results achieved or the impact Continuous, frequent Has to take place during the intervention Evaluation Measures whether progress towards goal is caused by the intervention - causality Infrequent, time bound Can evaluate ongoing or completed intervention
  • 9. Monitoring and Evaluation Evaluation Program Evaluation Impact Evaluation Monitoring
  • 10. Components of Program Evaluation  What are the characteristics of the target population? What are the risks and opportunities? What programs are most suitable?  What is the logical chain connecting our program to the desired results?  Is the program being rolled out as planned? Is their high uptake among clients? What do they think of it?  What was the impact and the magnitude of the program?  Given the magnitude of impact and cost, how efficient is the program? Needs assessment Program theory assessment Monitoring and process evaluation Impact evaluation Cost effectiveness Are your questions connected to decision-making?
  • 11. Evaluation Programme Evaluation Impact Evaluation Program Evaluation
  • 12. Who is this Evaluation For?  Academics  Donors • Their Constituents  Politicians / policymakers  Technocrats  Implementers  Proponents, Skeptics  Beneficiaries
  • 13. How can Impact Evaluation Help Us?  Answers the following questions • What works best, why and when? • How can we scale up what works?  Surprisingly little hard evidence on what works  Can do more with given budget with better evidence  If people knew money was going to programmes that worked, could help increase pot for anti-poverty programmes
  • 14. Programs and their Evaluations: Where do we Start? Intervention  Start with a problem  Verify that the problem actually exists  Generate a theory of why the problem exists  Design the program  Think about whether the solution is cost effective Program Evaluation  Start with a question  Verify the question hasn’t been answered  State a hypothesis  Design the evaluation  Determine whether the value of the answer is worth the cost of the evaluation
  • 15. Endline Evaluation Life Cycle of a Program Baseline Evaluation Change or improvement Distributing reading materials and training volunteers • Reading materials delivered • Volunteers trained • Target children are reached • Classes are run, volunteers show up • Attendance in classes • Entire district is covered • Refresher training of teachers • Tracking the target children, convincing parents to send their child • Incentives to volunteer to run classes daily and efficiently (motivation) • Efforts made for children to attend regularly • Improve coverage Theory of Change/ Needs Assessment Designing the program to implement Background preparation, logistics, roll out of program Monitoring implementati on • Process evaluation • Progress towards target Planning for continuous improvement Reporting findings - impact, process evaluation findings Using the findings to improve program model and delivery
  • 16. Program Theory – a Snap Shot Impacts Outcomes Outputs Activities Inputs Results Implementation
  • 17. With a Focus on measuring both implementation and results? HOW TO BUILD AN M&E SYSTEM
  • 18. Methods of Monitoring  First hand information  Citizens reporting  Surveys  Formal reports by project/programme staff • Project status report • Project schedule chart • Project financial status report • Informal Reports • Graphic presentations
  • 19. Monitoring: Questions  Is the intervention implemented as designed? Does the program perform? Inputs Outputs Outcomes Implementation Plans and targets  Is intervention money, staff and other inputs available and put to use as planned? Are inputs used effectively?  Are the services being delivered as planned?  Is the intervention reaching the right population and target numbers?  Is the target population satisfied with services? Are they utilizing the services?  What is the intensity of the treatment?
  • 20. Implementing Monitoring  Develop a monitoring plan • How should implementation be carried out? What is going to be changed? • Are the staff’s incentives aligned with project? Can they be incentivized to follow the implementation protocol? • How will you train staff? How will they interact with beneficiaries or other stakeholders? • What supplies or tools can you give your staff to make following the implementation design easier? • What can you do to monitor? (Field visits, tracking forms, administrative data, etc.) • Intensity of monitoring (frequency, resources required,…)?
  • 21. Ten Steps to a Results-based Monitoring and Evaluation System Conducting a readiness and needs assessment Selecting key indicators to monitor outcomes Planning for improvement selecting realistic targets Using evaluation information Using findings 1 2 3 4 5 6 7 8 9 10 Agreeing on outcomes to monitor and evaluate Gathering baseline data on indicators Monitoring for results Reporting findings Sustaining the M&E system within the organization
  • 22. Conducting a needs and readiness assessment 1 2 3 4 5 6 7 8 9 10  What are the current systems that exist?  What is the need for the monitoring and evaluation?  Who will benefit from this system?  At what levels will the data be used?  Do we have organization willingness and capacity to establish the M&E system?  Who has the skills to design and build the M&E system? Who will manage?  What are the barriers to implementing M&E system on the ground (resource-crunch)?  How will you fight these barriers?  Will there be pilot programs that can be evaluated within the M&E system? - DO WE GO AHEAD?
  • 23. Agreeing on outcomes (to monitor and evaluate) 1 2 3 4 5 6 7 8 9 10  What are we trying to achieve? What is the vision that our M&E system will help us achieve?  Are there national or sectoral goals (commitment to achieving the MDGs)?  Political/donor driven interest in goals?  In other words, what are our Outcomes: Improving coverage, learning outcomes… broader than focusing on merely inputs and activities
  • 24. Selecting key indicators to monitor outcomes 1 2 3 4 5 6 7 8 9 10  Identify WHAT needs to get measured so that we know we have achieved our results?  Avoid broad based results, but assess based on feasibility, time, cost, relevance  Indicator development is a core activity in building an M&E system and drives all subsequent data collection, analysis, and reporting  Arriving at indicators will take come time  Identify plans for data collection, analysis, reporting PILOT! PILOT! PILOT!
  • 25. Gathering baseline data on indicators 1 2 3 4 5 6 7 8 9 10  Where are we today?  What is the performance of indicators today?  Sources of baseline information: Primary or Secondary data  Date types: Qualitative or Quantitative  Data collection instruments
  • 26. Planning for improvement selecting realistic targets 1 2 3 4 5 6 7 8 9 10  Targets – quantifiable levels of the indicators  Sequential, feasible and measurable targets  If we reach our sequential set of targets, then we will reach our outcomes!  Time bound – Universal enrolment by 2015 (outcome – better economic opportunities), Every child immunized by 2013 (outcome - reduction in infant mortality) etc.  Funding and resources available to be taken into account Target 1 Target 2 Target 3 Outcomes
  • 27. Monitoring for implementation and results 1 2 3 4 5 6 7 8 9 10 Impacts Outcomes Outputs Activities Inputs Results Implementation Results monitoring Implementation monitoring Change in percentage children who cannot read; Change in teacher attendance Provision of materials; training of volunteers; usage of material; number of volunteers teaching
  • 28. Evaluation(?), Using Evaluation Information 1 2 3 4 5 6 7 8 9 10 Monitoring does not information on attribution and causality. Information through Evaluation can be useful to  Helps determine are the right things being done  Helps select competing strategies by comparing results – are there better ways of doing things?  Helps build consensus on scale-up  Investigate why something did not work – scope for in-depth analysis  Evaluate the costs relative to benefits and help allocate limited resources
  • 29. Reporting findings Sustaining M&E System 1 2 3 4 5 6 7 8 9 10 Using Results  Reporting Findings: What findings are reported to whom, in what format, and at what intervals. A good M&E system should provide an early warning system to detect problems or inconsistencies, as well as being a vehicle for demonstrating the value of an intervention – so do not hide poor results.  Using Results: recognize both internal and external uses of your results  Sustaining the M&E System: Some ways of doing this are generating demand, assigning responsibilities, increasing capacity, gather ing trustworthy data.

Editor's Notes

  1. Evaluation ek assessment hai program ka jo uski RELEVANCE: upyukta – jo program hai vo country context or logo ke liya kaam aayega? (water is pure, but I give chlorine tablets) EFFICIENCY: saksham - How economically are resources being used? (cost is more than benefit) EFFECTIVENESS: kitna prabhav tha – objectives achieved or not? IMPACT: Long term effects, intended or not intended SUSTAINABILTY: Nirantarta – program ka benefit continue karega?
  2. First let’s narrow down our definition of Evaluation Evaluation is a very big term and could mean many things… In general, we’ll be talking about program evaluation So that means, not the type of evaluation that’s more administrative in nature… Performance evaluations, audits, etc… Unless those are part of a new policy or program that we wish to evaluate… Programs are still a general term Could include Policies, or more generally, “interventions” What distinguishes impact evaluation? What makes “randomized evaluation” distinct? Where does monitoring fit in? The quiz you took at the beginning… that’s part of an evaluation. You’ll take one at the end as well. And you’ll also give us some course feedback something we’ll look at after this whole course is done and use it to make design and implementation changes to the course But it’s not something we consider part of the course design itself. It’s not really meant to serve as a pedagogical device. The clickers. That’s more part of monitoring. It’s specifically designed as part of the pedagogy. It gives us instant feedback based on which we make mid-course adjustments, corrections, etc. Part of the pedagogy is that we have a specific decision tree that the implementers (in this case, lecturers) use based on the results of the survey.
  3. Progress: Vikaas Extent: had Building an evaluation system allows for: • a more in-depth study of results-based outcomes and impacts • bringing in other data sources than just extant indicators • addressing factors that are too difficult or expensive to continuously monitor • tackling the issue of why and how the trends being tracked with monitoring data are moving in the directions they are (perhaps most important).
  4. In general, we’ll be talking about program evaluation So that means, not the type of evaluation that’s more administrative in nature… Performance evaluations, audits, etc… Unless those are part of a new policy or program that we wish to evaluate… Programs are still a general term Could include Policies, or more generally, “interventions” What distinguishes impact evaluation? What makes “randomized evaluation” distinct? Where does monitoring fit in? The quiz you took at the beginning… that’s part of an evaluation. You’ll take one at the end as well. And you’ll also give us some course feedback something we’ll look at after this whole course is done and use it to make design and implementation changes to the course But it’s not something we consider part of the course design itself. It’s not really meant to serve as a pedagogical device. The clickers. That’s more part of monitoring. It’s specifically designed as part of the pedagogy. It gives us instant feedback based on which we make mid-course adjustments, corrections, etc. Part of the pedagogy is that we have a specific decision tree that the implementers (in this case, lecturers) use based on the results of the survey.
  5. In general, we’ll be talking about programme evaluation So that means, not the type of evaluation that’s more administrative in nature… Performance evaluations, audits, etc… Unless those are part of a new policy or programme that we wish to evaluate… Programmes are still a general term Could include Policies, or more generally, “interventions” What distinguishes impact evaluation? What makes “randomized evaluation” distinct? Where does monitoring fit in? The quiz you took at the beginning… that’s part of an evaluation. You’ll take one at the end as well. And you’ll also give us some course feedback something we’ll look at after this whole course is done and use it to make design and implementation changes to the course But it’s not something we consider part of the course design itself. It’s not really meant to serve as a pedagogical device. The clickers. That’s more part of monitoring. It’s specifically designed as part of the pedagogy. It gives us instant feedback based on which we make mid-course adjustments, corrections, etc. Part of the pedagogy is that we have a specific decision tree that the implementers (in this case, lecturers) use based on the results of the survey.
  6. Who is your audience? And what questions are they asking? Academics: we have quite a few academics in the audience. Beneficiaries: This may be slightly different from “who are your stakeholders”? This effects the type of evaluation you do, but also the questions you seek to answer.
  7. This question is larger than a question of aid Aid accounts for less than 10% of development spending. Governments have their own budgets, their own programmes.
  8. Before thinking about evaluations, we should think about what it is that we’re evaluating… Here I’ll generically call it, an “intervention” You’d be surprised how many policies are implemented that address non-existent problems. One of our evaluations in India is of a policy called continuous and comprehensive evaluation… Now let’s ask a very specific question…
  9. Ignore blue boxes
  10. If this our program theory – anumaan, parikalpna – then it has two parts. The first is where the program is being implemented and the second is where the results have started to show.
  11. Hum ek M&E system kaise banaye jo program amal karne ko aur uske parinaam dono ko naapta hai
  12. How the world bank says we can do it – handbook for development practitioners
  13. Give examples
  14. Outcome – parinaam MDG example – eight international development goals that all UN member states have to achieve by the year 2015 Eradicate extreme hunger and poverty - universal primary education
  15. Indicator: suchak, nideshak Relevant: uchit
  16. Instruments: jariya
  17. Measure: naap Feasible: sambhav Another example – I want higher literacy levels 1) 50% children in school; 2) 100 % children in school; 3) teacher attendance increases; 4) children scores improve
  18. Implementation monitoring – data collected on inputs, activities and immediate outputs; information on administrative, implementation, management issues Results monitoring – sets indicators for outcomes and collects data on outcomes; systematic reporting on progress towards outcomes
  19. Evaluation – karanta sthapit karna, could be expensive but very useful in certain cases