SlideShare a Scribd company logo
1 of 88
Role of board in Monitoring and
Evaluation
Professor Benon C Basheka, PhD, FCIPS
Uganda Technology and Management University
LEARNING OUTCOMES
By the end of the training, board members should be able to:-
1. Describe the importance of monitoring and evaluation in
organizational effectiveness
2. Apply M and E knowledge in the day today activities of their work in
ABiTRust
3. Relate M and E requirements to the DEC standards
4. Oversees establishment of an effective M and E system for the
organization
TRAINING METHODOLOGY
1. Presentations
2. Individual experiences
3. Discussions
Session coverage
The training will generally cover three units:-
1. General Introduction and Role of Boards of Trustees (with relevance to M and E)
2. Requirements for establishing an M and E system (and role of the board)
3. The Evaluation Process and Roles the board
UNIT 1: GENERAL INTRODUCTION AND ROLE OF
THE BOARD IN M & E
INTRODUCTION
• Monitoring and evaluation are increasingly being emphasized in all endeavors
• The need to determine what interventions are working or not and why now
necessitate organizations to invest in building sound monitoring and evaluation
systems
• In almost all organizations where boards exist, boards generally have oversight
functions-overseeing all activities of the organization (M and E Inclusive)
• The range of things where evaluation applies is limited but can include projects,
programmes, processes, systems etc
Monitoring/Evaluation is to be seen as:-
• As a profession with a set of standards
• As a field of practice (Monitoring and evaluation units and staff therein)
• As a discipline of study which is an academic field now declared to be in
adolescent stages (universities and other higher educational institutions
Organizations have advisory or policy organs
1. Board of governors
2. Board of managers
3. Board of regents
4. Board of trustees, and
5. Board of visitors
Different Modes of Board Governance
1. Advisory Board Model
2. Patron Model
3. Co-operative Model
4. Management Team Model
5. Policy Board Model [AbiTrust?]
Abi-TRUST Governance
Founders Committee
Management Staff M and E unit
Board of
Trustees
Typical duties of boards of directors include:
1. Governing the organization by establishing broad policies and objectives
2. Selecting, appointing, supporting and reviewing the performance of the chief
executive
3. Ensuring the availability of adequate financial resources
4. Approving annual budgets
5. Approving annual plans
6. Approving and reviewing Reports for management activities
Working Methodology of Boards
• In larger organizations, the structure of the board and its committees usually
mirrors the structure of the organization's administration.
• Just as there are staff responsible for human resources, fund-raising, finance,
planning, and programs, the board creates committees with responsibility for
these areas.
• One of the areas of board committee will be monitoring and evaluation
• Even the performance of the board or its committees needs to be measured
Strategically
1. Mobilize resources
2. Look for networks
3. Design and approve policy
4. Provide oversight
5. Participate in evaluation
Boards Strength lies on :-
1. Composition
2. Expertise
3. Experience
4. Qualifications
5. Networks
CONTEXTUALLY
• The Board of Trustees of aBi Trust has a four-fold mandate:
1. To protect the Trust’s assets over time and ensure survival and the
prosperity of the Trust in a transparent, accountable and responsible
manner
2. To guide the Trust in fulfilling its Vision, mission and objectives
3. To give strategic direction to aBi Trust Management
4. To protect the Trust’s interests.
What is the implication?
1. The mandates and functions of the board cannot be efficiently and effectively
achieved without sound knowledge of M and E
2. M and E supplies knowledge for oversight and decision making
3. M and E supplies the board with necessary tools and methodologies
4. M and E provides the right attitude and mindset for involvement
5. M and E shapes the decision making to support building required systems
Is Evaluation new?
 Evaluation is as old as the world itself and has moved side by side the journey of human
civilization
 The scriptures tell us under genesis 1:31 that when God created the earth, the light in the
darkness, the firmament in the midst in the midst of the waters, the plants, the animals, and
finally man, at the end of the fifth day’…God saw everything that he had made, and behold, it
was very good
 He used some criteria unknown to us and this enabled him to make an assessment on
whose findings he was able to make a fundamental decision of not scrapping what he had
done
— Michael Q. Patton
God’s archangel came then, asking, “God, how do you know that what you have
created is ‘very good’? What are your criteria? On what data do you base your
judgment? Just what results were you expecting to attain? And aren’t you a little close
to the situation to make a fair and unbiased evaluation?”
God thought about these questions and that day and God’s rest was greatly disturbed.
On the eighth day God said, “Lucifer go to hell.”
The second example
• From the philosophical works of Socrates, Plato and Aristotle to the mathematical
methodologies of Pythagoras and Euclid, the ideas of the ancient Greeks shaped many
institutions and contributions to many fields including evaluation.
• Existing scholarly accounts inform us that the Delphic oracle of the ninth to the third
centuries BC was the first central intelligence database of the ancient world, an
interdisciplinary think tank of approximately 90 priests, deemed the best educated experts of
antiquity.
• They collected and evaluated information and advised ordinary people and leaders, among them
Alexander the Great. Major project management in the fourth century BC existed.
In ancient Greece,
• the practices of their 12 gods, called Olympians because they were stationed on Olympus, the
highest mountain in Greece shed some light on how evaluation was done.
• The council of the Olympian gods and goddesses made collective decisions with input
from an expert panel, which consisted of Zeus (the president of the gods), Athena (the
goddess of wisdom), Hermes (the god of information and commerce), and any other god
whose area of expertise would be pertinent to the subject in question.
In their Working methodology
• Meetings were problem-oriented participatory sessions, characterized by intense discussions and
searches for best solution.
• The gods' decisions were persuasively communicated to mortals and powerfully implemented with
follow-up reports (Theofanides 1999).
• The Olympian style of management and decision making is illustrated in the steps below:-
1. Identify the problem or theme for action. Collect all relevant information and data, through the intelligence
work by Hermes, the god of informatics.
2. Search for solutions via dialogue with all participants. Discuss step 1 with all concerned parties and
propose alternative solutions.
3. Select the best problem solution or action theme, mainly by conferring with the concerned party or parties
4. Announce the decision of the gods to all mortals concerned through the god of informatics, Hermes. Send
Peitho, the goddess of persuasion, to illuminate the best solution in step 3 as the decision of the gods of
Olympus
5. Use lightning and thunderbolts to implement the Olympian decisions in step 4 to achieve the desired
goals identified in steps 1 and 3.
6. Implement all decisions, supervised by Hermes, the god of informatics, who announces to the Olympian
gods the results of their action taken in step
In contemporary times….
• Monitoring and evaluation (M&E) is an essential part of any program, project, policy and is often used in all kinds of
contexts.
• Development partners increasingly expect their partners to have sound M and E systems
• Monitoring and evaluation can tell us :-
• Whether a program, policy or project is making a difference and for whom;
• It can identify program areas that are on target or aspects of a program that need to be adjusted or replaced.
• Information gained from M&E can lead to better decisions about program investments.
• It can demonstrate to program implementers and funders that their investments are paying off
• It is a tool for ensuring accountability to other stakeholders
Monitoring and evaluation can:
• Help identify problems and their causes;
• Suggest possible solutions to problems;
• Raise questions about assumptions and strategy;
• Push you to reflect on where you are going and how you are getting there;
• Provide you with information and insight;
• Encourage you to act on the information and insight;
• Increase the likelihood that you will make a positive development difference.
We conduct evaluations….
 To help people make better decisions and achieve better outcomes
 To provide better services (public and private)
• By:
 Comparing policy options objectively and rigorously
 Calculating empirically the size of likely impacts
 Calculating empirically the diversity/variance of impacts
 Getting more precise estimates of risk and bias
 Establishing a cumulative evidence base for decision making
1. Does it work?
2. How well does it work?
3. Does it do what we want it to?
4. Does it work for the reasons we think it does?
5. Is it cost effective?
6. Are the benefits worth it?
7. What are the unintended consequences?
An evaluation answers questions such as….
Challenges facing developing countries in M and E
Masuku, Ngengeezi and Ijeoma EOC (2015: 5) report the
following:-
1. Designing M and E
2. Context Challenges
3. Cooperation and coordination
4. Institutional challenges
….
5. Lack of stakeholder involvement
6. Compliance
7. Linking planning, budget, priorities and M and E
8. Lack of integration with other strategic approaches
Other Challenges include:-
1. Capacity challenges
2. Poor coordination
3. Lack of legislative structures
4. Locus and focus problems
5. Elite capture vs stakeholder involvement
6. Absence of theory of change
7. Lack of evidence and truth
Types of Evaluation
• Evaluation Type by Level
1) Project-level evaluation
2) Program-level evaluation
3) Sector program evaluation
4) Thematic evaluation
5) Policy-level evaluation
• Evaluation Types by Stages of Project Cycle
1) Ex-ante Evaluation
2) Mid-term evaluation
3) Terminal evaluation
4) Impact evaluation
5) Ex-post evaluation
Types of evaluation based on the results chain
1. Context evaluation
2. Input evaluation
3. Process evaluation
4. Output evaluation
5. Outcome evaluation
6. Impact evaluation
Professionalization of evaluation
• By 2010, there were more than 65 national
and regional evaluation organizations
throughout the world, most in developing
countries
• Although specialized training programs have
existed for several decades, graduate
degree programs in evaluation have
emerged only recently
– Australasia
– Africa
– Canada
– Central America
– Europe (not every country)
– Japan
– Malaysia
– United Kingdom
Professional Standards
• Utility
• Feasibility
• Propriety
• Accuracy
• Evaluation Accountability
DISCIPLINES FROM WHICH EVALUATION BORROWS?
•‘Social Research Methods’
•Sociology
•Economics
•Statistics
•Development studies
•Public Administration
•Social Anthropology
•Education
•Project Management
•Management
•Engineering
•Policy Analysis
•History
Structure of an evaluation-Commissioner’s perspective
1. Concept Paper
2. RFQ/Proposal
3. Evaluation of EOI/Proposal
4. Contact negotiation
5. Providing contacts and support
6. Quality control
7. Providing information
8. Approval of the report
9. Discussion of results
10. Discussion of consequences
11. Managing implementation of recommendations
Structure of an evaluation-Evaluator’s
Perspective
1. EOI/Proposal
2. Contract Negotiation
3. Planning workshop meeting
4. Clarifying organizational questions
5. Inception report
6. Data collection, analysis and interpretation
7. Continuous coordination and exchange of information among parties
8. Draft evaluation report
9. Final evaluation report
10. Closing workshop
11. Follow-up
Policy Maker’s Perspective???
1. Emerging questions
2. Directives on how to address the questions
3. Participation as respondents
4. Participation in workshops discussing results
5. Utilization of evaluation results
6. Change in policy as a result of the evaluation
Tasks
1. As board members, what comes in your mind when you hear or read
about Monitoring and evaluation?
2. What came into your mind when you were told you were going to do a
training in M and E?
Principles of Evaluation
In every evaluation, it is clear certain questions have to be addressed:-
1. What is to be evaluated (evaluands)
2. Why should the evaluation be conducted (purpose)
3. What criteria should be applied
4. Who should conduct the evaluation
5. When should the evaluation be conducted
6. How should the evaluation be conducted
1. What do we evaluate?
•The things to be evaluated (evaluands) now days range from :-
•Laws
•Products
•Services
•Organizations
•People
•Processes
•Social state of affairs of any kind (Stockman & Meyer 2013:67)
• On the political Global level:
1. Global Goals (the Millennium Development Goals)
2. International Conventions (the Geneva Conventions)
3. DECD Standards?
Millennium Development Goals
1) Eradicate extreme poverty and hunger
2) Achieve universal primary education
3) Promote gender equality and empower women
4) Reduce child mortality
5) Improve maternal health
6) Combat HIV/AIDS, malaria, and other diseases
7) Ensure environmental sustainability
8) Develop a global partnership for development
• On the political Africa level
1. African political federation
2. African Union peace and military initiatives
3. African participation in ICC
4. African Governance Mechanisms (APRM)
5. New Partnership for African Development (NEPAD)
• On political regional level (East African Community level)
1. East African political federation
2. East African Customs union
3. Northern Corridor interventions
4. Oil pipeline interventions
5. Standard railway-guage interventions
On the political country level
1. National Development Plans
2. Vision 2040
3. The NRM Manifesto (2011-2016)
4. Strategies ( NAADS interventions, Basket funding strategies)
• The Programme Level
• Things become a bit more concrete when we move on to the programme level
• Interventions take place and are followed up by Monitoring & Evaluation.
• The idea here is to assess how programmes work.
• The following are the most common examples of evaluation objects:
• Programmes
• Projects
• Single Measures/Activities/Outputs
• Competences/Resources/Inputs
• The System Level (Board plays a leading role)
• At the system level, things again become more abstract and less easy to handle.
• Typical objects are:
• Structures/Systems
• Networks
• Organisations
• Institutions
• Rules/Norms/Curricula/Agreements
• The Performance Level
 The question shifts to the way a policy/ Programme/system/intervention/etc. evolves.
 Nowadays there is a much greater focus on the performance aspect of programmes (and also on the
systemic view) than in former times.
 The main objects of performance evaluations are:
1. Results/Impacts
2. Performances/Processes
3. Management/Governance
4. Operations/Actions
• The Individual Level
• The assessment focuses on either group processes or individual behaviour and the attitudes
behind them.
• So the main objects are:
1) Interaction/Group Behaviour
2) Communication/Exchange –
3) Individual Behaviour
4) Cognitive Processes/Attitudes
2. Why evaluation?
1. Providing gainful insights
2. Exercising control
3. Initiation of development and learning processes
4. Legitimization of measures, projects or prgrammes implemented
5. Accountability roles
6. Observe implementation processes
7. Assessing the feasibility of a programme (programme development phase-formative)
8. Supporting Managers in management (during implemenation phases)
3. What assessment criteria?
• Development Assistance Committee (DAC) of the Organisation for Economic
Cooperation and Development(OECD) has developed criteria towards which many
national organizations orient themselves
A. Relevance
B. Effectiveness
C. Efficiency
D. Impact
E. Sustaianability
Note
• If there are standards like the ones of DAC, they are directly stated by the client
• Sometimes, it is left to the evaluator to determine the criteria as he or she is
considered to be an expert who ought to know what criteria best on what is to
be evaluated-this is knowledge-or experience-based
• It is rare for the criteria to be set by the target group
4. Who should conduct the evaluation?
• Evaluation is best thought of as a team effort.
• Although one person heads an evaluation team and has primary responsibility
for the project, this individual will need assistance from others on your staff.
• An evaluation team will work together on the following tasks:
1. Determining the focus and design of the evaluation.
2. Developing the evaluation plan, performance indicators, and data collection instruments.
3. Collecting, analyzing, and interpreting data.
4. Preparing the report on evaluation findings.
Options on who to conduct evaluation
1) Hiring an outside evaluator (option 1).
2) Using an in-house evaluation team supported by an outside consultant and
program staff (option 2).
3) Using an in-house evaluation team supported by program staff (option 3).
Note:-
•Evaluators are diverse:
•They might be economists concerned with efficiency and costs;
•Management consultants interested in the smooth running of the organization;
•Policy analysts with a commitment to public sector reforms and transparency;
•Scientists concerned to establish truth, generate new knowledge and confirm or
disconfirm hypothesis.
5. When should the evaluation be
conducted?
1. Before the intervention
2. During the implementation
3. Mid-way the implementation process
4. After the implementation
6. How will the evaluation be conducted?
1. Scientific-oriented approaches
2. Management-oriented approaches
3. Participant-oriented approaches
4. Qualitative-oriented approaches
Session 2: Building Monitoring and Evaluation
systems
Board establishes M and E System
• One may define an M&E system as a collection of people, procedures, data
and technology that interact to provide timely information for authorized
decision-makers
• M and E systems are systems used to monitor and evaluate a project,
program or organization to see if it is on track to achieve its overall outcomes
If you have a project for example….
61
A good M/E system will
1.Monitor the use of project inputs
2.Monitor the effectiveness of the project implementation process
3. Monitor the production of project outputs
4.Assess project impacts on the target communities
5.Assess the effectiveness of project outputs in producing the
intended short-term and long-term impacts.
6.Assess the extent to which these impacts can be attributed to the
effects of the project.
Why build an M and E system?
1) Supports planning activities at the sectoral and program level
2) Provides information for a more efficient allocation of public funds
3) Facilitates program management`
4) Helps re-designing and improving programs
5) Promotes transparency and accountability
6) Enriches policy discussion by incorporating rigorous evidence
M and E Systems has 12 key features
1. Organizational Structures with M&E Functions
 The M&E unit whose main purpose is to coordinate all the M&E functions.
 Some organizations prefer to outsource such services.
 M&E unit should have its roles defined, its roles should be supported by the
organizations hierarchy and other units within the organization should be
aligned to support the M&E functions.
…
2. Human Capacity for M&E
 An effective M&E implementation requires not only adequate staff but they must have the
necessary M&E technical know-how and experience.
 Necessary to have the human resource that can run the M&E function by hiring employees
who have adequate knowledge and experience in M&E implementation
 Ensure that the M&E capacity of these employees are continuously developed through
training and other capacity building initiatives to ensure that they keep up with current and
emerging trends in the field
….
3. Partnerships for Planning, Coordinating and Managing the M&E System
 A prerequisite for successful M&E systems whether at organizational or national
levels is the existence of M&E partnerships.
 Partnerships for M&E systems complement the organization’s M&E efforts in the
M&E process and they act as a source of verification for whether M&E functions
align to intended objectives.
 Partnerships also serve auditing purposes where line ministries, technical
working groups, communities and other stakeholders are able to compare M&E
outputs with reported outputs.
….
4. M&E frameworks/Logical Framework
 The M&E framework outlines the objectives, inputs, outputs and outcomes of the
intended project and the indicators that will be used to measure all these.
 It also outlines the assumptions that the M&E system will adopt.
 The M&E framework is essential as it links the objectives with the process and
enables the M&E expert know what to measure and how to measure it.
…
5. M&E Work Plan and costs
 Closely related to the M&E frameworks is the M&E Work plan and costs.
 While the framework outlines objectives, inputs, outputs and outcomes of the
intended project, the work plan outlines how the resources that have been
allocated for the M&E functions will be used to achieve the goals of M&E.
 The work plan shows how personnel, time, materials and money will be used to
achieve the set M&E functions
….
6. Communication, Advocacy and Culture for M&E
 This refers to the presence of policies and strategies within the organization to promote
M&E functions.
 Without continuous communication and advocacy initiatives within the organization to
promote M&E, it is difficult to entrench the M&E culture within the organization.
 Such communication and strategies need to be supported by the organizations hierarchy.
 The existence of an organizational M&E policy, together with the continuous use of the
M&E system outputs on communication channels are some of the ways of improving
communication, advocacy and culture for M&E
….
7. Routine Programme Monitoring
 M&E consists of two major aspects: monitoring and evaluation.
 This component emphasizes the importance of monitoring.
 Monitoring refers to the continuous and routine data collection that takes place during
project implementation.
 Data needs to be collected and reported on a continuous basis to show whether the
project activities are driving towards meeting the set objectives.
 They also need to be integrated into the program activities for routine gathering and
analysis.
….
8. Surveys and Surveillance
 This involves majorly the national level M&E plans and entails how frequently
relevant national surveys are conducted in the country.
 National surveys and surveillance needs to be conducted frequently and used to
evaluate progress of related projects.
 For example, for HIV and AIDS national M&E plans, there needs to be HIV related
surveys carried at last bi-annually and used to measure HIV indicators at the
national level.
…
9. National and Sub-national databases
 The data world is gradually becoming open source.
 More and more entities are seeking data that are relevant for their purposes.
 The need for M&E systems to make data available can therefore not be over-
emphasized.
 This implies that M&E systems need to develop strategies of submitting relevant,
reliable and valid data to national and sub-national databases.
…
10. Supportive Supervision and Data Auditing
 Every M&E system needs a plan for supervision and data auditing.
 Supportive supervision implies that an individual or organization is able to supervise
regularly the M&E processes in such a way that the supervisor offers suggestions on ways
of improvement.
 Data auditing implies that the data is subjected to verification to ensure its reliability and
validity.
 Supportive supervision is important since it ensures the M&E process is run efficiently,
while data auditing is crucial since all project decisions are based on the data collected.
…
11. Evaluation and Research
 One aspect of M&E is research and the other is evaluation.
 Evaluation of projects is done at specific times most often mid- term and at the end of the
project.
 Evaluation is an important component of M&E as it establishes whether the project has
met he desired objectives.
 It usually provides for organizational learning and sharing of successes with other
stakeholders.
….
12. Data Dissemination and use
 The information that is gathered during the project implementation phase needs
to be used to inform future activities, either to reinforce the implemented
strategy or to change it.
 Additionally, results of both monitoring and evaluation outputs need to be
shared out to relevant stakeholders for accountability purposes.
 Organizations must therefore ensure that there is an information dissemination
plan either in the M&E plan, Work plan or both.
Session three: The Evaluation Process and Role of
the Board
The evaluation
manager’s role,
in consultation
with the
steering
committee
The evaluator’s role
Clarify policy/
programme
objectives and
intended outcomes
Clarify intended
evaluation
purpose, users and
uses
Develop relevant
evaluation
questions
Select the
evaluation
approach and
methods
Identify data
sources and
collection and
analysis procedures
Identify the
necessary
resources and
governance
arrangements
Prepare the TOR; commission
(and possibly tender) the
evaluation
Conduct and Manage the
evaluation
Our main focus in this
session
The evaluation process – an overview
Terms of
Reference
START
Evaluation Assessment
(2-3 months)
Contracting
(3-4 months)
Field Work/
Analysis
(6-8 months)Report &
Recommendations
(2-4 months)
Management
Responses
(1 month)
Executive
Approval;
Internet Posting
(2-3 months)
Implementing
Change/Follow Up
Large evaluations typically
take 12-18 months to
complete. Some phases
may overlap.
Cycle of conducting and managing evaluations
Standards
Utility
Feasibility
Propriety
Accuracy
Engage Stakeholders
Focus the
Evaluation
Design
Describe the
program
Gather credible
evidence
Justify
conclusions
Use and share
lessons learned
Steps
Steps of the Evaluation process
Step 1- Engage Stakeholders
Who are the stakeholders?
Those involved in program operations, those
affected by the program operations, and
users of evaluation results
Step 2 - Describe the Program
 What are the goals and specific aims of
the program?
 What problem or need is it designed to
address?
 What are the measurable objectives?
What are the strategies to achieve the
objectives?
 What are the expected effects?
 What are the resources and activities?
 How is the program supposed to work?
What do you want to know?
Consider the purpose, uses,
questions, methods, roles,
budgets, deliverables etc.
An evaluation cannot
answer all questions
For all stakeholders.
Step 3 - Focus the evaluation design
 Data collected must address the evaluation questions
 Evidence must be believable, trustworthy and relevant
 Information scope, sources, quality, logistics,
methodology & data collection.
 Who is studied and when?
Step 4 - Gather credible evidence
Consider the data you have:
• Analysis and synthesis
- determine findings.
• Interpretation
- what do findings mean?
• Judgments
- what is the value of findings based on accepted
standards?
• Recommendations –
- what claims can be made?
- what are the limitations of your design?
Step 5 - “Justify” Conclusions
 Share lessons learned with stakeholders!
 Provide feedback, offer briefings. disseminate findings
 Implement evaluation recommendations
 Develop a new/revised implementation plan in
partnership with stakeholders
Step 6 - Use and share results
 Several options (not mutually exclusive)
Reconstructing baseline data ex post: recall method (more later)
Use key informants and triangulate (mostly qualitative)
Reconstruct a baseline “scenario” with secondary data (not
always practical given absence and quality of baseline studies)
Single difference with econometric techniques: some practical
obstacles (workload, time constraints, availability of trained
specialists)
Dealing with lack of baseline data
 Identify the implementation logic and theory of change
 Allow for the inception report phase
 Deal with missing baseline and other gaps
 Gather data
 Examine the effort using various criteria
 Draw conclusions and recommendations
 Conduct reporting
 Ensure quality
 Feedback on the evaluation
 Management response
 Dissemination findings
 Feedback and lessons learnt
Best practices in managing evaluations
If I had know
too much
information
would make it
complicated, I
wouldn’t have
asked for it!!!
• Value for money a key concern
• Underfunding as wasteful as over-funding
• Balance between cost and quality
• Quality ultimately more important
• But also relevance for purpose
• Make sure all aspects adequately funded including consultation
with stakeholders, reporting and dissemination
• Ensure evaluation design appropriate to budget as well as aims of
programme
Evaluation budget
Conclusions
1) Monitoring and evaluation is now a condition for development partners and
government collaboration
2) Board’s oversight role is well done if there are functioning M and E systems
3) International standards need to be integrated into the local M and E systems
4) M and E units need to be adequately staffed and their capacity enhanced
5) M and E budget needs to be supported by the board

More Related Content

Similar to Role of Board in Monitoring Evaluation

Process assumptions-values-n-beliefs-of-od
Process assumptions-values-n-beliefs-of-odProcess assumptions-values-n-beliefs-of-od
Process assumptions-values-n-beliefs-of-odaileenv21
 
FGS 2015 - Strategic Planning for Society Leaders
FGS 2015 - Strategic Planning for Society LeadersFGS 2015 - Strategic Planning for Society Leaders
FGS 2015 - Strategic Planning for Society LeadersGenealogyMedia.com
 
An effective way to change organizations
An effective way to change organizationsAn effective way to change organizations
An effective way to change organizationsLeszek Soltysik
 
Management & leadership leprosy 7th july
Management & leadership leprosy 7th julyManagement & leadership leprosy 7th july
Management & leadership leprosy 7th julyThurein Naywinaung
 
Vicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact EvaluationVicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact EvaluationJosh Chandler
 
programme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanprogramme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanPriya Das
 
Monitoring And Evaluation Of Knowledge Management Elb
Monitoring And Evaluation Of Knowledge Management   ElbMonitoring And Evaluation Of Knowledge Management   Elb
Monitoring And Evaluation Of Knowledge Management ElbEwen Le Borgne
 
Organization Development
Organization DevelopmentOrganization Development
Organization DevelopmentSalman Hameed
 
Humidtropics Systems Conference Facilitator's Presentation by Jürgen Hagmann
Humidtropics Systems Conference Facilitator's Presentation by Jürgen HagmannHumidtropics Systems Conference Facilitator's Presentation by Jürgen Hagmann
Humidtropics Systems Conference Facilitator's Presentation by Jürgen HagmannHumidtropics, a CGIAR Research Program
 
Business Communication: Chap 3 -face to face meeting
Business Communication: Chap 3  -face to face meetingBusiness Communication: Chap 3  -face to face meeting
Business Communication: Chap 3 -face to face meetingBrenda Rachel Marie
 
Academic Personal Branding - Softskill_&_Research.pptx
Academic Personal Branding - Softskill_&_Research.pptxAcademic Personal Branding - Softskill_&_Research.pptx
Academic Personal Branding - Softskill_&_Research.pptxDR. Ram Kumar Pathak
 
cupdf.com_chapter-1-definitions-and-foundations-of-od.ppt
cupdf.com_chapter-1-definitions-and-foundations-of-od.pptcupdf.com_chapter-1-definitions-and-foundations-of-od.ppt
cupdf.com_chapter-1-definitions-and-foundations-of-od.pptMuskanMere
 
Educational evaluation. ed8 chapter 6
Educational evaluation. ed8 chapter 6Educational evaluation. ed8 chapter 6
Educational evaluation. ed8 chapter 6Eddie Abug
 
ID introduction instructional design for health professionals
ID introduction  instructional design for health professionalsID introduction  instructional design for health professionals
ID introduction instructional design for health professionalsVaikunthan Rajaratnam
 
Transplanning: Transition and Planning
Transplanning: Transition and PlanningTransplanning: Transition and Planning
Transplanning: Transition and PlanningAIESEC Medina
 
INtroduction to Organizational development
INtroduction to Organizational developmentINtroduction to Organizational development
INtroduction to Organizational developmentDrNajmonnisa
 
Chapter 6 exercises in assessment 2
Chapter 6 exercises in assessment 2Chapter 6 exercises in assessment 2
Chapter 6 exercises in assessment 2Geneiva Anne Casas
 
Comp10 unit9 lecture_slides
Comp10 unit9 lecture_slidesComp10 unit9 lecture_slides
Comp10 unit9 lecture_slidesCMDLMS
 
Essentials of Building a culture of feedback - pulse survey
Essentials of Building a culture of feedback - pulse surveyEssentials of Building a culture of feedback - pulse survey
Essentials of Building a culture of feedback - pulse surveyXoxoday
 

Similar to Role of Board in Monitoring Evaluation (20)

Process assumptions-values-n-beliefs-of-od
Process assumptions-values-n-beliefs-of-odProcess assumptions-values-n-beliefs-of-od
Process assumptions-values-n-beliefs-of-od
 
FGS 2015 - Strategic Planning for Society Leaders
FGS 2015 - Strategic Planning for Society LeadersFGS 2015 - Strategic Planning for Society Leaders
FGS 2015 - Strategic Planning for Society Leaders
 
An effective way to change organizations
An effective way to change organizationsAn effective way to change organizations
An effective way to change organizations
 
Management & leadership leprosy 7th july
Management & leadership leprosy 7th julyManagement & leadership leprosy 7th july
Management & leadership leprosy 7th july
 
Vicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact EvaluationVicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact Evaluation
 
programme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanprogramme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhan
 
Monitoring And Evaluation Of Knowledge Management Elb
Monitoring And Evaluation Of Knowledge Management   ElbMonitoring And Evaluation Of Knowledge Management   Elb
Monitoring And Evaluation Of Knowledge Management Elb
 
Organization Development
Organization DevelopmentOrganization Development
Organization Development
 
Humidtropics Systems Conference Facilitator's Presentation by Jürgen Hagmann
Humidtropics Systems Conference Facilitator's Presentation by Jürgen HagmannHumidtropics Systems Conference Facilitator's Presentation by Jürgen Hagmann
Humidtropics Systems Conference Facilitator's Presentation by Jürgen Hagmann
 
Business Communication: Chap 3 -face to face meeting
Business Communication: Chap 3  -face to face meetingBusiness Communication: Chap 3  -face to face meeting
Business Communication: Chap 3 -face to face meeting
 
Change Calling: VAP intro
Change Calling:  VAP introChange Calling:  VAP intro
Change Calling: VAP intro
 
Academic Personal Branding - Softskill_&_Research.pptx
Academic Personal Branding - Softskill_&_Research.pptxAcademic Personal Branding - Softskill_&_Research.pptx
Academic Personal Branding - Softskill_&_Research.pptx
 
cupdf.com_chapter-1-definitions-and-foundations-of-od.ppt
cupdf.com_chapter-1-definitions-and-foundations-of-od.pptcupdf.com_chapter-1-definitions-and-foundations-of-od.ppt
cupdf.com_chapter-1-definitions-and-foundations-of-od.ppt
 
Educational evaluation. ed8 chapter 6
Educational evaluation. ed8 chapter 6Educational evaluation. ed8 chapter 6
Educational evaluation. ed8 chapter 6
 
ID introduction instructional design for health professionals
ID introduction  instructional design for health professionalsID introduction  instructional design for health professionals
ID introduction instructional design for health professionals
 
Transplanning: Transition and Planning
Transplanning: Transition and PlanningTransplanning: Transition and Planning
Transplanning: Transition and Planning
 
INtroduction to Organizational development
INtroduction to Organizational developmentINtroduction to Organizational development
INtroduction to Organizational development
 
Chapter 6 exercises in assessment 2
Chapter 6 exercises in assessment 2Chapter 6 exercises in assessment 2
Chapter 6 exercises in assessment 2
 
Comp10 unit9 lecture_slides
Comp10 unit9 lecture_slidesComp10 unit9 lecture_slides
Comp10 unit9 lecture_slides
 
Essentials of Building a culture of feedback - pulse survey
Essentials of Building a culture of feedback - pulse surveyEssentials of Building a culture of feedback - pulse survey
Essentials of Building a culture of feedback - pulse survey
 

More from Mwiza Helen

Gender dynamics in public policy management in uganda & south africa
Gender dynamics in public policy management in uganda & south africaGender dynamics in public policy management in uganda & south africa
Gender dynamics in public policy management in uganda & south africaMwiza Helen
 
Utamu postgraduate educational model 1
Utamu postgraduate educational model 1Utamu postgraduate educational model 1
Utamu postgraduate educational model 1Mwiza Helen
 
Role of Board in Monitoring Evaluation
Role of Board in Monitoring EvaluationRole of Board in Monitoring Evaluation
Role of Board in Monitoring EvaluationMwiza Helen
 
Role of Board in Monitoring Evaluation
Role of Board in Monitoring EvaluationRole of Board in Monitoring Evaluation
Role of Board in Monitoring EvaluationMwiza Helen
 
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION Mwiza Helen
 
Supervision of graduate studnets
Supervision of graduate studnetsSupervision of graduate studnets
Supervision of graduate studnetsMwiza Helen
 
Procurement 2015
Procurement 2015Procurement 2015
Procurement 2015Mwiza Helen
 
Comesa innovation-awards-2015
Comesa innovation-awards-2015Comesa innovation-awards-2015
Comesa innovation-awards-2015Mwiza Helen
 
Best practices-in-elearning
Best practices-in-elearningBest practices-in-elearning
Best practices-in-elearningMwiza Helen
 
Best practices-in-elearning
Best practices-in-elearningBest practices-in-elearning
Best practices-in-elearningMwiza Helen
 
Mo fa effective use of ict tools-05-01-2015
Mo fa effective use of ict tools-05-01-2015Mo fa effective use of ict tools-05-01-2015
Mo fa effective use of ict tools-05-01-2015Mwiza Helen
 
Best practices on corporate governance of higher education 1....
Best practices on corporate governance of higher education 1....Best practices on corporate governance of higher education 1....
Best practices on corporate governance of higher education 1....Mwiza Helen
 
Management and leadership notu-1.....
Management and leadership notu-1.....Management and leadership notu-1.....
Management and leadership notu-1.....Mwiza Helen
 
Management and leadership notu
Management and leadership notuManagement and leadership notu
Management and leadership notuMwiza Helen
 
Management and leadership notu
Management and leadership notuManagement and leadership notu
Management and leadership notuMwiza Helen
 
Management and leadership notu
Management and leadership notuManagement and leadership notu
Management and leadership notuMwiza Helen
 
Management and leadership NOTU
Management and leadership NOTUManagement and leadership NOTU
Management and leadership NOTUMwiza Helen
 
The role of procurement in the realisation of
The role of procurement in the realisation ofThe role of procurement in the realisation of
The role of procurement in the realisation ofMwiza Helen
 
The role of procurement in the realisation of
The role of procurement in the realisation ofThe role of procurement in the realisation of
The role of procurement in the realisation ofMwiza Helen
 
Public service transformation in africa key note speech
Public service transformation in africa key note speechPublic service transformation in africa key note speech
Public service transformation in africa key note speechMwiza Helen
 

More from Mwiza Helen (20)

Gender dynamics in public policy management in uganda & south africa
Gender dynamics in public policy management in uganda & south africaGender dynamics in public policy management in uganda & south africa
Gender dynamics in public policy management in uganda & south africa
 
Utamu postgraduate educational model 1
Utamu postgraduate educational model 1Utamu postgraduate educational model 1
Utamu postgraduate educational model 1
 
Role of Board in Monitoring Evaluation
Role of Board in Monitoring EvaluationRole of Board in Monitoring Evaluation
Role of Board in Monitoring Evaluation
 
Role of Board in Monitoring Evaluation
Role of Board in Monitoring EvaluationRole of Board in Monitoring Evaluation
Role of Board in Monitoring Evaluation
 
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
 
Supervision of graduate studnets
Supervision of graduate studnetsSupervision of graduate studnets
Supervision of graduate studnets
 
Procurement 2015
Procurement 2015Procurement 2015
Procurement 2015
 
Comesa innovation-awards-2015
Comesa innovation-awards-2015Comesa innovation-awards-2015
Comesa innovation-awards-2015
 
Best practices-in-elearning
Best practices-in-elearningBest practices-in-elearning
Best practices-in-elearning
 
Best practices-in-elearning
Best practices-in-elearningBest practices-in-elearning
Best practices-in-elearning
 
Mo fa effective use of ict tools-05-01-2015
Mo fa effective use of ict tools-05-01-2015Mo fa effective use of ict tools-05-01-2015
Mo fa effective use of ict tools-05-01-2015
 
Best practices on corporate governance of higher education 1....
Best practices on corporate governance of higher education 1....Best practices on corporate governance of higher education 1....
Best practices on corporate governance of higher education 1....
 
Management and leadership notu-1.....
Management and leadership notu-1.....Management and leadership notu-1.....
Management and leadership notu-1.....
 
Management and leadership notu
Management and leadership notuManagement and leadership notu
Management and leadership notu
 
Management and leadership notu
Management and leadership notuManagement and leadership notu
Management and leadership notu
 
Management and leadership notu
Management and leadership notuManagement and leadership notu
Management and leadership notu
 
Management and leadership NOTU
Management and leadership NOTUManagement and leadership NOTU
Management and leadership NOTU
 
The role of procurement in the realisation of
The role of procurement in the realisation ofThe role of procurement in the realisation of
The role of procurement in the realisation of
 
The role of procurement in the realisation of
The role of procurement in the realisation ofThe role of procurement in the realisation of
The role of procurement in the realisation of
 
Public service transformation in africa key note speech
Public service transformation in africa key note speechPublic service transformation in africa key note speech
Public service transformation in africa key note speech
 

Recently uploaded

PSYCHIATRIC History collection FORMAT.pptx
PSYCHIATRIC   History collection FORMAT.pptxPSYCHIATRIC   History collection FORMAT.pptx
PSYCHIATRIC History collection FORMAT.pptxPoojaSen20
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docxPoojaSen20
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting DataJhengPantaleon
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 

Recently uploaded (20)

PSYCHIATRIC History collection FORMAT.pptx
PSYCHIATRIC   History collection FORMAT.pptxPSYCHIATRIC   History collection FORMAT.pptx
PSYCHIATRIC History collection FORMAT.pptx
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docx
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 

Role of Board in Monitoring Evaluation

  • 1. Role of board in Monitoring and Evaluation Professor Benon C Basheka, PhD, FCIPS Uganda Technology and Management University
  • 2. LEARNING OUTCOMES By the end of the training, board members should be able to:- 1. Describe the importance of monitoring and evaluation in organizational effectiveness 2. Apply M and E knowledge in the day today activities of their work in ABiTRust 3. Relate M and E requirements to the DEC standards 4. Oversees establishment of an effective M and E system for the organization
  • 3. TRAINING METHODOLOGY 1. Presentations 2. Individual experiences 3. Discussions
  • 4. Session coverage The training will generally cover three units:- 1. General Introduction and Role of Boards of Trustees (with relevance to M and E) 2. Requirements for establishing an M and E system (and role of the board) 3. The Evaluation Process and Roles the board
  • 5. UNIT 1: GENERAL INTRODUCTION AND ROLE OF THE BOARD IN M & E
  • 6. INTRODUCTION • Monitoring and evaluation are increasingly being emphasized in all endeavors • The need to determine what interventions are working or not and why now necessitate organizations to invest in building sound monitoring and evaluation systems • In almost all organizations where boards exist, boards generally have oversight functions-overseeing all activities of the organization (M and E Inclusive) • The range of things where evaluation applies is limited but can include projects, programmes, processes, systems etc
  • 7. Monitoring/Evaluation is to be seen as:- • As a profession with a set of standards • As a field of practice (Monitoring and evaluation units and staff therein) • As a discipline of study which is an academic field now declared to be in adolescent stages (universities and other higher educational institutions
  • 8. Organizations have advisory or policy organs 1. Board of governors 2. Board of managers 3. Board of regents 4. Board of trustees, and 5. Board of visitors
  • 9. Different Modes of Board Governance 1. Advisory Board Model 2. Patron Model 3. Co-operative Model 4. Management Team Model 5. Policy Board Model [AbiTrust?]
  • 10. Abi-TRUST Governance Founders Committee Management Staff M and E unit Board of Trustees
  • 11. Typical duties of boards of directors include: 1. Governing the organization by establishing broad policies and objectives 2. Selecting, appointing, supporting and reviewing the performance of the chief executive 3. Ensuring the availability of adequate financial resources 4. Approving annual budgets 5. Approving annual plans 6. Approving and reviewing Reports for management activities
  • 12. Working Methodology of Boards • In larger organizations, the structure of the board and its committees usually mirrors the structure of the organization's administration. • Just as there are staff responsible for human resources, fund-raising, finance, planning, and programs, the board creates committees with responsibility for these areas. • One of the areas of board committee will be monitoring and evaluation • Even the performance of the board or its committees needs to be measured
  • 13. Strategically 1. Mobilize resources 2. Look for networks 3. Design and approve policy 4. Provide oversight 5. Participate in evaluation
  • 14. Boards Strength lies on :- 1. Composition 2. Expertise 3. Experience 4. Qualifications 5. Networks
  • 15. CONTEXTUALLY • The Board of Trustees of aBi Trust has a four-fold mandate: 1. To protect the Trust’s assets over time and ensure survival and the prosperity of the Trust in a transparent, accountable and responsible manner 2. To guide the Trust in fulfilling its Vision, mission and objectives 3. To give strategic direction to aBi Trust Management 4. To protect the Trust’s interests.
  • 16. What is the implication? 1. The mandates and functions of the board cannot be efficiently and effectively achieved without sound knowledge of M and E 2. M and E supplies knowledge for oversight and decision making 3. M and E supplies the board with necessary tools and methodologies 4. M and E provides the right attitude and mindset for involvement 5. M and E shapes the decision making to support building required systems
  • 17. Is Evaluation new?  Evaluation is as old as the world itself and has moved side by side the journey of human civilization  The scriptures tell us under genesis 1:31 that when God created the earth, the light in the darkness, the firmament in the midst in the midst of the waters, the plants, the animals, and finally man, at the end of the fifth day’…God saw everything that he had made, and behold, it was very good  He used some criteria unknown to us and this enabled him to make an assessment on whose findings he was able to make a fundamental decision of not scrapping what he had done
  • 18. — Michael Q. Patton God’s archangel came then, asking, “God, how do you know that what you have created is ‘very good’? What are your criteria? On what data do you base your judgment? Just what results were you expecting to attain? And aren’t you a little close to the situation to make a fair and unbiased evaluation?” God thought about these questions and that day and God’s rest was greatly disturbed. On the eighth day God said, “Lucifer go to hell.”
  • 19. The second example • From the philosophical works of Socrates, Plato and Aristotle to the mathematical methodologies of Pythagoras and Euclid, the ideas of the ancient Greeks shaped many institutions and contributions to many fields including evaluation. • Existing scholarly accounts inform us that the Delphic oracle of the ninth to the third centuries BC was the first central intelligence database of the ancient world, an interdisciplinary think tank of approximately 90 priests, deemed the best educated experts of antiquity. • They collected and evaluated information and advised ordinary people and leaders, among them Alexander the Great. Major project management in the fourth century BC existed.
  • 20. In ancient Greece, • the practices of their 12 gods, called Olympians because they were stationed on Olympus, the highest mountain in Greece shed some light on how evaluation was done. • The council of the Olympian gods and goddesses made collective decisions with input from an expert panel, which consisted of Zeus (the president of the gods), Athena (the goddess of wisdom), Hermes (the god of information and commerce), and any other god whose area of expertise would be pertinent to the subject in question.
  • 21. In their Working methodology • Meetings were problem-oriented participatory sessions, characterized by intense discussions and searches for best solution. • The gods' decisions were persuasively communicated to mortals and powerfully implemented with follow-up reports (Theofanides 1999). • The Olympian style of management and decision making is illustrated in the steps below:-
  • 22. 1. Identify the problem or theme for action. Collect all relevant information and data, through the intelligence work by Hermes, the god of informatics. 2. Search for solutions via dialogue with all participants. Discuss step 1 with all concerned parties and propose alternative solutions. 3. Select the best problem solution or action theme, mainly by conferring with the concerned party or parties 4. Announce the decision of the gods to all mortals concerned through the god of informatics, Hermes. Send Peitho, the goddess of persuasion, to illuminate the best solution in step 3 as the decision of the gods of Olympus 5. Use lightning and thunderbolts to implement the Olympian decisions in step 4 to achieve the desired goals identified in steps 1 and 3. 6. Implement all decisions, supervised by Hermes, the god of informatics, who announces to the Olympian gods the results of their action taken in step
  • 23. In contemporary times…. • Monitoring and evaluation (M&E) is an essential part of any program, project, policy and is often used in all kinds of contexts. • Development partners increasingly expect their partners to have sound M and E systems • Monitoring and evaluation can tell us :- • Whether a program, policy or project is making a difference and for whom; • It can identify program areas that are on target or aspects of a program that need to be adjusted or replaced. • Information gained from M&E can lead to better decisions about program investments. • It can demonstrate to program implementers and funders that their investments are paying off • It is a tool for ensuring accountability to other stakeholders
  • 24. Monitoring and evaluation can: • Help identify problems and their causes; • Suggest possible solutions to problems; • Raise questions about assumptions and strategy; • Push you to reflect on where you are going and how you are getting there; • Provide you with information and insight; • Encourage you to act on the information and insight; • Increase the likelihood that you will make a positive development difference.
  • 25. We conduct evaluations….  To help people make better decisions and achieve better outcomes  To provide better services (public and private) • By:  Comparing policy options objectively and rigorously  Calculating empirically the size of likely impacts  Calculating empirically the diversity/variance of impacts  Getting more precise estimates of risk and bias  Establishing a cumulative evidence base for decision making
  • 26. 1. Does it work? 2. How well does it work? 3. Does it do what we want it to? 4. Does it work for the reasons we think it does? 5. Is it cost effective? 6. Are the benefits worth it? 7. What are the unintended consequences? An evaluation answers questions such as….
  • 27. Challenges facing developing countries in M and E Masuku, Ngengeezi and Ijeoma EOC (2015: 5) report the following:- 1. Designing M and E 2. Context Challenges 3. Cooperation and coordination 4. Institutional challenges
  • 28. …. 5. Lack of stakeholder involvement 6. Compliance 7. Linking planning, budget, priorities and M and E 8. Lack of integration with other strategic approaches
  • 29. Other Challenges include:- 1. Capacity challenges 2. Poor coordination 3. Lack of legislative structures 4. Locus and focus problems 5. Elite capture vs stakeholder involvement 6. Absence of theory of change 7. Lack of evidence and truth
  • 30. Types of Evaluation • Evaluation Type by Level 1) Project-level evaluation 2) Program-level evaluation 3) Sector program evaluation 4) Thematic evaluation 5) Policy-level evaluation • Evaluation Types by Stages of Project Cycle 1) Ex-ante Evaluation 2) Mid-term evaluation 3) Terminal evaluation 4) Impact evaluation 5) Ex-post evaluation
  • 31. Types of evaluation based on the results chain 1. Context evaluation 2. Input evaluation 3. Process evaluation 4. Output evaluation 5. Outcome evaluation 6. Impact evaluation
  • 32. Professionalization of evaluation • By 2010, there were more than 65 national and regional evaluation organizations throughout the world, most in developing countries • Although specialized training programs have existed for several decades, graduate degree programs in evaluation have emerged only recently – Australasia – Africa – Canada – Central America – Europe (not every country) – Japan – Malaysia – United Kingdom
  • 33. Professional Standards • Utility • Feasibility • Propriety • Accuracy • Evaluation Accountability
  • 34. DISCIPLINES FROM WHICH EVALUATION BORROWS? •‘Social Research Methods’ •Sociology •Economics •Statistics •Development studies •Public Administration •Social Anthropology •Education •Project Management •Management •Engineering •Policy Analysis •History
  • 35. Structure of an evaluation-Commissioner’s perspective 1. Concept Paper 2. RFQ/Proposal 3. Evaluation of EOI/Proposal 4. Contact negotiation 5. Providing contacts and support 6. Quality control 7. Providing information 8. Approval of the report 9. Discussion of results 10. Discussion of consequences 11. Managing implementation of recommendations
  • 36. Structure of an evaluation-Evaluator’s Perspective 1. EOI/Proposal 2. Contract Negotiation 3. Planning workshop meeting 4. Clarifying organizational questions 5. Inception report 6. Data collection, analysis and interpretation 7. Continuous coordination and exchange of information among parties 8. Draft evaluation report 9. Final evaluation report 10. Closing workshop 11. Follow-up
  • 37. Policy Maker’s Perspective??? 1. Emerging questions 2. Directives on how to address the questions 3. Participation as respondents 4. Participation in workshops discussing results 5. Utilization of evaluation results 6. Change in policy as a result of the evaluation
  • 38. Tasks 1. As board members, what comes in your mind when you hear or read about Monitoring and evaluation? 2. What came into your mind when you were told you were going to do a training in M and E?
  • 39. Principles of Evaluation In every evaluation, it is clear certain questions have to be addressed:- 1. What is to be evaluated (evaluands) 2. Why should the evaluation be conducted (purpose) 3. What criteria should be applied 4. Who should conduct the evaluation 5. When should the evaluation be conducted 6. How should the evaluation be conducted
  • 40. 1. What do we evaluate? •The things to be evaluated (evaluands) now days range from :- •Laws •Products •Services •Organizations •People •Processes •Social state of affairs of any kind (Stockman & Meyer 2013:67)
  • 41. • On the political Global level: 1. Global Goals (the Millennium Development Goals) 2. International Conventions (the Geneva Conventions) 3. DECD Standards?
  • 42. Millennium Development Goals 1) Eradicate extreme poverty and hunger 2) Achieve universal primary education 3) Promote gender equality and empower women 4) Reduce child mortality 5) Improve maternal health 6) Combat HIV/AIDS, malaria, and other diseases 7) Ensure environmental sustainability 8) Develop a global partnership for development
  • 43. • On the political Africa level 1. African political federation 2. African Union peace and military initiatives 3. African participation in ICC 4. African Governance Mechanisms (APRM) 5. New Partnership for African Development (NEPAD)
  • 44. • On political regional level (East African Community level) 1. East African political federation 2. East African Customs union 3. Northern Corridor interventions 4. Oil pipeline interventions 5. Standard railway-guage interventions
  • 45. On the political country level 1. National Development Plans 2. Vision 2040 3. The NRM Manifesto (2011-2016) 4. Strategies ( NAADS interventions, Basket funding strategies)
  • 46. • The Programme Level • Things become a bit more concrete when we move on to the programme level • Interventions take place and are followed up by Monitoring & Evaluation. • The idea here is to assess how programmes work. • The following are the most common examples of evaluation objects: • Programmes • Projects • Single Measures/Activities/Outputs • Competences/Resources/Inputs
  • 47. • The System Level (Board plays a leading role) • At the system level, things again become more abstract and less easy to handle. • Typical objects are: • Structures/Systems • Networks • Organisations • Institutions • Rules/Norms/Curricula/Agreements
  • 48. • The Performance Level  The question shifts to the way a policy/ Programme/system/intervention/etc. evolves.  Nowadays there is a much greater focus on the performance aspect of programmes (and also on the systemic view) than in former times.  The main objects of performance evaluations are: 1. Results/Impacts 2. Performances/Processes 3. Management/Governance 4. Operations/Actions
  • 49. • The Individual Level • The assessment focuses on either group processes or individual behaviour and the attitudes behind them. • So the main objects are: 1) Interaction/Group Behaviour 2) Communication/Exchange – 3) Individual Behaviour 4) Cognitive Processes/Attitudes
  • 50. 2. Why evaluation? 1. Providing gainful insights 2. Exercising control 3. Initiation of development and learning processes 4. Legitimization of measures, projects or prgrammes implemented 5. Accountability roles 6. Observe implementation processes 7. Assessing the feasibility of a programme (programme development phase-formative) 8. Supporting Managers in management (during implemenation phases)
  • 51. 3. What assessment criteria? • Development Assistance Committee (DAC) of the Organisation for Economic Cooperation and Development(OECD) has developed criteria towards which many national organizations orient themselves A. Relevance B. Effectiveness C. Efficiency D. Impact E. Sustaianability
  • 52. Note • If there are standards like the ones of DAC, they are directly stated by the client • Sometimes, it is left to the evaluator to determine the criteria as he or she is considered to be an expert who ought to know what criteria best on what is to be evaluated-this is knowledge-or experience-based • It is rare for the criteria to be set by the target group
  • 53. 4. Who should conduct the evaluation? • Evaluation is best thought of as a team effort. • Although one person heads an evaluation team and has primary responsibility for the project, this individual will need assistance from others on your staff. • An evaluation team will work together on the following tasks: 1. Determining the focus and design of the evaluation. 2. Developing the evaluation plan, performance indicators, and data collection instruments. 3. Collecting, analyzing, and interpreting data. 4. Preparing the report on evaluation findings.
  • 54. Options on who to conduct evaluation 1) Hiring an outside evaluator (option 1). 2) Using an in-house evaluation team supported by an outside consultant and program staff (option 2). 3) Using an in-house evaluation team supported by program staff (option 3).
  • 55. Note:- •Evaluators are diverse: •They might be economists concerned with efficiency and costs; •Management consultants interested in the smooth running of the organization; •Policy analysts with a commitment to public sector reforms and transparency; •Scientists concerned to establish truth, generate new knowledge and confirm or disconfirm hypothesis.
  • 56. 5. When should the evaluation be conducted? 1. Before the intervention 2. During the implementation 3. Mid-way the implementation process 4. After the implementation
  • 57. 6. How will the evaluation be conducted? 1. Scientific-oriented approaches 2. Management-oriented approaches 3. Participant-oriented approaches 4. Qualitative-oriented approaches
  • 58. Session 2: Building Monitoring and Evaluation systems
  • 59. Board establishes M and E System • One may define an M&E system as a collection of people, procedures, data and technology that interact to provide timely information for authorized decision-makers • M and E systems are systems used to monitor and evaluate a project, program or organization to see if it is on track to achieve its overall outcomes
  • 60. If you have a project for example….
  • 61. 61 A good M/E system will 1.Monitor the use of project inputs 2.Monitor the effectiveness of the project implementation process 3. Monitor the production of project outputs 4.Assess project impacts on the target communities 5.Assess the effectiveness of project outputs in producing the intended short-term and long-term impacts. 6.Assess the extent to which these impacts can be attributed to the effects of the project.
  • 62. Why build an M and E system? 1) Supports planning activities at the sectoral and program level 2) Provides information for a more efficient allocation of public funds 3) Facilitates program management` 4) Helps re-designing and improving programs 5) Promotes transparency and accountability 6) Enriches policy discussion by incorporating rigorous evidence
  • 63. M and E Systems has 12 key features 1. Organizational Structures with M&E Functions  The M&E unit whose main purpose is to coordinate all the M&E functions.  Some organizations prefer to outsource such services.  M&E unit should have its roles defined, its roles should be supported by the organizations hierarchy and other units within the organization should be aligned to support the M&E functions.
  • 64. … 2. Human Capacity for M&E  An effective M&E implementation requires not only adequate staff but they must have the necessary M&E technical know-how and experience.  Necessary to have the human resource that can run the M&E function by hiring employees who have adequate knowledge and experience in M&E implementation  Ensure that the M&E capacity of these employees are continuously developed through training and other capacity building initiatives to ensure that they keep up with current and emerging trends in the field
  • 65. …. 3. Partnerships for Planning, Coordinating and Managing the M&E System  A prerequisite for successful M&E systems whether at organizational or national levels is the existence of M&E partnerships.  Partnerships for M&E systems complement the organization’s M&E efforts in the M&E process and they act as a source of verification for whether M&E functions align to intended objectives.  Partnerships also serve auditing purposes where line ministries, technical working groups, communities and other stakeholders are able to compare M&E outputs with reported outputs.
  • 66. …. 4. M&E frameworks/Logical Framework  The M&E framework outlines the objectives, inputs, outputs and outcomes of the intended project and the indicators that will be used to measure all these.  It also outlines the assumptions that the M&E system will adopt.  The M&E framework is essential as it links the objectives with the process and enables the M&E expert know what to measure and how to measure it.
  • 67. … 5. M&E Work Plan and costs  Closely related to the M&E frameworks is the M&E Work plan and costs.  While the framework outlines objectives, inputs, outputs and outcomes of the intended project, the work plan outlines how the resources that have been allocated for the M&E functions will be used to achieve the goals of M&E.  The work plan shows how personnel, time, materials and money will be used to achieve the set M&E functions
  • 68. …. 6. Communication, Advocacy and Culture for M&E  This refers to the presence of policies and strategies within the organization to promote M&E functions.  Without continuous communication and advocacy initiatives within the organization to promote M&E, it is difficult to entrench the M&E culture within the organization.  Such communication and strategies need to be supported by the organizations hierarchy.  The existence of an organizational M&E policy, together with the continuous use of the M&E system outputs on communication channels are some of the ways of improving communication, advocacy and culture for M&E
  • 69. …. 7. Routine Programme Monitoring  M&E consists of two major aspects: monitoring and evaluation.  This component emphasizes the importance of monitoring.  Monitoring refers to the continuous and routine data collection that takes place during project implementation.  Data needs to be collected and reported on a continuous basis to show whether the project activities are driving towards meeting the set objectives.  They also need to be integrated into the program activities for routine gathering and analysis.
  • 70. …. 8. Surveys and Surveillance  This involves majorly the national level M&E plans and entails how frequently relevant national surveys are conducted in the country.  National surveys and surveillance needs to be conducted frequently and used to evaluate progress of related projects.  For example, for HIV and AIDS national M&E plans, there needs to be HIV related surveys carried at last bi-annually and used to measure HIV indicators at the national level.
  • 71. … 9. National and Sub-national databases  The data world is gradually becoming open source.  More and more entities are seeking data that are relevant for their purposes.  The need for M&E systems to make data available can therefore not be over- emphasized.  This implies that M&E systems need to develop strategies of submitting relevant, reliable and valid data to national and sub-national databases.
  • 72. … 10. Supportive Supervision and Data Auditing  Every M&E system needs a plan for supervision and data auditing.  Supportive supervision implies that an individual or organization is able to supervise regularly the M&E processes in such a way that the supervisor offers suggestions on ways of improvement.  Data auditing implies that the data is subjected to verification to ensure its reliability and validity.  Supportive supervision is important since it ensures the M&E process is run efficiently, while data auditing is crucial since all project decisions are based on the data collected.
  • 73. … 11. Evaluation and Research  One aspect of M&E is research and the other is evaluation.  Evaluation of projects is done at specific times most often mid- term and at the end of the project.  Evaluation is an important component of M&E as it establishes whether the project has met he desired objectives.  It usually provides for organizational learning and sharing of successes with other stakeholders.
  • 74. …. 12. Data Dissemination and use  The information that is gathered during the project implementation phase needs to be used to inform future activities, either to reinforce the implemented strategy or to change it.  Additionally, results of both monitoring and evaluation outputs need to be shared out to relevant stakeholders for accountability purposes.  Organizations must therefore ensure that there is an information dissemination plan either in the M&E plan, Work plan or both.
  • 75. Session three: The Evaluation Process and Role of the Board
  • 76. The evaluation manager’s role, in consultation with the steering committee The evaluator’s role Clarify policy/ programme objectives and intended outcomes Clarify intended evaluation purpose, users and uses Develop relevant evaluation questions Select the evaluation approach and methods Identify data sources and collection and analysis procedures Identify the necessary resources and governance arrangements Prepare the TOR; commission (and possibly tender) the evaluation Conduct and Manage the evaluation Our main focus in this session The evaluation process – an overview
  • 77. Terms of Reference START Evaluation Assessment (2-3 months) Contracting (3-4 months) Field Work/ Analysis (6-8 months)Report & Recommendations (2-4 months) Management Responses (1 month) Executive Approval; Internet Posting (2-3 months) Implementing Change/Follow Up Large evaluations typically take 12-18 months to complete. Some phases may overlap. Cycle of conducting and managing evaluations
  • 78. Standards Utility Feasibility Propriety Accuracy Engage Stakeholders Focus the Evaluation Design Describe the program Gather credible evidence Justify conclusions Use and share lessons learned Steps Steps of the Evaluation process
  • 79. Step 1- Engage Stakeholders Who are the stakeholders? Those involved in program operations, those affected by the program operations, and users of evaluation results
  • 80. Step 2 - Describe the Program  What are the goals and specific aims of the program?  What problem or need is it designed to address?  What are the measurable objectives? What are the strategies to achieve the objectives?  What are the expected effects?  What are the resources and activities?  How is the program supposed to work?
  • 81. What do you want to know? Consider the purpose, uses, questions, methods, roles, budgets, deliverables etc. An evaluation cannot answer all questions For all stakeholders. Step 3 - Focus the evaluation design
  • 82.  Data collected must address the evaluation questions  Evidence must be believable, trustworthy and relevant  Information scope, sources, quality, logistics, methodology & data collection.  Who is studied and when? Step 4 - Gather credible evidence
  • 83. Consider the data you have: • Analysis and synthesis - determine findings. • Interpretation - what do findings mean? • Judgments - what is the value of findings based on accepted standards? • Recommendations – - what claims can be made? - what are the limitations of your design? Step 5 - “Justify” Conclusions
  • 84.  Share lessons learned with stakeholders!  Provide feedback, offer briefings. disseminate findings  Implement evaluation recommendations  Develop a new/revised implementation plan in partnership with stakeholders Step 6 - Use and share results
  • 85.  Several options (not mutually exclusive) Reconstructing baseline data ex post: recall method (more later) Use key informants and triangulate (mostly qualitative) Reconstruct a baseline “scenario” with secondary data (not always practical given absence and quality of baseline studies) Single difference with econometric techniques: some practical obstacles (workload, time constraints, availability of trained specialists) Dealing with lack of baseline data
  • 86.  Identify the implementation logic and theory of change  Allow for the inception report phase  Deal with missing baseline and other gaps  Gather data  Examine the effort using various criteria  Draw conclusions and recommendations  Conduct reporting  Ensure quality  Feedback on the evaluation  Management response  Dissemination findings  Feedback and lessons learnt Best practices in managing evaluations If I had know too much information would make it complicated, I wouldn’t have asked for it!!!
  • 87. • Value for money a key concern • Underfunding as wasteful as over-funding • Balance between cost and quality • Quality ultimately more important • But also relevance for purpose • Make sure all aspects adequately funded including consultation with stakeholders, reporting and dissemination • Ensure evaluation design appropriate to budget as well as aims of programme Evaluation budget
  • 88. Conclusions 1) Monitoring and evaluation is now a condition for development partners and government collaboration 2) Board’s oversight role is well done if there are functioning M and E systems 3) International standards need to be integrated into the local M and E systems 4) M and E units need to be adequately staffed and their capacity enhanced 5) M and E budget needs to be supported by the board