This document outlines the key aspects of conducting a participatory evaluation. It discusses that participatory evaluations involve stakeholders throughout the evaluation process, from planning to acting on results. The main characteristics are that they are focused on participant needs, include diverse views, and are a learning process. Rapid appraisal methods are used to gather empirical data. Facilitators guide participants, who conduct the evaluation themselves. The benefits are that participants are empowered to improve performance, while disadvantages can include perceived lack of objectivity. The steps outlined include deciding on participation level, scope of work, team planning, data collection, analysis, and creating an action plan.
Presentation Training on Result Based Management (RBM) for M&E StaffFida Karim 🇵🇰
Planning, Monitoring, Evaluation & Reporting together for developmental results: Results-based Management-RBM (RBM)?
Logical Framework Approach (LFA)
Planning for results
Monitoring for results
Evaluating for results
Enhancing the use of knowledge from monitoring and evaluation
Participatory Monitoring and Evaluation background, concepts and principles, goals of PM&E, the PM&E process, stakeholder analysis, PM&E framework, plan, worksheet, a case study using PM&E
Monitoring and evaluation is a vital component that determines the effectiveness of a corporation's assistance by establishing clear links between past, present and future initiatives and results. The process helps in improving the programme performance and achieving desired results. It provides opportunities for fine-tuning, re-orientation and planning of the programme effectively, without which it becomes impossible to measure the success and impact of the programme even if the approach is right.
Self-Assessment of Organizational Capacity in Monitoring & EvaluationMEASURE Evaluation
Presentation that captures self-assessments of two teams of Ethiopian health officers (most of whom have M&E responsibilities): those from SNNP Regional Health Bureau and those from the Sidama Zonal Health Department.
Presentation Training on Result Based Management (RBM) for M&E StaffFida Karim 🇵🇰
Planning, Monitoring, Evaluation & Reporting together for developmental results: Results-based Management-RBM (RBM)?
Logical Framework Approach (LFA)
Planning for results
Monitoring for results
Evaluating for results
Enhancing the use of knowledge from monitoring and evaluation
Participatory Monitoring and Evaluation background, concepts and principles, goals of PM&E, the PM&E process, stakeholder analysis, PM&E framework, plan, worksheet, a case study using PM&E
Monitoring and evaluation is a vital component that determines the effectiveness of a corporation's assistance by establishing clear links between past, present and future initiatives and results. The process helps in improving the programme performance and achieving desired results. It provides opportunities for fine-tuning, re-orientation and planning of the programme effectively, without which it becomes impossible to measure the success and impact of the programme even if the approach is right.
Self-Assessment of Organizational Capacity in Monitoring & EvaluationMEASURE Evaluation
Presentation that captures self-assessments of two teams of Ethiopian health officers (most of whom have M&E responsibilities): those from SNNP Regional Health Bureau and those from the Sidama Zonal Health Department.
Promoting a culture of monitoring and evaluation in educational institutions. How to develop a M&E system, and grounding M&E planning on the Logical Framework Approach, and using Logframe as reference for M&E.
During this session we will:
*Review importance of monitoring and evaluation
*Share overview of grant model evaluation plan
*Review methodologies used in previous evaluations
*Share plans for future evaluation methodologies
Building a Results-Based Monitoring and Evaluation System 建立面向结果的监测与评价系统Dadang Solihin
Shanghai International Program for Development Evaluation Training Asia-Pacific Finance and Development Center; 200 Panlong Road-Shanghai, October 10, 2008
Provides insights into the result based planning process including result based matrix preparation that help to manage scarce resources to realize a better result.
Promoting a culture of monitoring and evaluation in educational institutions. How to develop a M&E system, and grounding M&E planning on the Logical Framework Approach, and using Logframe as reference for M&E.
During this session we will:
*Review importance of monitoring and evaluation
*Share overview of grant model evaluation plan
*Review methodologies used in previous evaluations
*Share plans for future evaluation methodologies
Building a Results-Based Monitoring and Evaluation System 建立面向结果的监测与评价系统Dadang Solihin
Shanghai International Program for Development Evaluation Training Asia-Pacific Finance and Development Center; 200 Panlong Road-Shanghai, October 10, 2008
Provides insights into the result based planning process including result based matrix preparation that help to manage scarce resources to realize a better result.
A Good Program Can Improve Educational Outcomes.pdfnoblex1
We hope this guide helps practitioners and others strengthen programs designed to increase academic achievement, ultimately broadening access to higher education for youth and adults.
We believe that evaluation is a critical part of program design and is necessary for ongoing program improvement. Evaluation requires collecting reliable, current and compelling information to empower stakeholders to make better decisions about programs and organizational practices that directly affect students. A good evaluation is an effective way of gathering information that strengthens programs, identifies problems, and assesses the extent of change over time. A sound evaluation that prompts program improvement is also a positive sign to funders and other stakeholders, and can help to sustain their commitment to your program.
Theories of change are conceptual maps that show how and why program activities will achieve short-term, interim, and long-term outcomes. The underlying assumptions that promote, support, and sustain a program often seem self-evident to program planners. Consequently, they spend too little time clarifying those assumptions for implementers and participants. Explicit theories of change provoke continuous reflection and shared ownership of the work to be accomplished. Even the most experienced program planners sometimes make the mistake of thinking an innovative design will accomplish goals without checking the linkages among assumptions and plans.
Developing a theory of change is a team effort. The collective knowledge and experience of program staff, stakeholders, and participants contribute to formulating a clear, precise statement about how and why a program will work. Using a theory-based approach, program collaborators state what they are doing and why by working backwards from the outcomes they seek to the interventions they plan, and forward from interventions to desired outcomes. When defining a theory of change, program planners usually begin by deciding expected outcomes, aligning outcomes with goals, deciding on the best indicators to evaluate progress toward desired outcomes, and developing specific measures for evaluating results. The end product is a statement of the expected change that specifies how implementation, resources, and evaluation translate into desired outcomes.
Continuously evaluating a theory of change encourages program planners to keep an eye on their goals. Statements about how and why a program will work must be established using the knowledge of program staff, stakeholders, and participants. This statement represents the theory underlying the program plan and shows planners how resources and activities translate to desired improvements and outcomes. It also becomes a framework for program implementation and evaluation.
Source: https://ebookscheaper.com/2022/04/06/a-good-program-can-improve-educational-outcomes/
This gives the information about programme evaluation, planning of evaluation, requirement and purpose of evaluation, steps involved in evaluation, Uses of evaluation, Stakeholder and their role in evaluation, finding and analysing the result of evaluation, Standards of effective evaluation, utilization of evaluation.
1
4
Milestone 4
Student’s Name
University Affiliation
Southern New Hampshire University
Milestone 4
Description of the Initiative Evaluation Plan
Initiative evaluation involves systematic mechanisms for gathering, reviewing, and utilizing information to answer questions concerning the initiative, policies, and programs, specifically about their effectiveness and efficiency. Initiative evaluation can entail both qualitative as well as qualitative techniques of social research. The initiative evaluation plan also contains the intended use of the evaluation outcomes for the program enhancement and decision making. The evaluation plan serves to clarify the initiative’s purpose and expected results (Dudley, 2020). The evaluation plan provides the direction that the monitoring should take based on the initiative priorities, the available resources, time, and skills required to complete the evaluation.
The initiative will have a well-documented plan to foster transparency as well as ensure that stakeholders are on a similar page with concerns about the purpose, use, and also the beneficiaries of the evaluation outcomes. Utilization of the evaluation outcomes is not a thing that can be wished when implementing an initiative. Instead, it must be planned, directed, and ensured to have intentions (Dudley, 2020). The evaluation plan for this initiative will have many benefits, including facilitating the capacity to establish strong connections with partners and stakeholders. The program is also essential for creating the initiative transparency to the stakeholders and decision-makers. The plan also serves as advocacy means for evaluation resources based on negotiated priorities. The procedure for evaluation initiative is also critical for helping in identifying whether there are enough intervention resources and time to realize the desired evaluation exercises and provide answers to prioritize evaluation questions.
When developing the plan for evaluating the initiative targeting to promote health and wellbeing in the community, the key steps must be to develop an effective strategy. The key steps to be followed when creating the evaluation plan differ depending on the project type to be evaluated. The first step entails engaging the stakeholders. When finding the purpose of the evaluation procedures, it is crucial to determine its purpose and the stakeholders involved in the implementation process of the intervention. Identifying the purpose of the evaluation process and stakeholders involved is critical because the two components serve as the basis for evaluation planning, target, design, and comprehension of the outcomes. Stakeholders' engagement is necessary to enable the support of the evaluation process. Involving stakeholders in the evaluation process can have many advantages. Stakeholders comprise the people who use the evaluation outcomes, support and keep the initiative or those impacted by the intervention activities or evalu ...
1
4
Milestone 4
Student’s Name
University Affiliation
Southern New Hampshire University
Milestone 4
Description of the Initiative Evaluation Plan
Initiative evaluation involves systematic mechanisms for gathering, reviewing, and utilizing information to answer questions concerning the initiative, policies, and programs, specifically about their effectiveness and efficiency. Initiative evaluation can entail both qualitative as well as qualitative techniques of social research. The initiative evaluation plan also contains the intended use of the evaluation outcomes for the program enhancement and decision making. The evaluation plan serves to clarify the initiative’s purpose and expected results (Dudley, 2020). The evaluation plan provides the direction that the monitoring should take based on the initiative priorities, the available resources, time, and skills required to complete the evaluation.
The initiative will have a well-documented plan to foster transparency as well as ensure that stakeholders are on a similar page with concerns about the purpose, use, and also the beneficiaries of the evaluation outcomes. Utilization of the evaluation outcomes is not a thing that can be wished when implementing an initiative. Instead, it must be planned, directed, and ensured to have intentions (Dudley, 2020). The evaluation plan for this initiative will have many benefits, including facilitating the capacity to establish strong connections with partners and stakeholders. The program is also essential for creating the initiative transparency to the stakeholders and decision-makers. The plan also serves as advocacy means for evaluation resources based on negotiated priorities. The procedure for evaluation initiative is also critical for helping in identifying whether there are enough intervention resources and time to realize the desired evaluation exercises and provide answers to prioritize evaluation questions.
When developing the plan for evaluating the initiative targeting to promote health and wellbeing in the community, the key steps must be to develop an effective strategy. The key steps to be followed when creating the evaluation plan differ depending on the project type to be evaluated. The first step entails engaging the stakeholders. When finding the purpose of the evaluation procedures, it is crucial to determine its purpose and the stakeholders involved in the implementation process of the intervention. Identifying the purpose of the evaluation process and stakeholders involved is critical because the two components serve as the basis for evaluation planning, target, design, and comprehension of the outcomes. Stakeholders' engagement is necessary to enable the support of the evaluation process. Involving stakeholders in the evaluation process can have many advantages. Stakeholders comprise the people who use the evaluation outcomes, support and keep the initiative or those impacted by the intervention activities or evalu ...
Organizational Capacity-Building Series - Session 6: Program EvaluationINGENAES
This session describes different kinds of program evaluations, and key evaluation considerations. These presentations are are part of a workshop series that was implemented in Nepal and 2016 as part of the INGENAES initiative.
Project Management Fundamentals Project Organization and Integration;
Key General Management Skills
Encompasses planning, organizing, executing and controlling operations of an ongoing enterprise
Provide foundation for building project management skills
Required general management skills for a PM
Leading
communicating
Negotiating
Problem solving
Influencing the organization
Advanced Project Management Project Organization and Integration;
Project Proposal
Project Contract
Project Charter
Elicitation of Project Requirements and Specifications
Project Statement of Work
Project Scope Statement
Project Work Breakdown Structure
Scope Creep, Control and Verification
Project Change Management
Project Integration Management
Lecture 05:Advanced Project Management PM Processes and FrameworkFida Karim 🇵🇰
Advanced Project Management PM Processes and Framework,
comprising a set of interrelated processes and tools, ranging from simple to complex, and is based on the accepted principles of management used for planning, estimating and controlling work activities with a view to developing specifically defined outputs that are to be delivered by a certain time, to a defined quality standard and with a given level of resources so that the project goal and outcomes/benefits are realized.
Effective project management is essential for the success of any project – whether in the private or public sectors – and irrespective of its category, size and complexity.
Advanced Project Management PM Processes and Framework
PM Framework and Integration
Key Definitions
Accountability
Acceptance of success or failure
Responsibility
Assignment for completion of specific event or activity
Authority
Right of an individual to make necessary decisions required to achieve his objectives or responsibility
Power
Granted to an individual by the subordinates, peer and is a measure of their respect for the individual
Project Management Discipline
Start and End date, allocated budget and available resources
Dedicated Stakeholders
Informed and Knowledgeable End user
Empowered Project Office personnel
Strict documentation
Change management and risk mitigating process
Estimation process for additional or in-scope deliverables
PLANNING, CONTROLLING AND MANAGING.
Introduction
Meeting Objectives
Project Oriented Industries
Project Manager, Power and Authority
PM Discipline
Managing your Stake Holders
Talk the Talk and Walk the Walk
Communication
Project Closure
CONSOLIDATE THE RELEVANT HISTORICAL RECORD ANALYZE AND INTERPRET RELEVANT ...Fida Karim 🇵🇰
REPORT ON
HISTORICAL PERSPECTIVE OF GILGIT-BALTISTAN
BY
COMMITTEE CONSTITUTED VIDE
CHIEF SECRETARY GILGIT BALTISTAN’S OFFICE MEMORANDUM
21 NOVEMBER 2015
THE TERMS OF REFERRENCE
CONSOLIDATE THE RELEVANT HISTORICAL RECORD
ANALYZE AND INTERPRET RELEVANT TREATIES
DELIBERATE ON BOUNDRIES TILL NOVEMBER 1, 1947
i
COMPOSTION OF THE COMMITTEE
Brig. Hisamullah Beg SI[M], Blogger [Hunza Development Forum – Convener http://hisamullahbeg.blogspot.com/, with many posts on this topic]. Mr. Usman Ali Professor (Retired), Historian and writer Member
Col. Imtiaz Ul Haq SI[M], Author of a Thesis on related topic Member
AIG. Muhammad Dil Pazeer (Retired), expert on GB Affairs Member
Mr. Sher Baz Ali Bercha, Historian, writer and researcher Member
Mr. Qasim Naseem, Journalist and writer Member
Mr. Israruddin Israr, columnist and Human Rights Activist Member
The 2nd Quarterly Progress Report covers the activities carried out during the period April 1st to June 30th, 2014. These activities comprise of activities carried out by HGISF itself and its member organizations along with its multipurpose human development project named Jiwani Development Center in Karachi, Pakistan.
Sample Project Proposal Template by Fida KarimFida Karim 🇵🇰
Project Proposal template-training on proposal writing and resource mobilization by fida karim (1)
training on proposal writing & resource mobilization ,
Sample Project Proposal Template by Fida Karim,
Sample Proposal Template by Fida Karim
Hunza Gilgit Ismaili Students Federation (HGISF) is a youth-led civil society initiative that is working on a
voluntary basis for the holistic development of youth from Hunza and Gilgit region living here in Karachi
besides respective villages and towns in Hunza and Gilgit region of Pakistan. HGISF is the federation of more
than eighteen youth-led Social Welfare and Academic Development Organizations comprising several
thousand members mainly based here in Karachi and other parts of Pakistan. It is a social entity registered
with namdar Regional Council for Karachi and Baluchistan, Pakistan.
What is the point of small housing associations.pptxPaul Smith
Given the small scale of housing associations and their relative high cost per home what is the point of them and how do we justify their continued existance
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
Understanding the Challenges of Street ChildrenSERUDS INDIA
By raising awareness, providing support, advocating for change, and offering assistance to children in need, individuals can play a crucial role in improving the lives of street children and helping them realize their full potential
Donate Us
https://serudsindia.org/how-individuals-can-support-street-children-in-india/
#donatefororphan, #donateforhomelesschildren, #childeducation, #ngochildeducation, #donateforeducation, #donationforchildeducation, #sponsorforpoorchild, #sponsororphanage #sponsororphanchild, #donation, #education, #charity, #educationforchild, #seruds, #kurnool, #joyhome
Many ways to support street children.pptxSERUDS INDIA
By raising awareness, providing support, advocating for change, and offering assistance to children in need, individuals can play a crucial role in improving the lives of street children and helping them realize their full potential
Donate Us
https://serudsindia.org/how-individuals-can-support-street-children-in-india/
#donatefororphan, #donateforhomelesschildren, #childeducation, #ngochildeducation, #donateforeducation, #donationforchildeducation, #sponsorforpoorchild, #sponsororphanage #sponsororphanchild, #donation, #education, #charity, #educationforchild, #seruds, #kurnool, #joyhome
ZGB - The Role of Generative AI in Government transformation.pdfSaeed Al Dhaheri
This keynote was presented during the the 7th edition of the UAE Hackathon 2024. It highlights the role of AI and Generative AI in addressing government transformation to achieve zero government bureaucracy
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
2024: The FAR - Federal Acquisition Regulations, Part 37
Usaid tips 01 conducting a participatory evaluation-2011 05
1. 1
ABOUTTIPS
These TIPS provide practical advice and suggestions to USAID managers on issues related to
performance monitoring and evaluation.This publication is a supplemental reference to the
Automated Directive Service (ADS) Chapter 203.
PERFORMANCE MONITORING & EVALUATION
TIPS
CONDUCTING A PARTICIPATORY EVALUATION
NUMBER 1
2011 Printing
USAID is promoting
participation in all as-
pects of its development
work.
This TIPS outlines how
to conduct a participa-
tory
evaluation.
Participatory evaluation provides for active in-
volvement in the evaluation process of those
with a stake in the program: providers, part-
ners, customers (beneficiaries), and any other
interested parties. Participation typically takes
place throughout all phases of the evaluation:
planning and design; gathering and analyzing the
data; identifying the evaluation findings, conclu-
sions, and recommendations; disseminating re-
sults; and preparing an action plan to improve
program performance.
WHAT IS DIRECT
OBSERVATION ?
CHARACTERISTICS OF
PARTICIPATORY
EVALUATION
2. 2
Participatory evaluations typically share several
characteristics that set them apart from trad-
tional evaluation approaches.These include:
Participant focus and ownership. Partici-
patory evaluations are primarily oriented to
the information needs of program stakehold-
ers rather than of the donor agency.The donor
agency simply helps the participants conduct
their own evaluations, thus building their own-
ership and commitment to the results and fa-
cilitating their follow-up action.
Scope of participation.The range of partici-
pants included and the roles they play may vary.
For example, some evaluations may target only
program providers or beneficiaries, while oth-
ers may include the full array of stakeholders.
Participant negotiations. Participating
groups meet to communicate and negotiate to
reach a consensus on evaluation findings, solve
problems, and make plans to improve perfor-
mance.
Diversity of views.Views of all participants are
sought and recognized. More powerful stake-
holders allow participation of the less powerful.
Learning process. The process is a learn-
ing experience for participants. Emphasis is on
identifying lessons learned that will help partici-
pants improve program implementation, as well
as on assessing whether targets were achieved.
Flexible design. While some preliminary
planning for the evaluation may be necessary,
design issues are decided (as much as possible)
in the participatory process. Generally, evalua-
tion questions and data collection and analysis
methods are determined by the participants,
not by outside evaluators.
Empirical orientation. Good participatory
evaluations are based on empirical data. Typi-
cally, rapid appraisal techniques are used to de-
termine what happened and why.
Use of facilitators. Participants actually con-
duct the evaluation, not outside evaluators as is
traditional. However, one or more outside ex-
perts usually serve as facilitator—that is, pro-
vide supporting roles as mentor, trainer, group
processor, negotiator, and/or methodologist.
WHY CONDUCT A
PARTICIPATORY
EVALUATION?
Experience has shown that participatory evalu-
ations improve program performance.Listening
to and learning from program beneficiaries,field
staff, and other stakeholders who know why a
program is or is not working is critical to mak-
ing improvements. Also, the more these insid-
ers are involved in identifying evaluation ques-
tions and in gathering and analyzing data, the
more likely they are to use the information to
improve performance. Participatory evaluation
empowers program providers and beneficiaries
to act on the knowledge gained.
Advantages to participatory evaluations are
that they:
• Examine relevant issues by involving key
players in evaluation design
• Promote participants’ learning about the
program and its performance and enhance
their understanding of other stakeholders’
points of view
• Improve participants’ evaluation skills
• Mobilize stakeholders, enhance teamwork,
and build shared commitment to act on evalua-
3. 3
tion recommendations
• Increase likelihood that evaluation informa-
tion will be used to improve performance
But there may be disadvantages. For example,
participatory evaluations may
• Be viewed as less objective because program
staff, customers, and other stakeholders
with possible vested interests participate
• Be less useful in addressing highly technical
aspects
• Require considerable time and resources to
identify and involve a wide array of stakehold-
ers
• Take participating staff away from ongoing
activities
• Be dominated and misused by some stake-
holders to further their own interests
STEPS IN CONDUCTING A
PARTICIPATORY
EVALUATION
Step 1: Decide if a participatory evalu-
ation approach is appropriate. Participatory
evaluations are especially useful when there are
questions about implementation difficulties or
program effects on beneficiaries,or when infor-
mation is wanted on stakeholders’ knowledge
of program goals or their views of progress.
Traditional evaluation approaches may be more
suitable when there is a need for independent
outside judgment,when specialized information
is needed that only technical experts can pro-
vide, when key stakeholders don’t have time to
participate, or when such serious lack of agree-
ment exists among stakeholders that a collab-
orative approach is likely to fail.
Step 2: Decide on the degree of partici-
pation. What groups will participate and what
roles will they play? Participation may be broad,
with a wide array of program staff,beneficiaries,
partners, and others. It may, alternatively, tar-
get one or two of these groups. For example,
if the aim is to uncover what hinders program
implementation, field staff may need to be in-
volved. If the issue is a program’s effect on lo-
cal communities, beneficiaries may be the most
appropriate participants. If the aim is to know
if all stakeholders understand a program’s goals
and view progress similarly, broad participation
may be best. Roles may range from serving as
a resource or informant to participating fully in
some or all phases of the evaluation.
Step 3: Prepare the evaluation scope of
work. Consider the evaluation approach—the
basic methods, schedule, logistics, and funding.
Special attention should go to defining roles of
the outside facilitator and participating stake-
holders. As much as possible, decisions such as
the evaluation questions to be addressed and
the development of data collection instruments
and analysis plans should be left to the partici-
patory process rather than be predetermined
in the scope of work.
Step 4:Conduct the team planning meet-
ing. Typically, the participatory evaluation pro-
cess begins with a workshop of the facilitator
and participants. The purpose is to build con-
sensus on the aim of the evaluation; refine the
scope of work and clarify roles and responsi-
bilities of the participants and facilitator; review
the schedule, logistical arrangements, and agen-
da; and train participants in basic data collec-
tion and analysis. Assisted by the facilitator,par-
ticipants identify the evaluation questions they
want answered. The approach taken to identify
questions may be open ended or may stipulate
4. 4
broad areas of inquiry. Participants then select
appropriate methods and develop data-gather-
ing instruments and analysis plans needed to
answer the questions.
Step 5: Conduct the evaluation. Participa-
tory evaluations seek to maximize stakehold-
ers’ involvement in conducting the evaluation
in order to promote learning. Participants de-
fine the questions, consider the data collection
skills,methods,and commitment of time and la-
bor required. Participatory evaluations usually
use rapid appraisal techniques, which are sim-
pler, quicker, and less costly than conventional
sample surveys.They include methods such as
those in the box below.Typically, facilitators are
skilled in these methods, and they help train
and guide other participants in their use.
Step 6: Analyze the data and build con-
sensus on results. Once the data are gath-
ered, participatory approaches to analyzing
and interpreting them help participants build a
common body of knowledge. Once the analysis
is complete, facilitators work with participants
to reach consensus on findings,conclusions,and
recommendations. Facilitators may need to ne-
gotiate among stakeholder groups if disagree-
ments emerge. Developing a common under-
standing of the results, on the basis of empirical
evidence, becomes the cornerstone for group
commitment to a plan of action.
Step 7: Prepare an action plan. Facilitators
work with participants to prepare an action
plan to improve program performance. The
knowledge shared by participants about a pro-
gram’s strengths and weaknesses is turned into
action. Empowered by knowledge, participants
become agents of change and apply the lessons
they have learned to improve performance.
Participatory Evaluation
• participant focus and ownership of
evaluation
• broad range of stakeholders partici-
pate
• focus is on learning
• flexible design
• rapid appraisal methods
• outsiders are facilitators
Traditional Evaluation
• donor focus and ownership of evalu-
ation
• stakeholders often don’t participate
• focus is on accountability
• predetermined design
• formal methods
• outsiders are evaluators
WHAT’S DIFFERENT ABOUT PARTICIPATORY
EVALUATIONS?
5. 5
Rapid Appraisal Methods
Key informant interviews. This in-
volves interviewing 15 to 35 individuals
selected for their knowledge and experi-
ence in a topic of interest. Interviews are
qualitative, in-depth, and semistructured.
They rely on interview guides that list
topics or open-ended questions. The in-
terviewer subtly probes the informant to
elicit information, opinions, and experi-
ences.
Focus group interviews. In these,
8 to 12 carefully selected participants
freely discuss issues, ideas, and experi-
ences among themselves. A modera-
tor introduces the subject, keeps the
discussion going, and tries to prevent
domination of the discussion by a few
participants. Focus groups should be
homogeneous, with participants of simi-
lar backgrounds as much as possible.
Community group interviews.
These take place at public meetings
open to all community members. The pri-
mary interaction is between the partici-
pants and the interviewer, who presides
over the meeting and asks questions,
following a carefully prepared question-
naire.
Direct observation. Using a detailed
observation form, observers record what
they see and hear at a program site. The
information may be about physical sur-
roundings or about ongoing activities,
processes, or discussions.
Minisurveys. These are usually
based on a structured questionnaire with
a limited number of mostly closeended
questions. They are usually adminis-
tered to 25 to 50 people. Respondents
may be selected through probability or
nonprobability sampling techniques, or
through “convenience” sampling (inter-
viewing stakeholders at locations where
they’re likely to be, such as a clinic for
a survey on health care programs). The
major advantage of minisurveys is that
the datacan be collected and analyzed
within a few days. It is the only rapid ap-
praisal method that generates quantita-
tive data.
Case studies. Case studies record
anedotes that illustrate a program’s
shortcomings or accomplishments. They
tell about incidents or concrete events,
often from one person’s experience.
Village imaging. This involves
groups of villagers drawing maps or dia-
grams to identify and visualize problems
and solutions.
Selected Further Reading
Aaker, Jerry and Jennifer Shumaker. 1994.
Looking Back and Looking Forward: A Partici-
patory Approach to Evaluation. Heifer Project
International. P.O. Box 808, Little Rock,AK
72203.
Aubel, Judi. 1994. Participatory Program Evalu-
ation: A Manual for Involving Program Stake-
holders in the Evaluation Process. Catholic
Relief Services. USCC, 1011 First Avenue, New
York, NY 10022.
Freeman, Jim. Participatory Evaluations: Making
Projects Work, 1994. Dialogue on Develop-
ment Technical Paper No.TP94/2. International
Centre,The University of Calgary.
Feurstein, Marie-Therese. 1991. Partners in-
Evaluation: Evaluating Development and Com-
munity Programmes with Participants.TALC,
6. 6
Box 49, St.Albans, Herts AL1 4AX, United
Kingdom.
Guba, Egon andYvonna Lincoln. 1989. Fourth
Generation Evaluation. Sage Publications.
Pfohl, Jake. 1986. Participatory Evaluation:A
User’s Guide. PACT Publications. 777 United
Nations Plaza, NewYork, NY 10017.
Rugh, Jim. 1986. Self-Evaluation: Ideas for
Participatory Evaluation of Rural Community
Development Projects.World Neighbors Pub-
lication.