Indian Call Girls in Abu Dhabi O5286O24O8 Call Girls in Abu Dhabi By Independ...
Program Evaluations
1. Program Evaluations:
Using evaluation data to set direction, expand
impact and maintain accountability
Boston Non-profit Leadership Series
November 19, 2015
A Lecture by Dumas F. Lafontant
Director, Lower Roxbury Coalition
2. TODAY’S AGENDA
The shoestring approach to evaluation: working with a limited budget,
limited time frames, and insufficient data
Steps in Evaluation
Understanding the limits of evaluation
Best uses for program evaluation data
3. THE SHOESTRING
APPROACH TO EVALUATION
Working with a limited budget
Budgetary resources should not be a factor limiting the quality of an
evaluation. For instance, managers can do more preparatory work within
the program, e.g. in terms of information collection and compilation. Aiding
the evaluator with concrete inputs can help reduce the evaluation budget.
If only a limited budget is available there are several options:
•Have a very focused evaluation that requires only a limited budget. Prepare
several options and ask the evaluation team to indicate which elements have the
highest priority;
•Demand a smaller number of beneficiary surveys, less detailed data collection and
fewer stakeholder consultations. In general, these elements are rather time-
consuming and therefore budget-consuming. Indicate, for example, that phone
interviews or a web-based questionnaire would also be possible; and
•Organize brainstorming meetings with stakeholders to encourage self-reflection on
the program, moderated by an external expert.
4. THE SHOESTRING
APPROACH TO EVALUATION
Working with limited time frames
Evaluation must not be viewed as time-consuming or expensive to be
worthwhile; however, it requires resources, planning, data collection, and
dissemination of the results to strengthen program. With limited time
frames, managers must ensure they make the most of program evaluation,
e.g., early planning to determine usefulness of collected information, and
set goals that match budget.
Working with insufficient data
When evaluations do not begin until after the project has been underway
for some time, the evaluator will often find that no baseline data have been
collected and that no comparison group has been identified or studied.
Strategies can be used to try to reconstruct the baseline conditions that
existed at the time the project began. These include the use of
documentary (secondary) data sources, interviews with key informants,
using participatory methods help recreate historical data and timelines, and
the use of recall.
5. STEPS IN EVALUATION
I. INTRODUCTION
The purpose of an evaluation is to learn from a program in order to improve
future programming. An evaluation must assess processes and progress of
the project, whether it is having the expected effects and impacts.
A good quality, well-designed evaluation will expand the evidence base, as
well as help an organization learn more about what does not work.
II. EVALUATION GOAL
To meet and/or exceed the evaluation goal, managers must prioritize what
information is most vital to obtain given their resources, before they begin
to gather information:
Which clients?
What aspects of your program?
What outcomes?
6. STEPS IN EVALUATION
III. LOGIC MODEL
A logic model is a one-page diagram that presents the conceptual
framework for a proposed project and it explains the links among program
elements, such as: Goals, Assumptions, Inputs, Target Population,
Activities, outputs, and outcomes.
IV. PROGRAM OUTCOMES, OUTPUTS AND MEASURES
The program outcomes should relate to the overall goals of the project. If
research is part of the proposed work, outcomes must include
hypothesized results and implications of the proposed research.
The evaluation plan must include a valid and reliable measurement plan
and sound methodological design, as well as how the results of evaluation
will provide greater understanding and improvement of the funded
activities.
7. STEPS IN EVALUATION
V. DATA COLLECTION ACTIVITIES
Managers must think about the type of method to use for collecting the
information. The best way to obtain data often depends on understanding
the context social, cultural and political context. Often, managers rely on
surveys, tests, and end of session questionnaires. Focus groups
questionnaire have gained popularity, but there are a variety of techniques
from which to choose from. Select the method that suits the purpose; don’t
let the method determine the approach. Be creative and experiment with
different techniques, such as interview, observation, case study,
photograph, testimonials, and so on.
8. STEPS IN EVALUATION
VI. ANALYSIS AND INTERPRETATION OF THE DATA
Review submitted data to assess and swiftly address problems;
Conduct evaluation in collaboration with an independent evaluator. The key
components of data analysis are as follows:
•Purpose of the evaluation;
•Questions;
•What you hope to learn from the question;
•Analysis technique; and
•How data will be presented
VII. REPORT, DISSEMINATION AND USE OF FINDINGS
The evaluation, as designed, will inform future programming, expand the
evidence base, support final reported results, and using descriptions
of the results to communicate the implication of the study to others in the
field (e.g., narrative, photo-voice; annual report; article, op-ed, letter to
editor).
9. UNDERSTANDING THE LIMITS
Limits of Evaluation Data
Descriptive evaluations use data and analysis to describe and explain the
importance/implications of program’s processes (such as a process or
program implementation study and/or program’s population (e.g., pre-post
study). Impact evaluations have and describe a comparison group that
does not receive services of interest and is comparable (at baseline, i.e.,
before program begins) to those who participate in service program.
Written reports:
•Be explicit about your limitations
Oral reports:
•Be prepared to discuss limitations;
•Be honest about limitations; and
•Know the claims you cannot make
Do not claim causation without a true experimental design
Do not generalize to the population without random sample and quality
administration (e.g., <60% response rate on a survey)
10. USING EVALUATION DATA
Best Uses for program evaluation data
Setting Direction
Having comprehensive processes in place to collect data in a uniform,
systematic manner while maintaining privacy. Evaluations seek to look at a
program and its results through the eyes of the participant. Hence data
collection is designed to avoid preconceived views and include stakeholders
interests and concerns.
Expanding Impact
Local evaluations must be designed to help inform future programming and
expand the evidence base The goal is to obtain trustworthy, authentic, and
credible evidence. Being credible means that people (e.g., funders, board)
have confidence in the process and believe the results.
11. USING EVALUATION DATA
Maintaining Accountability
The management of data collection, documentation, and reporting allow for
internal accountability and monitoring performance. Linking program
performance to program budget is the final step in accountability. Called
“activity-based budgeting” or “performance budgeting,” it requires an
understanding of program components and the links between activities and
intended outcomes. The early steps in the program evaluation approach
(such as logic modeling) clarify these relationships, making the link between
budget and performance easier and more apparent.
12. INTERACTIVE ACTIVITY
I am going to have you form five groups by asking each one of you to take
a single number from 1 to 5. The purpose of this activity is to get each one
of you to start seeing the steps to program evaluation.
Let’s consider, for instance, that you are a member of the Committee
formed to bring the Olympics to Boston. In 15 minutes, write the steps to
developing the program evaluation, which will be incorporated to the
Boston Olympics Project.