SlideShare a Scribd company logo
1 of 25
Download to read offline
Evaluation Plan Workbook
Page 0 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Evaluation Plan Workbook
Page 1 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Evaluation Plan Workbook
Table of Contents
P a g e
Introduction
How to Use this Workbook ......................................................................2
What is Evaluation?....................................................................................3
Proving vs. Improving: A Brief History of Evaluation..........................3
Our Approach..............................................................................................3
Evaluation Principles..................................................................................4
Why Evaluate?.............................................................................................5
Developing an Evaluation Plan ............................................................................6
Implementation and Outcomes: One Without the Other?....................7
Evaluating Implementation.......................................................................8
What You Did: Activities and Outputs........................................8
How Well You It: Additional Evaluation Questions .................9
Evaluating Outcomes .................................................................................11
What Difference Did You Make? ................................................11
Indicators..........................................................................................11
Elements of a Strong Indicator Statement ...................................13
Indicator Examples .........................................................................14
Setting Indicator Targets................................................................15
Multiple Indicators .........................................................................15
Direct v. Indirect Indicators...........................................................17
Data Collection Preview.........................................................................................18
Methods ................................................................................................18
Ease of Data Collection...............................................................................20
Review Your Plan ................................................................................................21
Appendices: Evaluation Templates
Implementation
Outcomes
Evaluation Plan Workbook
Page 2 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Introduction
How to Use this Workbook
Welcome to Innovation Network’s Evaluation Plan Workbook, which offers an
introduction to the concepts and processes of planning a program evaluation. We hope
that after following this workbook, you will understand evaluation as a tool for
empowerment. You will learn how evaluation can help your organization be more
effective, and you will be able to develop plans to evaluate both the implementation and
outcomes of your programs.
This is the second workbook in a series. We recommend that before using this
workbook, you go through the Logic Model Workbook.
You may use this book in whatever way suits you best:
• As a stand-alone guide to help create an evaluation plan for a program
• As an extra resource for users of the online Evaluation Plan Builder (see the Point
K Learning Center at www.innonet.org)
• As a supplement to an in-person training on evaluation planning.
This checklist icon appears at points in the workbook at which you should take an
action, or record something – gather information, take action, write something in your
template, or enter something into your online Logic Model Builder.
You can create your evaluation plan online using the Evaluation Plan Builder in
Innovation Network’s Point K Learning Center, our suite of online planning and
evaluation tools and resources at www.innonet.org. This online tool walks you through
the evaluation planning process; allows you to use an existing logic model as a
framework for your evaluation plan; saves your work so you can come back to it later;
and share work with colleagues to review and critique.
For those of you who prefer to work on paper, an evaluation plan template is located at
the end of this workbook. You may want to make several copies of the template, to
allow for adjustments and updates to your evaluation plan over time.
Evaluation Plan Workbook
Page 3 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
One of the negative connotations often associated with evaluation is that it is something
done to people. One is evaluated. Participatory evaluation, in contrast, is a process
controlled by the people in the program or community. It is something they undertake as
a formal, reflective process for their own development and empowerment.
M. Patton, Qualitative Evaluation Methods1
What is Evaluation?
Evaluation is the systematic collection of information about a program that enables
stakeholders to better understand the program, improve its effectiveness, and/or make
decisions about future programming.
Proving vs. Improving: A Brief History of Evaluation
Evaluation has not always been—and still is not always—viewed as a tool to help those
involved with a program to better understand and improve it.
Historically, evaluation focused on proving whether a program worked, rather than on
improving it to be more successful. This focus on proof has meant that “objective,”
external evaluators conducted the evaluations. Research designs used rigorous scientific
standards, using control or comparison groups to assess causation. Evaluations occurred
at the end of a project and focused only on whether the program was a success or
failure; it did not seek to learn what contributed to or hindered success. Finally, this
type of evaluation often disengaged program staff and others from the evaluation
process; these stakeholders rarely learned answers to their questions about a program
and rarely received information to help them improve the program.
Our Approach
We believe evaluation can be a form of empowerment. Participatory evaluation
empowers an organization to define its own success, to pose its own evaluation
questions, and to involve stakeholders and constituents in the process. Rather than
being imposed from the outside, evaluation can help program stakeholders identify
what a program is expected to accomplish (and when), thereby making sure everyone’s
expectations for the program are aligned. By looking systematically at what goes into a
program, what the program is doing and producing, and what the program is achieving,
this evaluation approach enables program stakeholders both to be accountable for
results and to learn how to improve the program.
1 2nd edition, 1990; p. 129.
Evaluation Plan Workbook
Page 4 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Evaluation Principles2
We believe that evaluation is most effective when it:
• Links to program planning and delivery. Evaluation should inform planning
and implementation. Evaluation shouldn’t be done only if you have some extra
time or only when you are required to do it. Rather, evaluation is a process
integral to a program’s effectiveness.
• Involves the participation of stakeholders. Those affected by the results of an
evaluation have a right to be involved in the process. Participation will help them
understand and inform the evaluation’s purpose. Participation will also promote
stakeholder contribution to, and acceptance of, the evaluation results. This
increases the likely use of the evaluation results for program improvement.
• Supports an organization’s capacity to learn and reflect. Evaluation is not an
end in itself; it should be a part of an organization’s core management processes,
so it can contribute to ongoing learning.
• Respects the community served by the program. Evaluation needs to be
respectful of constituents and judicious in what is asked of them. Evaluation
should not be something that is “done to” program participants and others
affected by or associated with the program. Rather, it should draw on their
knowledge and experience to produce information that will help improve
programs and better meet the needs of the community.
• Enables the collection of the most information with the least effort. You
can’t—and don’t need to—evaluate everything! Focus on what you need to
know. What are the critical pieces of information you and your stakeholders
need to know to remain accountable and to improve your program?
2 Some information in this section is drawn from: Earl, Sarah et al. Outcome Mapping: Building
Learning and Reflection into Development Programs. International Development Research Centre
(Canada), 2002.
Evaluation Plan Workbook
Page 5 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Why Evaluate?
Conducting a well-conceived and implemented evaluation will be in your interest. It
will help you:
• Understand and improve your program. Even the best-run programs are not
always complete successes. Every program can improve; the information
collected in an evaluation can provide guidance for program improvement. As
you incorporate evaluation into your ongoing work, you will gain useful
information and become a “learning organization” —one that is constantly
gathering information, changing, and improving.
• Test the theory underlying your program. The systematic data you collect about
your program’s short-, intermediate and long-term achievements as well as its
implementation helps you to understand whether (and under what conditions)
the hypotheses underlying your program are accurate, or whether they need to
be modified.
• Tell your program’s story. The data collected through evaluation can provide
compelling information to help you describe what your program is doing and
achieving. Evaluation results provide a strong framework for making your
program’s case before stakeholders, funders, and policy-makers.
• Be accountable. Evaluation helps you demonstrate responsible stewardship of
funding dollars.
• Inform the field. Nonprofits that have evaluated and refined their programs can
share credible results with the broader nonprofit community. A community that
can share results can be more effective.
• Support fundraising efforts. A clear understanding of your program—what
you did well, and precisely how you accomplished your outcomes—helps you
raise additional funds to continue your work and expand or replicate your
efforts.
Evaluation Plan Workbook
Page 6 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Developing an Evaluation Plan
Evaluation planning identifies and organizes questions you have about your program
and plots a route to get answers. Most questions that organizations probe through
evaluation are in three categories:
What did we do?
How well did we do it?
What difference did our program make? (What changes occurred because of our
program?)
Your program’s logic model will form the foundation of your evaluation plan. As you
look at your logic model, you will find questions about your program that you hope to
answer. The purpose of evaluation planning is to identify these questions and plan a
route to finding the answers.
Two major forms of evaluation help answer these questions.
1) Implementation Evaluation: Are you performing the services or activities as
planned? Are you reaching the intended target population? Are you reaching the
intended number of participants? Is it leading to the products you expected?
How do the participants perceive these services and activities? These questions
are about implementation.
2) Outcomes Evaluation: Is your target audience experiencing the changes in
knowledge, attitudes, behaviors, or awareness that you sought? What are the
results of your work? What is it accomplishing among your target audience?
These questions are about outcomes.
Evaluation Plan Workbook
Page 7 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Implementation and Outcomes: One without the Other?
We believe that an effective evaluation should answer both types of questions. You can
do one without the other, but you will not learn as much as if you conduct both types.
In the “old days”, nonprofit evaluation focused on documenting and reporting on
program activities. Nonprofits and others then assumed that if it implemented program
activities as planned, desired results would occur for the individuals, families,
organizations, or communities they served. In general, the nonprofit community focused
on reporting on implementation to the exclusion of outcomes. In recent years, the
pendulum has swung in the opposite direction: nonprofits are under pressure to
measure and report outcomes, with little emphasis on implementation.
The evaluation framework we use incorporates both outcomes and implementation.
Programs are complex. You need to understand the degree to which you accomplished
your desired outcomes. You also want to learn what aspects of your program
contributed to those achievements and what barriers exist to your program getting its
ideal results.
This workbook will:
• Lead you through the development of a plan to evaluate your program’s
implementation.
• Help you create a plan to evaluate your program’s outcomes.
It doesn’t matter which you do first – if you would prefer to start with outcomes, please
do so.
Evaluation Plan Workbook
Page 8 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Evaluating Implementation:
What Did You Do? How Well Did You Do It?
Activities Outputs &
Implementation
Questions
Data Collection
Method
(How to Measure)
Data Collection Effort
(have, low, med, high)
OutputsActivity Group
Questions
OutputsActivity Group
Questions
The illustration above outlines the components of your implementation evaluation plan.
What You Did: Activities & Outputs
Your implementation evaluation plan starts with identification of the activities of your
program. The activities are the actions that the program takes to achieve desired
outcomes. If your program entails many activities, you may have organized these
activities into activity categories—closely related groups of activities in your program.
Your outputs are the tangible products of your program’s activities. Outputs are also the
evidence of your activities. In implementation evaluation, outputs are the items you will
actually measure to evaluate your activities. Measuring outputs answers the question:
What Did We Do? This is often the easiest and most direct process in evaluation.
Evaluation Plan Workbook
Page 9 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Build Your Evaluation Plan: Information from the Logic Model
(Visit the Point K Learning Center at www.innonet.org to download our Logic Model
Workbook or use the online Logic Model Builder.)
• If you are working on paper:
o Take your activity categories and individual activities from your logic
model and place them in the “Activities” column in your Implementation
Plan template.
o Take your outputs from the logic model, and place them in the “Outputs”
boxes in your plan.
o Review your outputs and determine if you want to track any additional
“products” from your activities. Include these in the outputs boxes.
• If you are working online at Point K, this information will pre-fill from your
Logic Model into your Evaluation Plan.
How Well You Did It: Additional Evaluation Questions
Documenting activities and associated outputs tells us what you did. However, that
isn’t sufficient for evaluation purposes (after all, you already had your activities and
outputs identified in your logic model). The purpose of implementation evaluation is to
understand how well you did it.
The next step is to identify other questions you have about your activities and their
outputs. What information will help you better understand the implementation of your
program? The following are examples of the types of questions you might consider:
• Participation: Did the targeted audience participate in the activities as expected?
Why? Were some individuals over- or under-represented? Why?
• Quality: Were the services/materials you provided perceived as valuable by the
intended audience? Were they appropriate? How did others in the field view
their quality?
• Satisfaction: Did those affected by your program’s services approve of them?
Why? Who was most/least satisfied?
• Context: What other factors influenced your ability to implement your program
as planned? What political, economic, or leadership issues intervened, changing
the expected outcomes in your program?
Evaluation Plan Workbook
Page 10 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Choosing Questions
Brainstorm additional questions you may want to answer regarding the
implementation of your program. Use “Participation/Quality/Satisfaction/
Context” only as a guide; there is no need to have one of each question for all
activities, or limit your questions to these categories. Keep your list targeted to
those “need-to-know” questions that will have the biggest impact on program
improvement.
The answers to these questions can offer rich feedback for program
improvement. They move beyond a simple inventory of outputs and activities,
often probing the viewpoint of those you are serving. Still, they are related to
your program’s implementation rather than outcomes; they address what you did,
not what changes occurred because of your program.
Build Your Evaluation Plan: Insert your questions into the Questions box
in your evaluation plan template, or on the “Implementation Questions” tab of
the online Evaluation Plan Builder.
Implementation evaluation offers important information about what you did and
how well you did it. The lessons you learn can serve as benchmarks for progress
against your original program plan.
Perhaps you aren’t conducting the activities as planned; or
You are conducting those activities, but they are not leading to the
products/outputs you intended, or
They did lead to the intended outputs, but the quality or satisfaction levels
are not what you had hoped.
This information can help you determine if you need to adjust your plan, change
activities, or reconsider your theoretical assumptions. Evaluating your
implementation can provide a feedback loop in the midst of your effort, before
you may be able to evaluate outcomes.
Evaluation Plan Workbook
Page 11 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Evaluating Outcomes: What Difference Did You Make?
These days it’s no longer acceptable for nonprofits to assume that good
intentions, good-faith effort, or even exemplary program implementation will
result in the desired outcomes for those we serve. It is important to spend time
developing a plan to measure the achievement of outcomes.
In your logic model, you identified your desired outcomes—the changes you
expect to see as a result of your work. Outcomes are frequently expressed as
changes in knowledge, skill, attitudes, behavior, motivation, decisions, policies,
and conditions. They occur among individuals, communities, organizations, or
systems.
Indicators
In order to evaluate how successfully you have achieved your outcomes, you
will need to determine indicators for your outcomes.
An indicator is the evidence or information that will tell you whether your
program is achieving its intended outcomes. Indicators are measurable and
observable characteristics. They answer the question: “How will we know
change occurred?”
We often state outcomes as abstract concepts or ambitions. Indicators are the
measurement of outcomes. They are specific characteristics or behaviors that
provide tangible information about those concepts or ambitions. Often, one
outcome will have more than one indicator. When you develop your indicators,
it may be helpful to ask: “What does the outcome look like when it occurs? How
will I know if it has happened? What will I be able to see?”
Evaluation Plan Workbook
Page 12 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
An indicator should be:
• Meaningful: The indicator presents information that is important to key
stakeholders of the program. Keep in mind that different people can have
different perceptions about what defines “success” for a program.
Reaching consensus among key stakeholders regarding what success
looks like is essential to ensuring buy-in to your evaluation results.
• Direct: The indicator or combination of indicators captures enough of the
essential components of the outcome to represent the outcome. Several
indicators can be necessary to measure an outcome adequately. However,
there is no standard for the number of indicators to use. While multiple
indicators are often necessary, more than three or four may mean that the
outcome is too complex and should be better defined. An indicator must
also reflect the same type of change as the outcome. For example, if an
outcome is about a change in attitude or opinion, the indicator should not
reflect a behavior change.
• Useful: The information provided by this indicator can be put to
practical use for program improvement.
• Practical to Collect: The data for the indicator shouldn’t be a burden to
collect. Consider whether you can collect data about your indicator in a
timely manner and at reasonable cost. Sometimes an indicator meets the
other criteria described above, but the effort to collect it would be too
burdensome. Our evaluation template offers you an opportunity to note
the level of effort involved.
Evaluation Plan Workbook
Page 13 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Elements of a Strong Indicator Statement
To assist in evaluation, a strong indicator statement should include these four
elements:
How much. Identify the amount of change among your target population
that would indicate a successful level of achievement. This sets the target
for your work; base this on an understanding of your baseline and a level
of change that is reasonable for your program.
Who. Specify the target population you will measure.
What. Describe the condition, behavior, or characteristic that you will
measure.
When. Note the timeframe in which this change should occur.
For example, an outcome of an economic empowerment training program may
be that participants institute improved money management practices. One
indicator for that outcome would be the statement:
75% of participants open a free bank checking account within six months
of beginning the program.
Assume that you know, through an intake process, that most of a program’s
participants manage their money through expensive financial services such as
payday loan businesses. If 75% of the participants open a free bank checking
account within six months of beginning the program, you might view that as
ONE indicator of improved money management practices.
Breaking down the statement into the elements above, you see:
How Much = 75%
Who = program participants
What = open a free bank checking account
When = within six months of starting the program
Evaluation Plan Workbook
Page 14 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
The following are examples of outcomes and indicators.
Outcomes Indicators
New mothers increase their
knowledge of child development.
75% of new mothers in the program satisfactorily complete
a short survey about child development at the end of the
course
Target audiences increase
knowledge about the signs of child
abuse and neglect
50% of community focus group members can identify the
signs of child abuse and neglect six months after education
campaign ends
Residents feel neighborhood is a
safer place for children
60% of neighborhood residents report in one year that they
believe the neighborhood is safer for children than it was
one year before.
Diversified program resources In one year, each of four funding sources (public, private
corporation, individual donors, foundations) comprise at
least 15% and not more than 55% of program income
Increased cultural competency of
legal aid attorneys
80% of legal aid attorneys self-report learning about cultural
issues at end of workshop
Within 6 months, 60% of clients of trained attorneys indicate
attorney was knowledgeable about cultural issues
Increased legislators’ awareness of
policy options
In six months, 45% of surveyed legislators indicate
awareness of different policy options
Students (K-6) demonstrate
knowledge about opera
By the end of class:
-65% of children can identify the story of the opera being
performed.
-65% can describe 3 of the characters in the opera.
-50% can identify the type of voice (soprano, alto, tenor,
bass) of the character singing a part
Youth have increased knowledge
about the consequences of long-
term ATOD use/abuse.
At end of course
-90% of participants report that they gained knowledge
about the risks/harms associated with ATOD use
-80% report that it is important not to use alcohol or other
drugs.
While developing your indicators, you may realize that your outcomes are
unclear or ambiguous. This process offers an opportunity to reconsider or
further clarify your outcomes.
Build Your Evaluation Plan: Check over your outcomes—are they clear
and unambiguous?
Evaluation Plan Workbook
Page 15 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Setting Indicator Targets
An indicator indicates that your program is achieving its intended outcomes,
which are changes you want to see in individuals, groups, communities or
systems. To be meaningful, you should identify an amount of change that you
believe demonstrates successful achievement of the related outcome.
Setting targets may seem challenging. No one wants to be held accountable for
unrealistic expectations. At the same time, we all want programs to lead to real
change.
How do you choose a realistic target? Consider these guideposts:
What is your baseline?
What does your experience tell you?
What are standards in the field?
What do stakeholders expect?
Perhaps your program is new or you have no baseline information because you
have never gathered this information before. In that case, your first evaluation
may provide baseline information; in future years, you will want to create
realistic targets to show progress.
Multiple Indicators
Some outcomes have just one closely related indicator. If an outcome is to
increase the awareness among your members of a new service your organization
provides, your sole indicator may be that 80% of your members report awareness
of new service within six months. A smoking reduction program’s outcome is for
participants to stop smoking. In this case, the indicator will be a percentage of
participants who stop smoking after participating in the program. In that case,
only one indicator will be necessary.
Complex outcomes may have multiple indicators: If an outcome of a program is
for participants to improve their job-seeking skills, this will require multiple
indicators that would likely reflect the skills you emphasize in your program.
Indicators might include the following:
Evaluation Plan Workbook
Page 16 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
--A percentage of participants who meet criteria in mock interviews at end of
training course
--A percentage of participants who develop quality resumes within 1 month of
completing course
--A percentage of participants who can identify three key sources of job listings
by end of the training course
When developing your logic model, you identified a chain of outcomes. This
chain is useful in both clarifying your theory of change and in setting realistic
expectations. The chain is also important from an evaluation standpoint—
knowing what you are achieving. Each link in the chain has corresponding
indicators.
Short-Term
Outcomes
LEARNING: The knowledge
parents and guardians gain
from the literature & PSAs.
• Increased
understanding among
targeted parents of the
importance of
childhood immunization
• Increased knowledge
among targeted parents
of where to go to have
their children
immunized
Indicators:
47% of targeted parents
showed increased
knowledge about
immunization after the
program
39% of participating parents
could identify an accessible
clinic that provides
immunizations
Intermediate
Outcomes
BEHAVIOR: The actions
parents & guardians take as
a result of that knowledge.
• Increased number of
targeted parents who
take their children to
be immunized
Indicator:
26% increase in parents
taking their children for
immunizations in the target
area in the year following
the program
Long-Term
Outcomes
CONDITION: The conditions
that change as a result of
those actions.
• Increased number of
children of targeted
parents who continue
to receive up to date
immunizations
• Healthier children
Indicators:
86% of children in the
target area who received 2-
month DTaP immunizations
also received 4-month DTaP
immunizations
68% of children in the
target area who received 2-
month DTaP immunizations
also received 4-year DTaP
immunizations
Evaluation Plan Workbook
Page 17 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Direct versus Indirect Indicators
Ensure that your indicators relate directly to the outcome you are evaluating, and are
evidence of the same type of change. Below are some examples of direct versus indirect
indicators for the sample outcomes.
OUTCOMES INDICATORS
Participating new mothers have their
children immunized (behavior)
Indirect: #/% of participating new mothers
who are aware of importance of immunization
(awareness)
Direct: #/% of children of participating
mothers who are up-to-date in immunizations
within 1 year (behavior)
Participating children understand
principles of good sportsmanship
(knowledge)
Indirect: #/% of children who participate on
teams after finishing program (unrelated
behavior)
Direct: #/% of participating youth who are
able to identify five good sportsmanship
behaviors by the end of the season
(knowledge); #/% of fights and arguments
among student athletes decreases each year of
program (behavior, but shows knowledge in
action)
Targeted teens increase knowledge of
certain environmental health hazards
(knowledge)
Indirect: #/% of students who receive brochure
on topic during first 6 months of program
(output)
Direct: #/% of targeted students who can
identify 3 health hazards at end of first year of
program (knowledge)
Build Your Evaluation Plan: Insert your Indicators into the Indicators box in your
evaluation plan template, or on the “Indicators/Data Collection” tab of the online
Evaluation Plan Builder.
Evaluation Plan Workbook
Page 18 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Data Collection Preview
Of all the parts of evaluation, data collection can be the most daunting. Many of
us believe we need to be statisticians or evaluation professionals to engage in
quality data collection, but that simply isn’t true.
This workbook will introduce you to the basic concepts of data collection, to help
you complete your evaluation plans. [A separate data collection workbook will
be available soon at the Point K Learning Center.]
Data Collection Methods:
What’s the best way to gather the information you need?
So far, you have identified what you want to evaluate and what you will
measure. In implementation evaluation, these are activities and their related
outputs and additional questions. In outcome evaluation, these are program
outcomes and their related indicators.
Now you will consider methods to collect the data. Outputs, implementation
questions, and indicators are what you will measure; data collection methods are
how you will measure these.
The goal in data collection is to minimize the number of collection instruments
you use and maximize the amount of information you collect from each one!
When choosing the best data collection method to obtain the information you
need, consider the following:
• Which methods will be least disruptive to your program and to those your
serve?
• Which methods can you afford and implement well?
• Which methods are best suited to obtain information from your sources
(considering cultural appropriateness and other contextual issues)?
Evaluation Plan Workbook
Page 19 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
The most common data collection strategies fall into the following broad categories.
1. Review documents
Analysis of printed material including program records, research reports, census
data, health records, budgets. Document review is a common method of
collecting data about activities and outputs for implementation evaluation.
2. Observe
Observe situations, behaviors and activities in a formalized and systematic way,
usually using observational checklists and trained observers. This is a good
method to use in settings where experiencing actual events or settings (rather
than hearing about them) is an important part of the evaluation.
3. Talk to people
Collect verbal responses from participants and other stakeholders through
interviews (in-person or phone) or focus groups. This method is helpful when it
is important to hear complex or highly individual thoughts of a certain group of
individuals.
4. Collect written responses from people
Collect written responses through surveys (in-person, e-mail, online, mail,
phone), tests, or journals/logs. Except in the case of journals, this method is often
used when you need a lot of information from a large number of people or when
it is important that identical information be available from all respondents.
5. Other methods
Review pictorial/multi-media data in photographs, audiotapes, compact discs,
visual artwork. Conduct expert or peer reviews in which professionals in the
field with specific expertise assess a set of activities or products. Use a case study,
an intensive investigation of one unit to use for learning purposes, often as an
exemplar or model to be avoided.
Brainstorm: Consider data collection methods for each item you will
measure.
Evaluation Plan Workbook
Page 20 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Ease of Data Collection
When you do identify data collection methods, the next step is to identify the
level of effort required to collect the data. Consider the cost and time required to
create new data collection tools. Also, consider the cost involved in actually
collecting and analyzing the data. Some methods are more expensive than others.
For example, interviews take more time (and therefore resources) than surveys.
For many methods you’ll need database software. For each data collection
method you identify, consider whether you already have a tool in place that you
could use (such as an intake survey). If not, think about the amount of effort
required to create and use the new data collection tool. Assess whether it will
require a “low,” “medium,” or “high” level of effort.
Build Your Evaluation Plan: Assess the level of effort required to collect the
data using the methods you have identified so far. Use the scale: have (for those
methods your organization already has available); low; medium; or high.
Enter your data collection methods and their corresponding levels of effort into
your evaluation plan template or the online Evaluation Plan Builder.
Evaluation Plan Workbook
Page 21 of 23
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Review Your Plan
Together, your implementation and outcome templates form one evaluation
plan. Take time to review your plan so far to make sure it would lead to a
worthwhile evaluation.
Here are some tips for reviewing your plan:
• For implementation evaluation, have you identified additional questions
that will help you improve the quality of the activities you are
conducting?
• Have you narrowed that list of questions to those few “need-to-know”
topics?
• Have you identified indicators for all your outcomes? Has any outcome
required more than three indicators? If so, consider whether that outcome
needs to be re-defined or separated into several intended changes.
• Do most or all of your indicator statements include the key elements of
“who,” “how much,” “what” and by “when”?
• Have you identified reasonable targets for you indicators?
If you have any questions about program planning or evaluation, or are interested in our
in-person services, please visit our website, www.innonet.org or contact us at:
Innovation Network, Inc.
1625 K St., NW, 11t h
Floor
Washington, DC 20006
(202) 728-0727
info@innonet.org
www.innonet.org
• Copyright Policy
Innovation Network, Inc. grants permission for noncommercial, share-alike use of these
materials provided that attribution is given to Innovation Network, Inc. For more
information, please see our Creative Commons License.
Implementation Evaluation Plan
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Activities Outputs & Implementation Questions Data Collection
Method
Data Collection
Effort
(Have, Low, Medium,
High)
OutputsProgram Component
Questions
Program Component Outputs
Implementation Evaluation Plan
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Activities Outputs & Implementation Questions Data Collection
Method
Data Collection
Effort
(Have, Low, Medium,
High)
Questions
Outcomes Evaluation Plan
Innovation Network, Inc.
www.innonet.org • info@innonet.org
Outcomes Indicators Data Collection
Method
Data Collection
Effort
(Have, Low, Medium,
High)

More Related Content

What's hot

Logic models
Logic modelsLogic models
Logic modelsTarun Gehlot
 
A Guide to Logic Models – Grant Writing
A Guide to Logic Models – Grant WritingA Guide to Logic Models – Grant Writing
A Guide to Logic Models – Grant Writing4Good.org
 
Logic Models For Grant Writing
Logic Models For Grant WritingLogic Models For Grant Writing
Logic Models For Grant WritingDierdre McKee
 
Outputs, Outcomes, and Logic Models
Outputs, Outcomes, and Logic ModelsOutputs, Outcomes, and Logic Models
Outputs, Outcomes, and Logic ModelsAshley Brundage
 
10-Year Evaluation of Connecticut Health Foundation's Leadership Program
10-Year Evaluation of Connecticut Health Foundation's Leadership Program10-Year Evaluation of Connecticut Health Foundation's Leadership Program
10-Year Evaluation of Connecticut Health Foundation's Leadership ProgramConnecticut Health Foundation
 
Non-Profit Program Planning and Evaluation
Non-Profit Program Planning and EvaluationNon-Profit Program Planning and Evaluation
Non-Profit Program Planning and EvaluationKristina Jones
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christieharrindl
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...Institute of Development Studies
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sectorwishart5
 
Two Examples of Program Planning, Monitoring and Evaluation
Two Examples of Program Planning, Monitoring and EvaluationTwo Examples of Program Planning, Monitoring and Evaluation
Two Examples of Program Planning, Monitoring and EvaluationMEASURE Evaluation
 
Program Planning Workshop with Mr. Caloy DiĂąo
Program Planning Workshop with Mr. Caloy DiĂąoProgram Planning Workshop with Mr. Caloy DiĂąo
Program Planning Workshop with Mr. Caloy DiĂąoMights Rasing
 
PDE Week 3 Developing and evaluating programs using the logic model
PDE Week 3 Developing and evaluating programs using the logic modelPDE Week 3 Developing and evaluating programs using the logic model
PDE Week 3 Developing and evaluating programs using the logic modelkpravera
 
PDE Week 1 Introduction
PDE Week 1 IntroductionPDE Week 1 Introduction
PDE Week 1 Introductionkpravera
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationKnowledge Management Center
 
An Introduction to Monitoring & Evaluation
An Introduction to Monitoring & EvaluationAn Introduction to Monitoring & Evaluation
An Introduction to Monitoring & EvaluationRobin Beveridge
 

What's hot (20)

Logic models
Logic modelsLogic models
Logic models
 
A Guide to Logic Models – Grant Writing
A Guide to Logic Models – Grant WritingA Guide to Logic Models – Grant Writing
A Guide to Logic Models – Grant Writing
 
Logic Models For Grant Writing
Logic Models For Grant WritingLogic Models For Grant Writing
Logic Models For Grant Writing
 
Outputs, Outcomes, and Logic Models
Outputs, Outcomes, and Logic ModelsOutputs, Outcomes, and Logic Models
Outputs, Outcomes, and Logic Models
 
10-Year Evaluation of Connecticut Health Foundation's Leadership Program
10-Year Evaluation of Connecticut Health Foundation's Leadership Program10-Year Evaluation of Connecticut Health Foundation's Leadership Program
10-Year Evaluation of Connecticut Health Foundation's Leadership Program
 
Outcome Mapping: Monitoring and Evaluation Tool
Outcome Mapping: Monitoring and Evaluation ToolOutcome Mapping: Monitoring and Evaluation Tool
Outcome Mapping: Monitoring and Evaluation Tool
 
Non-Profit Program Planning and Evaluation
Non-Profit Program Planning and EvaluationNon-Profit Program Planning and Evaluation
Non-Profit Program Planning and Evaluation
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sector
 
Basics of Extension Evaluations
Basics of Extension EvaluationsBasics of Extension Evaluations
Basics of Extension Evaluations
 
Evaluating Extension Programe Outcomes
Evaluating Extension Programe OutcomesEvaluating Extension Programe Outcomes
Evaluating Extension Programe Outcomes
 
Two Examples of Program Planning, Monitoring and Evaluation
Two Examples of Program Planning, Monitoring and EvaluationTwo Examples of Program Planning, Monitoring and Evaluation
Two Examples of Program Planning, Monitoring and Evaluation
 
Program Planning Workshop with Mr. Caloy DiĂąo
Program Planning Workshop with Mr. Caloy DiĂąoProgram Planning Workshop with Mr. Caloy DiĂąo
Program Planning Workshop with Mr. Caloy DiĂąo
 
PDE Week 3 Developing and evaluating programs using the logic model
PDE Week 3 Developing and evaluating programs using the logic modelPDE Week 3 Developing and evaluating programs using the logic model
PDE Week 3 Developing and evaluating programs using the logic model
 
PDE Week 1 Introduction
PDE Week 1 IntroductionPDE Week 1 Introduction
PDE Week 1 Introduction
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
 
Logic Models
Logic ModelsLogic Models
Logic Models
 
An Introduction to Monitoring & Evaluation
An Introduction to Monitoring & EvaluationAn Introduction to Monitoring & Evaluation
An Introduction to Monitoring & Evaluation
 
Needs assessment
Needs assessmentNeeds assessment
Needs assessment
 

Similar to Evaluation Plan Workbook

17- Program Evaluation a beginner’s guide.pdf
17- Program Evaluation a beginner’s guide.pdf17- Program Evaluation a beginner’s guide.pdf
17- Program Evaluation a beginner’s guide.pdfAnshuman834549
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation MoseStaton39
 
Workbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxWorkbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxAASTHA76
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation MikeEly930
 
Learning_Unit_3
Learning_Unit_3Learning_Unit_3
Learning_Unit_3Jack Ong
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1sourav goswami
 
Usaid tips series
Usaid tips seriesUsaid tips series
Usaid tips seriesAchint Kumar
 
Basic Guide to Program Evaluation (Including Outcomes Evaluation).docx
Basic Guide to Program Evaluation (Including Outcomes Evaluation).docxBasic Guide to Program Evaluation (Including Outcomes Evaluation).docx
Basic Guide to Program Evaluation (Including Outcomes Evaluation).docxJASS44
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptxgggadiel
 
USER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYATUSER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYATLenny Hidayat
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfnoblex1
 
Quality Assurance_Final
Quality Assurance_FinalQuality Assurance_Final
Quality Assurance_Finalkristin kipp
 
Outcomes evaluation for community organization
Outcomes evaluation for community organization  Outcomes evaluation for community organization
Outcomes evaluation for community organization Mohammad Saeb Mohajeri
 
Usaid tips 01 conducting a participatory evaluation-2011 05
Usaid   tips 01 conducting a participatory evaluation-2011 05Usaid   tips 01 conducting a participatory evaluation-2011 05
Usaid tips 01 conducting a participatory evaluation-2011 05Fida Karim 🇵🇰
 
\\Storage\Personal Files\Module V\Utec
\\Storage\Personal Files\Module V\Utec\\Storage\Personal Files\Module V\Utec
\\Storage\Personal Files\Module V\Utechildagomez
 
PDE Week 5: Developing an Evaluation Plan
PDE Week 5: Developing an Evaluation PlanPDE Week 5: Developing an Evaluation Plan
PDE Week 5: Developing an Evaluation Plankpravera
 
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Brent MacKinnon
 
Ngo’s project management
Ngo’s project managementNgo’s project management
Ngo’s project managementMohamed Ahmed Said
 

Similar to Evaluation Plan Workbook (20)

17- Program Evaluation a beginner’s guide.pdf
17- Program Evaluation a beginner’s guide.pdf17- Program Evaluation a beginner’s guide.pdf
17- Program Evaluation a beginner’s guide.pdf
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation
 
Workbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxWorkbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docx
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation
 
Learning_Unit_3
Learning_Unit_3Learning_Unit_3
Learning_Unit_3
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
 
Usaid tips series
Usaid tips seriesUsaid tips series
Usaid tips series
 
Basic Guide to Program Evaluation (Including Outcomes Evaluation).docx
Basic Guide to Program Evaluation (Including Outcomes Evaluation).docxBasic Guide to Program Evaluation (Including Outcomes Evaluation).docx
Basic Guide to Program Evaluation (Including Outcomes Evaluation).docx
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
USER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYATUSER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYAT
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdf
 
Quality Assurance_Final
Quality Assurance_FinalQuality Assurance_Final
Quality Assurance_Final
 
Outcomes evaluation for community organization
Outcomes evaluation for community organization  Outcomes evaluation for community organization
Outcomes evaluation for community organization
 
Usaid tips 01 conducting a participatory evaluation-2011 05
Usaid   tips 01 conducting a participatory evaluation-2011 05Usaid   tips 01 conducting a participatory evaluation-2011 05
Usaid tips 01 conducting a participatory evaluation-2011 05
 
\\Storage\Personal Files\Module V\Utec
\\Storage\Personal Files\Module V\Utec\\Storage\Personal Files\Module V\Utec
\\Storage\Personal Files\Module V\Utec
 
Program Evaluations
Program EvaluationsProgram Evaluations
Program Evaluations
 
PDE Week 5: Developing an Evaluation Plan
PDE Week 5: Developing an Evaluation PlanPDE Week 5: Developing an Evaluation Plan
PDE Week 5: Developing an Evaluation Plan
 
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
 
Ngo’s project management
Ngo’s project managementNgo’s project management
Ngo’s project management
 
evaluation
evaluationevaluation
evaluation
 

More from Innovation Network

Refreshing Evaluation in Support of the Social Movements Revival
Refreshing Evaluation in Support of the Social Movements RevivalRefreshing Evaluation in Support of the Social Movements Revival
Refreshing Evaluation in Support of the Social Movements RevivalInnovation Network
 
10-Year Evaluation of Connecticut Health Foundation's Leadership Program
10-Year Evaluation of Connecticut Health Foundation's Leadership Program10-Year Evaluation of Connecticut Health Foundation's Leadership Program
10-Year Evaluation of Connecticut Health Foundation's Leadership ProgramInnovation Network
 
Evaluation Theory Tree: Evaluation Approaches
Evaluation Theory Tree: Evaluation ApproachesEvaluation Theory Tree: Evaluation Approaches
Evaluation Theory Tree: Evaluation ApproachesInnovation Network
 
Putting Data in Context: Timelining for Evaluators (HANDOUT)
Putting Data in Context: Timelining for Evaluators (HANDOUT)Putting Data in Context: Timelining for Evaluators (HANDOUT)
Putting Data in Context: Timelining for Evaluators (HANDOUT)Innovation Network
 
Putting Data in Context: Timelining for Evaluators
Putting Data in Context: Timelining for EvaluatorsPutting Data in Context: Timelining for Evaluators
Putting Data in Context: Timelining for EvaluatorsInnovation Network
 
Data Placemats: Construction and Practical Design Tips
Data Placemats: Construction and Practical Design TipsData Placemats: Construction and Practical Design Tips
Data Placemats: Construction and Practical Design TipsInnovation Network
 
Real Time Evaluation Handout
Real Time Evaluation HandoutReal Time Evaluation Handout
Real Time Evaluation HandoutInnovation Network
 
Real Time Evaluation: Tips, Tools, and Tricks of the Trade
Real Time Evaluation: Tips, Tools, and Tricks of the TradeReal Time Evaluation: Tips, Tools, and Tricks of the Trade
Real Time Evaluation: Tips, Tools, and Tricks of the TradeInnovation Network
 
Real Time Survey_Long Method
Real Time Survey_Long MethodReal Time Survey_Long Method
Real Time Survey_Long MethodInnovation Network
 
Make Your Data Count: Tips & Tools for Visual Reporting
Make Your Data Count: Tips & Tools for Visual Reporting Make Your Data Count: Tips & Tools for Visual Reporting
Make Your Data Count: Tips & Tools for Visual Reporting Innovation Network
 
Make Your Data Count: New, Visual Approaches to Evaluation Reporting
Make Your Data Count: New, Visual Approaches to Evaluation Reporting Make Your Data Count: New, Visual Approaches to Evaluation Reporting
Make Your Data Count: New, Visual Approaches to Evaluation Reporting Innovation Network
 
Success and Failure in the Evaluation Process
Success and Failure in the Evaluation ProcessSuccess and Failure in the Evaluation Process
Success and Failure in the Evaluation ProcessInnovation Network
 
Evaluating Adaptive Capacity and Skills + Resources
Evaluating Adaptive Capacity and Skills + ResourcesEvaluating Adaptive Capacity and Skills + Resources
Evaluating Adaptive Capacity and Skills + ResourcesInnovation Network
 
#Eval14 - Make Your Data Count: New, Visual Approaches to Reporting
#Eval14 - Make Your Data Count: New, Visual Approaches to Reporting#Eval14 - Make Your Data Count: New, Visual Approaches to Reporting
#Eval14 - Make Your Data Count: New, Visual Approaches to ReportingInnovation Network
 
Data Placemats (handout)
Data Placemats (handout)Data Placemats (handout)
Data Placemats (handout)Innovation Network
 
#JAGUnity2014: Innovations in Evaluating Social Movements
#JAGUnity2014: Innovations in Evaluating Social Movements#JAGUnity2014: Innovations in Evaluating Social Movements
#JAGUnity2014: Innovations in Evaluating Social MovementsInnovation Network
 
Evaluation Essentials for Nonprofits: Terms, Tips, and Trends
Evaluation Essentials for Nonprofits: Terms, Tips, and TrendsEvaluation Essentials for Nonprofits: Terms, Tips, and Trends
Evaluation Essentials for Nonprofits: Terms, Tips, and TrendsInnovation Network
 
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...Innovation Network
 
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...Innovation Network
 
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Slides)
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Slides)#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Slides)
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Slides)Innovation Network
 

More from Innovation Network (20)

Refreshing Evaluation in Support of the Social Movements Revival
Refreshing Evaluation in Support of the Social Movements RevivalRefreshing Evaluation in Support of the Social Movements Revival
Refreshing Evaluation in Support of the Social Movements Revival
 
10-Year Evaluation of Connecticut Health Foundation's Leadership Program
10-Year Evaluation of Connecticut Health Foundation's Leadership Program10-Year Evaluation of Connecticut Health Foundation's Leadership Program
10-Year Evaluation of Connecticut Health Foundation's Leadership Program
 
Evaluation Theory Tree: Evaluation Approaches
Evaluation Theory Tree: Evaluation ApproachesEvaluation Theory Tree: Evaluation Approaches
Evaluation Theory Tree: Evaluation Approaches
 
Putting Data in Context: Timelining for Evaluators (HANDOUT)
Putting Data in Context: Timelining for Evaluators (HANDOUT)Putting Data in Context: Timelining for Evaluators (HANDOUT)
Putting Data in Context: Timelining for Evaluators (HANDOUT)
 
Putting Data in Context: Timelining for Evaluators
Putting Data in Context: Timelining for EvaluatorsPutting Data in Context: Timelining for Evaluators
Putting Data in Context: Timelining for Evaluators
 
Data Placemats: Construction and Practical Design Tips
Data Placemats: Construction and Practical Design TipsData Placemats: Construction and Practical Design Tips
Data Placemats: Construction and Practical Design Tips
 
Real Time Evaluation Handout
Real Time Evaluation HandoutReal Time Evaluation Handout
Real Time Evaluation Handout
 
Real Time Evaluation: Tips, Tools, and Tricks of the Trade
Real Time Evaluation: Tips, Tools, and Tricks of the TradeReal Time Evaluation: Tips, Tools, and Tricks of the Trade
Real Time Evaluation: Tips, Tools, and Tricks of the Trade
 
Real Time Survey_Long Method
Real Time Survey_Long MethodReal Time Survey_Long Method
Real Time Survey_Long Method
 
Make Your Data Count: Tips & Tools for Visual Reporting
Make Your Data Count: Tips & Tools for Visual Reporting Make Your Data Count: Tips & Tools for Visual Reporting
Make Your Data Count: Tips & Tools for Visual Reporting
 
Make Your Data Count: New, Visual Approaches to Evaluation Reporting
Make Your Data Count: New, Visual Approaches to Evaluation Reporting Make Your Data Count: New, Visual Approaches to Evaluation Reporting
Make Your Data Count: New, Visual Approaches to Evaluation Reporting
 
Success and Failure in the Evaluation Process
Success and Failure in the Evaluation ProcessSuccess and Failure in the Evaluation Process
Success and Failure in the Evaluation Process
 
Evaluating Adaptive Capacity and Skills + Resources
Evaluating Adaptive Capacity and Skills + ResourcesEvaluating Adaptive Capacity and Skills + Resources
Evaluating Adaptive Capacity and Skills + Resources
 
#Eval14 - Make Your Data Count: New, Visual Approaches to Reporting
#Eval14 - Make Your Data Count: New, Visual Approaches to Reporting#Eval14 - Make Your Data Count: New, Visual Approaches to Reporting
#Eval14 - Make Your Data Count: New, Visual Approaches to Reporting
 
Data Placemats (handout)
Data Placemats (handout)Data Placemats (handout)
Data Placemats (handout)
 
#JAGUnity2014: Innovations in Evaluating Social Movements
#JAGUnity2014: Innovations in Evaluating Social Movements#JAGUnity2014: Innovations in Evaluating Social Movements
#JAGUnity2014: Innovations in Evaluating Social Movements
 
Evaluation Essentials for Nonprofits: Terms, Tips, and Trends
Evaluation Essentials for Nonprofits: Terms, Tips, and TrendsEvaluation Essentials for Nonprofits: Terms, Tips, and Trends
Evaluation Essentials for Nonprofits: Terms, Tips, and Trends
 
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
 
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Comm...
 
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Slides)
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Slides)#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Slides)
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data (Slides)
 

Recently uploaded

call girls in moti bagh DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in moti bagh DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in moti bagh DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in moti bagh DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️saminamagar
 
call girls in Laxmi Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Laxmi Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in Laxmi Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Laxmi Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️saminamagar
 
2024: The FAR, Federal Acquisition Regulations - Part 26
2024: The FAR, Federal Acquisition Regulations - Part 262024: The FAR, Federal Acquisition Regulations - Part 26
2024: The FAR, Federal Acquisition Regulations - Part 26JSchaus & Associates
 
history of 1935 philippine constitution.pptx
history of 1935 philippine constitution.pptxhistory of 1935 philippine constitution.pptx
history of 1935 philippine constitution.pptxhellokittymaearciaga
 
(多少钱)Dal毕业证国外本科学位证
(多少钱)Dal毕业证国外本科学位证(多少钱)Dal毕业证国外本科学位证
(多少钱)Dal毕业证国外本科学位证mbetknu
 
Powering Britain: Can we decarbonise electricity without disadvantaging poore...
Powering Britain: Can we decarbonise electricity without disadvantaging poore...Powering Britain: Can we decarbonise electricity without disadvantaging poore...
Powering Britain: Can we decarbonise electricity without disadvantaging poore...ResolutionFoundation
 
call girls in DLF Phase 1 gurgaon 🔝 >༒9540349809 🔝 genuine Escort Service 🔝...
call girls in DLF Phase 1  gurgaon  🔝 >༒9540349809 🔝 genuine Escort Service 🔝...call girls in DLF Phase 1  gurgaon  🔝 >༒9540349809 🔝 genuine Escort Service 🔝...
call girls in DLF Phase 1 gurgaon 🔝 >༒9540349809 🔝 genuine Escort Service 🔝...saminamagar
 
2024: The FAR, Federal Acquisition Regulations - Part 25
2024: The FAR, Federal Acquisition Regulations - Part 252024: The FAR, Federal Acquisition Regulations - Part 25
2024: The FAR, Federal Acquisition Regulations - Part 25JSchaus & Associates
 
call girls in Kirti Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Kirti Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in Kirti Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Kirti Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️saminamagar
 
call girls in sector 22 Gurgaon 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in sector 22 Gurgaon  🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in sector 22 Gurgaon  🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in sector 22 Gurgaon 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️saminamagar
 
call girls in Mayapuri DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Mayapuri DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in Mayapuri DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Mayapuri DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️saminamagar
 
Call Girls Service AECS Layout Just Call 7001305949 Enjoy College Girls Service
Call Girls Service AECS Layout Just Call 7001305949 Enjoy College Girls ServiceCall Girls Service AECS Layout Just Call 7001305949 Enjoy College Girls Service
Call Girls Service AECS Layout Just Call 7001305949 Enjoy College Girls Servicenarwatsonia7
 
Action Toolkit - Earth Day 2024 - April 22nd.
Action Toolkit - Earth Day 2024 - April 22nd.Action Toolkit - Earth Day 2024 - April 22nd.
Action Toolkit - Earth Day 2024 - April 22nd.Christina Parmionova
 
WORLD CREATIVITY AND INNOVATION DAY 2024.
WORLD CREATIVITY AND INNOVATION DAY 2024.WORLD CREATIVITY AND INNOVATION DAY 2024.
WORLD CREATIVITY AND INNOVATION DAY 2024.Christina Parmionova
 
Club of Rome: Eco-nomics for an Ecological Civilization
Club of Rome: Eco-nomics for an Ecological CivilizationClub of Rome: Eco-nomics for an Ecological Civilization
Club of Rome: Eco-nomics for an Ecological CivilizationEnergy for One World
 
Earth Day 2024 - AMC "COMMON GROUND'' movie night.
Earth Day 2024 - AMC "COMMON GROUND'' movie night.Earth Day 2024 - AMC "COMMON GROUND'' movie night.
Earth Day 2024 - AMC "COMMON GROUND'' movie night.Christina Parmionova
 
call girls in Mehrauli DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Mehrauli  DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in Mehrauli  DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Mehrauli DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️saminamagar
 
productionpost-productiondiary-240320114322-5004daf6.pptx
productionpost-productiondiary-240320114322-5004daf6.pptxproductionpost-productiondiary-240320114322-5004daf6.pptx
productionpost-productiondiary-240320114322-5004daf6.pptxHenryBriggs2
 
Disciplines-and-Ideas-in-the-Applied-Social-Sciences-DLP-.pdf
Disciplines-and-Ideas-in-the-Applied-Social-Sciences-DLP-.pdfDisciplines-and-Ideas-in-the-Applied-Social-Sciences-DLP-.pdf
Disciplines-and-Ideas-in-the-Applied-Social-Sciences-DLP-.pdfDeLeon9
 
High Class Call Girls Bangalore Komal 7001305949 Independent Escort Service B...
High Class Call Girls Bangalore Komal 7001305949 Independent Escort Service B...High Class Call Girls Bangalore Komal 7001305949 Independent Escort Service B...
High Class Call Girls Bangalore Komal 7001305949 Independent Escort Service B...narwatsonia7
 

Recently uploaded (20)

call girls in moti bagh DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in moti bagh DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in moti bagh DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in moti bagh DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
 
call girls in Laxmi Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Laxmi Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in Laxmi Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Laxmi Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
 
2024: The FAR, Federal Acquisition Regulations - Part 26
2024: The FAR, Federal Acquisition Regulations - Part 262024: The FAR, Federal Acquisition Regulations - Part 26
2024: The FAR, Federal Acquisition Regulations - Part 26
 
history of 1935 philippine constitution.pptx
history of 1935 philippine constitution.pptxhistory of 1935 philippine constitution.pptx
history of 1935 philippine constitution.pptx
 
(多少钱)Dal毕业证国外本科学位证
(多少钱)Dal毕业证国外本科学位证(多少钱)Dal毕业证国外本科学位证
(多少钱)Dal毕业证国外本科学位证
 
Powering Britain: Can we decarbonise electricity without disadvantaging poore...
Powering Britain: Can we decarbonise electricity without disadvantaging poore...Powering Britain: Can we decarbonise electricity without disadvantaging poore...
Powering Britain: Can we decarbonise electricity without disadvantaging poore...
 
call girls in DLF Phase 1 gurgaon 🔝 >༒9540349809 🔝 genuine Escort Service 🔝...
call girls in DLF Phase 1  gurgaon  🔝 >༒9540349809 🔝 genuine Escort Service 🔝...call girls in DLF Phase 1  gurgaon  🔝 >༒9540349809 🔝 genuine Escort Service 🔝...
call girls in DLF Phase 1 gurgaon 🔝 >༒9540349809 🔝 genuine Escort Service 🔝...
 
2024: The FAR, Federal Acquisition Regulations - Part 25
2024: The FAR, Federal Acquisition Regulations - Part 252024: The FAR, Federal Acquisition Regulations - Part 25
2024: The FAR, Federal Acquisition Regulations - Part 25
 
call girls in Kirti Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Kirti Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in Kirti Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Kirti Nagar DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
 
call girls in sector 22 Gurgaon 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in sector 22 Gurgaon  🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in sector 22 Gurgaon  🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in sector 22 Gurgaon 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
 
call girls in Mayapuri DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Mayapuri DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in Mayapuri DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Mayapuri DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
 
Call Girls Service AECS Layout Just Call 7001305949 Enjoy College Girls Service
Call Girls Service AECS Layout Just Call 7001305949 Enjoy College Girls ServiceCall Girls Service AECS Layout Just Call 7001305949 Enjoy College Girls Service
Call Girls Service AECS Layout Just Call 7001305949 Enjoy College Girls Service
 
Action Toolkit - Earth Day 2024 - April 22nd.
Action Toolkit - Earth Day 2024 - April 22nd.Action Toolkit - Earth Day 2024 - April 22nd.
Action Toolkit - Earth Day 2024 - April 22nd.
 
WORLD CREATIVITY AND INNOVATION DAY 2024.
WORLD CREATIVITY AND INNOVATION DAY 2024.WORLD CREATIVITY AND INNOVATION DAY 2024.
WORLD CREATIVITY AND INNOVATION DAY 2024.
 
Club of Rome: Eco-nomics for an Ecological Civilization
Club of Rome: Eco-nomics for an Ecological CivilizationClub of Rome: Eco-nomics for an Ecological Civilization
Club of Rome: Eco-nomics for an Ecological Civilization
 
Earth Day 2024 - AMC "COMMON GROUND'' movie night.
Earth Day 2024 - AMC "COMMON GROUND'' movie night.Earth Day 2024 - AMC "COMMON GROUND'' movie night.
Earth Day 2024 - AMC "COMMON GROUND'' movie night.
 
call girls in Mehrauli DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Mehrauli  DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️call girls in Mehrauli  DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
call girls in Mehrauli DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
 
productionpost-productiondiary-240320114322-5004daf6.pptx
productionpost-productiondiary-240320114322-5004daf6.pptxproductionpost-productiondiary-240320114322-5004daf6.pptx
productionpost-productiondiary-240320114322-5004daf6.pptx
 
Disciplines-and-Ideas-in-the-Applied-Social-Sciences-DLP-.pdf
Disciplines-and-Ideas-in-the-Applied-Social-Sciences-DLP-.pdfDisciplines-and-Ideas-in-the-Applied-Social-Sciences-DLP-.pdf
Disciplines-and-Ideas-in-the-Applied-Social-Sciences-DLP-.pdf
 
High Class Call Girls Bangalore Komal 7001305949 Independent Escort Service B...
High Class Call Girls Bangalore Komal 7001305949 Independent Escort Service B...High Class Call Girls Bangalore Komal 7001305949 Independent Escort Service B...
High Class Call Girls Bangalore Komal 7001305949 Independent Escort Service B...
 

Evaluation Plan Workbook

  • 1. Evaluation Plan Workbook Page 0 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org
  • 2. Evaluation Plan Workbook Page 1 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Evaluation Plan Workbook Table of Contents P a g e Introduction How to Use this Workbook ......................................................................2 What is Evaluation?....................................................................................3 Proving vs. Improving: A Brief History of Evaluation..........................3 Our Approach..............................................................................................3 Evaluation Principles..................................................................................4 Why Evaluate?.............................................................................................5 Developing an Evaluation Plan ............................................................................6 Implementation and Outcomes: One Without the Other?....................7 Evaluating Implementation.......................................................................8 What You Did: Activities and Outputs........................................8 How Well You It: Additional Evaluation Questions .................9 Evaluating Outcomes .................................................................................11 What Difference Did You Make? ................................................11 Indicators..........................................................................................11 Elements of a Strong Indicator Statement ...................................13 Indicator Examples .........................................................................14 Setting Indicator Targets................................................................15 Multiple Indicators .........................................................................15 Direct v. Indirect Indicators...........................................................17 Data Collection Preview.........................................................................................18 Methods ................................................................................................18 Ease of Data Collection...............................................................................20 Review Your Plan ................................................................................................21 Appendices: Evaluation Templates Implementation Outcomes
  • 3. Evaluation Plan Workbook Page 2 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Introduction How to Use this Workbook Welcome to Innovation Network’s Evaluation Plan Workbook, which offers an introduction to the concepts and processes of planning a program evaluation. We hope that after following this workbook, you will understand evaluation as a tool for empowerment. You will learn how evaluation can help your organization be more effective, and you will be able to develop plans to evaluate both the implementation and outcomes of your programs. This is the second workbook in a series. We recommend that before using this workbook, you go through the Logic Model Workbook. You may use this book in whatever way suits you best: • As a stand-alone guide to help create an evaluation plan for a program • As an extra resource for users of the online Evaluation Plan Builder (see the Point K Learning Center at www.innonet.org) • As a supplement to an in-person training on evaluation planning. This checklist icon appears at points in the workbook at which you should take an action, or record something – gather information, take action, write something in your template, or enter something into your online Logic Model Builder. You can create your evaluation plan online using the Evaluation Plan Builder in Innovation Network’s Point K Learning Center, our suite of online planning and evaluation tools and resources at www.innonet.org. This online tool walks you through the evaluation planning process; allows you to use an existing logic model as a framework for your evaluation plan; saves your work so you can come back to it later; and share work with colleagues to review and critique. For those of you who prefer to work on paper, an evaluation plan template is located at the end of this workbook. You may want to make several copies of the template, to allow for adjustments and updates to your evaluation plan over time.
  • 4. Evaluation Plan Workbook Page 3 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org One of the negative connotations often associated with evaluation is that it is something done to people. One is evaluated. Participatory evaluation, in contrast, is a process controlled by the people in the program or community. It is something they undertake as a formal, reflective process for their own development and empowerment. M. Patton, Qualitative Evaluation Methods1 What is Evaluation? Evaluation is the systematic collection of information about a program that enables stakeholders to better understand the program, improve its effectiveness, and/or make decisions about future programming. Proving vs. Improving: A Brief History of Evaluation Evaluation has not always been—and still is not always—viewed as a tool to help those involved with a program to better understand and improve it. Historically, evaluation focused on proving whether a program worked, rather than on improving it to be more successful. This focus on proof has meant that “objective,” external evaluators conducted the evaluations. Research designs used rigorous scientific standards, using control or comparison groups to assess causation. Evaluations occurred at the end of a project and focused only on whether the program was a success or failure; it did not seek to learn what contributed to or hindered success. Finally, this type of evaluation often disengaged program staff and others from the evaluation process; these stakeholders rarely learned answers to their questions about a program and rarely received information to help them improve the program. Our Approach We believe evaluation can be a form of empowerment. Participatory evaluation empowers an organization to define its own success, to pose its own evaluation questions, and to involve stakeholders and constituents in the process. Rather than being imposed from the outside, evaluation can help program stakeholders identify what a program is expected to accomplish (and when), thereby making sure everyone’s expectations for the program are aligned. By looking systematically at what goes into a program, what the program is doing and producing, and what the program is achieving, this evaluation approach enables program stakeholders both to be accountable for results and to learn how to improve the program. 1 2nd edition, 1990; p. 129.
  • 5. Evaluation Plan Workbook Page 4 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Evaluation Principles2 We believe that evaluation is most effective when it: • Links to program planning and delivery. Evaluation should inform planning and implementation. Evaluation shouldn’t be done only if you have some extra time or only when you are required to do it. Rather, evaluation is a process integral to a program’s effectiveness. • Involves the participation of stakeholders. Those affected by the results of an evaluation have a right to be involved in the process. Participation will help them understand and inform the evaluation’s purpose. Participation will also promote stakeholder contribution to, and acceptance of, the evaluation results. This increases the likely use of the evaluation results for program improvement. • Supports an organization’s capacity to learn and reflect. Evaluation is not an end in itself; it should be a part of an organization’s core management processes, so it can contribute to ongoing learning. • Respects the community served by the program. Evaluation needs to be respectful of constituents and judicious in what is asked of them. Evaluation should not be something that is “done to” program participants and others affected by or associated with the program. Rather, it should draw on their knowledge and experience to produce information that will help improve programs and better meet the needs of the community. • Enables the collection of the most information with the least effort. You can’t—and don’t need to—evaluate everything! Focus on what you need to know. What are the critical pieces of information you and your stakeholders need to know to remain accountable and to improve your program? 2 Some information in this section is drawn from: Earl, Sarah et al. Outcome Mapping: Building Learning and Reflection into Development Programs. International Development Research Centre (Canada), 2002.
  • 6. Evaluation Plan Workbook Page 5 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Why Evaluate? Conducting a well-conceived and implemented evaluation will be in your interest. It will help you: • Understand and improve your program. Even the best-run programs are not always complete successes. Every program can improve; the information collected in an evaluation can provide guidance for program improvement. As you incorporate evaluation into your ongoing work, you will gain useful information and become a “learning organization” —one that is constantly gathering information, changing, and improving. • Test the theory underlying your program. The systematic data you collect about your program’s short-, intermediate and long-term achievements as well as its implementation helps you to understand whether (and under what conditions) the hypotheses underlying your program are accurate, or whether they need to be modified. • Tell your program’s story. The data collected through evaluation can provide compelling information to help you describe what your program is doing and achieving. Evaluation results provide a strong framework for making your program’s case before stakeholders, funders, and policy-makers. • Be accountable. Evaluation helps you demonstrate responsible stewardship of funding dollars. • Inform the field. Nonprofits that have evaluated and refined their programs can share credible results with the broader nonprofit community. A community that can share results can be more effective. • Support fundraising efforts. A clear understanding of your program—what you did well, and precisely how you accomplished your outcomes—helps you raise additional funds to continue your work and expand or replicate your efforts.
  • 7. Evaluation Plan Workbook Page 6 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Developing an Evaluation Plan Evaluation planning identifies and organizes questions you have about your program and plots a route to get answers. Most questions that organizations probe through evaluation are in three categories: What did we do? How well did we do it? What difference did our program make? (What changes occurred because of our program?) Your program’s logic model will form the foundation of your evaluation plan. As you look at your logic model, you will find questions about your program that you hope to answer. The purpose of evaluation planning is to identify these questions and plan a route to finding the answers. Two major forms of evaluation help answer these questions. 1) Implementation Evaluation: Are you performing the services or activities as planned? Are you reaching the intended target population? Are you reaching the intended number of participants? Is it leading to the products you expected? How do the participants perceive these services and activities? These questions are about implementation. 2) Outcomes Evaluation: Is your target audience experiencing the changes in knowledge, attitudes, behaviors, or awareness that you sought? What are the results of your work? What is it accomplishing among your target audience? These questions are about outcomes.
  • 8. Evaluation Plan Workbook Page 7 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Implementation and Outcomes: One without the Other? We believe that an effective evaluation should answer both types of questions. You can do one without the other, but you will not learn as much as if you conduct both types. In the “old days”, nonprofit evaluation focused on documenting and reporting on program activities. Nonprofits and others then assumed that if it implemented program activities as planned, desired results would occur for the individuals, families, organizations, or communities they served. In general, the nonprofit community focused on reporting on implementation to the exclusion of outcomes. In recent years, the pendulum has swung in the opposite direction: nonprofits are under pressure to measure and report outcomes, with little emphasis on implementation. The evaluation framework we use incorporates both outcomes and implementation. Programs are complex. You need to understand the degree to which you accomplished your desired outcomes. You also want to learn what aspects of your program contributed to those achievements and what barriers exist to your program getting its ideal results. This workbook will: • Lead you through the development of a plan to evaluate your program’s implementation. • Help you create a plan to evaluate your program’s outcomes. It doesn’t matter which you do first – if you would prefer to start with outcomes, please do so.
  • 9. Evaluation Plan Workbook Page 8 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Evaluating Implementation: What Did You Do? How Well Did You Do It? Activities Outputs & Implementation Questions Data Collection Method (How to Measure) Data Collection Effort (have, low, med, high) OutputsActivity Group Questions OutputsActivity Group Questions The illustration above outlines the components of your implementation evaluation plan. What You Did: Activities & Outputs Your implementation evaluation plan starts with identification of the activities of your program. The activities are the actions that the program takes to achieve desired outcomes. If your program entails many activities, you may have organized these activities into activity categories—closely related groups of activities in your program. Your outputs are the tangible products of your program’s activities. Outputs are also the evidence of your activities. In implementation evaluation, outputs are the items you will actually measure to evaluate your activities. Measuring outputs answers the question: What Did We Do? This is often the easiest and most direct process in evaluation.
  • 10. Evaluation Plan Workbook Page 9 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Build Your Evaluation Plan: Information from the Logic Model (Visit the Point K Learning Center at www.innonet.org to download our Logic Model Workbook or use the online Logic Model Builder.) • If you are working on paper: o Take your activity categories and individual activities from your logic model and place them in the “Activities” column in your Implementation Plan template. o Take your outputs from the logic model, and place them in the “Outputs” boxes in your plan. o Review your outputs and determine if you want to track any additional “products” from your activities. Include these in the outputs boxes. • If you are working online at Point K, this information will pre-fill from your Logic Model into your Evaluation Plan. How Well You Did It: Additional Evaluation Questions Documenting activities and associated outputs tells us what you did. However, that isn’t sufficient for evaluation purposes (after all, you already had your activities and outputs identified in your logic model). The purpose of implementation evaluation is to understand how well you did it. The next step is to identify other questions you have about your activities and their outputs. What information will help you better understand the implementation of your program? The following are examples of the types of questions you might consider: • Participation: Did the targeted audience participate in the activities as expected? Why? Were some individuals over- or under-represented? Why? • Quality: Were the services/materials you provided perceived as valuable by the intended audience? Were they appropriate? How did others in the field view their quality? • Satisfaction: Did those affected by your program’s services approve of them? Why? Who was most/least satisfied? • Context: What other factors influenced your ability to implement your program as planned? What political, economic, or leadership issues intervened, changing the expected outcomes in your program?
  • 11. Evaluation Plan Workbook Page 10 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Choosing Questions Brainstorm additional questions you may want to answer regarding the implementation of your program. Use “Participation/Quality/Satisfaction/ Context” only as a guide; there is no need to have one of each question for all activities, or limit your questions to these categories. Keep your list targeted to those “need-to-know” questions that will have the biggest impact on program improvement. The answers to these questions can offer rich feedback for program improvement. They move beyond a simple inventory of outputs and activities, often probing the viewpoint of those you are serving. Still, they are related to your program’s implementation rather than outcomes; they address what you did, not what changes occurred because of your program. Build Your Evaluation Plan: Insert your questions into the Questions box in your evaluation plan template, or on the “Implementation Questions” tab of the online Evaluation Plan Builder. Implementation evaluation offers important information about what you did and how well you did it. The lessons you learn can serve as benchmarks for progress against your original program plan. Perhaps you aren’t conducting the activities as planned; or You are conducting those activities, but they are not leading to the products/outputs you intended, or They did lead to the intended outputs, but the quality or satisfaction levels are not what you had hoped. This information can help you determine if you need to adjust your plan, change activities, or reconsider your theoretical assumptions. Evaluating your implementation can provide a feedback loop in the midst of your effort, before you may be able to evaluate outcomes.
  • 12. Evaluation Plan Workbook Page 11 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Evaluating Outcomes: What Difference Did You Make? These days it’s no longer acceptable for nonprofits to assume that good intentions, good-faith effort, or even exemplary program implementation will result in the desired outcomes for those we serve. It is important to spend time developing a plan to measure the achievement of outcomes. In your logic model, you identified your desired outcomes—the changes you expect to see as a result of your work. Outcomes are frequently expressed as changes in knowledge, skill, attitudes, behavior, motivation, decisions, policies, and conditions. They occur among individuals, communities, organizations, or systems. Indicators In order to evaluate how successfully you have achieved your outcomes, you will need to determine indicators for your outcomes. An indicator is the evidence or information that will tell you whether your program is achieving its intended outcomes. Indicators are measurable and observable characteristics. They answer the question: “How will we know change occurred?” We often state outcomes as abstract concepts or ambitions. Indicators are the measurement of outcomes. They are specific characteristics or behaviors that provide tangible information about those concepts or ambitions. Often, one outcome will have more than one indicator. When you develop your indicators, it may be helpful to ask: “What does the outcome look like when it occurs? How will I know if it has happened? What will I be able to see?”
  • 13. Evaluation Plan Workbook Page 12 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org An indicator should be: • Meaningful: The indicator presents information that is important to key stakeholders of the program. Keep in mind that different people can have different perceptions about what defines “success” for a program. Reaching consensus among key stakeholders regarding what success looks like is essential to ensuring buy-in to your evaluation results. • Direct: The indicator or combination of indicators captures enough of the essential components of the outcome to represent the outcome. Several indicators can be necessary to measure an outcome adequately. However, there is no standard for the number of indicators to use. While multiple indicators are often necessary, more than three or four may mean that the outcome is too complex and should be better defined. An indicator must also reflect the same type of change as the outcome. For example, if an outcome is about a change in attitude or opinion, the indicator should not reflect a behavior change. • Useful: The information provided by this indicator can be put to practical use for program improvement. • Practical to Collect: The data for the indicator shouldn’t be a burden to collect. Consider whether you can collect data about your indicator in a timely manner and at reasonable cost. Sometimes an indicator meets the other criteria described above, but the effort to collect it would be too burdensome. Our evaluation template offers you an opportunity to note the level of effort involved.
  • 14. Evaluation Plan Workbook Page 13 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Elements of a Strong Indicator Statement To assist in evaluation, a strong indicator statement should include these four elements: How much. Identify the amount of change among your target population that would indicate a successful level of achievement. This sets the target for your work; base this on an understanding of your baseline and a level of change that is reasonable for your program. Who. Specify the target population you will measure. What. Describe the condition, behavior, or characteristic that you will measure. When. Note the timeframe in which this change should occur. For example, an outcome of an economic empowerment training program may be that participants institute improved money management practices. One indicator for that outcome would be the statement: 75% of participants open a free bank checking account within six months of beginning the program. Assume that you know, through an intake process, that most of a program’s participants manage their money through expensive financial services such as payday loan businesses. If 75% of the participants open a free bank checking account within six months of beginning the program, you might view that as ONE indicator of improved money management practices. Breaking down the statement into the elements above, you see: How Much = 75% Who = program participants What = open a free bank checking account When = within six months of starting the program
  • 15. Evaluation Plan Workbook Page 14 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org The following are examples of outcomes and indicators. Outcomes Indicators New mothers increase their knowledge of child development. 75% of new mothers in the program satisfactorily complete a short survey about child development at the end of the course Target audiences increase knowledge about the signs of child abuse and neglect 50% of community focus group members can identify the signs of child abuse and neglect six months after education campaign ends Residents feel neighborhood is a safer place for children 60% of neighborhood residents report in one year that they believe the neighborhood is safer for children than it was one year before. Diversified program resources In one year, each of four funding sources (public, private corporation, individual donors, foundations) comprise at least 15% and not more than 55% of program income Increased cultural competency of legal aid attorneys 80% of legal aid attorneys self-report learning about cultural issues at end of workshop Within 6 months, 60% of clients of trained attorneys indicate attorney was knowledgeable about cultural issues Increased legislators’ awareness of policy options In six months, 45% of surveyed legislators indicate awareness of different policy options Students (K-6) demonstrate knowledge about opera By the end of class: -65% of children can identify the story of the opera being performed. -65% can describe 3 of the characters in the opera. -50% can identify the type of voice (soprano, alto, tenor, bass) of the character singing a part Youth have increased knowledge about the consequences of long- term ATOD use/abuse. At end of course -90% of participants report that they gained knowledge about the risks/harms associated with ATOD use -80% report that it is important not to use alcohol or other drugs. While developing your indicators, you may realize that your outcomes are unclear or ambiguous. This process offers an opportunity to reconsider or further clarify your outcomes. Build Your Evaluation Plan: Check over your outcomes—are they clear and unambiguous?
  • 16. Evaluation Plan Workbook Page 15 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Setting Indicator Targets An indicator indicates that your program is achieving its intended outcomes, which are changes you want to see in individuals, groups, communities or systems. To be meaningful, you should identify an amount of change that you believe demonstrates successful achievement of the related outcome. Setting targets may seem challenging. No one wants to be held accountable for unrealistic expectations. At the same time, we all want programs to lead to real change. How do you choose a realistic target? Consider these guideposts: What is your baseline? What does your experience tell you? What are standards in the field? What do stakeholders expect? Perhaps your program is new or you have no baseline information because you have never gathered this information before. In that case, your first evaluation may provide baseline information; in future years, you will want to create realistic targets to show progress. Multiple Indicators Some outcomes have just one closely related indicator. If an outcome is to increase the awareness among your members of a new service your organization provides, your sole indicator may be that 80% of your members report awareness of new service within six months. A smoking reduction program’s outcome is for participants to stop smoking. In this case, the indicator will be a percentage of participants who stop smoking after participating in the program. In that case, only one indicator will be necessary. Complex outcomes may have multiple indicators: If an outcome of a program is for participants to improve their job-seeking skills, this will require multiple indicators that would likely reflect the skills you emphasize in your program. Indicators might include the following:
  • 17. Evaluation Plan Workbook Page 16 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org --A percentage of participants who meet criteria in mock interviews at end of training course --A percentage of participants who develop quality resumes within 1 month of completing course --A percentage of participants who can identify three key sources of job listings by end of the training course When developing your logic model, you identified a chain of outcomes. This chain is useful in both clarifying your theory of change and in setting realistic expectations. The chain is also important from an evaluation standpoint— knowing what you are achieving. Each link in the chain has corresponding indicators. Short-Term Outcomes LEARNING: The knowledge parents and guardians gain from the literature & PSAs. • Increased understanding among targeted parents of the importance of childhood immunization • Increased knowledge among targeted parents of where to go to have their children immunized Indicators: 47% of targeted parents showed increased knowledge about immunization after the program 39% of participating parents could identify an accessible clinic that provides immunizations Intermediate Outcomes BEHAVIOR: The actions parents & guardians take as a result of that knowledge. • Increased number of targeted parents who take their children to be immunized Indicator: 26% increase in parents taking their children for immunizations in the target area in the year following the program Long-Term Outcomes CONDITION: The conditions that change as a result of those actions. • Increased number of children of targeted parents who continue to receive up to date immunizations • Healthier children Indicators: 86% of children in the target area who received 2- month DTaP immunizations also received 4-month DTaP immunizations 68% of children in the target area who received 2- month DTaP immunizations also received 4-year DTaP immunizations
  • 18. Evaluation Plan Workbook Page 17 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Direct versus Indirect Indicators Ensure that your indicators relate directly to the outcome you are evaluating, and are evidence of the same type of change. Below are some examples of direct versus indirect indicators for the sample outcomes. OUTCOMES INDICATORS Participating new mothers have their children immunized (behavior) Indirect: #/% of participating new mothers who are aware of importance of immunization (awareness) Direct: #/% of children of participating mothers who are up-to-date in immunizations within 1 year (behavior) Participating children understand principles of good sportsmanship (knowledge) Indirect: #/% of children who participate on teams after finishing program (unrelated behavior) Direct: #/% of participating youth who are able to identify five good sportsmanship behaviors by the end of the season (knowledge); #/% of fights and arguments among student athletes decreases each year of program (behavior, but shows knowledge in action) Targeted teens increase knowledge of certain environmental health hazards (knowledge) Indirect: #/% of students who receive brochure on topic during first 6 months of program (output) Direct: #/% of targeted students who can identify 3 health hazards at end of first year of program (knowledge) Build Your Evaluation Plan: Insert your Indicators into the Indicators box in your evaluation plan template, or on the “Indicators/Data Collection” tab of the online Evaluation Plan Builder.
  • 19. Evaluation Plan Workbook Page 18 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Data Collection Preview Of all the parts of evaluation, data collection can be the most daunting. Many of us believe we need to be statisticians or evaluation professionals to engage in quality data collection, but that simply isn’t true. This workbook will introduce you to the basic concepts of data collection, to help you complete your evaluation plans. [A separate data collection workbook will be available soon at the Point K Learning Center.] Data Collection Methods: What’s the best way to gather the information you need? So far, you have identified what you want to evaluate and what you will measure. In implementation evaluation, these are activities and their related outputs and additional questions. In outcome evaluation, these are program outcomes and their related indicators. Now you will consider methods to collect the data. Outputs, implementation questions, and indicators are what you will measure; data collection methods are how you will measure these. The goal in data collection is to minimize the number of collection instruments you use and maximize the amount of information you collect from each one! When choosing the best data collection method to obtain the information you need, consider the following: • Which methods will be least disruptive to your program and to those your serve? • Which methods can you afford and implement well? • Which methods are best suited to obtain information from your sources (considering cultural appropriateness and other contextual issues)?
  • 20. Evaluation Plan Workbook Page 19 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org The most common data collection strategies fall into the following broad categories. 1. Review documents Analysis of printed material including program records, research reports, census data, health records, budgets. Document review is a common method of collecting data about activities and outputs for implementation evaluation. 2. Observe Observe situations, behaviors and activities in a formalized and systematic way, usually using observational checklists and trained observers. This is a good method to use in settings where experiencing actual events or settings (rather than hearing about them) is an important part of the evaluation. 3. Talk to people Collect verbal responses from participants and other stakeholders through interviews (in-person or phone) or focus groups. This method is helpful when it is important to hear complex or highly individual thoughts of a certain group of individuals. 4. Collect written responses from people Collect written responses through surveys (in-person, e-mail, online, mail, phone), tests, or journals/logs. Except in the case of journals, this method is often used when you need a lot of information from a large number of people or when it is important that identical information be available from all respondents. 5. Other methods Review pictorial/multi-media data in photographs, audiotapes, compact discs, visual artwork. Conduct expert or peer reviews in which professionals in the field with specific expertise assess a set of activities or products. Use a case study, an intensive investigation of one unit to use for learning purposes, often as an exemplar or model to be avoided. Brainstorm: Consider data collection methods for each item you will measure.
  • 21. Evaluation Plan Workbook Page 20 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Ease of Data Collection When you do identify data collection methods, the next step is to identify the level of effort required to collect the data. Consider the cost and time required to create new data collection tools. Also, consider the cost involved in actually collecting and analyzing the data. Some methods are more expensive than others. For example, interviews take more time (and therefore resources) than surveys. For many methods you’ll need database software. For each data collection method you identify, consider whether you already have a tool in place that you could use (such as an intake survey). If not, think about the amount of effort required to create and use the new data collection tool. Assess whether it will require a “low,” “medium,” or “high” level of effort. Build Your Evaluation Plan: Assess the level of effort required to collect the data using the methods you have identified so far. Use the scale: have (for those methods your organization already has available); low; medium; or high. Enter your data collection methods and their corresponding levels of effort into your evaluation plan template or the online Evaluation Plan Builder.
  • 22. Evaluation Plan Workbook Page 21 of 23 Innovation Network, Inc. www.innonet.org • info@innonet.org Review Your Plan Together, your implementation and outcome templates form one evaluation plan. Take time to review your plan so far to make sure it would lead to a worthwhile evaluation. Here are some tips for reviewing your plan: • For implementation evaluation, have you identified additional questions that will help you improve the quality of the activities you are conducting? • Have you narrowed that list of questions to those few “need-to-know” topics? • Have you identified indicators for all your outcomes? Has any outcome required more than three indicators? If so, consider whether that outcome needs to be re-defined or separated into several intended changes. • Do most or all of your indicator statements include the key elements of “who,” “how much,” “what” and by “when”? • Have you identified reasonable targets for you indicators? If you have any questions about program planning or evaluation, or are interested in our in-person services, please visit our website, www.innonet.org or contact us at: Innovation Network, Inc. 1625 K St., NW, 11t h Floor Washington, DC 20006 (202) 728-0727 info@innonet.org www.innonet.org • Copyright Policy Innovation Network, Inc. grants permission for noncommercial, share-alike use of these materials provided that attribution is given to Innovation Network, Inc. For more information, please see our Creative Commons License.
  • 23. Implementation Evaluation Plan Innovation Network, Inc. www.innonet.org • info@innonet.org Activities Outputs & Implementation Questions Data Collection Method Data Collection Effort (Have, Low, Medium, High) OutputsProgram Component Questions Program Component Outputs
  • 24. Implementation Evaluation Plan Innovation Network, Inc. www.innonet.org • info@innonet.org Activities Outputs & Implementation Questions Data Collection Method Data Collection Effort (Have, Low, Medium, High) Questions
  • 25. Outcomes Evaluation Plan Innovation Network, Inc. www.innonet.org • info@innonet.org Outcomes Indicators Data Collection Method Data Collection Effort (Have, Low, Medium, High)