SlideShare a Scribd company logo
1 of 77
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed.
Thus,
process evaluation asks "what," and outcome evaluation asks,
"so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own
process
evaluation for a program of your choosing. There are many
steps involved
in the implementation of a process evaluation, and this
workbook will
attempt to direct you through some of the main stages. It will be
helpful to
think of a delivery service program that you can use as your
example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability
in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the
intervention
and the outcomes
5. To provide information on what components of the
intervention
are responsible for outcomes
6. To understand the relationship between program context
(i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of
implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public,
clients,
and funders
10. To improve the quality of the program, as the act of
evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid 30
c. References 31
Evaluation can be an exciting,
challenging, and fun experience
Enjoy!
* Previously covered in Evaluation Planning Workshops.
** Will not be covered in this expert session. Please refer to
the Evaluation Framework
and Evaluation Module of FHB Best Practice Manual for more
details.
Evaluation Expert Session
July 16, 2002 Page 3
Forming collaborative relationships
A strong, collaborative relationship with program delivery staff
and management will
likely result in the following:
Feedback regarding evaluation design and implementation
Ease in conducting the evaluation due to increased cooperation
Participation in interviews, panel discussion, meetings, etc.
Increased utilization of findings
Seek to establish a mutually respectful relationship
characterized by trust, commitment,
and flexibility.
Key points in establishing a collaborative
relationship:
Start early. Introduce yourself and the evaluation team to as
many delivery staff and
management personnel as early as possible.
Emphasize that THEY are the experts, and you will be utilizing
their knowledge and
information to inform your evaluation development and
implementation.
Be respectful of their time both in-person and on the
telephone. Set up meeting places
that are geographically accessible to all parties involved in the
evaluation process.
Remain aware that, even if they have requested the evaluation,
it may often appear as
an intrusion upon their daily activities. Attempt to be as
unobtrusive as possible and
request their feedback regarding appropriate times for on-site
data collection.
Involve key policy makers, managers, and staff in a series of
meetings throughout the
evaluation process. The evaluation should be driven by the
questions that are of
greatest interest to the stakeholders. Set agendas for meetings
and provide an
overview of the goals of the meeting before beginning. Obtain
their feedback and
provide them with updates regarding the evaluation process.
You may wish to
obtained structured feedback. Sample feedback forms are
throughout the workbook.
Provide feedback regarding evaluation findings to the key
policy makers, managers,
and staff when and as appropriate. Use visual aids and
handouts. Tabulate and
summarize information. Make it as interesting as possible.
Consider establishing a resource or expert "panel" or advisory
board that is an official
group of people willing to be contacted when you need feedback
or have questions.
Evaluation Expert Session
July 16, 2002 Page 4
Determining Program Components
Program components are identified by answering the questions
who, what, when, where,
and how as they pertain to your program.
Who: the program clients/recipients and staff
What: activities, behaviors, materials
When: frequency and length of the contact or intervention
Where: the community context and physical setting
How: strategies for operating the program or intervention
BRIEF EXAMPLE:
Who: elementary school students
What: fire safety intervention
When: 2 times per year
Where: in students’ classroom
How: group administered intervention, small group practice
1. Instruct students what to do in case of fire (stop, drop and
roll).
2. Educate students on calling 911 and have them practice on
play telephones.
3. Educate students on how to pull a fire alarm, how to test a
home fire alarm and how to
change batteries in a home fire alarm. Have students practice
each of these activities.
4. Provide students with written information and have them take
it home to share with their
parents. Request parental signature to indicate compliance and
target a 75% return rate.
Points to keep in mind when determining program
components
Specify activities as behaviors that can be observed
If you have a logic model, use the "activities" column as a
starting point
Ensure that each component is separate and distinguishable
from others
Include all activities and materials intended for use in the
intervention
Identify the aspects of the intervention that may need to be
adapted, and those that should
always be delivered as designed.
Consult with program staff, mission statements, and program
materials as needed.
Evaluation Expert Session
July 16, 2002 Page 5
Your Program Components
After you have identified your program components, create a
logic model that graphically
portrays the link between program components and outcomes
expected from these
components.
Now, write out a succinct list of the components of your
program.
WHO:
WHAT:
WHEN:
WHERE:
HOW:
Evaluation Expert Session
July 16, 2002 Page 6
What is a Logic Model
A logical series of statements that link the problems your
program is attempting to
address (conditions), how it will address them (activities), and
what are the expected
results (immediate and intermediate outcomes, long-term goals).
Benefits of the logic model include:
helps develop clarity about a project or program,
helps to develop consensus among people,
helps to identify gaps or redundancies in a plan,
helps to identify core hypothesis,
helps to succinctly communicate what your project or program
is about.
When do you use a logic model
Use...
- During any work to clarify what is being done, why, and with
what intended results
- During project or program planning to make sure that the
project or program is logical and
complete
- During evaluation planning to focus the evaluation
- During project or program implementation as a template for
comparing to the actual program
and as a filter to determine whether proposed changes fit or
not.
This information was extracted from the Logic Models: A
Multi-Purpose Tool materials developed by Wellsys
Corporation for the Evaluation Planning Workshop Training.
Please see the Evaluation Planning Workshop
materials for more information. Appendix A has a sample
template of the tabular format.
Evaluation Expert Session
July 16, 2002 Page 7
Determining Evaluation Questions
As you design your process evaluation, consider what questions
you would like to answer. It is only after
your questions are specified that you can begin to develop your
methodology. Considering the importance
and purpose of each question is critical.
BROADLY....
What questions do you hope to answer? You may wish to turn
the program components that you have just identified
into questions assessing:
Was the component completed as indicated?
What were the strengths in implementation?
What were the barriers or challenges in implementation?
What were the apparent strengths and weaknesses of each step
of the intervention?
Did the recipient understand the intervention?
Were resources available to sustain project activities?
What were staff perceptions?
What were community perceptions?
What was the nature of the interaction between staff and
clients?
These are examples. Check off what is applicable to you, and
use the space below to write additional broad,
overarching questions that you wish to answer.
Evaluation Expert Session
July 16, 2002 Page 8
SPECIFICALLY ...
Now, make a list of all the specific questions you wish to
answer, and organize your questions categorically. Your
list of questions will likely be much longer than your list of
program components. This step of developing your
evaluation will inform your methodologies and instrument
choice.
Remember that you must collect information on what the
program is intended to be and what it is in reality, so you
may need to ask some questions in 2 formats.
For example:
How many people are intended to complete this intervention
per week?"
How many actually go through the intervention during an
average week?"
Consider what specific questions you have. The questions below
are only examples! Some may not be appropriate
for your evaluation, and you will most likely need to add
additional questions. Check off the questions that are
applicable to you, and add your own questions in the space
provided.
WHO (regarding client):
Who is the target audience, client, or recipient?
How many people have participated?
How many people have dropped out?
How many people have declined participation?
What are the demographic characteristics of clients?
Race
Ethnicity
National Origin
Age
Gender
Sexual Orientation
Religion
Marital Status
Employment
Income Sources
Education
Socio-Economic Status
What factors do the clients have in common?
What risk factors do clients have?
Who is eligible for participation?
How are people referred to the program? How are the
screened?
How satisfied are the clients?
YOUR QUESTIONS:
Evaluation Expert Session
July 16, 2002 Page 9
WHO (Regarding staff):
Who delivers the services?
How are they hired?
How supportive are staff and management of each other?
What qualifications do staff have?
How are staff trained?
How congruent are staff and recipients with one another?
What are staff demographics? (see client demographic list for
specifics.)
YOUR QUESTIONS:
WHAT:
What happens during the intervention?
What is being delivered?
What are the methods of delivery for each service (e.g., one-
on-one, group session, didactic instruction,
etc.)
What are the standard operating procedures?
What technologies are in use?
What types of communication techniques are implemented?
What type of organization delivers the program?
How many years has the organization existed? How many
years has the program been operating?
What type of reputation does the agency have in the
community? What about the program?
What are the methods of service delivery?
How is the intervention structured?
How is confidentiality maintained?
YOUR QUESTIONS:
WHEN:
When is the intervention conducted?
How frequently is the intervention conducted?
At what intervals?
At what time of day, week, month, year?
What is the length and/or duration of each service?
Evaluation Expert Session
July 16, 2002 Page 10
YOUR QUESTIONS:
WHERE:
Where does the intervention occur?
What type of facility is used?
What is the age and condition of the facility?
In what part of town is the facility? Is it accessibl e to the
target audience? Does public transportation access
the facility? Is parking available?
Is child care provided on site?
YOUR QUESTIONS:
WHY:
Why are these activities or strategies implemented and why
not others?
Why has the intervention varied in ability to maintain interest?
Why are clients not participating?
Why is the intervention conducted at a certain time or at a
certain frequency?
YOUR QUESTIONS:
Evaluation Expert Session
July 16, 2002 Page 11
Validating Your Evaluation Questions
Even though all of your questions may be interesting, it is
important to narrow your list to questions that
will be particularly helpful to the evaluation and that can be
answered given your specific resources, staff,
and time.
Go through each of your questions and consider it with respect
to the questions below, which may be helpful in
streamlining your final list of questions.
Revise your worksheet/list of questions until you can answer
"yes" to all of these questions. If you cannot answer
"yes" to your question, consider omitting the question from your
evaluation.
Validation
Yes
No
Will I use the data that will stem from these questions?
Do I know why each question is important and /or valuable?
Is someone interested in each of these questions?
Have I ensured that no questions are omitted that may be
important to
someone else?
Is the wording of each question sufficiently clear and
unambiguous?
Do I have a hypothesis about what the “correct” answer will be
for each
question?
Is each question specific without inappropriately limiting the
scope of the
evaluation or probing for a specific response?
Do they constitute a sufficient set of questions to achieve the
purpose(s) of
the evaluation?
Is it feasible to answer the question, given what I know about
the
resources for evaluation?
Is each question worth the expense of answering it?
Derived from "A Design Manual" Checklist, page 51.
Evaluation Expert Session
July 16, 2002 Page 12
Determining Methodology
Process evaluation is characterized by collection of data
primarily through two formats:
1) Quantitative, archival, recorded data that may be managed
by an computerized
tracking or management system, and
2) Qualitative data that may be obtained through a variety of
formats, such as
surveys or focus groups.
When considering what methods to use, it is critical to have a
thorough
understanding and knowledge of the questions you want
answered. Your
questions will inform your choice of methods. After this section
on types of
methodologies, you will complete an exercise in which you
consider what method
of data collection is most appropriate for each question.
Do you have a thorough understanding of your
questions?
Furthermore, it is essential to consider what data the
organization you are
evaluating already has. Data may exist in the form of an
existing computerized
management information system, records, or a tracking system
of some other
sort. Using this data may provide the best reflection of what is
"going on," and it
will also save you time, money, and energy because you will not
have to devise
your own data collection method! However, keep in mind that
you may have to
adapt this data to meet your own needs - you may need to add or
replace fields,
records, or variables.
What data does your organization already have?
Will you need to adapt it?
If the organization does not already have existing data, consider
devising a
method for the organizational staff to collect their own data.
This process will
ultimately be helpful for them so that they can continue to self-
evaluate, track
their activities, and assess progress and change. It will be
helpful for the
evaluation process because, again, it will save you time, money,
and energy that
you can better devote towards other aspects of the evaluation.
Management
information systems will be described more fully in a later
section of this
workbook.
Do you have the capacity and resources to devise
such a system? (You may need to refer to a later
section of this workbook before answering.)
Evaluation Expert Session
July 16, 2002 Page 13
Who should collect the data?
Given all of this, what thoughts do you have on who should
collect data for your
evaluation? Program staff, evaluation staff, or some
combination?
Program Staff: May collect data from activities such as
attendance, demographics,
participation, characteristics of participants, dispositions, etc;
may
conduct intake interviews, note changes regarding service
delivery,
and monitor program implementation.
Advantages: Cost-efficient, accessible, resourceful, available,
time-efficient,
and increased understanding of the program.
Disadvantages: May exhibit bias and/or social desirability, may
use data for critical
judgment, may compromise the validity of the program; may put
staff in uncomfortable or inappropriate position; also, if staff
collect
data, may have an increased burden and responsibility placed
upon
them outside of their usual or typical job responsibilities. If you
utilize staff for data collection, provide frequent reminders as
well
as messages of gratitude.
Evaluation staff: May collect qualitative information regarding
implementation,
general characteristics of program participants, and other
information that may otherwise be subject to bias or distortion.
Advantages: Data collected in manner consistent with overall
goals and timeline
of evaluation; prevents bias and inappropriate use of
information;
promotes overall fidelity and validity of data.
Disadvantages: May be costly and take extensive time; may
require additional
training on part of evaluator; presence of evaluator in
organization
may be intrusive, inconvenient, or burdensome.
Evaluation Expert Session
July 16, 2002 Page 14
When should data be collected?
Conducting the evaluation according to your timeline can be
challenging. Consider how
much time you have for data collection, and make decisions
regarding what to collect
and how much based on your timeline.
In many cases, outcome evaluation is not considered appropriate
until the program has
stabilized. However, when conducting a process evaluation, it
can be important to start
the evaluation at the beginning so that a story may be told
regarding how the program
was developed, information may be provided on refinements,
and program growth and
progress may be noted.
If you have the luxury of collecting data from the start of the
intervention to the end of
the intervention, space out data collection as appropriate. If you
are evaluating an
ongoing intervention that is fairly quick (e.g., an 8-week
educational group), you may
choose to evaluate one or more "cycles."
How much time do you have to conduct your evaluation?
How much time do you have for data collection (as opposed to
designing the evaluation,
training, organizing and analyzing results, and writing the
report?)
Is the program you are evaluating time specific?
How long does the program or intervention last?
At what stages do you think you will most likely collect data?
Soon after a program has begun
Descriptive information on program characteristics that will not
change; information
requiring baseline information
During the intervention
Ongoing process information such as recruitment, program
implementation
After the intervention
Demographics, attendance ratings, satisfaction ratings
Evaluation Expert Session
July 16, 2002 Page 15
Before you consider methods
A list of various methods follows this section. Before choosing
what methods are
most appropriate for your evaluation, review the following
questions. (Some may
already be answered in another section of this workbook.)
What questions do I want answered? (see previous section)
Does the organization already have existing data, and if so,
what kind?
Does the organization have staff to collect data?
What data can the organization staff collect?
Must I maintain anonymity (participant is not identified at all)
or confidentiality
(participant is identified but responses remain private)? This
consideration
pertains to existing archival data as well as original data
collection.
How much time do I have to conduct the evaluation?
How much money do I have in my budget?
How many evaluation staff do I have to manage the data
collection activities?
Can I (and/or members of my evaluation staff) travel on site?
What time of day is best for collecting data? For example, if
you plan to conduct
focus groups or interviews, remember that your population may
work during the
day and need evening times.
Evaluation Expert Session
July 16, 2002 Page 16
Types of methods
A number of different methods exist that can be used to collect
process
information. Consider each of the following, and check those
that you think would
be helpful in addressing the specific questions in your
evaluation. When "see
sample" is indicated, refer to the pages that follow this table.
√ Method Description
Activity,
participation, or
client tracking log
Brief record completed on site at frequent intervals by
participant or deliverer.
May use form developed by evaluator if none previously exists.
Examples: sign
in log, daily records of food consumption, medication
management.
Case Studies
Collection of in-depth information regarding small number of
intervention
recipients; use multiple methods of data collection.
Ethnographic
analysis
Obtain in-depth information regarding the experience of the
recipient by
partaking in the intervention, attending meetings, and talking
with delivery staff
and recipients.
Expert judgment
Convene a panel of experts or conduct individual interviews to
obtain their
understanding of and reaction to program delivery.
Focus groups
Small group discussion among program delivery staff or
recipients. Focus on
their thoughts and opinions regarding their experiences with the
intervention.
Meeting minutes
(see sample)
Qualitative information regarding agendas, tasks assigned, and
coordination and
implementation of the intervention as recorded on a consistent
basis.
Observation
(see sample)
Observe actual delivery in vivo or on video, record findings
using check sheet
or make qualitative observations.
Open-ended
interviews –
telephone or in
person
Evaluator asks open questions (i.e., who, what, when, where,
why, how) to
delivery staff or recipients. Use interview protocol without
preset response
options.
Questionnaire
Written survey with structured questions. May administer in
individual, group,
or mail format. May be anonymous or confidential.
Record review
Obtain indicators from intervention records such patient files,
time sheets,
telephone logs, registration forms, student charts, sales records,
or records
specific to the service delivery.
Structured
interviews –
telephone or in
person
Interviewer asks direct questions using interview protocol with
preset response
options.
Evaluation Expert Session
July 16, 2002
Page 17
Sample activity log
This is a common process evaluation method ology because it
systematically records exactly what is happening during
implementation. You may wish to devise a log such as the one
below and alter it to meet your specific needs. Consider
computerizing such a log for efficiency. Your program may
already have existing logs that you can utilize and adapt for
your
evaluation purposes.
Site:
Recorder:
Code
Service
Date
Location
# People
# Hours
Notes
Evaluation Expert Session
July 16, 2002
Page 18
Meeting Minutes
Taking notes at meetings may provide extensive and invaluable
process information that
can later be organized and structured into a comprehensive
report. Minutes may be taken
by program staff or by the evaluator if necessary. You may find
it helpful to use a
structured form, such as the one below that is derived from
Evaluating Collaboratives,
University of Wisconsin-Cooperative Extension, 1998.
Meeting Place: __________________ Start time: ____________
Date: _____________________________ End time:
____________
Attendance (names):
Agenda topic:
_________________________________________________
Discussion:
_____________________________________________________
Decision Related Tasks Who responsible Deadline
1.
2.
3.
Agenda topic:
_________________________________________________
Discussion:
_____________________________________________________
Decision Related Tasks Who responsible Deadline
1.
2.
3.
Sample observation log
Evaluation Expert Session
July 16, 2002
Page 19
Observation may occur in various methods, but one of the most
common is
hand-recording specific details during a small time period. The
following is several rows
from an observation log utilized during an evaluation examining
school classrooms.
CLASSROOM OBSERVATIONS (School Environment Scale)
Classroom 1: Grade level _________________ (Goal: 30
minutes of observation)
Time began observation: _________Time ended
observation:_________
Subjects were taught during observation period:
___________________
PHYSICAL ENVIRONMENT
Question
Answer
1. Number of students
2. Number of adults in room:
a. Teachers
b. Para-pros
c. Parents
Total:
a.
b.
c.
3. Desks/Tables
a. Number of Desks
b. Number of Tables for students’ use
c. Any other furniture/include number
(Arrangement of desks/tables/other furniture)
a.
b.
c.
4. Number of computers, type
5. How are computers being used?
6. What is the general classroom setup? (are there walls,
windows, mirrors,
carpet, rugs, cabinets, curtains, etc.)
7. Other technology (overhead projector, power point, VCR,
etc.)
8. Are books and other materials accessible for students?
9. Is there adequate space for whole-class instruction?
12. What type of lighting is used?
13. Are there animals or fish in the room?
14. Is there background music playing?
15. Rate the classroom condition
Poor Average Excellent
16. Are rules/discipline procedures posted? If so, where?
17. Is the classroom Noisy or Quiet?
Very Quiet Very Noisy
Choosing or designing measurement instruments
Consider using a resource panel, advisory panel, or focus group
to offer feedback
Evaluation Expert Session
July 16, 2002
Page 20
regarding your instrument. This group may be composed of any
of the people listed
below. You may also wish to consult with one or more of these
individuals throughout
the development of your overall methodology.
Who should be involved in the design of your instrument(s)
and/or provide feedback?
Program service delivery staff / volunteers
Project director
Recipients of the program
Board of directors
Community leader
Collaborating organizations
Experts on the program or service being evaluated
Evaluation experts
_________________________
_________________________
_________________________
Conduct a pilot study and administer the instrument to a group
of recipients, and then
obtain feedback regarding their experience. This is a critical
component of the
development of your instruments, as it will help ensure clarity
of questions, and reduce
the degree of discomfort or burden that questions or processes
(e.g., intakes or
computerized data entry) elicit.
How can you ensure that you pilot your methods? When will
you do it, and whom will you use
as participants in the study?
Ensure that written materials are at an appropriate reading
level for the population.
Ensure that verbal information is at an appropriate terminology
level for the population.
A third or sixth-grade reading level is often utilized.
Remember that you are probably collecting data that is
program-specific. This may
increase the difficulty in finding instruments previously
constructed to use for
questionnaires, etc. However, instruments used for conducti ng
process evaluations of
other programs may provide you with ideas for how to structure
your own instruments.
Evaluation Expert Session
July 16, 2002
Page 21
Linking program components and methods (an example)
Now that you have identified your program components, broad
questions, specific
questions, and possible measures, it is time to link them
together. Let's start with your
program components. Here is an example of 3 program
components of an intervention.
Program Components and Essential Elements:
There are six program components to M2M. There
are essential elements in each component that must
be present for the program to achieve its intended
results and outcomes, and for the program to be
identified as a program of the American Cancer
Society.
Possible Process Measures
1) Man to Man Self-Help and/or Support Groups
The essential elements within this component are:
• Offer information and support to all men
with prostate cancer at all points along the
cancer care continuum
• Directly, or through collaboration and
referral, offer community access to
prostate cancer self-help and/or support
groups
• Provide recruitment and on-going training
and monitoring for M2M leaders and
volunteers
• Monitor, track and report program
activities
• Descriptions of attempts to schedule and advertise
group meetings
• Documented efforts to establish the program
• Documented local needs assessments
• # of meetings held per independent group
• Documented meetings held
• # of people who attended different topics and speakers
• Perceptions of need of survey participants for
additional groups and current satisfaction levels
• # of new and # of continuing group members
• Documented sign-up sheets for group meetings
• Documented attempts to contact program dropouts
• # of referrals to other PC groups documented
• # of times corresponding with other PC groups
• # of training sessions for new leaders
• # of continuing education sessions for experienced
leaders
• # and types of other on-going support activities for
volunteer leaders
• # of volunteers trained as group facilitators
• Perceptions of trained volunteers for readiness to
function as group facilitators
Evaluation Expert Session
July 16, 2002
Page 22
2) One-to-One Contacts
The essential elements within this component are:
• Offer one-to-one contact to provide
information and support to all men with
prostate cancer, including those in the
diagnostic process
• Provide recruitment and on-going training
and monitoring for M2M leaders and
volunteers
• Monitor, track and report program
activities
• # of contact pairings
• Frequency and duration of contact pairings
• Types of information shared during contact pairings
• # of volunteers trained
• Perception of readiness by trained volunteers
• Documented attempts for recruiting volunteers
• Documented on-going training activities for volunteers
• Documented support activities
3) Community Education and Awareness
The essential elements within this component are:
• Conduct public awareness activities to
inform the public about prostate cancer
and M2M
• Monitor, track and report program
activities
• # of screenings provided by various health care
providers/agencies over assessment period
• Documented ACS staff and volunteer efforts to
publicize the availability and importance of PC and
screenings, including health fairs, public service
announcements, billboard advertising, etc.
• # of addresses to which newsletters are mailed
• Documented efforts to increase newsletter mailing list
Page 23
Linking YOUR program components, questions, and methods
Consider each of your program components and questions that
you have devised in an earlier section of this workbook, and the
methods that you checked off on the "types of methods" table.
Now ask yourself, how will I use the information I have
obtained from this question? And, what method is most
appropriate for obtaining this information?
Program Component
Specific questions that go with this
component
How will I use this
information?
Best method?
Page 24
Program Component
Specific questions that go with this
component
How will I use this
information?
Best method?
Evaluation Expert Session
July 16, 2002
Page 25
Data Collection Plan
Now let's put your data collection activities on one sheet - what
you're collecting, how you're doing it, when, your sample, and
who will collect it. Identifying your methods that you have just
picked, instruments, and data collection techniques in a
structured manner will facilitate this process.
Method
Type of data (questions, briefly
indicated)
Instrument used
When
implemented
Sample
Who collects
E.g.: Patient
interviews in health
dept clinics
Qualitative - what services they are
using, length of visit, why came in,
how long wait, some quantitative
satisfaction ratings
Interview created
by evaluation team
and piloted with
patients
Oct-Dec; days
and hrs
randomly
selected
10 interviews
in each
clinic
Trained
interviewers
Page 26
Evaluation Expert Session
July 16, 2002
Consider a Management Information System
Process data is frequently collected through a management
information system (MIS) that
is designed to record characteristics of participants,
participation of participants, and
characteristics of activities and services provided. An MIS is a
computerized record
system that enables service providers and evaluators to
accumulate and display data
quickly and efficiently in various ways.
Will your evaluation be enhanced by periodic data presentations
in tables or other
structured formats? For example, should the evaluation utilize a
monthly print-out of
services utilized or to monitor and process recipient tracking
(such as date, time, and
length of service)?
YES
NO
Does the agency create monthly (or other periodic) print outs
reflecting
services rendered or clients served?
YES
NO
Will the evaluation be conducted in a more efficient manner if
program
delivery staff enter data on a consistent basis?
YES
NO
Does the agency already have hard copies of files or records
that would be
better utilized if computerized?
YES
NO
Does the agency already have an MIS or a similar computerized
database?
YES
NO
If the answers to any of these questions are YES,
consider using an MIS for your evaluation.
If an MIS does not already exist, you may desire to design a
database in which you can
enter information from records obtained by the agency. Thi s
process decreases missing
data and is generally efficient.
If you do create a database that can be used on an ongoing
basis by the agency, you may
consider offering it to them for future use.
Page 27
Evaluation Expert Session
July 16, 2002
Information to be included in your MIS
Examples include:
Client demographics
Client contacts
Client services
Referrals offered
Client outcomes
Program activities
Staff notes
Jot down the important data you would like to be included in
your MIS.
Managing your MIS
What software do you wish to utilize to manage your data?
What type of data do you have?
How much information will you need to enter?
How will you ultimately analyze the data? You may wish to
create a database directly in
the program you will eventually use, such as SPSS?
Will you be utilizing lap tops?
Page 28
Evaluation Expert Session
July 16, 2002
If so, will you be taking them onsite and directly entering your
data into them?
How will you download or transfer the information, if
applicable?
What will the impact be on your audience if you have a laptop?
Tips on using an MIS
If service delivery personnel will be collecting and/or entering
information into the MIS
for the evaluator's use, it is generally a good idea to provide
frequent reminders of the
importance of entering the appropriate information in a timely,
consistent, and regular
manner.
For example, if an MIS is dependent upon patient data
collected by public health officers
daily activities, the officers should be entering data on at least a
daily basis. Otherwise,
important data is lost and the database will only reflect what
was salient enough to be
remembered and entered at the end of the week.
Don't forget that this may be burdensome and/or inconvenient
for the program staff.
Provide them with frequent thank you's.
Remember that your database is only as good as you make it.
It must be organized and
arranged so that it is most helpful in answering your questions.
If you are collecting from existing records, at what level is he
data currently available?
For example, is it state, county, or city information? How is it
defined? Consider whether
adaptations need to be made or additions need to be included for
your evaluation.
Back up your data frequently and in at least one additional
format (e.g., zip, disk, server).
Consider file security. Will you be saving data on a network
server? You may need to
consider password protection.
Page 29
Evaluation Expert Session
July 16, 2002
Allocate time for data entry and checking.
Allow additional time to contemplate the meaning of the data
before writing the report.
Page 30
Evaluation Expert Session
July 16, 2002
Implement Data Collection and Analysis
Data collection cannot be fully reviewed in this workbook, but
this page offers a few tips
regarding the process.
General reminders:
THANK everyone who helps you, directs you, or participates
in anyway.
Obtain clear directions and give yourself plenty of time,
especially if you are traveling
long distance (e.g., several hours away).
Bring all of your own materials - do not expect the program to
provide you with writing
utensils, paper, a clipboard, etc.
Address each person that you meet with respect and attempt to
make your meeting as
conducive with their schedule as possible.
Most process evaluation will be in the form of routine record
keeping (e.g., MIS). However, you
may wish to interview clients and staff. If so:
Ensure that you have sufficient time to train evaluation staff,
data collectors, and/or
organization staff who will be collecting data. After they have
been trained in the data
collection materials and procedure, require that they practice
the technique, whether it is
an interview or entering a sample record in an MIS.
If planning to use a tape recorder during interviews or focus
groups, request permission
from participants before beginning. You may need to turn the
tape recorder off on
occasion if it will facilitate increased comfort by participants.
If planning to use laptop computers, attempt to make
consistent eye contact and spend
time establishing rapport before beginning. Some participants
may be uncomfortable with
technology and you may need to provide education regarding
the process of data
collection and how the information will be utilized.
If planning to hand write responses, warn the participant that
you may move slowly and
Page 31
Evaluation Expert Session
July 16, 2002
may need to ask them to repeat themselves. However, prepare
for this process by
developing shorthand specific to the evaluation. A sample
shorthand page follows.
Page 32
Evaluation Expert Session
July 16, 2002
Annual Evaluation Reports
The ultimate aim of all the Branch’s evaluation efforts is to
increase the intelligent use of
information in Branch decision-making in order to improve
health outcomes. Because we
understand that many evaluation efforts fail because the data are
never collected and that even
more fail because the data are collected but never used in
decision-making, we have struggled to
find a way to institutionalize the use of evaluation results in
Branch decision-making.
These reports will serve multiple purposes:
The need to complete the report will increase the likelihood
that evaluation is done and
data are collected.
The need to review reports from lower levels in order to
complete one’s own report
hopefully will cause managers at all levels to consciously
consider, at least once a year,
the effectiveness of their activities and how evaluation results
suggest that effectiveness
can be improved.
The summaries of evaluation findings in the reports should
simplify preparation of other
reports to funders including the General Assembly.
Each evaluation report forms the basis of the evaluation report
at the next level. The contents
and length of the report should be determined by what is mot
helpful to the manager who is
receiving the report. Rather than simply reporting every
possible piece of data, these reports
should present summary data, summarize important conclusions,
and suggest recommendations
based on the evaluation findings. A program-level annual
evaluation report should be ten pages
or less. Many my be less than five pages. Population team and
Branch-level annual evaluation
reports may be longer than ten pages, depending on how many
findings are being reported.
However, reports that go beyond ten pages should also contain a
shorter Executive Summary, to
insure that those with the power to make decisions actually read
the findings.
Especially, the initial reports may reflect formative work and
consist primarily of updates on the
progress of evaluation planning and implementation. This is
fine and to be expected.
However, within a year or two the reports should begin to
include process data, and later actual
outcome findings.
This information was extracted from the FHB Evaluation
Framework developed by Monica Herk and Rebekah Hudgins.
Page 33
Evaluation Expert Session
July 16, 2002
Suggested shorthand - a sample
The list below was derived for a process evaluation regarding
charter schools. Note the use of general shorthand as
well as shorthand derived specifically for the evaluation.
CS
Charter School
mst
Most
Sch School b/c Because
Tch Teacher, teach st Something
P Principal b Be
VP Vice Principal c See
Admin Administration, administrators r Are
DOE Dept of Education w/ When
BOE Board of Education @ At
Comm Community ~ About
Stud Students, pupils = Is, equals, equivalent
Kids Students, children, teenagers ≠ Does not equal, is not the
same
K Kindergarten Sone Someone
Cl Class # Number
CR Classroom $ Money, finances, financial, funding,
expenses, etc.
W White + Add, added, in addition
B Black < Less than
AA African American > Greater/more than
SES Socio-economic status ??? What does this mean? Get more
info on, I'm confused…
Lib Library, librarian DWA Don't worry about (e.g. if you wrote
something unnecessary)
Caf Cafeteria Ψ Psychology, psychologist
Ch Charter ∴ Therefore
Conv Conversion (school) ∆ Change, is changing
S-up Start up school mm Movement
App Application, applied ↑ Increases, up, promotes
ITBS Iowa Test of Basic Skills ↓ Decreases, down, inhibits
LA Language arts X Times (e.g. many x we laugh)
SS Social Studies ÷ Divided (we ÷ up the classrooms)
QCC Quality Core Curriculum C With
Pol Policy, politics Home, house
Curr Curriculum ♥ Love, adore (e.g. the kids ♥ this)
LP Lesson plans Church, religious activity
Disc Discipline O No, doesn't, not
Girls, women, female 1/2 Half (e.g. we took 1/2)
Boys, men, male 2 To
Page 34
Evaluation Expert Session
July 16, 2002
F
Father, dad
c/out
without
P
Parent
2B
To be
M
Mom, mother
e.g.
For example
i.e.
That is
…
If the person trails off, you missed
information
Appendix A
Logic Model Worksheet
Population Team/Program Name
__________________________ Date
_______________________
If the following
CONDITIONS
AND
ASSUMPTIONS
exist...
And if the following
ACTIVITIES are
implemented to
address these
conditions and
assumptions
Then these
SHORT-TERM
OUTCOMES may
be achieved...
And these
LONG-TERM
OUTCOMES
may be
acheived...
And these LONG-
TERM GOALS can
be reached....
Page 35
Evaluation Expert Session
July 16, 2002
Appendix B
Pitfalls To Avoid
Avoid heightening expectations of delivery staff, program
recipients, policy makers, or
community members. Ensure that feedback will be provided as
appropriate, but may or may
not be utilized.
Avoid any implication that you are evaluating the impact or
outcome. Stress that you are
evaluating "what is happening," not how well any one person is
performing or what the
outcomes of the intervention are.
Make sure that the right information gets to the right people -
it is most likely to be utilized
in a constructive and effective manner if you ensure that your
final report does not end up on
someone's desk who has little motivation or interest in utilizing
your findings.
Ensure that data collection and entry is managed on a
consistent basis - avoid developing an
evaluation design and than having the contract lapse because
staff did not enter the data.
Page 36
Evaluation Expert Session
July 16, 2002
Appendix C
References
References used for completion of this workbook and/or that
you may find helpful for
additional information.
Centers for Disease Control and Prevention. 1995. Evaluating
Community Efforts to Prevent
Cardiovascular Diseases. Atlanta, GA.
Centers for Disease Control and Prevention. 2001. Introduction
to Program Evaluation for
Comprehensive Tobacco Control Programs. Atlanta, GA.
Freeman, H. E., Rossi, P. H., Sandefur, G. D. 1993. Workbook
for evaluation: A systematic
approach. Sage Publications: Newbury Park, CA.
Georgia Policy Council for Children and Families; The Family
Connection; Metis Associates,
Inc. 1997. Pathways for assessing change: Strategies for
community partners.
Grembowski, D. 2001. The practice of health program
evaluation. Sage Publications: Thousand
Oaks.
Hawkins, J. D., Nederhood, B. 1987. Handbook for Evaluating
Drug and Alcohol Prevention
Programs. U.S. Department of Health and Human Services;
Public Health Service; Alcohol,
Drug Abuse, and Mental Health Administration: Washington, D.
C.
Muraskin, L. D. 1993. Understanding evaluation: The way to
better prevention programs.
Westat, Inc.
National Community AIDS Partnership 1993. Evaluating
HIV/AIDS Prevention Programs in
Community-based Organizations. Washington, D.C.
NIMH Overview of Needs Assessment. Chapter 3: Selecting the
needs assessment approach.
Patton, M. Q. 1982. Practical Evaluation. Sage Publications,
Inc.: Beverly Hills, CA.
Page 37
Evaluation Expert Session
July 16, 2002
Posavac, E. J., Carey, R. G. 1980. Program Evaluation: Methods
and Case Studies.
Prentice-Hall, Inc.: Englewood Cliffs, N.J.
Rossi, P. H., Freeman, H. E., Lipsey, M. W. 1999. Evaluation:
A Systematic Approach. (6th
edition). Sage Publications, Inc.: Thousand Oaks, CA.
Scheirer, M. A. 1994. Designing and using process evaluation.
In: J. S. Wholey, H. P. Hatry, &
K. E. Newcomer (eds) Handbook of practical program
evaluation. Jossey-Bass Publishers: San
Francisco.
Taylor-Powell, E., Rossing, B., Geran, J. 1998. Evaluating
Collaboratives: Reaching the
potential. Program Development and Evaluation: Madison, WI.
U.S. Department of Health and Human Services; Administration
for Children and Families;
Office of Community Services. 1994. Evaluation Guidebook:
Demonstration partnership
program projects.
W.K. Kellogg Foundation. 1998. W. K. Kellogg Foundation
Evaluation Handbook.
Websites:
www.cdc.gov/eval/resources
www.eval.org (has online text books)
www.wmich.edu/evalctr (has online checklists)
www.preventiondss.org
When conducting literature reviews or searching for additional
information, consider using
alternative names for "process evaluation," including:
formative evaluation
program fidelity
implementation assessment
implementation evaluation
program monitoring

More Related Content

Similar to Workbook for Designing a Process Evaluation

Social Work Research Planning a Program EvaluationJoan is a soc.docx
Social Work Research Planning a Program EvaluationJoan is a soc.docxSocial Work Research Planning a Program EvaluationJoan is a soc.docx
Social Work Research Planning a Program EvaluationJoan is a soc.docxsamuel699872
 
Running head IMPROVEMENT OPPORTUNITY .docx
Running head IMPROVEMENT OPPORTUNITY                             .docxRunning head IMPROVEMENT OPPORTUNITY                             .docx
Running head IMPROVEMENT OPPORTUNITY .docxwlynn1
 
Bringing User-Centered Design Practices into Agile Development Projects
Bringing User-CenteredDesign Practices intoAgile Development ProjectsBringing User-CenteredDesign Practices intoAgile Development Projects
Bringing User-Centered Design Practices into Agile Development Projectsabcd82
 
Assignment 2 Designing a Training ProgramDue Week 8 and worth 3.docx
Assignment 2 Designing a Training ProgramDue Week 8 and worth 3.docxAssignment 2 Designing a Training ProgramDue Week 8 and worth 3.docx
Assignment 2 Designing a Training ProgramDue Week 8 and worth 3.docxsherni1
 
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxSOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxsamuel699872
 
Collaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraCollaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraSandra Guevara
 
DirectionsWrite a summary (at least 250 words) of We Are the Wi
DirectionsWrite a summary (at least 250 words) of We Are the WiDirectionsWrite a summary (at least 250 words) of We Are the Wi
DirectionsWrite a summary (at least 250 words) of We Are the WiAlyciaGold776
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sectorwishart5
 
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroProgram Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroHelen Casimiro
 
Co Inst Eval Presentation 09
Co Inst Eval Presentation 09Co Inst Eval Presentation 09
Co Inst Eval Presentation 09June Gothberg
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...Institute of Development Studies
 
WIOA Youth Career Pathways 2017
WIOA Youth Career Pathways 2017 WIOA Youth Career Pathways 2017
WIOA Youth Career Pathways 2017 Illinois workNet
 
Please first send the proposals if the professor agrees, we will s.docx
Please first send the proposals if the professor agrees, we will s.docxPlease first send the proposals if the professor agrees, we will s.docx
Please first send the proposals if the professor agrees, we will s.docxChereCheek752
 
Eco Perks Logic Model
Eco Perks Logic ModelEco Perks Logic Model
Eco Perks Logic Modelecoperks
 
PJM6125 Project Evaluation Selecting Evaluation Tools .docx
PJM6125 Project Evaluation Selecting Evaluation Tools .docxPJM6125 Project Evaluation Selecting Evaluation Tools .docx
PJM6125 Project Evaluation Selecting Evaluation Tools .docxinfantsuk
 

Similar to Workbook for Designing a Process Evaluation (20)

Social Work Research Planning a Program EvaluationJoan is a soc.docx
Social Work Research Planning a Program EvaluationJoan is a soc.docxSocial Work Research Planning a Program EvaluationJoan is a soc.docx
Social Work Research Planning a Program EvaluationJoan is a soc.docx
 
PgMP Panel Review (3/3)
PgMP Panel Review (3/3)PgMP Panel Review (3/3)
PgMP Panel Review (3/3)
 
Running head IMPROVEMENT OPPORTUNITY .docx
Running head IMPROVEMENT OPPORTUNITY                             .docxRunning head IMPROVEMENT OPPORTUNITY                             .docx
Running head IMPROVEMENT OPPORTUNITY .docx
 
Bringing User-Centered Design Practices into Agile Development Projects
Bringing User-CenteredDesign Practices intoAgile Development ProjectsBringing User-CenteredDesign Practices intoAgile Development Projects
Bringing User-Centered Design Practices into Agile Development Projects
 
Assignment 2 Designing a Training ProgramDue Week 8 and worth 3.docx
Assignment 2 Designing a Training ProgramDue Week 8 and worth 3.docxAssignment 2 Designing a Training ProgramDue Week 8 and worth 3.docx
Assignment 2 Designing a Training ProgramDue Week 8 and worth 3.docx
 
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxSOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
 
Collaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraCollaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandra
 
Program Evaluations
Program EvaluationsProgram Evaluations
Program Evaluations
 
DirectionsWrite a summary (at least 250 words) of We Are the Wi
DirectionsWrite a summary (at least 250 words) of We Are the WiDirectionsWrite a summary (at least 250 words) of We Are the Wi
DirectionsWrite a summary (at least 250 words) of We Are the Wi
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sector
 
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroProgram Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. Casimiro
 
Co Inst Eval Presentation 09
Co Inst Eval Presentation 09Co Inst Eval Presentation 09
Co Inst Eval Presentation 09
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
WIOA Youth Career Pathways 2017
WIOA Youth Career Pathways 2017 WIOA Youth Career Pathways 2017
WIOA Youth Career Pathways 2017
 
Please first send the proposals if the professor agrees, we will s.docx
Please first send the proposals if the professor agrees, we will s.docxPlease first send the proposals if the professor agrees, we will s.docx
Please first send the proposals if the professor agrees, we will s.docx
 
Practical Evaluation Workshop
Practical Evaluation WorkshopPractical Evaluation Workshop
Practical Evaluation Workshop
 
Evaluation Plan Workbook
Evaluation Plan WorkbookEvaluation Plan Workbook
Evaluation Plan Workbook
 
Eco Perks Logic Model
Eco Perks Logic ModelEco Perks Logic Model
Eco Perks Logic Model
 
Mentoring 418
Mentoring 418Mentoring 418
Mentoring 418
 
PJM6125 Project Evaluation Selecting Evaluation Tools .docx
PJM6125 Project Evaluation Selecting Evaluation Tools .docxPJM6125 Project Evaluation Selecting Evaluation Tools .docx
PJM6125 Project Evaluation Selecting Evaluation Tools .docx
 

More from MikeEly930

Building on the Report Analysis you completed in Week 4, create a 10.docx
Building on the Report Analysis you completed in Week 4, create a 10.docxBuilding on the Report Analysis you completed in Week 4, create a 10.docx
Building on the Report Analysis you completed in Week 4, create a 10.docxMikeEly930
 
Bullet In the BrainHow to date a brown girl (black girl, white.docx
Bullet In the BrainHow to date a brown girl (black girl, white.docxBullet In the BrainHow to date a brown girl (black girl, white.docx
Bullet In the BrainHow to date a brown girl (black girl, white.docxMikeEly930
 
Budgeting and Financial ManagementPart 1There is a mounting publ.docx
Budgeting and Financial ManagementPart 1There is a mounting publ.docxBudgeting and Financial ManagementPart 1There is a mounting publ.docx
Budgeting and Financial ManagementPart 1There is a mounting publ.docxMikeEly930
 
Building aswimmingpoolTaskWorkerCategoryPerson.docx
Building aswimmingpoolTaskWorkerCategoryPerson.docxBuilding aswimmingpoolTaskWorkerCategoryPerson.docx
Building aswimmingpoolTaskWorkerCategoryPerson.docxMikeEly930
 
Bringing about Change in the Public Sector Please respond to the.docx
Bringing about Change in the Public Sector Please respond to the.docxBringing about Change in the Public Sector Please respond to the.docx
Bringing about Change in the Public Sector Please respond to the.docxMikeEly930
 
Briefly share with the class the  issue analysis paper written in .docx
Briefly share with the class the  issue analysis paper written in .docxBriefly share with the class the  issue analysis paper written in .docx
Briefly share with the class the  issue analysis paper written in .docxMikeEly930
 
Bronsen acquired a biblical manuscript in 1955.In 1962, he told .docx
Bronsen acquired a biblical manuscript in 1955.In 1962, he told .docxBronsen acquired a biblical manuscript in 1955.In 1962, he told .docx
Bronsen acquired a biblical manuscript in 1955.In 1962, he told .docxMikeEly930
 
BrochureInclude the following in your resource (Hyperten.docx
BrochureInclude the following in your resource (Hyperten.docxBrochureInclude the following in your resource (Hyperten.docx
BrochureInclude the following in your resource (Hyperten.docxMikeEly930
 
BSBMKG607B Manage market researchAssessment Task 1Procedure Fr.docx
BSBMKG607B Manage market researchAssessment Task 1Procedure Fr.docxBSBMKG607B Manage market researchAssessment Task 1Procedure Fr.docx
BSBMKG607B Manage market researchAssessment Task 1Procedure Fr.docxMikeEly930
 
Briefly provide an overview of Sir Robert Peel’s contributions to po.docx
Briefly provide an overview of Sir Robert Peel’s contributions to po.docxBriefly provide an overview of Sir Robert Peel’s contributions to po.docx
Briefly provide an overview of Sir Robert Peel’s contributions to po.docxMikeEly930
 
Brain-Based Innovative Teaching and Learning Strategies Chapter .docx
Brain-Based Innovative Teaching and Learning Strategies Chapter .docxBrain-Based Innovative Teaching and Learning Strategies Chapter .docx
Brain-Based Innovative Teaching and Learning Strategies Chapter .docxMikeEly930
 
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000 a.docx
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000 a.docxBrief Exercise 4-2Brisky Corporation had net sales of $2,400,000 a.docx
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000 a.docxMikeEly930
 
Both Germany and Finland, among a large number of other nation state.docx
Both Germany and Finland, among a large number of other nation state.docxBoth Germany and Finland, among a large number of other nation state.docx
Both Germany and Finland, among a large number of other nation state.docxMikeEly930
 
Brief Exercise 5-2Koch Corporation’s adjusted trial balance contai.docx
Brief Exercise 5-2Koch Corporation’s adjusted trial balance contai.docxBrief Exercise 5-2Koch Corporation’s adjusted trial balance contai.docx
Brief Exercise 5-2Koch Corporation’s adjusted trial balance contai.docxMikeEly930
 
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000.docx
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000.docxBrief Exercise 4-2Brisky Corporation had net sales of $2,400,000.docx
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000.docxMikeEly930
 
Briefly describe how the following tools can be Applied to a psychol.docx
Briefly describe how the following tools can be Applied to a psychol.docxBriefly describe how the following tools can be Applied to a psychol.docx
Briefly describe how the following tools can be Applied to a psychol.docxMikeEly930
 
Branding ConceptsBranding is one of the marketing-orig.docx
Branding ConceptsBranding is one of the marketing-orig.docxBranding ConceptsBranding is one of the marketing-orig.docx
Branding ConceptsBranding is one of the marketing-orig.docxMikeEly930
 
Briefly discuss the key phases of the SDLC methodology.Discuss the.docx
Briefly discuss the key phases of the SDLC methodology.Discuss the.docxBriefly discuss the key phases of the SDLC methodology.Discuss the.docx
Briefly discuss the key phases of the SDLC methodology.Discuss the.docxMikeEly930
 
Briefly describe a time when you received a job description and fe.docx
Briefly describe a time when you received a job description and fe.docxBriefly describe a time when you received a job description and fe.docx
Briefly describe a time when you received a job description and fe.docxMikeEly930
 
Briefly discuss the meaning of the so-called social contract. In.docx
Briefly discuss the meaning of the so-called social contract. In.docxBriefly discuss the meaning of the so-called social contract. In.docx
Briefly discuss the meaning of the so-called social contract. In.docxMikeEly930
 

More from MikeEly930 (20)

Building on the Report Analysis you completed in Week 4, create a 10.docx
Building on the Report Analysis you completed in Week 4, create a 10.docxBuilding on the Report Analysis you completed in Week 4, create a 10.docx
Building on the Report Analysis you completed in Week 4, create a 10.docx
 
Bullet In the BrainHow to date a brown girl (black girl, white.docx
Bullet In the BrainHow to date a brown girl (black girl, white.docxBullet In the BrainHow to date a brown girl (black girl, white.docx
Bullet In the BrainHow to date a brown girl (black girl, white.docx
 
Budgeting and Financial ManagementPart 1There is a mounting publ.docx
Budgeting and Financial ManagementPart 1There is a mounting publ.docxBudgeting and Financial ManagementPart 1There is a mounting publ.docx
Budgeting and Financial ManagementPart 1There is a mounting publ.docx
 
Building aswimmingpoolTaskWorkerCategoryPerson.docx
Building aswimmingpoolTaskWorkerCategoryPerson.docxBuilding aswimmingpoolTaskWorkerCategoryPerson.docx
Building aswimmingpoolTaskWorkerCategoryPerson.docx
 
Bringing about Change in the Public Sector Please respond to the.docx
Bringing about Change in the Public Sector Please respond to the.docxBringing about Change in the Public Sector Please respond to the.docx
Bringing about Change in the Public Sector Please respond to the.docx
 
Briefly share with the class the  issue analysis paper written in .docx
Briefly share with the class the  issue analysis paper written in .docxBriefly share with the class the  issue analysis paper written in .docx
Briefly share with the class the  issue analysis paper written in .docx
 
Bronsen acquired a biblical manuscript in 1955.In 1962, he told .docx
Bronsen acquired a biblical manuscript in 1955.In 1962, he told .docxBronsen acquired a biblical manuscript in 1955.In 1962, he told .docx
Bronsen acquired a biblical manuscript in 1955.In 1962, he told .docx
 
BrochureInclude the following in your resource (Hyperten.docx
BrochureInclude the following in your resource (Hyperten.docxBrochureInclude the following in your resource (Hyperten.docx
BrochureInclude the following in your resource (Hyperten.docx
 
BSBMKG607B Manage market researchAssessment Task 1Procedure Fr.docx
BSBMKG607B Manage market researchAssessment Task 1Procedure Fr.docxBSBMKG607B Manage market researchAssessment Task 1Procedure Fr.docx
BSBMKG607B Manage market researchAssessment Task 1Procedure Fr.docx
 
Briefly provide an overview of Sir Robert Peel’s contributions to po.docx
Briefly provide an overview of Sir Robert Peel’s contributions to po.docxBriefly provide an overview of Sir Robert Peel’s contributions to po.docx
Briefly provide an overview of Sir Robert Peel’s contributions to po.docx
 
Brain-Based Innovative Teaching and Learning Strategies Chapter .docx
Brain-Based Innovative Teaching and Learning Strategies Chapter .docxBrain-Based Innovative Teaching and Learning Strategies Chapter .docx
Brain-Based Innovative Teaching and Learning Strategies Chapter .docx
 
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000 a.docx
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000 a.docxBrief Exercise 4-2Brisky Corporation had net sales of $2,400,000 a.docx
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000 a.docx
 
Both Germany and Finland, among a large number of other nation state.docx
Both Germany and Finland, among a large number of other nation state.docxBoth Germany and Finland, among a large number of other nation state.docx
Both Germany and Finland, among a large number of other nation state.docx
 
Brief Exercise 5-2Koch Corporation’s adjusted trial balance contai.docx
Brief Exercise 5-2Koch Corporation’s adjusted trial balance contai.docxBrief Exercise 5-2Koch Corporation’s adjusted trial balance contai.docx
Brief Exercise 5-2Koch Corporation’s adjusted trial balance contai.docx
 
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000.docx
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000.docxBrief Exercise 4-2Brisky Corporation had net sales of $2,400,000.docx
Brief Exercise 4-2Brisky Corporation had net sales of $2,400,000.docx
 
Briefly describe how the following tools can be Applied to a psychol.docx
Briefly describe how the following tools can be Applied to a psychol.docxBriefly describe how the following tools can be Applied to a psychol.docx
Briefly describe how the following tools can be Applied to a psychol.docx
 
Branding ConceptsBranding is one of the marketing-orig.docx
Branding ConceptsBranding is one of the marketing-orig.docxBranding ConceptsBranding is one of the marketing-orig.docx
Branding ConceptsBranding is one of the marketing-orig.docx
 
Briefly discuss the key phases of the SDLC methodology.Discuss the.docx
Briefly discuss the key phases of the SDLC methodology.Discuss the.docxBriefly discuss the key phases of the SDLC methodology.Discuss the.docx
Briefly discuss the key phases of the SDLC methodology.Discuss the.docx
 
Briefly describe a time when you received a job description and fe.docx
Briefly describe a time when you received a job description and fe.docxBriefly describe a time when you received a job description and fe.docx
Briefly describe a time when you received a job description and fe.docx
 
Briefly discuss the meaning of the so-called social contract. In.docx
Briefly discuss the meaning of the so-called social contract. In.docxBriefly discuss the meaning of the so-called social contract. In.docx
Briefly discuss the meaning of the so-called social contract. In.docx
 

Recently uploaded

Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxUnboundStockton
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitolTechU
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 

Recently uploaded (20)

Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docx
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptx
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 

Workbook for Designing a Process Evaluation

  • 1. Workbook for Designing a Process Evaluation Produced for the Georgia Department of Human Resources Division of Public Health By Melanie J. Bliss, M.A. James G. Emshoff, Ph.D. Department of Psychology Georgia State University July 2002
  • 2. Evaluation Expert Session July 16, 2002 Page 1 What is process evaluation? Process evaluation uses empirical data to assess the delivery of programs. In contrast to outcome evaluation, which assess the impact of the program, process evaluation verifies what the program is and whether it is being implemented as designed. Thus, process evaluation asks "what," and outcome evaluation asks, "so what?" When conducting a process evaluation, keep in mind these three questions: 1. What is the program intended to be? 2. What is delivered, in reality? 3. Where are the gaps between program design and delivery? This workbook will serve as a guide for designing your own process evaluation for a program of your choosing. There are many steps involved in the implementation of a process evaluation, and this workbook will
  • 3. attempt to direct you through some of the main stages. It will be helpful to think of a delivery service program that you can use as your example as you complete these activities. Why is process evaluation important? 1. To determine the extent to which the program is being implemented according to plan 2. To assess and document the degree of fidelity and variability in program implementation, expected or unexpected, planned or unplanned 3. To compare multiple sites with respect to fidelity 4. To provide validity for the relationship between the intervention and the outcomes 5. To provide information on what components of the intervention are responsible for outcomes 6. To understand the relationship between program context (i.e., setting characteristics) and program processes (i.e., levels of implementation). 7. To provide managers feedback on the quality of implementation 8. To refine delivery components 9. To provide program accountability to sponsors, the public, clients,
  • 4. and funders 10. To improve the quality of the program, as the act of evaluating is an intervention. Evaluation Expert Session July 16, 2002 Page 2 Stages of Process Evaluation Page Number 1. Form Collaborative Relationships 3 2. Determine Program Components 4 3. Develop Logic Model* 4. Determine Evaluation Questions 6 5. Determine Methodology 11 6. Consider a Management Information System 25 7. Implement Data Collection and Analysis 28 8. Write Report** Also included in this workbook:
  • 5. a. Logic Model Template 30 b. Pitfalls to avoid 30 c. References 31 Evaluation can be an exciting, challenging, and fun experience Enjoy! * Previously covered in Evaluation Planning Workshops. ** Will not be covered in this expert session. Please refer to the Evaluation Framework and Evaluation Module of FHB Best Practice Manual for more details. Evaluation Expert Session July 16, 2002 Page 3 Forming collaborative relationships A strong, collaborative relationship with program delivery staff and management will
  • 6. likely result in the following: Feedback regarding evaluation design and implementation Ease in conducting the evaluation due to increased cooperation Participation in interviews, panel discussion, meetings, etc. Increased utilization of findings Seek to establish a mutually respectful relationship characterized by trust, commitment, and flexibility. Key points in establishing a collaborative relationship: Start early. Introduce yourself and the evaluation team to as many delivery staff and management personnel as early as possible. Emphasize that THEY are the experts, and you will be utilizing their knowledge and information to inform your evaluation development and implementation. Be respectful of their time both in-person and on the telephone. Set up meeting places that are geographically accessible to all parties involved in the evaluation process.
  • 7. Remain aware that, even if they have requested the evaluation, it may often appear as an intrusion upon their daily activities. Attempt to be as unobtrusive as possible and request their feedback regarding appropriate times for on-site data collection. Involve key policy makers, managers, and staff in a series of meetings throughout the evaluation process. The evaluation should be driven by the questions that are of greatest interest to the stakeholders. Set agendas for meetings and provide an overview of the goals of the meeting before beginning. Obtain their feedback and provide them with updates regarding the evaluation process. You may wish to obtained structured feedback. Sample feedback forms are throughout the workbook. Provide feedback regarding evaluation findings to the key policy makers, managers, and staff when and as appropriate. Use visual aids and handouts. Tabulate and summarize information. Make it as interesting as possible. Consider establishing a resource or expert "panel" or advisory board that is an official group of people willing to be contacted when you need feedback
  • 8. or have questions. Evaluation Expert Session July 16, 2002 Page 4 Determining Program Components Program components are identified by answering the questions who, what, when, where, and how as they pertain to your program. Who: the program clients/recipients and staff What: activities, behaviors, materials When: frequency and length of the contact or intervention Where: the community context and physical setting How: strategies for operating the program or intervention BRIEF EXAMPLE: Who: elementary school students What: fire safety intervention When: 2 times per year Where: in students’ classroom How: group administered intervention, small group practice
  • 9. 1. Instruct students what to do in case of fire (stop, drop and roll). 2. Educate students on calling 911 and have them practice on play telephones. 3. Educate students on how to pull a fire alarm, how to test a home fire alarm and how to change batteries in a home fire alarm. Have students practice each of these activities. 4. Provide students with written information and have them take it home to share with their parents. Request parental signature to indicate compliance and target a 75% return rate. Points to keep in mind when determining program components Specify activities as behaviors that can be observed If you have a logic model, use the "activities" column as a starting point Ensure that each component is separate and distinguishable from others Include all activities and materials intended for use in the intervention Identify the aspects of the intervention that may need to be adapted, and those that should
  • 10. always be delivered as designed. Consult with program staff, mission statements, and program materials as needed. Evaluation Expert Session July 16, 2002 Page 5 Your Program Components After you have identified your program components, create a logic model that graphically portrays the link between program components and outcomes expected from these components. Now, write out a succinct list of the components of your program. WHO:
  • 12. What is a Logic Model A logical series of statements that link the problems your program is attempting to address (conditions), how it will address them (activities), and what are the expected results (immediate and intermediate outcomes, long-term goals). Benefits of the logic model include: helps develop clarity about a project or program, helps to develop consensus among people, helps to identify gaps or redundancies in a plan, helps to identify core hypothesis, helps to succinctly communicate what your project or program is about. When do you use a logic model Use... - During any work to clarify what is being done, why, and with what intended results - During project or program planning to make sure that the project or program is logical and complete - During evaluation planning to focus the evaluation - During project or program implementation as a template for comparing to the actual program and as a filter to determine whether proposed changes fit or not.
  • 13. This information was extracted from the Logic Models: A Multi-Purpose Tool materials developed by Wellsys Corporation for the Evaluation Planning Workshop Training. Please see the Evaluation Planning Workshop materials for more information. Appendix A has a sample template of the tabular format. Evaluation Expert Session July 16, 2002 Page 7
  • 14. Determining Evaluation Questions As you design your process evaluation, consider what questions you would like to answer. It is only after your questions are specified that you can begin to develop your methodology. Considering the importance and purpose of each question is critical. BROADLY.... What questions do you hope to answer? You may wish to turn the program components that you have just identified into questions assessing: Was the component completed as indicated? What were the strengths in implementation? What were the barriers or challenges in implementation? What were the apparent strengths and weaknesses of each step of the intervention? Did the recipient understand the intervention? Were resources available to sustain project activities? What were staff perceptions? What were community perceptions? What was the nature of the interaction between staff and clients? These are examples. Check off what is applicable to you, and use the space below to write additional broad, overarching questions that you wish to answer.
  • 15. Evaluation Expert Session July 16, 2002 Page 8 SPECIFICALLY ... Now, make a list of all the specific questions you wish to answer, and organize your questions categorically. Your list of questions will likely be much longer than your list of program components. This step of developing your evaluation will inform your methodologies and instrument choice. Remember that you must collect information on what the program is intended to be and what it is in reality, so you may need to ask some questions in 2 formats. For example: How many people are intended to complete this intervention per week?" How many actually go through the intervention during an average week?" Consider what specific questions you have. The questions below are only examples! Some may not be appropriate for your evaluation, and you will most likely need to add additional questions. Check off the questions that are applicable to you, and add your own questions in the space provided.
  • 16. WHO (regarding client): Who is the target audience, client, or recipient? How many people have participated? How many people have dropped out? How many people have declined participation? What are the demographic characteristics of clients? Race Ethnicity National Origin Age Gender Sexual Orientation Religion Marital Status Employment Income Sources Education Socio-Economic Status What factors do the clients have in common? What risk factors do clients have? Who is eligible for participation? How are people referred to the program? How are the screened? How satisfied are the clients? YOUR QUESTIONS:
  • 17. Evaluation Expert Session July 16, 2002 Page 9 WHO (Regarding staff): Who delivers the services? How are they hired? How supportive are staff and management of each other? What qualifications do staff have? How are staff trained? How congruent are staff and recipients with one another? What are staff demographics? (see client demographic list for specifics.) YOUR QUESTIONS: WHAT: What happens during the intervention? What is being delivered? What are the methods of delivery for each service (e.g., one- on-one, group session, didactic instruction, etc.) What are the standard operating procedures?
  • 18. What technologies are in use? What types of communication techniques are implemented? What type of organization delivers the program? How many years has the organization existed? How many years has the program been operating? What type of reputation does the agency have in the community? What about the program? What are the methods of service delivery? How is the intervention structured? How is confidentiality maintained? YOUR QUESTIONS: WHEN: When is the intervention conducted? How frequently is the intervention conducted? At what intervals? At what time of day, week, month, year? What is the length and/or duration of each service?
  • 19. Evaluation Expert Session July 16, 2002 Page 10 YOUR QUESTIONS: WHERE: Where does the intervention occur? What type of facility is used? What is the age and condition of the facility? In what part of town is the facility? Is it accessibl e to the target audience? Does public transportation access the facility? Is parking available? Is child care provided on site? YOUR QUESTIONS:
  • 20. WHY: Why are these activities or strategies implemented and why not others? Why has the intervention varied in ability to maintain interest? Why are clients not participating? Why is the intervention conducted at a certain time or at a certain frequency? YOUR QUESTIONS: Evaluation Expert Session July 16, 2002 Page 11 Validating Your Evaluation Questions Even though all of your questions may be interesting, it is important to narrow your list to questions that will be particularly helpful to the evaluation and that can be answered given your specific resources, staff, and time. Go through each of your questions and consider it with respect to the questions below, which may be helpful in streamlining your final list of questions.
  • 21. Revise your worksheet/list of questions until you can answer "yes" to all of these questions. If you cannot answer "yes" to your question, consider omitting the question from your evaluation. Validation Yes No Will I use the data that will stem from these questions? Do I know why each question is important and /or valuable? Is someone interested in each of these questions? Have I ensured that no questions are omitted that may be
  • 22. important to someone else? Is the wording of each question sufficiently clear and unambiguous? Do I have a hypothesis about what the “correct” answer will be for each question? Is each question specific without inappropriately limiting the scope of the evaluation or probing for a specific response? Do they constitute a sufficient set of questions to achieve the purpose(s) of the evaluation? Is it feasible to answer the question, given what I know about the resources for evaluation?
  • 23. Is each question worth the expense of answering it? Derived from "A Design Manual" Checklist, page 51. Evaluation Expert Session July 16, 2002 Page 12 Determining Methodology Process evaluation is characterized by collection of data primarily through two formats: 1) Quantitative, archival, recorded data that may be managed by an computerized tracking or management system, and 2) Qualitative data that may be obtained through a variety of formats, such as
  • 24. surveys or focus groups. When considering what methods to use, it is critical to have a thorough understanding and knowledge of the questions you want answered. Your questions will inform your choice of methods. After this section on types of methodologies, you will complete an exercise in which you consider what method of data collection is most appropriate for each question. Do you have a thorough understanding of your questions? Furthermore, it is essential to consider what data the organization you are evaluating already has. Data may exist in the form of an existing computerized management information system, records, or a tracking system of some other sort. Using this data may provide the best reflection of what is "going on," and it will also save you time, money, and energy because you will not have to devise your own data collection method! However, keep in mind that you may have to adapt this data to meet your own needs - you may need to add or replace fields, records, or variables. What data does your organization already have?
  • 25. Will you need to adapt it? If the organization does not already have existing data, consider devising a method for the organizational staff to collect their own data. This process will ultimately be helpful for them so that they can continue to self- evaluate, track their activities, and assess progress and change. It will be helpful for the evaluation process because, again, it will save you time, money, and energy that you can better devote towards other aspects of the evaluation. Management information systems will be described more fully in a later section of this workbook. Do you have the capacity and resources to devise such a system? (You may need to refer to a later section of this workbook before answering.) Evaluation Expert Session July 16, 2002 Page 13 Who should collect the data?
  • 26. Given all of this, what thoughts do you have on who should collect data for your evaluation? Program staff, evaluation staff, or some combination? Program Staff: May collect data from activities such as attendance, demographics, participation, characteristics of participants, dispositions, etc; may conduct intake interviews, note changes regarding service delivery, and monitor program implementation. Advantages: Cost-efficient, accessible, resourceful, available, time-efficient, and increased understanding of the program. Disadvantages: May exhibit bias and/or social desirability, may use data for critical judgment, may compromise the validity of the program; may put staff in uncomfortable or inappropriate position; also, if staff collect data, may have an increased burden and responsibility placed upon them outside of their usual or typical job responsibilities. If you utilize staff for data collection, provide frequent reminders as well as messages of gratitude.
  • 27. Evaluation staff: May collect qualitative information regarding implementation, general characteristics of program participants, and other information that may otherwise be subject to bias or distortion. Advantages: Data collected in manner consistent with overall goals and timeline of evaluation; prevents bias and inappropriate use of information; promotes overall fidelity and validity of data. Disadvantages: May be costly and take extensive time; may require additional training on part of evaluator; presence of evaluator in organization may be intrusive, inconvenient, or burdensome. Evaluation Expert Session July 16, 2002 Page 14 When should data be collected?
  • 28. Conducting the evaluation according to your timeline can be challenging. Consider how much time you have for data collection, and make decisions regarding what to collect and how much based on your timeline. In many cases, outcome evaluation is not considered appropriate until the program has stabilized. However, when conducting a process evaluation, it can be important to start the evaluation at the beginning so that a story may be told regarding how the program was developed, information may be provided on refinements, and program growth and progress may be noted. If you have the luxury of collecting data from the start of the intervention to the end of the intervention, space out data collection as appropriate. If you are evaluating an ongoing intervention that is fairly quick (e.g., an 8-week educational group), you may choose to evaluate one or more "cycles." How much time do you have to conduct your evaluation? How much time do you have for data collection (as opposed to designing the evaluation, training, organizing and analyzing results, and writing the report?) Is the program you are evaluating time specific? How long does the program or intervention last?
  • 29. At what stages do you think you will most likely collect data? Soon after a program has begun Descriptive information on program characteristics that will not change; information requiring baseline information During the intervention Ongoing process information such as recruitment, program implementation After the intervention Demographics, attendance ratings, satisfaction ratings Evaluation Expert Session July 16, 2002 Page 15 Before you consider methods A list of various methods follows this section. Before choosing what methods are most appropriate for your evaluation, review the following questions. (Some may already be answered in another section of this workbook.)
  • 30. What questions do I want answered? (see previous section) Does the organization already have existing data, and if so, what kind? Does the organization have staff to collect data? What data can the organization staff collect? Must I maintain anonymity (participant is not identified at all) or confidentiality (participant is identified but responses remain private)? This consideration pertains to existing archival data as well as original data collection. How much time do I have to conduct the evaluation? How much money do I have in my budget? How many evaluation staff do I have to manage the data collection activities? Can I (and/or members of my evaluation staff) travel on site? What time of day is best for collecting data? For example, if
  • 31. you plan to conduct focus groups or interviews, remember that your population may work during the day and need evening times. Evaluation Expert Session July 16, 2002 Page 16 Types of methods A number of different methods exist that can be used to collect process information. Consider each of the following, and check those that you think would be helpful in addressing the specific questions in your evaluation. When "see sample" is indicated, refer to the pages that follow this table. √ Method Description Activity, participation, or client tracking log Brief record completed on site at frequent intervals by
  • 32. participant or deliverer. May use form developed by evaluator if none previously exists. Examples: sign in log, daily records of food consumption, medication management. Case Studies Collection of in-depth information regarding small number of intervention recipients; use multiple methods of data collection. Ethnographic analysis Obtain in-depth information regarding the experience of the recipient by partaking in the intervention, attending meetings, and talking with delivery staff and recipients. Expert judgment Convene a panel of experts or conduct individual interviews to obtain their understanding of and reaction to program delivery. Focus groups Small group discussion among program delivery staff or recipients. Focus on their thoughts and opinions regarding their experiences with the intervention. Meeting minutes (see sample)
  • 33. Qualitative information regarding agendas, tasks assigned, and coordination and implementation of the intervention as recorded on a consistent basis. Observation (see sample) Observe actual delivery in vivo or on video, record findings using check sheet or make qualitative observations. Open-ended interviews – telephone or in person Evaluator asks open questions (i.e., who, what, when, where, why, how) to delivery staff or recipients. Use interview protocol without preset response options. Questionnaire Written survey with structured questions. May administer in individual, group, or mail format. May be anonymous or confidential. Record review Obtain indicators from intervention records such patient files, time sheets, telephone logs, registration forms, student charts, sales records, or records
  • 34. specific to the service delivery. Structured interviews – telephone or in person Interviewer asks direct questions using interview protocol with preset response options. Evaluation Expert Session July 16, 2002 Page 17 Sample activity log This is a common process evaluation method ology because it systematically records exactly what is happening during implementation. You may wish to devise a log such as the one below and alter it to meet your specific needs. Consider computerizing such a log for efficiency. Your program may already have existing logs that you can utilize and adapt for your evaluation purposes.
  • 36.
  • 37. Evaluation Expert Session July 16, 2002 Page 18 Meeting Minutes Taking notes at meetings may provide extensive and invaluable process information that can later be organized and structured into a comprehensive report. Minutes may be taken by program staff or by the evaluator if necessary. You may find it helpful to use a structured form, such as the one below that is derived from Evaluating Collaboratives, University of Wisconsin-Cooperative Extension, 1998.
  • 38. Meeting Place: __________________ Start time: ____________ Date: _____________________________ End time: ____________ Attendance (names): Agenda topic: _________________________________________________ Discussion: _____________________________________________________ Decision Related Tasks Who responsible Deadline 1. 2. 3. Agenda topic: _________________________________________________ Discussion: _____________________________________________________ Decision Related Tasks Who responsible Deadline
  • 39. 1. 2. 3. Sample observation log Evaluation Expert Session July 16, 2002 Page 19 Observation may occur in various methods, but one of the most common is hand-recording specific details during a small time period. The following is several rows from an observation log utilized during an evaluation examining school classrooms. CLASSROOM OBSERVATIONS (School Environment Scale) Classroom 1: Grade level _________________ (Goal: 30 minutes of observation) Time began observation: _________Time ended observation:_________ Subjects were taught during observation period:
  • 40. ___________________ PHYSICAL ENVIRONMENT Question Answer 1. Number of students 2. Number of adults in room: a. Teachers b. Para-pros c. Parents Total: a. b. c. 3. Desks/Tables a. Number of Desks b. Number of Tables for students’ use c. Any other furniture/include number (Arrangement of desks/tables/other furniture) a. b. c.
  • 41. 4. Number of computers, type 5. How are computers being used? 6. What is the general classroom setup? (are there walls, windows, mirrors, carpet, rugs, cabinets, curtains, etc.) 7. Other technology (overhead projector, power point, VCR, etc.) 8. Are books and other materials accessible for students? 9. Is there adequate space for whole-class instruction? 12. What type of lighting is used?
  • 42. 13. Are there animals or fish in the room? 14. Is there background music playing? 15. Rate the classroom condition Poor Average Excellent 16. Are rules/discipline procedures posted? If so, where? 17. Is the classroom Noisy or Quiet? Very Quiet Very Noisy Choosing or designing measurement instruments Consider using a resource panel, advisory panel, or focus group to offer feedback
  • 43. Evaluation Expert Session July 16, 2002 Page 20 regarding your instrument. This group may be composed of any of the people listed below. You may also wish to consult with one or more of these individuals throughout the development of your overall methodology. Who should be involved in the design of your instrument(s) and/or provide feedback? Program service delivery staff / volunteers Project director Recipients of the program Board of directors Community leader Collaborating organizations Experts on the program or service being evaluated Evaluation experts _________________________ _________________________ _________________________ Conduct a pilot study and administer the instrument to a group of recipients, and then obtain feedback regarding their experience. This is a critical
  • 44. component of the development of your instruments, as it will help ensure clarity of questions, and reduce the degree of discomfort or burden that questions or processes (e.g., intakes or computerized data entry) elicit. How can you ensure that you pilot your methods? When will you do it, and whom will you use as participants in the study? Ensure that written materials are at an appropriate reading level for the population. Ensure that verbal information is at an appropriate terminology level for the population. A third or sixth-grade reading level is often utilized. Remember that you are probably collecting data that is program-specific. This may increase the difficulty in finding instruments previously constructed to use for questionnaires, etc. However, instruments used for conducti ng process evaluations of other programs may provide you with ideas for how to structure your own instruments.
  • 45. Evaluation Expert Session July 16, 2002 Page 21 Linking program components and methods (an example) Now that you have identified your program components, broad questions, specific questions, and possible measures, it is time to link them together. Let's start with your program components. Here is an example of 3 program components of an intervention. Program Components and Essential Elements: There are six program components to M2M. There are essential elements in each component that must be present for the program to achieve its intended results and outcomes, and for the program to be identified as a program of the American Cancer Society. Possible Process Measures 1) Man to Man Self-Help and/or Support Groups The essential elements within this component are: • Offer information and support to all men with prostate cancer at all points along the
  • 46. cancer care continuum • Directly, or through collaboration and referral, offer community access to prostate cancer self-help and/or support groups • Provide recruitment and on-going training and monitoring for M2M leaders and volunteers • Monitor, track and report program activities • Descriptions of attempts to schedule and advertise group meetings • Documented efforts to establish the program • Documented local needs assessments • # of meetings held per independent group • Documented meetings held • # of people who attended different topics and speakers • Perceptions of need of survey participants for additional groups and current satisfaction levels • # of new and # of continuing group members • Documented sign-up sheets for group meetings • Documented attempts to contact program dropouts • # of referrals to other PC groups documented • # of times corresponding with other PC groups • # of training sessions for new leaders • # of continuing education sessions for experienced leaders
  • 47. • # and types of other on-going support activities for volunteer leaders • # of volunteers trained as group facilitators • Perceptions of trained volunteers for readiness to function as group facilitators Evaluation Expert Session July 16, 2002 Page 22 2) One-to-One Contacts The essential elements within this component are: • Offer one-to-one contact to provide information and support to all men with prostate cancer, including those in the diagnostic process • Provide recruitment and on-going training and monitoring for M2M leaders and volunteers
  • 48. • Monitor, track and report program activities • # of contact pairings • Frequency and duration of contact pairings • Types of information shared during contact pairings • # of volunteers trained • Perception of readiness by trained volunteers • Documented attempts for recruiting volunteers • Documented on-going training activities for volunteers • Documented support activities 3) Community Education and Awareness The essential elements within this component are: • Conduct public awareness activities to inform the public about prostate cancer and M2M • Monitor, track and report program activities
  • 49. • # of screenings provided by various health care providers/agencies over assessment period • Documented ACS staff and volunteer efforts to publicize the availability and importance of PC and screenings, including health fairs, public service announcements, billboard advertising, etc. • # of addresses to which newsletters are mailed • Documented efforts to increase newsletter mailing list Page 23 Linking YOUR program components, questions, and methods Consider each of your program components and questions that you have devised in an earlier section of this workbook, and the methods that you checked off on the "types of methods" table. Now ask yourself, how will I use the information I have obtained from this question? And, what method is most appropriate for obtaining this information? Program Component
  • 50. Specific questions that go with this component How will I use this information? Best method? Page 24
  • 51. Program Component Specific questions that go with this component How will I use this information? Best method? Evaluation Expert Session
  • 52. July 16, 2002 Page 25 Data Collection Plan Now let's put your data collection activities on one sheet - what you're collecting, how you're doing it, when, your sample, and who will collect it. Identifying your methods that you have just picked, instruments, and data collection techniques in a structured manner will facilitate this process. Method Type of data (questions, briefly indicated) Instrument used When implemented Sample Who collects E.g.: Patient interviews in health
  • 53. dept clinics Qualitative - what services they are using, length of visit, why came in, how long wait, some quantitative satisfaction ratings Interview created by evaluation team and piloted with patients Oct-Dec; days and hrs randomly selected 10 interviews in each clinic Trained interviewers
  • 54. Page 26 Evaluation Expert Session July 16, 2002
  • 55. Consider a Management Information System Process data is frequently collected through a management information system (MIS) that is designed to record characteristics of participants, participation of participants, and characteristics of activities and services provided. An MIS is a computerized record system that enables service providers and evaluators to accumulate and display data quickly and efficiently in various ways. Will your evaluation be enhanced by periodic data presentations in tables or other structured formats? For example, should the evaluation utilize a monthly print-out of services utilized or to monitor and process recipient tracking (such as date, time, and length of service)? YES NO Does the agency create monthly (or other periodic) print outs reflecting services rendered or clients served?
  • 56. YES NO Will the evaluation be conducted in a more efficient manner if program delivery staff enter data on a consistent basis? YES NO Does the agency already have hard copies of files or records that would be better utilized if computerized? YES NO Does the agency already have an MIS or a similar computerized database? YES
  • 57. NO If the answers to any of these questions are YES, consider using an MIS for your evaluation. If an MIS does not already exist, you may desire to design a database in which you can enter information from records obtained by the agency. Thi s process decreases missing data and is generally efficient. If you do create a database that can be used on an ongoing basis by the agency, you may consider offering it to them for future use. Page 27 Evaluation Expert Session July 16, 2002 Information to be included in your MIS
  • 58. Examples include: Client demographics Client contacts Client services Referrals offered Client outcomes Program activities Staff notes Jot down the important data you would like to be included in your MIS. Managing your MIS What software do you wish to utilize to manage your data? What type of data do you have? How much information will you need to enter? How will you ultimately analyze the data? You may wish to create a database directly in the program you will eventually use, such as SPSS? Will you be utilizing lap tops?
  • 59. Page 28 Evaluation Expert Session July 16, 2002 If so, will you be taking them onsite and directly entering your data into them? How will you download or transfer the information, if applicable? What will the impact be on your audience if you have a laptop? Tips on using an MIS If service delivery personnel will be collecting and/or entering information into the MIS for the evaluator's use, it is generally a good idea to provide frequent reminders of the
  • 60. importance of entering the appropriate information in a timely, consistent, and regular manner. For example, if an MIS is dependent upon patient data collected by public health officers daily activities, the officers should be entering data on at least a daily basis. Otherwise, important data is lost and the database will only reflect what was salient enough to be remembered and entered at the end of the week. Don't forget that this may be burdensome and/or inconvenient for the program staff. Provide them with frequent thank you's. Remember that your database is only as good as you make it. It must be organized and arranged so that it is most helpful in answering your questions. If you are collecting from existing records, at what level is he data currently available? For example, is it state, county, or city information? How is it defined? Consider whether adaptations need to be made or additions need to be included for your evaluation. Back up your data frequently and in at least one additional format (e.g., zip, disk, server).
  • 61. Consider file security. Will you be saving data on a network server? You may need to consider password protection. Page 29 Evaluation Expert Session July 16, 2002 Allocate time for data entry and checking. Allow additional time to contemplate the meaning of the data before writing the report. Page 30
  • 62. Evaluation Expert Session July 16, 2002 Implement Data Collection and Analysis Data collection cannot be fully reviewed in this workbook, but this page offers a few tips regarding the process. General reminders: THANK everyone who helps you, directs you, or participates in anyway. Obtain clear directions and give yourself plenty of time, especially if you are traveling long distance (e.g., several hours away). Bring all of your own materials - do not expect the program to provide you with writing utensils, paper, a clipboard, etc. Address each person that you meet with respect and attempt to make your meeting as conducive with their schedule as possible. Most process evaluation will be in the form of routine record keeping (e.g., MIS). However, you may wish to interview clients and staff. If so:
  • 63. Ensure that you have sufficient time to train evaluation staff, data collectors, and/or organization staff who will be collecting data. After they have been trained in the data collection materials and procedure, require that they practice the technique, whether it is an interview or entering a sample record in an MIS. If planning to use a tape recorder during interviews or focus groups, request permission from participants before beginning. You may need to turn the tape recorder off on occasion if it will facilitate increased comfort by participants. If planning to use laptop computers, attempt to make consistent eye contact and spend time establishing rapport before beginning. Some participants may be uncomfortable with technology and you may need to provide education regarding the process of data collection and how the information will be utilized. If planning to hand write responses, warn the participant that you may move slowly and
  • 64. Page 31 Evaluation Expert Session July 16, 2002 may need to ask them to repeat themselves. However, prepare for this process by developing shorthand specific to the evaluation. A sample shorthand page follows. Page 32 Evaluation Expert Session July 16, 2002 Annual Evaluation Reports The ultimate aim of all the Branch’s evaluation efforts is to increase the intelligent use of information in Branch decision-making in order to improve health outcomes. Because we understand that many evaluation efforts fail because the data are never collected and that even more fail because the data are collected but never used in
  • 65. decision-making, we have struggled to find a way to institutionalize the use of evaluation results in Branch decision-making. These reports will serve multiple purposes: The need to complete the report will increase the likelihood that evaluation is done and data are collected. The need to review reports from lower levels in order to complete one’s own report hopefully will cause managers at all levels to consciously consider, at least once a year, the effectiveness of their activities and how evaluation results suggest that effectiveness can be improved. The summaries of evaluation findings in the reports should simplify preparation of other reports to funders including the General Assembly. Each evaluation report forms the basis of the evaluation report at the next level. The contents and length of the report should be determined by what is mot helpful to the manager who is receiving the report. Rather than simply reporting every possible piece of data, these reports should present summary data, summarize important conclusions, and suggest recommendations based on the evaluation findings. A program-level annual evaluation report should be ten pages or less. Many my be less than five pages. Population team and Branch-level annual evaluation
  • 66. reports may be longer than ten pages, depending on how many findings are being reported. However, reports that go beyond ten pages should also contain a shorter Executive Summary, to insure that those with the power to make decisions actually read the findings. Especially, the initial reports may reflect formative work and consist primarily of updates on the progress of evaluation planning and implementation. This is fine and to be expected. However, within a year or two the reports should begin to include process data, and later actual outcome findings. This information was extracted from the FHB Evaluation Framework developed by Monica Herk and Rebekah Hudgins. Page 33 Evaluation Expert Session July 16, 2002 Suggested shorthand - a sample The list below was derived for a process evaluation regarding charter schools. Note the use of general shorthand as well as shorthand derived specifically for the evaluation.
  • 67. CS Charter School mst Most Sch School b/c Because Tch Teacher, teach st Something P Principal b Be VP Vice Principal c See Admin Administration, administrators r Are DOE Dept of Education w/ When BOE Board of Education @ At Comm Community ~ About Stud Students, pupils = Is, equals, equivalent Kids Students, children, teenagers ≠ Does not equal, is not the same K Kindergarten Sone Someone Cl Class # Number CR Classroom $ Money, finances, financial, funding, expenses, etc. W White + Add, added, in addition B Black < Less than AA African American > Greater/more than SES Socio-economic status ??? What does this mean? Get more
  • 68. info on, I'm confused… Lib Library, librarian DWA Don't worry about (e.g. if you wrote something unnecessary) Caf Cafeteria Ψ Psychology, psychologist Ch Charter ∴ Therefore Conv Conversion (school) ∆ Change, is changing S-up Start up school mm Movement App Application, applied ↑ Increases, up, promotes ITBS Iowa Test of Basic Skills ↓ Decreases, down, inhibits LA Language arts X Times (e.g. many x we laugh) SS Social Studies ÷ Divided (we ÷ up the classrooms) QCC Quality Core Curriculum C With Pol Policy, politics Home, house Curr Curriculum ♥ Love, adore (e.g. the kids ♥ this) LP Lesson plans Church, religious activity Disc Discipline O No, doesn't, not Girls, women, female 1/2 Half (e.g. we took 1/2) Boys, men, male 2 To Page 34 Evaluation Expert Session July 16, 2002 F
  • 69. Father, dad c/out without P Parent 2B To be M Mom, mother e.g. For example i.e. That is …
  • 70. If the person trails off, you missed information Appendix A Logic Model Worksheet Population Team/Program Name __________________________ Date _______________________ If the following CONDITIONS AND ASSUMPTIONS exist... And if the following ACTIVITIES are implemented to address these conditions and assumptions Then these SHORT-TERM
  • 71. OUTCOMES may be achieved... And these LONG-TERM OUTCOMES may be acheived... And these LONG- TERM GOALS can be reached....
  • 72. Page 35 Evaluation Expert Session July 16, 2002 Appendix B Pitfalls To Avoid Avoid heightening expectations of delivery staff, program recipients, policy makers, or
  • 73. community members. Ensure that feedback will be provided as appropriate, but may or may not be utilized. Avoid any implication that you are evaluating the impact or outcome. Stress that you are evaluating "what is happening," not how well any one person is performing or what the outcomes of the intervention are. Make sure that the right information gets to the right people - it is most likely to be utilized in a constructive and effective manner if you ensure that your final report does not end up on someone's desk who has little motivation or interest in utilizing your findings. Ensure that data collection and entry is managed on a consistent basis - avoid developing an evaluation design and than having the contract lapse because staff did not enter the data.
  • 74. Page 36 Evaluation Expert Session July 16, 2002 Appendix C References References used for completion of this workbook and/or that you may find helpful for additional information. Centers for Disease Control and Prevention. 1995. Evaluating Community Efforts to Prevent Cardiovascular Diseases. Atlanta, GA. Centers for Disease Control and Prevention. 2001. Introduction to Program Evaluation for Comprehensive Tobacco Control Programs. Atlanta, GA. Freeman, H. E., Rossi, P. H., Sandefur, G. D. 1993. Workbook for evaluation: A systematic approach. Sage Publications: Newbury Park, CA. Georgia Policy Council for Children and Families; The Family Connection; Metis Associates, Inc. 1997. Pathways for assessing change: Strategies for community partners. Grembowski, D. 2001. The practice of health program evaluation. Sage Publications: Thousand Oaks.
  • 75. Hawkins, J. D., Nederhood, B. 1987. Handbook for Evaluating Drug and Alcohol Prevention Programs. U.S. Department of Health and Human Services; Public Health Service; Alcohol, Drug Abuse, and Mental Health Administration: Washington, D. C. Muraskin, L. D. 1993. Understanding evaluation: The way to better prevention programs. Westat, Inc. National Community AIDS Partnership 1993. Evaluating HIV/AIDS Prevention Programs in Community-based Organizations. Washington, D.C. NIMH Overview of Needs Assessment. Chapter 3: Selecting the needs assessment approach. Patton, M. Q. 1982. Practical Evaluation. Sage Publications, Inc.: Beverly Hills, CA. Page 37 Evaluation Expert Session July 16, 2002
  • 76. Posavac, E. J., Carey, R. G. 1980. Program Evaluation: Methods and Case Studies. Prentice-Hall, Inc.: Englewood Cliffs, N.J. Rossi, P. H., Freeman, H. E., Lipsey, M. W. 1999. Evaluation: A Systematic Approach. (6th edition). Sage Publications, Inc.: Thousand Oaks, CA. Scheirer, M. A. 1994. Designing and using process evaluation. In: J. S. Wholey, H. P. Hatry, & K. E. Newcomer (eds) Handbook of practical program evaluation. Jossey-Bass Publishers: San Francisco. Taylor-Powell, E., Rossing, B., Geran, J. 1998. Evaluating Collaboratives: Reaching the potential. Program Development and Evaluation: Madison, WI. U.S. Department of Health and Human Services; Administration for Children and Families; Office of Community Services. 1994. Evaluation Guidebook: Demonstration partnership program projects. W.K. Kellogg Foundation. 1998. W. K. Kellogg Foundation Evaluation Handbook. Websites: www.cdc.gov/eval/resources www.eval.org (has online text books) www.wmich.edu/evalctr (has online checklists) www.preventiondss.org When conducting literature reviews or searching for additional
  • 77. information, consider using alternative names for "process evaluation," including: formative evaluation program fidelity implementation assessment implementation evaluation program monitoring