SlideShare a Scribd company logo
1 of 79
Guide to Helping With Paper
· Description of the key program elements:
https://obamawhitehouse.archives.gov/blog/2011/11/30/prisoner
-reentry-programs-ensuring-safe-and-successful-return-
community
Drake, E. B., & Lafrance, S. (2007). Findings on Best Practices
of Community Re-Entry Programs ... Retrieved from
http://www.eisenhowerfoundation.org/docs/Ex-Offender Best
Practices.pdf
Mosteller, J. (2019). Why Reentry Programs are Important.
Retrieved from https://www.charleskochinstitute.org/issue-
areas/criminal-justice-policing-reform/reentry-programs/
· A description of the strategies that the program uses to
produce change
Caprizzo, C. (2011, November 30). Prisoner Reentry Programs:
Ensuring a Safe and Successful Return to the Community.
Retrieved
fromhttps://obamawhitehouse.archives.gov/blog/2011/11/30/pris
oner-reentry-programs-ensuring-safe-and-successful-return-
community
INTEGRATED REENTRYand EMPLOYMENT. (2013).
Retrieved from https://www.bja.gov/Publications/CSG-Reentry-
and-Employment.pdf
· A description of the needs of the target population
· An explanation of why a process evaluation is important for
the program
See attachment to answer this question (Workbook for
Designing a Process Evaluation) also look at this link below
Berghuis, M. (2018, October). Reentry Programs for Adult Male
Offender Recidivism and Reintegration: A Systematic Review
and Meta-Analysis. Retrieved from
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6139987/
· A plan for building relationships with the staff and
management
STRONG PROFESSIONAL RELATIONSHIPS - Leading
Teams. (2015). Retrieved from
http://www.leadingteams.net.au/wp-
content/uploads/2015/10/Whitepaper-Strong-Professional-
Relationships-Drive-High-Performance.pdf
See attachment can help you in answering this question
(Workbook for Designing a Process Evaluation)
· Broad questions to be answered by the process evaluation
Rossman, S., Willison, J., Lindquist, C., Walters, J., &
Lattimore, P. (2016, December). The author(s) shown below
used Federal funding provided by ... Retrieved from
https://www.ncjrs.gov/pdffiles1/nij/grants/250469.pdf
See attachment can help you in answering this question
(Workbook for Designing a Process Evaluation)
· Specific questions to be answered by the process evaluation
· A plan for gathering and analyzing the information
https://www.ncjrs.gov/pdffiles1/nij/grants/213675.pdf
Make Sure All Bullets Are Answered
:
· A description of the key program elements
· A description of the strategies that the program uses to
produce change
· A description of the needs of the target population
· An explanation of why a process evaluation is important for
the program
· A plan for building relationships with the staff and
management
· Broad questions to be answered by the process evaluation
· Specific questions to be answered by the process evaluation
· A plan for gathering and analyzing the information
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed.
Thus,
process evaluation asks "what," and outcome evaluation asks,
"so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own
process
evaluation for a program of your choosing. There are many
steps involved
in the implementation of a process evaluation, and this
workbook will
attempt to direct you through some of the main stages. It will be
helpful to
think of a delivery service program that you can use as your
example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability
in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the
intervention
and the outcomes
5. To provide information on what components of the
intervention
are responsible for outcomes
6. To understand the relationship between program context
(i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of
implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public,
clients,
and funders
10. To improve the quality of the program, as the act of
evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid 30
c. References 31
Evaluation can be an exciting,
challenging, and fun experience
Enjoy!
* Previously covered in Evaluation Planning Workshops.
** Will not be covered in this expert session. Please refer to
the Evaluation Framework
and Evaluation Module of FHB Best Practice Manual for more
details.
Evaluation Expert Session
July 16, 2002 Page 3
Forming collaborative relationships
A strong, collaborative relationship with program delivery staff
and management will
likely result in the following:
Feedback regarding evaluation design and implementation
Ease in conducting the evaluation due to increased cooperation
Participation in interviews, panel discussion, meetings, etc.
Increased utilization of findings
Seek to establish a mutually respectful relationship
characterized by trust, commitment,
and flexibility.
Key points in establishing a collaborative
relationship:
Start early. Introduce yourself and the evaluation team to as
many delivery staff and
management personnel as early as possible.
Emphasize that THEY are the experts, and you will be utilizing
their knowledge and
information to inform your evaluation development and
implementation.
Be respectful of their time both in-person and on the
telephone. Set up meeting places
that are geographically accessible to all parties involved in the
evaluation process.
Remain aware that, even if they have requested the evaluation,
it may often appear as
an intrusion upon their daily activities. Attempt to be as
unobtrusive as possible and
request their feedback regarding appropriate times for on-site
data collection.
Involve key policy makers, managers, and staff in a series of
meetings throughout the
evaluation process. The evaluation should be driven by the
questions that are of
greatest interest to the stakeholders. Set agendas for meetings
and provide an
overview of the goals of the meeting before beginning. Obtain
their feedback and
provide them with updates regarding the evaluation process.
You may wish to
obtained structured feedback. Sample feedback forms are
throughout the workbook.
Provide feedback regarding evaluation findings to the key
policy makers, managers,
and staff when and as appropriate. Use visual aids and
handouts. Tabulate and
summarize information. Make it as interesting as possible.
Consider establishing a resource or expert "panel" or advisory
board that is an official
group of people willing to be contacted when you need feedback
or have questions.
Evaluation Expert Session
July 16, 2002 Page 4
Determining Program Components
Program components are identified by answering the questions
who, what, when, where,
and how as they pertain to your program.
Who: the program clients/recipients and staff
What: activities, behaviors, materials
When: frequency and length of the contact or intervention
Where: the community context and physical setting
How: strategies for operating the program or intervention
BRIEF EXAMPLE:
Who: elementary school students
What: fire safety intervention
When: 2 times per year
Where: in students’ classroom
How: group administered intervention, small group practice
1. Instruct students what to do in case of fire (stop, drop and
roll).
2. Educate students on calling 911 and have them practice on
play telephones.
3. Educate students on how to pull a fire alarm, how to test a
home fire alarm and how to
change batteries in a home fire alarm. Have students practice
each of these activities.
4. Provide students with written information and have them take
it home to share with their
parents. Request parental signature to indicate compliance and
target a 75% return rate.
Points to keep in mind when determining program
components
Specify activities as behaviors that can be observed
If you have a logic model, use the "activities" column as a
starting point
Ensure that each component is separate and distinguishable
from others
Include all activities and materials intended for use in the
intervention
Identify the aspects of the intervention that may need to be
adapted, and those that should
always be delivered as designed.
Consult with program staff, mission statements, and program
materials as needed.
Evaluation Expert Session
July 16, 2002 Page 5
Your Program Components
After you have identified your program components, create a
logic model that graphically
portrays the link between program components and outcomes
expected from these
components.
Now, write out a succinct list of the components of your
program.
WHO:
WHAT:
WHEN:
WHERE:
HOW:
Evaluation Expert Session
July 16, 2002 Page 6
What is a Logic Model
A logical series of statements that link the problems your
program is attempting to
address (conditions), how it will address them (activities), and
what are the expected
results (immediate and intermediate outcomes, long-term goals).
Benefits of the logic model include:
helps develop clarity about a project or program,
helps to develop consensus among people,
helps to identify gaps or redundancies in a plan,
helps to identify core hypothesis,
helps to succinctly communicate what your project or program
is about.
When do you use a logic model
Use...
- During any work to clarify what is being done, why, and with
what intended results
- During project or program planning to make sure that the
project or program is logical and
complete
- During evaluation planning to focus the evaluation
- During project or program implementation as a template for
comparing to the actual program
and as a filter to determine whether proposed changes fit or
not.
This information was extracted from the Logic Models: A
Multi-Purpose Tool materials developed by Wellsys
Corporation for the Evaluation Planning Workshop Training.
Please see the Evaluation Planning Workshop
materials for more information. Appendix A has a sample
template of the tabular format.
Evaluation Expert Session
July 16, 2002 Page 7
Determining Evaluation Questions
As you design your process evaluation, consider what questions
you would like to answer. It is only after
your questions are specified that you can begin to develop your
methodology. Considering the importance
and purpose of each question is critical.
BROADLY....
What questions do you hope to answer? You may wish to turn
the program components that you have just identified
into questions assessing:
Was the component completed as indicated?
What were the strengths in implementation?
What were the barriers or challenges in implementation?
What were the apparent strengths and weaknesses of each step
of the intervention?
Did the recipient understand the intervention?
Were resources available to sustain project activities?
What were staff perceptions?
What were community perceptions?
What was the nature of the interaction between staff and
clients?
These are examples. Check off what is applicable to you, and
use the space below to write additional broad,
overarching questions that you wish to answer.
Evaluation Expert Session
July 16, 2002 Page 8
SPECIFICALLY ...
Now, make a list of all the specific questions you wish to
answer, and organize your questions categorically. Your
list of questions will likely be much longer than your list of
program components. This step of developing your
evaluation will inform your methodologies and instrument
choice.
Remember that you must collect information on what the
program is intended to be and what it is in reality, so you
may need to ask some questions in 2 formats.
For example:
How many people are intended to complete this intervention
per week?"
How many actually go through the intervention during an
average week?"
Consider what specific questions you have. The questions below
are only examples! Some may not be appropriate
for your evaluation, and you will most likely need to add
additional questions. Check off the questions that are
applicable to you, and add your own questions in the space
provided.
WHO (regarding client):
Who is the target audience, client, or recipient?
How many people have participated?
How many people have dropped out?
How many people have declined participation?
What are the demographic characteristics of clients?
Race
Ethnicity
National Origin
Age
Gender
Sexual Orientation
Religion
Marital Status
Employment
Income Sources
Education
Socio-Economic Status
What factors do the clients have in common?
What risk factors do clients have?
Who is eligible for participation?
How are people referred to the program? How are the
screened?
How satisfied are the clients?
YOUR QUESTIONS:
Evaluation Expert Session
July 16, 2002 Page 9
WHO (Regarding staff):
Who delivers the services?
How are they hired?
How supportive are staff and management of each other?
What qualifications do staff have?
How are staff trained?
How congruent are staff and recipients with one another?
What are staff demographics? (see client demographic list for
specifics.)
YOUR QUESTIONS:
WHAT:
What happens during the intervention?
What is being delivered?
What are the methods of delivery for each service (e.g., one-
on-one, group session, didactic instruction,
etc.)
What are the standard operating procedures?
What technologies are in use?
What types of communication techniques are implemented?
What type of organization delivers the program?
How many years has the organization existed? How many
years has the program been operating?
What type of reputation does the agency have in the
community? What about the program?
What are the methods of service delivery?
How is the intervention structured?
How is confidentiality maintained?
YOUR QUESTIONS:
WHEN:
When is the intervention conducted?
How frequently is the intervention conducted?
At what intervals?
At what time of day, week, month, year?
What is the length and/or duration of each service?
Evaluation Expert Session
July 16, 2002 Page 10
YOUR QUESTIONS:
WHERE:
Where does the intervention occur?
What type of facility is used?
What is the age and condition of the facility?
In what part of town is the facility? Is it accessible to the
target audience? Does public transportation access
the facility? Is parking available?
Is child care provided on site?
YOUR QUESTIONS:
WHY:
Why are these activities or strategies implemented and why
not others?
Why has the intervention varied in ability to maintain interest?
Why are clients not participating?
Why is the intervention conducted at a certain time or at a
certain frequency?
YOUR QUESTIONS:
Evaluation Expert Session
July 16, 2002 Page 11
Validating Your Evaluation Questions
Even though all of your questions may be interesting, it is
important to narrow your list to questions that
will be particularly helpful to the evaluation and that can be
answered given your specific resources, staff,
and time.
Go through each of your questions and consider it with respect
to the questions below, which may be helpful in
streamlining your final list of questions.
Revise your worksheet/list of questions until you can answer
"yes" to all of these questions. If you cannot answer
"yes" to your question, consider omitting the question from your
evaluation.
Validation
Yes
No
Will I use the data that will stem from these questions?
Do I know why each question is important and /or valuable?
Is someone interested in each of these questions?
Have I ensured that no questions are omitted that may be
important to
someone else?
Is the wording of each question sufficiently clear and
unambiguous?
Do I have a hypothesis about what the “correct” answer will be
for each
question?
Is each question specific without inappropriately limiting the
scope of the
evaluation or probing for a specific response?
Do they constitute a sufficient set of questions to achieve the
purpose(s) of
the evaluation?
Is it feasible to answer the question, given what I know about
the
resources for evaluation?
Is each question worth the expense of answering it?
Derived from "A Design Manual" Checklist, page 51.
Evaluation Expert Session
July 16, 2002 Page 12
Determining Methodology
Process evaluation is characterized by collection of data
primarily through two formats:
1) Quantitative, archival, recorded data that may be managed
by an computerized
tracking or management system, and
2) Qualitative data that may be obtained through a variety of
formats, such as
surveys or focus groups.
When considering what methods to use, it is critical to have a
thorough
understanding and knowledge of the questions you want
answered. Your
questions will inform your choice of methods. After this section
on types of
methodologies, you will complete an exercise in which you
consider what method
of data collection is most appropriate for each question.
Do you have a thorough understanding of your
questions?
Furthermore, it is essential to consider what data the
organization you are
evaluating already has. Data may exist in the form of an
existing computerized
management information system, records, or a tracking system
of some other
sort. Using this data may provide the best reflection of what is
"going on," and it
will also save you time, money, and energy because you will not
have to devise
your own data collection method! However, keep in mind that
you may have to
adapt this data to meet your own needs - you may need to add or
replace fields,
records, or variables.
What data does your organization already have?
Will you need to adapt it?
If the organization does not already have existing data, consider
devising a
method for the organizational staff to collect their own data.
This process will
ultimately be helpful for them so that they can continue to self-
evaluate, track
their activities, and assess progress and change. It will be
helpful for the
evaluation process because, again, it will save you time, money,
and energy that
you can better devote towards other aspects of the evaluation.
Management
information systems will be described more fully in a later
section of this
workbook.
Do you have the capacity and resources to devise
such a system? (You may need to refer to a later
section of this workbook before answering.)
Evaluation Expert Session
July 16, 2002 Page 13
Who should collect the data?
Given all of this, what thoughts do you have on who should
collect data for your
evaluation? Program staff, evaluation staff, or some
combination?
Program Staff: May collect data from activities such as
attendance, demographics,
participation, characteristics of participants, dispositions, etc;
may
conduct intake interviews, note changes regarding service
delivery,
and monitor program implementation.
Advantages: Cost-efficient, accessible, resourceful, available,
time-efficient,
and increased understanding of the program.
Disadvantages: May exhibit bias and/or social desirability, may
use data for critical
judgment, may compromise the validity of the program; may put
staff in uncomfortable or inappropriate position; also, if staff
collect
data, may have an increased burden and responsibility placed
upon
them outside of their usual or typical job responsibilities. If you
utilize staff for data collection, provide frequent reminders as
well
as messages of gratitude.
Evaluation staff: May collect qualitative information regarding
implementation,
general characteristics of program participants, and other
information that may otherwise be subject to bias or distortion.
Advantages: Data collected in manner consistent with overall
goals and timeline
of evaluation; prevents bias and inappropriate use of
information;
promotes overall fidelity and validity of data.
Disadvantages: May be costly and take extensive time; may
require additional
training on part of evaluator; presence of evaluator in
organization
may be intrusive, inconvenient, or burdensome.
Evaluation Expert Session
July 16, 2002 Page 14
When should data be collected?
Conducting the evaluation according to your timeline can be
challenging. Consider how
much time you have for data collection, and make decisions
regarding what to collect
and how much based on your timeline.
In many cases, outcome evaluation is not considered appropriate
until the program has
stabilized. However, when conducting a process evaluation, it
can be important to start
the evaluation at the beginning so that a story may be told
regarding how the program
was developed, information may be provided on refinements,
and program growth and
progress may be noted.
If you have the luxury of collecting data from the start of the
intervention to the end of
the intervention, space out data collection as appropriate. If you
are evaluating an
ongoing intervention that is fairly quick (e.g., an 8-week
educational group), you may
choose to evaluate one or more "cycles."
How much time do you have to conduct your evaluation?
How much time do you have for data collection (as opposed to
designing the evaluation,
training, organizing and analyzing results, and writing the
report?)
Is the program you are evaluating time specific?
How long does the program or intervention last?
At what stages do you think you will most likely collect data?
Soon after a program has begun
Descriptive information on program characteristics that will not
change; information
requiring baseline information
During the intervention
Ongoing process information such as recruitment, program
implementation
After the intervention
Demographics, attendance ratings, satisfaction ratings
Evaluation Expert Session
July 16, 2002 Page 15
Before you consider methods
A list of various methods follows this section. Before choosing
what methods are
most appropriate for your evaluation, review the following
questions. (Some may
already be answered in another section of this workbook.)
What questions do I want answered? (see previous section)
Does the organization already have existing data, and if so,
what kind?
Does the organization have staff to collect data?
What data can the organization staff collect?
Must I maintain anonymity (participant is not identified at all)
or confidentiality
(participant is identified but responses remain private)? This
consideration
pertains to existing archival data as well as original data
collection.
How much time do I have to conduct the evaluation?
How much money do I have in my budget?
How many evaluation staff do I have to manage the data
collection activities?
Can I (and/or members of my evaluation staff) travel on site?
What time of day is best for collecting data? For example, if
you plan to conduct
focus groups or interviews, remember that your population may
work during the
day and need evening times.
Evaluation Expert Session
July 16, 2002 Page 16
Types of methods
A number of different methods exist that can be used to collect
process
information. Consider each of the following, and check those
that you think would
be helpful in addressing the specific questions in your
evaluation. When "see
sample" is indicated, refer to the pages that follow this table.
√ Method Description
Activity,
participation, or
client tracking log
Brief record completed on site at frequent intervals by
participant or deliverer.
May use form developed by evaluator if none previously exists.
Examples: sign
in log, daily records of food consumption, medication
management.
Case Studies
Collection of in-depth information regarding small number of
intervention
recipients; use multiple methods of data collection.
Ethnographic
analysis
Obtain in-depth information regarding the experience of the
recipient by
partaking in the intervention, attending meetings, and talking
with delivery staff
and recipients.
Expert judgment
Convene a panel of experts or conduct individual interviews to
obtain their
understanding of and reaction to program delivery.
Focus groups
Small group discussion among program delivery staff or
recipients. Focus on
their thoughts and opinions regarding their experiences with the
intervention.
Meeting minutes
(see sample)
Qualitative information regarding agendas, tasks assigned, and
coordination and
implementation of the intervention as recorded on a consistent
basis.
Observation
(see sample)
Observe actual delivery in vivo or on video, record findings
using check sheet
or make qualitative observations.
Open-ended
interviews –
telephone or in
person
Evaluator asks open questions (i.e., who, what, when, where,
why, how) to
delivery staff or recipients. Use interview protocol without
preset response
options.
Questionnaire
Written survey with structured questions. May administer in
individual, group,
or mail format. May be anonymous or confidential.
Record review
Obtain indicators from intervention records such patient files,
time sheets,
telephone logs, registration forms, student charts, sales records,
or records
specific to the service delivery.
Structured
interviews –
telephone or in
person
Interviewer asks direct questions using interview protocol with
preset response
options.
Evaluation Expert Session
July 16, 2002
Page 17
Sample activity log
This is a common process evaluation methodology because it
systematically records exactly what is happening during
implementation. You may wish to devise a log such as the one
below and alter it to meet your specific needs. Consider
computerizing such a log for efficiency. Your program may
already have existing logs that you can utilize and adapt for
your
evaluation purposes.
Site:
Recorder:
Code
Service
Date
Location
# People
# Hours
Notes
Evaluation Expert Session
July 16, 2002
Page 18
Meeting Minutes
Taking notes at meetings may provide extensive and invaluable
process information that
can later be organized and structured into a comprehensive
report. Minutes may be taken
by program staff or by the evaluator if necessary. You may find
it helpful to use a
structured form, such as the one below that is derived from
Evaluating Collaboratives,
University of Wisconsin-Cooperative Extension, 1998.
Meeting Place: __________________ Start time: ____________
Date: _____________________________ End time:
____________
Attendance (names):
Agenda topic:
_________________________________________________
Discussion:
_____________________________________________________
Decision Related Tasks Who responsible Deadline
1.
2.
3.
Agenda topic:
_________________________________________________
Discussion:
_____________________________________________________
Decision Related Tasks Who responsible Deadline
1.
2.
3.
Sample observation log
Evaluation Expert Session
July 16, 2002
Page 19
Observation may occur in various methods, but one of the most
common is
hand-recording specific details during a small time period. The
following is several rows
from an observation log utilized during an evaluation examining
school classrooms.
CLASSROOM OBSERVATIONS (School Environment Scale)
Classroom 1: Grade level _________________ (Goal: 30
minutes of observation)
Time began observation: _________Time ended
observation:_________
Subjects were taught during observation period:
___________________
PHYSICAL ENVIRONMENT
Question
Answer
1. Number of students
2. Number of adults in room:
a. Teachers
b. Para-pros
c. Parents
Total:
a.
b.
c.
3. Desks/Tables
a. Number of Desks
b. Number of Tables for students’ use
c. Any other furniture/include number
(Arrangement of desks/tables/other furniture)
a.
b.
c.
4. Number of computers, type
5. How are computers being used?
6. What is the general classroom setup? (are there walls,
windows, mirrors,
carpet, rugs, cabinets, curtains, etc.)
7. Other technology (overhead projector, power point, VCR,
etc.)
8. Are books and other materials accessible for students?
9. Is there adequate space for whole-class instruction?
12. What type of lighting is used?
13. Are there animals or fish in the room?
14. Is there background music playing?
15. Rate the classroom condition
Poor Average Excellent
16. Are rules/discipline procedures posted? If so, where?
17. Is the classroom Noisy or Quiet?
Very Quiet Very Noisy
Choosing or designing measurement instruments
Consider using a resource panel, advisory panel, or focus group
to offer feedback
Evaluation Expert Session
July 16, 2002
Page 20
regarding your instrument. This group may be composed of any
of the people listed
below. You may also wish to consult with one or more of these
individuals throughout
the development of your overall methodology.
Who should be involved in the design of your instrument(s)
and/or provide feedback?
Program service delivery staff / volunteers
Project director
Recipients of the program
Board of directors
Community leader
Collaborating organizations
Experts on the program or service being evaluated
Evaluation experts
_________________________
_________________________
_________________________
Conduct a pilot study and administer the instrument to a group
of recipients, and then
obtain feedback regarding their experience. This is a critical
component of the
development of your instruments, as it will help ensure clarity
of questions, and reduce
the degree of discomfort or burden that questions or processes
(e.g., intakes or
computerized data entry) elicit.
How can you ensure that you pilot your methods? When will
you do it, and whom will you use
as participants in the study?
Ensure that written materials are at an appropriate reading
level for the population.
Ensure that verbal information is at an appropriate terminology
level for the population.
A third or sixth-grade reading level is often utilized.
Remember that you are probably collecting data that is
program-specific. This may
increase the difficulty in finding instruments previously
constructed to use for
questionnaires, etc. However, instruments used for conducting
process evaluations of
other programs may provide you with ideas for how to structure
your own instruments.
Evaluation Expert Session
July 16, 2002
Page 21
Linking program components and methods (an example)
Now that you have identified your program components, broad
questions, specific
questions, and possible measures, it is time to link them
together. Let's start with your
program components. Here is an example of 3 program
components of an intervention.
Program Components and Essential Elements:
There are six program components to M2M. There
are essential elements in each component that must
be present for the program to achieve its intended
results and outcomes, and for the program to be
identified as a program of the American Cancer
Society.
Possible Process Measures
1) Man to Man Self-Help and/or Support Groups
The essential elements within this component are:
• Offer information and support to all men
with prostate cancer at all points along the
cancer care continuum
• Directly, or through collaboration and
referral, offer community access to
prostate cancer self-help and/or support
groups
• Provide recruitment and on-going training
and monitoring for M2M leaders and
volunteers
• Monitor, track and report program
activities
• Descriptions of attempts to schedule and advertise
group meetings
• Documented efforts to establish the program
• Documented local needs assessments
• # of meetings held per independent group
• Documented meetings held
• # of people who attended different topics and speakers
• Perceptions of need of survey participants for
additional groups and current satisfaction levels
• # of new and # of continuing group members
• Documented sign-up sheets for group meetings
• Documented attempts to contact program dropouts
• # of referrals to other PC groups documented
• # of times corresponding with other PC groups
• # of training sessions for new leaders
• # of continuing education sessions for experienced
leaders
• # and types of other on-going support activities for
volunteer leaders
• # of volunteers trained as group facilitators
• Perceptions of trained volunteers for readiness to
function as group facilitators
Evaluation Expert Session
July 16, 2002
Page 22
2) One-to-One Contacts
The essential elements within this component are:
• Offer one-to-one contact to provide
information and support to all men with
prostate cancer, including those in the
diagnostic process
• Provide recruitment and on-going training
and monitoring for M2M leaders and
volunteers
• Monitor, track and report program
activities
• # of contact pairings
• Frequency and duration of contact pairings
• Types of information shared during contact pairings
• # of volunteers trained
• Perception of readiness by trained volunteers
• Documented attempts for recruiting volunteers
• Documented on-going training activities for volunteers
• Documented support activities
3) Community Education and Awareness
The essential elements within this component are:
• Conduct public awareness activities to
inform the public about prostate cancer
and M2M
• Monitor, track and report program
activities
• # of screenings provided by various health care
providers/agencies over assessment period
• Documented ACS staff and volunteer efforts to
publicize the availability and importance of PC and
screenings, including health fairs, public service
announcements, billboard advertising, etc.
• # of addresses to which newsletters are mailed
• Documented efforts to increase newsletter mailing list
Page 23
Linking YOUR program components, questions, and methods
Consider each of your program components and questions that
you have devised in an earlier section of this workbook, and the
methods that you checked off on the "types of methods" table.
Now ask yourself, how will I use the information I have
obtained from this question? And, what method is most
appropriate for obtaining this information?
Program Component
Specific questions that go with this
component
How will I use this
information?
Best method?
Page 24
Program Component
Specific questions that go with this
component
How will I use this
information?
Best method?
Evaluation Expert Session
July 16, 2002
Page 25
Data Collection Plan
Now let's put your data collection activities on one sheet - what
you're collecting, how you're doing it, when, your sample, and
who will collect it. Identifying your methods that you have just
picked, instruments, and data collection techniques in a
structured manner will facilitate this process.
Method
Type of data (questions, briefly
indicated)
Instrument used
When
implemented
Sample
Who collects
E.g.: Patient
interviews in health
dept clinics
Qualitative - what services they are
using, length of visit, why came in,
how long wait, some quantitative
satisfaction ratings
Interview created
by evaluation team
and piloted with
patients
Oct-Dec; days
and hrs
randomly
selected
10 interviews
in each
clinic
Trained
interviewers
Page 26
Evaluation Expert Session
July 16, 2002
Consider a Management Information System
Process data is frequently collected through a management
information system (MIS) that
is designed to record characteristics of participants,
participation of participants, and
characteristics of activities and services provided. An MIS is a
computerized record
system that enables service providers and evaluators to
accumulate and display data
quickly and efficiently in various ways.
Will your evaluation be enhanced by periodic data presentations
in tables or other
structured formats? For example, should the evaluation utilize a
monthly print-out of
services utilized or to monitor and process recipient tracking
(such as date, time, and
length of service)?
YES
NO
Does the agency create monthly (or other periodic) print outs
reflecting
services rendered or clients served?
YES
NO
Will the evaluation be conducted in a more efficient manner if
program
delivery staff enter data on a consistent basis?
YES
NO
Does the agency already have hard copies of files or records
that would be
better utilized if computerized?
YES
NO
Does the agency already have an MIS or a similar computerized
database?
YES
NO
If the answers to any of these questions are YES,
consider using an MIS for your evaluation.
If an MIS does not already exist, you may desire to design a
database in which you can
enter information from records obtained by the agency. This
process decreases missing
data and is generally efficient.
If you do create a database that can be used on an ongoing
basis by the agency, you may
consider offering it to them for future use.
Page 27
Evaluation Expert Session
July 16, 2002
Information to be included in your MIS
Examples include:
Client demographics
Client contacts
Client services
Referrals offered
Client outcomes
Program activities
Staff notes
Jot down the important data you would like to be included in
your MIS.
Managing your MIS
What software do you wish to utilize to manage your data?
What type of data do you have?
How much information will you need to enter?
How will you ultimately analyze the data? You may wish to
create a database directly in
the program you will eventually use, such as SPSS?
Will you be utilizing lap tops?
Page 28
Evaluation Expert Session
July 16, 2002
If so, will you be taking them onsite and directly entering your
data into them?
How will you download or transfer the information, if
applicable?
What will the impact be on your audience if you have a laptop?
Tips on using an MIS
If service delivery personnel will be collecting and/or entering
information into the MIS
for the evaluator's use, it is generally a good idea to provide
frequent reminders of the
importance of entering the appropriate information in a timely,
consistent, and regular
manner.
For example, if an MIS is dependent upon patient data
collected by public health officers
daily activities, the officers should be entering data on at least a
daily basis. Otherwise,
important data is lost and the database will only reflect what
was salient enough to be
remembered and entered at the end of the week.
Don't forget that this may be burdensome and/or inconvenient
for the program staff.
Provide them with frequent thank you's.
Remember that your database is only as good as you make it.
It must be organized and
arranged so that it is most helpful in answering your questions.
If you are collecting from existing records, at what level is he
data currently available?
For example, is it state, county, or city information? How is it
defined? Consider whether
adaptations need to be made or additions need to be included for
your evaluation.
Back up your data frequently and in at least one additional
format (e.g., zip, disk, server).
Consider file security. Will you be saving data on a network
server? You may need to
consider password protection.
Page 29
Evaluation Expert Session
July 16, 2002
Allocate time for data entry and checking.
Allow additional time to contemplate the meaning of the data
before writing the report.
Page 30
Evaluation Expert Session
July 16, 2002
Implement Data Collection and Analysis
Data collection cannot be fully reviewed in this workbook, but
this page offers a few tips
regarding the process.
General reminders:
THANK everyone who helps you, directs you, or participates
in anyway.
Obtain clear directions and give yourself plenty of time,
especially if you are traveling
long distance (e.g., several hours away).
Bring all of your own materials - do not expect the program to
provide you with writing
utensils, paper, a clipboard, etc.
Address each person that you meet with respect and attempt to
make your meeting as
conducive with their schedule as possible.
Most process evaluation will be in the form of routine record
keeping (e.g., MIS). However, you
may wish to interview clients and staff. If so:
Ensure that you have sufficient time to train evaluation staff,
data collectors, and/or
organization staff who will be collecting data. After they have
been trained in the data
collection materials and procedure, require that they practice
the technique, whether it is
an interview or entering a sample record in an MIS.
If planning to use a tape recorder during interviews or focus
groups, request permission
from participants before beginning. You may need to turn the
tape recorder off on
occasion if it will facilitate increased comfort by participants.
If planning to use laptop computers, attempt to make
consistent eye contact and spend
time establishing rapport before beginning. Some participants
may be uncomfortable with
technology and you may need to provide education regarding
the process of data
collection and how the information will be utilized.
If planning to hand write responses, warn the participant that
you may move slowly and
Page 31
Evaluation Expert Session
July 16, 2002
may need to ask them to repeat themselves. However, prepare
for this process by
developing shorthand specific to the evaluation. A sample
shorthand page follows.
Page 32
Evaluation Expert Session
July 16, 2002
Annual Evaluation Reports
The ultimate aim of all the Branch’s evaluation efforts is to
increase the intelligent use of
information in Branch decision-making in order to improve
health outcomes. Because we
understand that many evaluation efforts fail because the data are
never collected and that even
more fail because the data are collected but never used in
decision-making, we have struggled to
find a way to institutionalize the use of evaluation results in
Branch decision-making.
These reports will serve multiple purposes:
The need to complete the report will increase the likelihood
that evaluation is done and
data are collected.
The need to review reports from lower levels in order to
complete one’s own report
hopefully will cause managers at all levels to consciously
consider, at least once a year,
the effectiveness of their activities and how evaluation results
suggest that effectiveness
can be improved.
The summaries of evaluation findings in the reports should
simplify preparation of other
reports to funders including the General Assembly.
Each evaluation report forms the basis of the evaluation report
at the next level. The contents
and length of the report should be determined by what is mot
helpful to the manager who is
receiving the report. Rather than simply reporting every
possible piece of data, these reports
should present summary data, summarize important conclusions,
and suggest recommendations
based on the evaluation findings. A program-level annual
evaluation report should be ten pages
or less. Many my be less than five pages. Population team and
Branch-level annual evaluation
reports may be longer than ten pages, depending on how many
findings are being reported.
However, reports that go beyond ten pages should also contain a
shorter Executive Summary, to
insure that those with the power to make decisions actually read
the findings.
Especially, the initial reports may reflect formative work and
consist primarily of updates on the
progress of evaluation planning and implementation. This is
fine and to be expected.
However, within a year or two the reports should begin to
include process data, and later actual
outcome findings.
This information was extracted from the FHB Evaluation
Framework developed by Monica Herk and Rebekah Hudgins.
Page 33
Evaluation Expert Session
July 16, 2002
Suggested shorthand - a sample
The list below was derived for a process evaluation regarding
charter schools. Note the use of general shorthand as
well as shorthand derived specifically for the evaluation.
CS
Charter School
mst
Most
Sch School b/c Because
Tch Teacher, teach st Something
P Principal b Be
VP Vice Principal c See
Admin Administration, administrators r Are
DOE Dept of Education w/ When
BOE Board of Education @ At
Comm Community ~ About
Stud Students, pupils = Is, equals, equivalent
Kids Students, children, teenagers ≠ Does not equal, is not the
same
K Kindergarten Sone Someone
Cl Class # Number
CR Classroom $ Money, finances, financial, funding,
expenses, etc.
W White + Add, added, in addition
B Black < Less than
AA African American > Greater/more than
SES Socio-economic status ??? What does this mean? Get more
info on, I'm confused…
Lib Library, librarian DWA Don't worry about (e.g. if you wrote
something unnecessary)
Caf Cafeteria Ψ Psychology, psychologist
Ch Charter ∴ Therefore
Conv Conversion (school) ∆ Change, is changing
S-up Start up school mm Movement
App Application, applied ↑ Increases, up, promotes
ITBS Iowa Test of Basic Skills ↓ Decreases, down, inhibits
LA Language arts X Times (e.g. many x we laugh)
SS Social Studies ÷ Divided (we ÷ up the classrooms)
QCC Quality Core Curriculum C With
Pol Policy, politics Home, house
Curr Curriculum ♥ Love, adore (e.g. the kids ♥ this)
LP Lesson plans Church, religious activity
Disc Discipline O No, doesn't, not
Girls, women, female 1/2 Half (e.g. we took 1/2)
Boys, men, male 2 To
Page 34
Evaluation Expert Session
July 16, 2002
F
Father, dad
c/out
without
P
Parent
2B
To be
M
Mom, mother
e.g.
For example
i.e.
That is
…
If the person trails off, you missed
information
Appendix A
Logic Model Worksheet
Population Team/Program Name
__________________________ Date
_______________________
If the following
CONDITIONS
AND
ASSUMPTIONS
exist...
And if the following
ACTIVITIES are
implemented to
address these
conditions and
assumptions
Then these
SHORT-TERM
OUTCOMES may
be achieved...
And these
LONG-TERM
OUTCOMES
may be
acheived...
And these LONG-
TERM GOALS can
be reached....
Page 35
Evaluation Expert Session
July 16, 2002
Appendix B
Pitfalls To Avoid
Avoid heightening expectations of delivery staff, program
recipients, policy makers, or
community members. Ensure that feedback will be provided as
appropriate, but may or may
not be utilized.
Avoid any implication that you are evaluating the impact or
outcome. Stress that you are
evaluating "what is happening," not how well any one person is
performing or what the
outcomes of the intervention are.
Make sure that the right information gets to the right people -
it is most likely to be utilized
in a constructive and effective manner if you ensure that your
final report does not end up on
someone's desk who has little motivation or interest in utilizing
your findings.
Ensure that data collection and entry is managed on a
consistent basis - avoid developing an
evaluation design and than having the contract lapse because
staff did not enter the data.
Page 36
Evaluation Expert Session
July 16, 2002
Appendix C
References
References used for completion of this workbook and/or that
you may find helpful for
additional information.
Centers for Disease Control and Prevention. 1995. Evaluating
Community Efforts to Prevent
Cardiovascular Diseases. Atlanta, GA.
Centers for Disease Control and Prevention. 2001. Introduction
to Program Evaluation for
Comprehensive Tobacco Control Programs. Atlanta, GA.
Freeman, H. E., Rossi, P. H., Sandefur, G. D. 1993. Workbook
for evaluation: A systematic
approach. Sage Publications: Newbury Park, CA.
Georgia Policy Council for Children and Families; The Family
Connection; Metis Associates,
Inc. 1997. Pathways for assessing change: Strategies for
community partners.
Grembowski, D. 2001. The practice of health program
evaluation. Sage Publications: Thousand
Oaks.
Hawkins, J. D., Nederhood, B. 1987. Handbook for Evaluating
Drug and Alcohol Prevention
Programs. U.S. Department of Health and Human Services;
Public Health Service; Alcohol,
Drug Abuse, and Mental Health Administration: Washington, D.
C.
Muraskin, L. D. 1993. Understanding evaluation: The way to
better prevention programs.
Westat, Inc.
National Community AIDS Partnership 1993. Evaluating
HIV/AIDS Prevention Programs in
Community-based Organizations. Washington, D.C.
NIMH Overview of Needs Assessment. Chapter 3: Selecting the
needs assessment approach.
Patton, M. Q. 1982. Practical Evaluation. Sage Publications,
Inc.: Beverly Hills, CA.
Page 37
Evaluation Expert Session
July 16, 2002
Posavac, E. J., Carey, R. G. 1980. Program Evaluation: Methods
and Case Studies.
Prentice-Hall, Inc.: Englewood Cliffs, N.J.
Rossi, P. H., Freeman, H. E., Lipsey, M. W. 1999. Evaluation:
A Systematic Approach. (6th
edition). Sage Publications, Inc.: Thousand Oaks, CA.
Scheirer, M. A. 1994. Designing and using process evaluation.
In: J. S. Wholey, H. P. Hatry, &
K. E. Newcomer (eds) Handbook of practical program
evaluation. Jossey-Bass Publishers: San
Francisco.
Taylor-Powell, E., Rossing, B., Geran, J. 1998. Evaluating
Collaboratives: Reaching the
potential. Program Development and Evaluation: Madison, WI.
U.S. Department of Health and Human Services; Administration
for Children and Families;
Office of Community Services. 1994. Evaluation Guidebook:
Demonstration partnership
program projects.
W.K. Kellogg Foundation. 1998. W. K. Kellogg Foundation
Evaluation Handbook.
Websites:
www.cdc.gov/eval/resources
www.eval.org (has online text books)
www.wmich.edu/evalctr (has online checklists)
www.preventiondss.org
When conducting literature reviews or searching for additional
information, consider using
alternative names for "process evaluation," including:
formative evaluation
program fidelity
implementation assessment
implementation evaluation
program monitoring

More Related Content

Similar to Guide to Helping With Paper· Description of the key program .docx

Community implementation and evaluation Health Promotion Program.docx
Community implementation and evaluation Health Promotion Program.docxCommunity implementation and evaluation Health Promotion Program.docx
Community implementation and evaluation Health Promotion Program.docxwrite12
 
programme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanprogramme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanPriya Das
 
Evaluating the guidance program
Evaluating the guidance programEvaluating the guidance program
Evaluating the guidance programEmsz Domingo
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes successcontentli
 
Collaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraCollaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraSandra Guevara
 
Running Head DESIGN THE EVALUATION 1DESIGN THE EVALUATION .docx
Running Head DESIGN THE EVALUATION 1DESIGN THE EVALUATION .docxRunning Head DESIGN THE EVALUATION 1DESIGN THE EVALUATION .docx
Running Head DESIGN THE EVALUATION 1DESIGN THE EVALUATION .docxhealdkathaleen
 
694C H A P T E RProgram Planning The Big Picture.docx
694C H A P T E RProgram Planning The Big Picture.docx694C H A P T E RProgram Planning The Big Picture.docx
694C H A P T E RProgram Planning The Big Picture.docxevonnehoggarth79783
 
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxSOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxsamuel699872
 
Collaborative work 2 Part 1
Collaborative work 2 Part 1Collaborative work 2 Part 1
Collaborative work 2 Part 1Fc2017
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosaResty Samosa
 
17- Program Evaluation a beginner’s guide.pdf
17- Program Evaluation a beginner’s guide.pdf17- Program Evaluation a beginner’s guide.pdf
17- Program Evaluation a beginner’s guide.pdfAnshuman834549
 
Co Inst Eval Presentation 09
Co Inst Eval Presentation 09Co Inst Eval Presentation 09
Co Inst Eval Presentation 09June Gothberg
 
Minnesota State University Moorhead MHA 625 Health Pr
 Minnesota State University Moorhead MHA 625 Health Pr Minnesota State University Moorhead MHA 625 Health Pr
Minnesota State University Moorhead MHA 625 Health PrTatianaMajor22
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxstilliegeorgiana
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaDiscover JKUAT
 

Similar to Guide to Helping With Paper· Description of the key program .docx (20)

Program Evaluations
Program EvaluationsProgram Evaluations
Program Evaluations
 
Community implementation and evaluation Health Promotion Program.docx
Community implementation and evaluation Health Promotion Program.docxCommunity implementation and evaluation Health Promotion Program.docx
Community implementation and evaluation Health Promotion Program.docx
 
programme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanprogramme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhan
 
Evaluating the guidance program
Evaluating the guidance programEvaluating the guidance program
Evaluating the guidance program
 
Needs assessment
Needs assessmentNeeds assessment
Needs assessment
 
Curriculum monitoring
Curriculum monitoringCurriculum monitoring
Curriculum monitoring
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 
Collaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraCollaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandra
 
Running Head DESIGN THE EVALUATION 1DESIGN THE EVALUATION .docx
Running Head DESIGN THE EVALUATION 1DESIGN THE EVALUATION .docxRunning Head DESIGN THE EVALUATION 1DESIGN THE EVALUATION .docx
Running Head DESIGN THE EVALUATION 1DESIGN THE EVALUATION .docx
 
694C H A P T E RProgram Planning The Big Picture.docx
694C H A P T E RProgram Planning The Big Picture.docx694C H A P T E RProgram Planning The Big Picture.docx
694C H A P T E RProgram Planning The Big Picture.docx
 
Project evaluation
Project evaluationProject evaluation
Project evaluation
 
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxSOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
 
Collaborative work 2 Part 1
Collaborative work 2 Part 1Collaborative work 2 Part 1
Collaborative work 2 Part 1
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
 
17- Program Evaluation a beginner’s guide.pdf
17- Program Evaluation a beginner’s guide.pdf17- Program Evaluation a beginner’s guide.pdf
17- Program Evaluation a beginner’s guide.pdf
 
Co Inst Eval Presentation 09
Co Inst Eval Presentation 09Co Inst Eval Presentation 09
Co Inst Eval Presentation 09
 
Model of Programs.pptx
Model of Programs.pptxModel of Programs.pptx
Model of Programs.pptx
 
Minnesota State University Moorhead MHA 625 Health Pr
 Minnesota State University Moorhead MHA 625 Health Pr Minnesota State University Moorhead MHA 625 Health Pr
Minnesota State University Moorhead MHA 625 Health Pr
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docx
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
 

More from shericehewat

You have been asked to explain the differences between certain categ.docx
You have been asked to explain the differences between certain categ.docxYou have been asked to explain the differences between certain categ.docx
You have been asked to explain the differences between certain categ.docxshericehewat
 
You have been asked to help secure the information system and users .docx
You have been asked to help secure the information system and users .docxYou have been asked to help secure the information system and users .docx
You have been asked to help secure the information system and users .docxshericehewat
 
You have been asked to participate in a local radio program to add.docx
You have been asked to participate in a local radio program to add.docxYou have been asked to participate in a local radio program to add.docx
You have been asked to participate in a local radio program to add.docxshericehewat
 
You have been hired asa cons.docx
You have been hired asa cons.docxYou have been hired asa cons.docx
You have been hired asa cons.docxshericehewat
 
You have been appointed as a system analyst in the IT department of .docx
You have been appointed as a system analyst in the IT department of .docxYou have been appointed as a system analyst in the IT department of .docx
You have been appointed as a system analyst in the IT department of .docxshericehewat
 
You choose one and I will upload the materials for u.Choose 1 of.docx
You choose one and I will upload the materials for u.Choose 1 of.docxYou choose one and I will upload the materials for u.Choose 1 of.docx
You choose one and I will upload the materials for u.Choose 1 of.docxshericehewat
 
You are Incident Commander and principal planner for the DRNC even.docx
You are Incident Commander and principal planner for the DRNC even.docxYou are Incident Commander and principal planner for the DRNC even.docx
You are Incident Commander and principal planner for the DRNC even.docxshericehewat
 
You DecideCryptographic Tunneling and the OSI ModelWrite a p.docx
You DecideCryptographic Tunneling and the OSI ModelWrite a p.docxYou DecideCryptographic Tunneling and the OSI ModelWrite a p.docx
You DecideCryptographic Tunneling and the OSI ModelWrite a p.docxshericehewat
 
You are working as a behavioral health specialist in a neurological .docx
You are working as a behavioral health specialist in a neurological .docxYou are working as a behavioral health specialist in a neurological .docx
You are working as a behavioral health specialist in a neurological .docxshericehewat
 
You are to write up a reflection (longer than 2 pages) that discusse.docx
You are to write up a reflection (longer than 2 pages) that discusse.docxYou are to write up a reflection (longer than 2 pages) that discusse.docx
You are to write up a reflection (longer than 2 pages) that discusse.docxshericehewat
 
You can only take this assignment if you have the book Discovering t.docx
You can only take this assignment if you have the book Discovering t.docxYou can only take this assignment if you have the book Discovering t.docx
You can only take this assignment if you have the book Discovering t.docxshericehewat
 
You are to interview a woman 50 and older and write up the interview.docx
You are to interview a woman 50 and older and write up the interview.docxYou are to interview a woman 50 and older and write up the interview.docx
You are to interview a woman 50 and older and write up the interview.docxshericehewat
 
You are to complete TWO essays and answer the following questions.  .docx
You are to complete TWO essays and answer the following questions.  .docxYou are to complete TWO essays and answer the following questions.  .docx
You are to complete TWO essays and answer the following questions.  .docxshericehewat
 
You are the vice president of a human resources department and Susan.docx
You are the vice president of a human resources department and Susan.docxYou are the vice president of a human resources department and Susan.docx
You are the vice president of a human resources department and Susan.docxshericehewat
 
You are the purchasing manager of a company that has relationships w.docx
You are the purchasing manager of a company that has relationships w.docxYou are the purchasing manager of a company that has relationships w.docx
You are the purchasing manager of a company that has relationships w.docxshericehewat
 
You are to briefly describe how the Bible is related to the topics c.docx
You are to briefly describe how the Bible is related to the topics c.docxYou are to briefly describe how the Bible is related to the topics c.docx
You are to briefly describe how the Bible is related to the topics c.docxshericehewat
 
You are the manager of an accounting department and would like to hi.docx
You are the manager of an accounting department and would like to hi.docxYou are the manager of an accounting department and would like to hi.docx
You are the manager of an accounting department and would like to hi.docxshericehewat
 
You are the new chief financial officer (CFO) hired by a company. .docx
You are the new chief financial officer (CFO) hired by a company. .docxYou are the new chief financial officer (CFO) hired by a company. .docx
You are the new chief financial officer (CFO) hired by a company. .docxshericehewat
 
You are the manager of a team of six proposal-writing professionals..docx
You are the manager of a team of six proposal-writing professionals..docxYou are the manager of a team of six proposal-writing professionals..docx
You are the manager of a team of six proposal-writing professionals..docxshericehewat
 
You are the environmental compliance officer at a company that is .docx
You are the environmental compliance officer at a company that is .docxYou are the environmental compliance officer at a company that is .docx
You are the environmental compliance officer at a company that is .docxshericehewat
 

More from shericehewat (20)

You have been asked to explain the differences between certain categ.docx
You have been asked to explain the differences between certain categ.docxYou have been asked to explain the differences between certain categ.docx
You have been asked to explain the differences between certain categ.docx
 
You have been asked to help secure the information system and users .docx
You have been asked to help secure the information system and users .docxYou have been asked to help secure the information system and users .docx
You have been asked to help secure the information system and users .docx
 
You have been asked to participate in a local radio program to add.docx
You have been asked to participate in a local radio program to add.docxYou have been asked to participate in a local radio program to add.docx
You have been asked to participate in a local radio program to add.docx
 
You have been hired asa cons.docx
You have been hired asa cons.docxYou have been hired asa cons.docx
You have been hired asa cons.docx
 
You have been appointed as a system analyst in the IT department of .docx
You have been appointed as a system analyst in the IT department of .docxYou have been appointed as a system analyst in the IT department of .docx
You have been appointed as a system analyst in the IT department of .docx
 
You choose one and I will upload the materials for u.Choose 1 of.docx
You choose one and I will upload the materials for u.Choose 1 of.docxYou choose one and I will upload the materials for u.Choose 1 of.docx
You choose one and I will upload the materials for u.Choose 1 of.docx
 
You are Incident Commander and principal planner for the DRNC even.docx
You are Incident Commander and principal planner for the DRNC even.docxYou are Incident Commander and principal planner for the DRNC even.docx
You are Incident Commander and principal planner for the DRNC even.docx
 
You DecideCryptographic Tunneling and the OSI ModelWrite a p.docx
You DecideCryptographic Tunneling and the OSI ModelWrite a p.docxYou DecideCryptographic Tunneling and the OSI ModelWrite a p.docx
You DecideCryptographic Tunneling and the OSI ModelWrite a p.docx
 
You are working as a behavioral health specialist in a neurological .docx
You are working as a behavioral health specialist in a neurological .docxYou are working as a behavioral health specialist in a neurological .docx
You are working as a behavioral health specialist in a neurological .docx
 
You are to write up a reflection (longer than 2 pages) that discusse.docx
You are to write up a reflection (longer than 2 pages) that discusse.docxYou are to write up a reflection (longer than 2 pages) that discusse.docx
You are to write up a reflection (longer than 2 pages) that discusse.docx
 
You can only take this assignment if you have the book Discovering t.docx
You can only take this assignment if you have the book Discovering t.docxYou can only take this assignment if you have the book Discovering t.docx
You can only take this assignment if you have the book Discovering t.docx
 
You are to interview a woman 50 and older and write up the interview.docx
You are to interview a woman 50 and older and write up the interview.docxYou are to interview a woman 50 and older and write up the interview.docx
You are to interview a woman 50 and older and write up the interview.docx
 
You are to complete TWO essays and answer the following questions.  .docx
You are to complete TWO essays and answer the following questions.  .docxYou are to complete TWO essays and answer the following questions.  .docx
You are to complete TWO essays and answer the following questions.  .docx
 
You are the vice president of a human resources department and Susan.docx
You are the vice president of a human resources department and Susan.docxYou are the vice president of a human resources department and Susan.docx
You are the vice president of a human resources department and Susan.docx
 
You are the purchasing manager of a company that has relationships w.docx
You are the purchasing manager of a company that has relationships w.docxYou are the purchasing manager of a company that has relationships w.docx
You are the purchasing manager of a company that has relationships w.docx
 
You are to briefly describe how the Bible is related to the topics c.docx
You are to briefly describe how the Bible is related to the topics c.docxYou are to briefly describe how the Bible is related to the topics c.docx
You are to briefly describe how the Bible is related to the topics c.docx
 
You are the manager of an accounting department and would like to hi.docx
You are the manager of an accounting department and would like to hi.docxYou are the manager of an accounting department and would like to hi.docx
You are the manager of an accounting department and would like to hi.docx
 
You are the new chief financial officer (CFO) hired by a company. .docx
You are the new chief financial officer (CFO) hired by a company. .docxYou are the new chief financial officer (CFO) hired by a company. .docx
You are the new chief financial officer (CFO) hired by a company. .docx
 
You are the manager of a team of six proposal-writing professionals..docx
You are the manager of a team of six proposal-writing professionals..docxYou are the manager of a team of six proposal-writing professionals..docx
You are the manager of a team of six proposal-writing professionals..docx
 
You are the environmental compliance officer at a company that is .docx
You are the environmental compliance officer at a company that is .docxYou are the environmental compliance officer at a company that is .docx
You are the environmental compliance officer at a company that is .docx
 

Recently uploaded

Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsanshu789521
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitolTechU
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 
MICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxMICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxabhijeetpadhi001
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.arsicmarija21
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 

Recently uploaded (20)

Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha elections
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptx
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 
MICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxMICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptx
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 

Guide to Helping With Paper· Description of the key program .docx

  • 1. Guide to Helping With Paper · Description of the key program elements: https://obamawhitehouse.archives.gov/blog/2011/11/30/prisoner -reentry-programs-ensuring-safe-and-successful-return- community Drake, E. B., & Lafrance, S. (2007). Findings on Best Practices of Community Re-Entry Programs ... Retrieved from http://www.eisenhowerfoundation.org/docs/Ex-Offender Best Practices.pdf Mosteller, J. (2019). Why Reentry Programs are Important. Retrieved from https://www.charleskochinstitute.org/issue- areas/criminal-justice-policing-reform/reentry-programs/ · A description of the strategies that the program uses to produce change Caprizzo, C. (2011, November 30). Prisoner Reentry Programs: Ensuring a Safe and Successful Return to the Community. Retrieved fromhttps://obamawhitehouse.archives.gov/blog/2011/11/30/pris oner-reentry-programs-ensuring-safe-and-successful-return- community INTEGRATED REENTRYand EMPLOYMENT. (2013). Retrieved from https://www.bja.gov/Publications/CSG-Reentry- and-Employment.pdf · A description of the needs of the target population
  • 2. · An explanation of why a process evaluation is important for the program See attachment to answer this question (Workbook for Designing a Process Evaluation) also look at this link below Berghuis, M. (2018, October). Reentry Programs for Adult Male Offender Recidivism and Reintegration: A Systematic Review and Meta-Analysis. Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6139987/ · A plan for building relationships with the staff and management STRONG PROFESSIONAL RELATIONSHIPS - Leading Teams. (2015). Retrieved from http://www.leadingteams.net.au/wp- content/uploads/2015/10/Whitepaper-Strong-Professional- Relationships-Drive-High-Performance.pdf See attachment can help you in answering this question (Workbook for Designing a Process Evaluation) · Broad questions to be answered by the process evaluation Rossman, S., Willison, J., Lindquist, C., Walters, J., & Lattimore, P. (2016, December). The author(s) shown below used Federal funding provided by ... Retrieved from https://www.ncjrs.gov/pdffiles1/nij/grants/250469.pdf See attachment can help you in answering this question (Workbook for Designing a Process Evaluation) · Specific questions to be answered by the process evaluation · A plan for gathering and analyzing the information https://www.ncjrs.gov/pdffiles1/nij/grants/213675.pdf Make Sure All Bullets Are Answered :
  • 3. · A description of the key program elements · A description of the strategies that the program uses to produce change · A description of the needs of the target population · An explanation of why a process evaluation is important for the program · A plan for building relationships with the staff and management · Broad questions to be answered by the process evaluation · Specific questions to be answered by the process evaluation · A plan for gathering and analyzing the information Workbook for Designing a Process Evaluation Produced for the Georgia Department of Human Resources Division of Public Health
  • 4. By Melanie J. Bliss, M.A. James G. Emshoff, Ph.D. Department of Psychology Georgia State University July 2002 Evaluation Expert Session July 16, 2002 Page 1 What is process evaluation? Process evaluation uses empirical data to assess the delivery of programs. In contrast to outcome evaluation, which assess the impact of the program, process evaluation verifies what the program is and whether it is being implemented as designed. Thus, process evaluation asks "what," and outcome evaluation asks, "so what?" When conducting a process evaluation, keep in mind these three questions:
  • 5. 1. What is the program intended to be? 2. What is delivered, in reality? 3. Where are the gaps between program design and delivery? This workbook will serve as a guide for designing your own process evaluation for a program of your choosing. There are many steps involved in the implementation of a process evaluation, and this workbook will attempt to direct you through some of the main stages. It will be helpful to think of a delivery service program that you can use as your example as you complete these activities. Why is process evaluation important? 1. To determine the extent to which the program is being implemented according to plan 2. To assess and document the degree of fidelity and variability in program implementation, expected or unexpected, planned or unplanned 3. To compare multiple sites with respect to fidelity 4. To provide validity for the relationship between the intervention and the outcomes 5. To provide information on what components of the intervention
  • 6. are responsible for outcomes 6. To understand the relationship between program context (i.e., setting characteristics) and program processes (i.e., levels of implementation). 7. To provide managers feedback on the quality of implementation 8. To refine delivery components 9. To provide program accountability to sponsors, the public, clients, and funders 10. To improve the quality of the program, as the act of evaluating is an intervention. Evaluation Expert Session July 16, 2002 Page 2 Stages of Process Evaluation Page Number
  • 7. 1. Form Collaborative Relationships 3 2. Determine Program Components 4 3. Develop Logic Model* 4. Determine Evaluation Questions 6 5. Determine Methodology 11 6. Consider a Management Information System 25 7. Implement Data Collection and Analysis 28 8. Write Report** Also included in this workbook: a. Logic Model Template 30 b. Pitfalls to avoid 30 c. References 31 Evaluation can be an exciting, challenging, and fun experience Enjoy! * Previously covered in Evaluation Planning Workshops. ** Will not be covered in this expert session. Please refer to the Evaluation Framework and Evaluation Module of FHB Best Practice Manual for more details.
  • 8. Evaluation Expert Session July 16, 2002 Page 3 Forming collaborative relationships A strong, collaborative relationship with program delivery staff and management will likely result in the following: Feedback regarding evaluation design and implementation Ease in conducting the evaluation due to increased cooperation Participation in interviews, panel discussion, meetings, etc. Increased utilization of findings Seek to establish a mutually respectful relationship characterized by trust, commitment, and flexibility. Key points in establishing a collaborative relationship: Start early. Introduce yourself and the evaluation team to as many delivery staff and management personnel as early as possible.
  • 9. Emphasize that THEY are the experts, and you will be utilizing their knowledge and information to inform your evaluation development and implementation. Be respectful of their time both in-person and on the telephone. Set up meeting places that are geographically accessible to all parties involved in the evaluation process. Remain aware that, even if they have requested the evaluation, it may often appear as an intrusion upon their daily activities. Attempt to be as unobtrusive as possible and request their feedback regarding appropriate times for on-site data collection. Involve key policy makers, managers, and staff in a series of meetings throughout the evaluation process. The evaluation should be driven by the questions that are of greatest interest to the stakeholders. Set agendas for meetings and provide an overview of the goals of the meeting before beginning. Obtain their feedback and provide them with updates regarding the evaluation process. You may wish to obtained structured feedback. Sample feedback forms are throughout the workbook.
  • 10. Provide feedback regarding evaluation findings to the key policy makers, managers, and staff when and as appropriate. Use visual aids and handouts. Tabulate and summarize information. Make it as interesting as possible. Consider establishing a resource or expert "panel" or advisory board that is an official group of people willing to be contacted when you need feedback or have questions. Evaluation Expert Session July 16, 2002 Page 4 Determining Program Components Program components are identified by answering the questions who, what, when, where, and how as they pertain to your program. Who: the program clients/recipients and staff What: activities, behaviors, materials When: frequency and length of the contact or intervention Where: the community context and physical setting
  • 11. How: strategies for operating the program or intervention BRIEF EXAMPLE: Who: elementary school students What: fire safety intervention When: 2 times per year Where: in students’ classroom How: group administered intervention, small group practice 1. Instruct students what to do in case of fire (stop, drop and roll). 2. Educate students on calling 911 and have them practice on play telephones. 3. Educate students on how to pull a fire alarm, how to test a home fire alarm and how to change batteries in a home fire alarm. Have students practice each of these activities. 4. Provide students with written information and have them take it home to share with their parents. Request parental signature to indicate compliance and target a 75% return rate. Points to keep in mind when determining program components Specify activities as behaviors that can be observed If you have a logic model, use the "activities" column as a
  • 12. starting point Ensure that each component is separate and distinguishable from others Include all activities and materials intended for use in the intervention Identify the aspects of the intervention that may need to be adapted, and those that should always be delivered as designed. Consult with program staff, mission statements, and program materials as needed. Evaluation Expert Session July 16, 2002 Page 5 Your Program Components After you have identified your program components, create a logic model that graphically portrays the link between program components and outcomes expected from these
  • 13. components. Now, write out a succinct list of the components of your program. WHO: WHAT: WHEN: WHERE: HOW:
  • 14. Evaluation Expert Session July 16, 2002 Page 6 What is a Logic Model A logical series of statements that link the problems your program is attempting to address (conditions), how it will address them (activities), and what are the expected results (immediate and intermediate outcomes, long-term goals). Benefits of the logic model include: helps develop clarity about a project or program, helps to develop consensus among people, helps to identify gaps or redundancies in a plan, helps to identify core hypothesis, helps to succinctly communicate what your project or program is about. When do you use a logic model Use...
  • 15. - During any work to clarify what is being done, why, and with what intended results - During project or program planning to make sure that the project or program is logical and complete - During evaluation planning to focus the evaluation - During project or program implementation as a template for comparing to the actual program and as a filter to determine whether proposed changes fit or not. This information was extracted from the Logic Models: A Multi-Purpose Tool materials developed by Wellsys Corporation for the Evaluation Planning Workshop Training. Please see the Evaluation Planning Workshop materials for more information. Appendix A has a sample template of the tabular format.
  • 16. Evaluation Expert Session July 16, 2002 Page 7 Determining Evaluation Questions As you design your process evaluation, consider what questions you would like to answer. It is only after your questions are specified that you can begin to develop your methodology. Considering the importance and purpose of each question is critical. BROADLY.... What questions do you hope to answer? You may wish to turn the program components that you have just identified into questions assessing: Was the component completed as indicated? What were the strengths in implementation? What were the barriers or challenges in implementation? What were the apparent strengths and weaknesses of each step of the intervention? Did the recipient understand the intervention?
  • 17. Were resources available to sustain project activities? What were staff perceptions? What were community perceptions? What was the nature of the interaction between staff and clients? These are examples. Check off what is applicable to you, and use the space below to write additional broad, overarching questions that you wish to answer. Evaluation Expert Session July 16, 2002 Page 8 SPECIFICALLY ... Now, make a list of all the specific questions you wish to answer, and organize your questions categorically. Your list of questions will likely be much longer than your list of program components. This step of developing your evaluation will inform your methodologies and instrument choice. Remember that you must collect information on what the program is intended to be and what it is in reality, so you may need to ask some questions in 2 formats. For example:
  • 18. How many people are intended to complete this intervention per week?" How many actually go through the intervention during an average week?" Consider what specific questions you have. The questions below are only examples! Some may not be appropriate for your evaluation, and you will most likely need to add additional questions. Check off the questions that are applicable to you, and add your own questions in the space provided. WHO (regarding client): Who is the target audience, client, or recipient? How many people have participated? How many people have dropped out? How many people have declined participation? What are the demographic characteristics of clients? Race Ethnicity National Origin Age Gender Sexual Orientation Religion Marital Status Employment Income Sources Education Socio-Economic Status What factors do the clients have in common? What risk factors do clients have?
  • 19. Who is eligible for participation? How are people referred to the program? How are the screened? How satisfied are the clients? YOUR QUESTIONS: Evaluation Expert Session July 16, 2002 Page 9 WHO (Regarding staff): Who delivers the services? How are they hired? How supportive are staff and management of each other? What qualifications do staff have? How are staff trained? How congruent are staff and recipients with one another? What are staff demographics? (see client demographic list for specifics.) YOUR QUESTIONS:
  • 20. WHAT: What happens during the intervention? What is being delivered? What are the methods of delivery for each service (e.g., one- on-one, group session, didactic instruction, etc.) What are the standard operating procedures? What technologies are in use? What types of communication techniques are implemented? What type of organization delivers the program? How many years has the organization existed? How many years has the program been operating? What type of reputation does the agency have in the community? What about the program? What are the methods of service delivery? How is the intervention structured? How is confidentiality maintained? YOUR QUESTIONS:
  • 21. WHEN: When is the intervention conducted? How frequently is the intervention conducted? At what intervals? At what time of day, week, month, year? What is the length and/or duration of each service? Evaluation Expert Session July 16, 2002 Page 10 YOUR QUESTIONS: WHERE: Where does the intervention occur? What type of facility is used? What is the age and condition of the facility? In what part of town is the facility? Is it accessible to the target audience? Does public transportation access
  • 22. the facility? Is parking available? Is child care provided on site? YOUR QUESTIONS: WHY: Why are these activities or strategies implemented and why not others? Why has the intervention varied in ability to maintain interest? Why are clients not participating? Why is the intervention conducted at a certain time or at a certain frequency? YOUR QUESTIONS: Evaluation Expert Session July 16, 2002 Page 11
  • 23. Validating Your Evaluation Questions Even though all of your questions may be interesting, it is important to narrow your list to questions that will be particularly helpful to the evaluation and that can be answered given your specific resources, staff, and time. Go through each of your questions and consider it with respect to the questions below, which may be helpful in streamlining your final list of questions. Revise your worksheet/list of questions until you can answer "yes" to all of these questions. If you cannot answer "yes" to your question, consider omitting the question from your evaluation. Validation Yes No Will I use the data that will stem from these questions?
  • 24. Do I know why each question is important and /or valuable? Is someone interested in each of these questions? Have I ensured that no questions are omitted that may be important to someone else? Is the wording of each question sufficiently clear and unambiguous? Do I have a hypothesis about what the “correct” answer will be for each question? Is each question specific without inappropriately limiting the scope of the evaluation or probing for a specific response?
  • 25. Do they constitute a sufficient set of questions to achieve the purpose(s) of the evaluation? Is it feasible to answer the question, given what I know about the resources for evaluation? Is each question worth the expense of answering it? Derived from "A Design Manual" Checklist, page 51. Evaluation Expert Session July 16, 2002 Page 12
  • 26. Determining Methodology Process evaluation is characterized by collection of data primarily through two formats: 1) Quantitative, archival, recorded data that may be managed by an computerized tracking or management system, and 2) Qualitative data that may be obtained through a variety of formats, such as surveys or focus groups. When considering what methods to use, it is critical to have a thorough understanding and knowledge of the questions you want answered. Your questions will inform your choice of methods. After this section on types of methodologies, you will complete an exercise in which you consider what method of data collection is most appropriate for each question. Do you have a thorough understanding of your questions? Furthermore, it is essential to consider what data the organization you are evaluating already has. Data may exist in the form of an existing computerized management information system, records, or a tracking system
  • 27. of some other sort. Using this data may provide the best reflection of what is "going on," and it will also save you time, money, and energy because you will not have to devise your own data collection method! However, keep in mind that you may have to adapt this data to meet your own needs - you may need to add or replace fields, records, or variables. What data does your organization already have? Will you need to adapt it? If the organization does not already have existing data, consider devising a method for the organizational staff to collect their own data. This process will ultimately be helpful for them so that they can continue to self- evaluate, track their activities, and assess progress and change. It will be helpful for the evaluation process because, again, it will save you time, money, and energy that you can better devote towards other aspects of the evaluation. Management information systems will be described more fully in a later section of this workbook. Do you have the capacity and resources to devise such a system? (You may need to refer to a later
  • 28. section of this workbook before answering.) Evaluation Expert Session July 16, 2002 Page 13 Who should collect the data? Given all of this, what thoughts do you have on who should collect data for your evaluation? Program staff, evaluation staff, or some combination? Program Staff: May collect data from activities such as attendance, demographics, participation, characteristics of participants, dispositions, etc; may conduct intake interviews, note changes regarding service delivery, and monitor program implementation. Advantages: Cost-efficient, accessible, resourceful, available, time-efficient, and increased understanding of the program.
  • 29. Disadvantages: May exhibit bias and/or social desirability, may use data for critical judgment, may compromise the validity of the program; may put staff in uncomfortable or inappropriate position; also, if staff collect data, may have an increased burden and responsibility placed upon them outside of their usual or typical job responsibilities. If you utilize staff for data collection, provide frequent reminders as well as messages of gratitude. Evaluation staff: May collect qualitative information regarding implementation, general characteristics of program participants, and other information that may otherwise be subject to bias or distortion. Advantages: Data collected in manner consistent with overall goals and timeline of evaluation; prevents bias and inappropriate use of information; promotes overall fidelity and validity of data. Disadvantages: May be costly and take extensive time; may require additional training on part of evaluator; presence of evaluator in organization may be intrusive, inconvenient, or burdensome.
  • 30. Evaluation Expert Session July 16, 2002 Page 14 When should data be collected? Conducting the evaluation according to your timeline can be challenging. Consider how much time you have for data collection, and make decisions regarding what to collect and how much based on your timeline. In many cases, outcome evaluation is not considered appropriate until the program has stabilized. However, when conducting a process evaluation, it can be important to start the evaluation at the beginning so that a story may be told regarding how the program was developed, information may be provided on refinements, and program growth and progress may be noted. If you have the luxury of collecting data from the start of the intervention to the end of the intervention, space out data collection as appropriate. If you are evaluating an ongoing intervention that is fairly quick (e.g., an 8-week educational group), you may
  • 31. choose to evaluate one or more "cycles." How much time do you have to conduct your evaluation? How much time do you have for data collection (as opposed to designing the evaluation, training, organizing and analyzing results, and writing the report?) Is the program you are evaluating time specific? How long does the program or intervention last? At what stages do you think you will most likely collect data? Soon after a program has begun Descriptive information on program characteristics that will not change; information requiring baseline information During the intervention Ongoing process information such as recruitment, program implementation After the intervention Demographics, attendance ratings, satisfaction ratings
  • 32. Evaluation Expert Session July 16, 2002 Page 15 Before you consider methods A list of various methods follows this section. Before choosing what methods are most appropriate for your evaluation, review the following questions. (Some may already be answered in another section of this workbook.) What questions do I want answered? (see previous section) Does the organization already have existing data, and if so, what kind? Does the organization have staff to collect data? What data can the organization staff collect? Must I maintain anonymity (participant is not identified at all) or confidentiality (participant is identified but responses remain private)? This consideration pertains to existing archival data as well as original data collection. How much time do I have to conduct the evaluation?
  • 33. How much money do I have in my budget? How many evaluation staff do I have to manage the data collection activities? Can I (and/or members of my evaluation staff) travel on site? What time of day is best for collecting data? For example, if you plan to conduct focus groups or interviews, remember that your population may work during the day and need evening times. Evaluation Expert Session July 16, 2002 Page 16 Types of methods A number of different methods exist that can be used to collect process information. Consider each of the following, and check those that you think would
  • 34. be helpful in addressing the specific questions in your evaluation. When "see sample" is indicated, refer to the pages that follow this table. √ Method Description Activity, participation, or client tracking log Brief record completed on site at frequent intervals by participant or deliverer. May use form developed by evaluator if none previously exists. Examples: sign in log, daily records of food consumption, medication management. Case Studies Collection of in-depth information regarding small number of intervention recipients; use multiple methods of data collection. Ethnographic analysis Obtain in-depth information regarding the experience of the recipient by partaking in the intervention, attending meetings, and talking with delivery staff and recipients. Expert judgment Convene a panel of experts or conduct individual interviews to
  • 35. obtain their understanding of and reaction to program delivery. Focus groups Small group discussion among program delivery staff or recipients. Focus on their thoughts and opinions regarding their experiences with the intervention. Meeting minutes (see sample) Qualitative information regarding agendas, tasks assigned, and coordination and implementation of the intervention as recorded on a consistent basis. Observation (see sample) Observe actual delivery in vivo or on video, record findings using check sheet or make qualitative observations. Open-ended interviews – telephone or in person Evaluator asks open questions (i.e., who, what, when, where, why, how) to delivery staff or recipients. Use interview protocol without preset response
  • 36. options. Questionnaire Written survey with structured questions. May administer in individual, group, or mail format. May be anonymous or confidential. Record review Obtain indicators from intervention records such patient files, time sheets, telephone logs, registration forms, student charts, sales records, or records specific to the service delivery. Structured interviews – telephone or in person Interviewer asks direct questions using interview protocol with preset response options. Evaluation Expert Session July 16, 2002 Page 17
  • 37. Sample activity log This is a common process evaluation methodology because it systematically records exactly what is happening during implementation. You may wish to devise a log such as the one below and alter it to meet your specific needs. Consider computerizing such a log for efficiency. Your program may already have existing logs that you can utilize and adapt for your evaluation purposes. Site: Recorder: Code Service Date Location # People # Hours
  • 38. Notes
  • 39. Evaluation Expert Session July 16, 2002 Page 18
  • 40. Meeting Minutes Taking notes at meetings may provide extensive and invaluable process information that can later be organized and structured into a comprehensive report. Minutes may be taken by program staff or by the evaluator if necessary. You may find it helpful to use a structured form, such as the one below that is derived from Evaluating Collaboratives, University of Wisconsin-Cooperative Extension, 1998. Meeting Place: __________________ Start time: ____________ Date: _____________________________ End time: ____________ Attendance (names): Agenda topic: _________________________________________________ Discussion: _____________________________________________________ Decision Related Tasks Who responsible Deadline 1. 2.
  • 41. 3. Agenda topic: _________________________________________________ Discussion: _____________________________________________________ Decision Related Tasks Who responsible Deadline 1. 2. 3. Sample observation log Evaluation Expert Session July 16, 2002 Page 19 Observation may occur in various methods, but one of the most common is
  • 42. hand-recording specific details during a small time period. The following is several rows from an observation log utilized during an evaluation examining school classrooms. CLASSROOM OBSERVATIONS (School Environment Scale) Classroom 1: Grade level _________________ (Goal: 30 minutes of observation) Time began observation: _________Time ended observation:_________ Subjects were taught during observation period: ___________________ PHYSICAL ENVIRONMENT Question Answer 1. Number of students 2. Number of adults in room: a. Teachers b. Para-pros c. Parents Total: a. b.
  • 43. c. 3. Desks/Tables a. Number of Desks b. Number of Tables for students’ use c. Any other furniture/include number (Arrangement of desks/tables/other furniture) a. b. c. 4. Number of computers, type 5. How are computers being used? 6. What is the general classroom setup? (are there walls, windows, mirrors, carpet, rugs, cabinets, curtains, etc.) 7. Other technology (overhead projector, power point, VCR, etc.)
  • 44. 8. Are books and other materials accessible for students? 9. Is there adequate space for whole-class instruction? 12. What type of lighting is used? 13. Are there animals or fish in the room? 14. Is there background music playing? 15. Rate the classroom condition Poor Average Excellent 16. Are rules/discipline procedures posted? If so, where?
  • 45. 17. Is the classroom Noisy or Quiet? Very Quiet Very Noisy Choosing or designing measurement instruments Consider using a resource panel, advisory panel, or focus group to offer feedback Evaluation Expert Session July 16, 2002 Page 20 regarding your instrument. This group may be composed of any of the people listed below. You may also wish to consult with one or more of these individuals throughout the development of your overall methodology. Who should be involved in the design of your instrument(s) and/or provide feedback? Program service delivery staff / volunteers Project director Recipients of the program Board of directors
  • 46. Community leader Collaborating organizations Experts on the program or service being evaluated Evaluation experts _________________________ _________________________ _________________________ Conduct a pilot study and administer the instrument to a group of recipients, and then obtain feedback regarding their experience. This is a critical component of the development of your instruments, as it will help ensure clarity of questions, and reduce the degree of discomfort or burden that questions or processes (e.g., intakes or computerized data entry) elicit. How can you ensure that you pilot your methods? When will you do it, and whom will you use as participants in the study? Ensure that written materials are at an appropriate reading level for the population. Ensure that verbal information is at an appropriate terminology level for the population. A third or sixth-grade reading level is often utilized. Remember that you are probably collecting data that is
  • 47. program-specific. This may increase the difficulty in finding instruments previously constructed to use for questionnaires, etc. However, instruments used for conducting process evaluations of other programs may provide you with ideas for how to structure your own instruments. Evaluation Expert Session July 16, 2002 Page 21 Linking program components and methods (an example) Now that you have identified your program components, broad questions, specific questions, and possible measures, it is time to link them together. Let's start with your program components. Here is an example of 3 program components of an intervention. Program Components and Essential Elements: There are six program components to M2M. There are essential elements in each component that must be present for the program to achieve its intended results and outcomes, and for the program to be
  • 48. identified as a program of the American Cancer Society. Possible Process Measures 1) Man to Man Self-Help and/or Support Groups The essential elements within this component are: • Offer information and support to all men with prostate cancer at all points along the cancer care continuum • Directly, or through collaboration and referral, offer community access to prostate cancer self-help and/or support groups • Provide recruitment and on-going training and monitoring for M2M leaders and volunteers • Monitor, track and report program activities • Descriptions of attempts to schedule and advertise group meetings • Documented efforts to establish the program • Documented local needs assessments • # of meetings held per independent group • Documented meetings held
  • 49. • # of people who attended different topics and speakers • Perceptions of need of survey participants for additional groups and current satisfaction levels • # of new and # of continuing group members • Documented sign-up sheets for group meetings • Documented attempts to contact program dropouts • # of referrals to other PC groups documented • # of times corresponding with other PC groups • # of training sessions for new leaders • # of continuing education sessions for experienced leaders • # and types of other on-going support activities for volunteer leaders • # of volunteers trained as group facilitators • Perceptions of trained volunteers for readiness to function as group facilitators Evaluation Expert Session July 16, 2002 Page 22
  • 50. 2) One-to-One Contacts The essential elements within this component are: • Offer one-to-one contact to provide information and support to all men with prostate cancer, including those in the diagnostic process • Provide recruitment and on-going training and monitoring for M2M leaders and volunteers • Monitor, track and report program activities • # of contact pairings • Frequency and duration of contact pairings • Types of information shared during contact pairings • # of volunteers trained • Perception of readiness by trained volunteers • Documented attempts for recruiting volunteers • Documented on-going training activities for volunteers • Documented support activities
  • 51. 3) Community Education and Awareness The essential elements within this component are: • Conduct public awareness activities to inform the public about prostate cancer and M2M • Monitor, track and report program activities • # of screenings provided by various health care providers/agencies over assessment period • Documented ACS staff and volunteer efforts to publicize the availability and importance of PC and screenings, including health fairs, public service announcements, billboard advertising, etc. • # of addresses to which newsletters are mailed • Documented efforts to increase newsletter mailing list Page 23
  • 52. Linking YOUR program components, questions, and methods Consider each of your program components and questions that you have devised in an earlier section of this workbook, and the methods that you checked off on the "types of methods" table. Now ask yourself, how will I use the information I have obtained from this question? And, what method is most appropriate for obtaining this information? Program Component Specific questions that go with this component How will I use this information? Best method?
  • 53. Page 24 Program Component Specific questions that go with this component How will I use this information? Best method?
  • 54. Evaluation Expert Session July 16, 2002 Page 25 Data Collection Plan Now let's put your data collection activities on one sheet - what you're collecting, how you're doing it, when, your sample, and who will collect it. Identifying your methods that you have just picked, instruments, and data collection techniques in a structured manner will facilitate this process. Method Type of data (questions, briefly indicated) Instrument used
  • 55. When implemented Sample Who collects E.g.: Patient interviews in health dept clinics Qualitative - what services they are using, length of visit, why came in, how long wait, some quantitative satisfaction ratings Interview created by evaluation team and piloted with patients Oct-Dec; days and hrs randomly selected 10 interviews in each
  • 57. Page 26 Evaluation Expert Session July 16, 2002 Consider a Management Information System Process data is frequently collected through a management information system (MIS) that is designed to record characteristics of participants, participation of participants, and characteristics of activities and services provided. An MIS is a computerized record system that enables service providers and evaluators to accumulate and display data quickly and efficiently in various ways. Will your evaluation be enhanced by periodic data presentations in tables or other structured formats? For example, should the evaluation utilize a monthly print-out of services utilized or to monitor and process recipient tracking (such as date, time, and
  • 58. length of service)? YES NO Does the agency create monthly (or other periodic) print outs reflecting services rendered or clients served? YES NO Will the evaluation be conducted in a more efficient manner if program delivery staff enter data on a consistent basis? YES NO Does the agency already have hard copies of files or records that would be better utilized if computerized?
  • 59. YES NO Does the agency already have an MIS or a similar computerized database? YES NO If the answers to any of these questions are YES, consider using an MIS for your evaluation. If an MIS does not already exist, you may desire to design a database in which you can enter information from records obtained by the agency. This process decreases missing data and is generally efficient. If you do create a database that can be used on an ongoing basis by the agency, you may consider offering it to them for future use.
  • 60. Page 27 Evaluation Expert Session July 16, 2002 Information to be included in your MIS Examples include: Client demographics Client contacts Client services Referrals offered Client outcomes Program activities Staff notes Jot down the important data you would like to be included in your MIS.
  • 61. Managing your MIS What software do you wish to utilize to manage your data? What type of data do you have? How much information will you need to enter? How will you ultimately analyze the data? You may wish to create a database directly in the program you will eventually use, such as SPSS? Will you be utilizing lap tops? Page 28 Evaluation Expert Session July 16, 2002 If so, will you be taking them onsite and directly entering your data into them? How will you download or transfer the information, if applicable?
  • 62. What will the impact be on your audience if you have a laptop? Tips on using an MIS If service delivery personnel will be collecting and/or entering information into the MIS for the evaluator's use, it is generally a good idea to provide frequent reminders of the importance of entering the appropriate information in a timely, consistent, and regular manner. For example, if an MIS is dependent upon patient data collected by public health officers daily activities, the officers should be entering data on at least a daily basis. Otherwise, important data is lost and the database will only reflect what was salient enough to be remembered and entered at the end of the week. Don't forget that this may be burdensome and/or inconvenient for the program staff. Provide them with frequent thank you's. Remember that your database is only as good as you make it. It must be organized and
  • 63. arranged so that it is most helpful in answering your questions. If you are collecting from existing records, at what level is he data currently available? For example, is it state, county, or city information? How is it defined? Consider whether adaptations need to be made or additions need to be included for your evaluation. Back up your data frequently and in at least one additional format (e.g., zip, disk, server). Consider file security. Will you be saving data on a network server? You may need to consider password protection. Page 29 Evaluation Expert Session July 16, 2002 Allocate time for data entry and checking.
  • 64. Allow additional time to contemplate the meaning of the data before writing the report. Page 30 Evaluation Expert Session July 16, 2002 Implement Data Collection and Analysis Data collection cannot be fully reviewed in this workbook, but this page offers a few tips regarding the process. General reminders: THANK everyone who helps you, directs you, or participates in anyway. Obtain clear directions and give yourself plenty of time, especially if you are traveling long distance (e.g., several hours away).
  • 65. Bring all of your own materials - do not expect the program to provide you with writing utensils, paper, a clipboard, etc. Address each person that you meet with respect and attempt to make your meeting as conducive with their schedule as possible. Most process evaluation will be in the form of routine record keeping (e.g., MIS). However, you may wish to interview clients and staff. If so: Ensure that you have sufficient time to train evaluation staff, data collectors, and/or organization staff who will be collecting data. After they have been trained in the data collection materials and procedure, require that they practice the technique, whether it is an interview or entering a sample record in an MIS. If planning to use a tape recorder during interviews or focus groups, request permission from participants before beginning. You may need to turn the tape recorder off on occasion if it will facilitate increased comfort by participants. If planning to use laptop computers, attempt to make consistent eye contact and spend time establishing rapport before beginning. Some participants
  • 66. may be uncomfortable with technology and you may need to provide education regarding the process of data collection and how the information will be utilized. If planning to hand write responses, warn the participant that you may move slowly and Page 31 Evaluation Expert Session July 16, 2002 may need to ask them to repeat themselves. However, prepare for this process by developing shorthand specific to the evaluation. A sample shorthand page follows. Page 32
  • 67. Evaluation Expert Session July 16, 2002 Annual Evaluation Reports The ultimate aim of all the Branch’s evaluation efforts is to increase the intelligent use of information in Branch decision-making in order to improve health outcomes. Because we understand that many evaluation efforts fail because the data are never collected and that even more fail because the data are collected but never used in decision-making, we have struggled to find a way to institutionalize the use of evaluation results in Branch decision-making. These reports will serve multiple purposes: The need to complete the report will increase the likelihood that evaluation is done and data are collected. The need to review reports from lower levels in order to complete one’s own report hopefully will cause managers at all levels to consciously consider, at least once a year, the effectiveness of their activities and how evaluation results suggest that effectiveness can be improved. The summaries of evaluation findings in the reports should simplify preparation of other reports to funders including the General Assembly.
  • 68. Each evaluation report forms the basis of the evaluation report at the next level. The contents and length of the report should be determined by what is mot helpful to the manager who is receiving the report. Rather than simply reporting every possible piece of data, these reports should present summary data, summarize important conclusions, and suggest recommendations based on the evaluation findings. A program-level annual evaluation report should be ten pages or less. Many my be less than five pages. Population team and Branch-level annual evaluation reports may be longer than ten pages, depending on how many findings are being reported. However, reports that go beyond ten pages should also contain a shorter Executive Summary, to insure that those with the power to make decisions actually read the findings. Especially, the initial reports may reflect formative work and consist primarily of updates on the progress of evaluation planning and implementation. This is fine and to be expected. However, within a year or two the reports should begin to include process data, and later actual outcome findings. This information was extracted from the FHB Evaluation Framework developed by Monica Herk and Rebekah Hudgins.
  • 69. Page 33 Evaluation Expert Session July 16, 2002 Suggested shorthand - a sample The list below was derived for a process evaluation regarding charter schools. Note the use of general shorthand as well as shorthand derived specifically for the evaluation. CS Charter School mst Most Sch School b/c Because Tch Teacher, teach st Something P Principal b Be VP Vice Principal c See Admin Administration, administrators r Are DOE Dept of Education w/ When BOE Board of Education @ At Comm Community ~ About
  • 70. Stud Students, pupils = Is, equals, equivalent Kids Students, children, teenagers ≠ Does not equal, is not the same K Kindergarten Sone Someone Cl Class # Number CR Classroom $ Money, finances, financial, funding, expenses, etc. W White + Add, added, in addition B Black < Less than AA African American > Greater/more than SES Socio-economic status ??? What does this mean? Get more info on, I'm confused… Lib Library, librarian DWA Don't worry about (e.g. if you wrote something unnecessary) Caf Cafeteria Ψ Psychology, psychologist Ch Charter ∴ Therefore Conv Conversion (school) ∆ Change, is changing S-up Start up school mm Movement App Application, applied ↑ Increases, up, promotes ITBS Iowa Test of Basic Skills ↓ Decreases, down, inhibits LA Language arts X Times (e.g. many x we laugh) SS Social Studies ÷ Divided (we ÷ up the classrooms) QCC Quality Core Curriculum C With Pol Policy, politics Home, house Curr Curriculum ♥ Love, adore (e.g. the kids ♥ this) LP Lesson plans Church, religious activity Disc Discipline O No, doesn't, not Girls, women, female 1/2 Half (e.g. we took 1/2) Boys, men, male 2 To
  • 71. Page 34 Evaluation Expert Session July 16, 2002 F Father, dad c/out without P Parent 2B To be M Mom, mother
  • 72. e.g. For example i.e. That is … If the person trails off, you missed information Appendix A Logic Model Worksheet Population Team/Program Name __________________________ Date _______________________ If the following CONDITIONS AND ASSUMPTIONS
  • 73. exist... And if the following ACTIVITIES are implemented to address these conditions and assumptions Then these SHORT-TERM OUTCOMES may be achieved... And these LONG-TERM OUTCOMES may be acheived... And these LONG- TERM GOALS can be reached....
  • 74.
  • 75. Page 35 Evaluation Expert Session July 16, 2002 Appendix B Pitfalls To Avoid Avoid heightening expectations of delivery staff, program recipients, policy makers, or community members. Ensure that feedback will be provided as appropriate, but may or may not be utilized. Avoid any implication that you are evaluating the impact or outcome. Stress that you are evaluating "what is happening," not how well any one person is performing or what the outcomes of the intervention are. Make sure that the right information gets to the right people - it is most likely to be utilized in a constructive and effective manner if you ensure that your final report does not end up on someone's desk who has little motivation or interest in utilizing your findings.
  • 76. Ensure that data collection and entry is managed on a consistent basis - avoid developing an evaluation design and than having the contract lapse because staff did not enter the data. Page 36 Evaluation Expert Session July 16, 2002 Appendix C References References used for completion of this workbook and/or that you may find helpful for additional information. Centers for Disease Control and Prevention. 1995. Evaluating Community Efforts to Prevent Cardiovascular Diseases. Atlanta, GA. Centers for Disease Control and Prevention. 2001. Introduction to Program Evaluation for Comprehensive Tobacco Control Programs. Atlanta, GA.
  • 77. Freeman, H. E., Rossi, P. H., Sandefur, G. D. 1993. Workbook for evaluation: A systematic approach. Sage Publications: Newbury Park, CA. Georgia Policy Council for Children and Families; The Family Connection; Metis Associates, Inc. 1997. Pathways for assessing change: Strategies for community partners. Grembowski, D. 2001. The practice of health program evaluation. Sage Publications: Thousand Oaks. Hawkins, J. D., Nederhood, B. 1987. Handbook for Evaluating Drug and Alcohol Prevention Programs. U.S. Department of Health and Human Services; Public Health Service; Alcohol, Drug Abuse, and Mental Health Administration: Washington, D. C. Muraskin, L. D. 1993. Understanding evaluation: The way to better prevention programs. Westat, Inc. National Community AIDS Partnership 1993. Evaluating HIV/AIDS Prevention Programs in Community-based Organizations. Washington, D.C. NIMH Overview of Needs Assessment. Chapter 3: Selecting the needs assessment approach. Patton, M. Q. 1982. Practical Evaluation. Sage Publications, Inc.: Beverly Hills, CA.
  • 78. Page 37 Evaluation Expert Session July 16, 2002 Posavac, E. J., Carey, R. G. 1980. Program Evaluation: Methods and Case Studies. Prentice-Hall, Inc.: Englewood Cliffs, N.J. Rossi, P. H., Freeman, H. E., Lipsey, M. W. 1999. Evaluation: A Systematic Approach. (6th edition). Sage Publications, Inc.: Thousand Oaks, CA. Scheirer, M. A. 1994. Designing and using process evaluation. In: J. S. Wholey, H. P. Hatry, & K. E. Newcomer (eds) Handbook of practical program evaluation. Jossey-Bass Publishers: San Francisco. Taylor-Powell, E., Rossing, B., Geran, J. 1998. Evaluating Collaboratives: Reaching the potential. Program Development and Evaluation: Madison, WI. U.S. Department of Health and Human Services; Administration for Children and Families; Office of Community Services. 1994. Evaluation Guidebook: Demonstration partnership
  • 79. program projects. W.K. Kellogg Foundation. 1998. W. K. Kellogg Foundation Evaluation Handbook. Websites: www.cdc.gov/eval/resources www.eval.org (has online text books) www.wmich.edu/evalctr (has online checklists) www.preventiondss.org When conducting literature reviews or searching for additional information, consider using alternative names for "process evaluation," including: formative evaluation program fidelity implementation assessment implementation evaluation program monitoring