Social Work Research: Planning a Program Evaluation
Joan is a social worker who is currently enrolled in a social work PhD program. She is planning to conduct her dissertation research project with a large nonprofit child welfare organization where she has worked as a site coordinator for many years. She has already approached the agency director with her interest, and the leadership team of the agency stated that they would like to collaborate on the research project.
The child welfare organization at the center of the planned study has seven regional centers that operate fairly independently. The primary focus of work is on foster care; that is, recruiting and training foster parents and running a regular foster care program with an emphasis on family foster care. The agency has a residential program as well, but it will not participate in the study. Each of the regional centers services about 45–50 foster parents and approximately 100 foster children. On average, five to six new foster families are recruited at each center on a quarterly basis. This number has been consistent over the past 2 years.
Recently it was decided that a new training program for incoming foster parents would be used by the organization. The primary goals of this new training program include reducing foster placement disruptions, improving the quality of services delivered, and increasing child well-being through better trained and skilled foster families. Each of the regional centers will participate and implement the new training program. Three of the sites will start the program immediately, while the other four centers will not start until 12 months from now. The new training program consists of six separate 3-hour training sessions that are typically conducted in a biweekly format. It is a fairly proceduralized training program; that is, a very detailed set of manuals and training materials exists. All trainings will be conducted by the same two instructors. The current training program that it will replace differs considerably in its focus, but it also uses a 6-week, 3-hour format. It will be used by those sites not immediately participating until the new program is implemented.
Joan has done a thorough review of the foster care literature and has found that there has been no research on the training program to date, even though it is being used by a growing number of agencies. She also found that there are some standardized instruments that she could use for her study. In addition, she would need to create a set of Likert-type scales for the study. She will be able to use a group design because all seven regional centers are interested in participating and they are starting the training at different times.
(Plummer 66-67)
Plummer, Sara-Beth, Sara Makris, Sally Brocksen. Social Work Case Studies: Concentration Year. Laureate Publishing, 10/21/13. VitalBook file.
The citation provided is a guideline. Please check each citation for accuracy before use.
Content.
Social Work Research Planning a Program EvaluationJoan is a soc.docx
1. Social Work Research: Planning a Program Evaluation
Joan is a social worker who is currently enrolled in a social
work PhD program. She is planning to conduct her dissertation
research project with a large nonprofit child welfare
organization where she has worked as a site coordinator for
many years. She has already approached the agency director
with her interest, and the leadership team of the agency stated
that they would like to collaborate on the research project.
The child welfare organization at the center of the planned
study has seven regional centers that operate fairly
independently. The primary focus of work is on foster care; that
is, recruiting and training foster parents and running a regular
foster care program with an emphasis on family foster care. The
agency has a residential program as well, but it will not
participate in the study. Each of the regional centers services
about 45–50 foster parents and approximately 100 foster
children. On average, five to six new foster families are
recruited at each center on a quarterly basis. This number has
been consistent over the past 2 years.
Recently it was decided that a new training program for
incoming foster parents would be used by the organization. The
primary goals of this new training program include reducing
foster placement disruptions, improving the quality of services
delivered, and increasing child well-being through better trained
and skilled foster families. Each of the regional centers will
participate and implement the new training program. Three of
the sites will start the program immediately, while the other
four centers will not start until 12 months from now. The new
training program consists of six separate 3-hour training
sessions that are typically conducted in a biweekly format. It is
a fairly proceduralized training program; that is, a very detailed
set of manuals and training materials exists. All trainings will
be conducted by the same two instructors. The current training
program that it will replace differs considerably in its focus, but
2. it also uses a 6-week, 3-hour format. It will be used by those
sites not immediately participating until the new program is
implemented.
Joan has done a thorough review of the foster care literature and
has found that there has been no research on the training
program to date, even though it is being used by a growing
number of agencies. She also found that there are some
standardized instruments that she could use for her study. In
addition, she would need to create a set of Likert-type scales for
the study. She will be able to use a group design because all
seven regional centers are interested in participating and they
are starting the training at different times.
(Plummer 66-67)
Plummer, Sara-Beth, Sara Makris, Sally Brocksen. Social Work
Case Studies: Concentration Year. Laureate Publishing,
10/21/13. VitalBook file.
The citation provided is a guideline. Please check each citation
for accuracy before use.
Contents of an Evaluation Plan
Develop an evaluation plan to ensure your program evaluations
are carried out efficiently in the future. Note that bankers or
funders may want or benefit from a copy of this plan.
Ensure your evaluation plan is documented so you can regularly
and efficiently carry out your evaluation activities. Record
enough information in the plan so that someone outside of the
organization can understand what you're evaluating and how.
Consider the following format for your report:
1. Title Page (name of the organization that is being, or has a
product/service/program that is being, evaluated; date)
2. Table of Contents
3. Executive Summary (one-page, concise overview of findings
and recommendations)
4. Purpose of the Report (what type of evaluation(s) was
conducted, what decisions are being aided by the findings of the
3. evaluation, who is making the decision, etc.)
5. Background About Organization and
Product/Service/Program that is being evaluated
a) Organization Description/History
b) Product/Service/Program Description (that is being
evaluated)
i) Problem Statement (in the case of nonprofits, description of
the community need that is being met by the
product/service/program)
ii) Overall Goal(s) of Product/Service/Program
iii) Outcomes (or client/customer impacts) and Performance
Measures (that can be measured as indicators toward the
outcomes)
iv) Activities/Technologies of the Product/Service/Program
(general description of how the product/service/program is
developed and delivered)
v) Staffing (description of the number of personnel and roles in
the organization that are relevant to developing and delivering
the product/service/program)
6) Overall Evaluation Goals (eg, what questions are being
answered by the evaluation)
7) Methodology
a) Types of data/information that were collected
b) How data/information were collected (what instruments were
used, etc.)
c) How data/information were analyzed
d) Limitations of the evaluation (eg, cautions about
findings/conclusions and how to use the findings/conclusions,
etc.)
8) Interpretations and Conclusions (from analysis of the
data/information)
9) Recommendations (regarding the decisions that must be made
about the product/service/program)
Appendices: content of the appendices depends on the goals of
the evaluation report, eg.:
a) Instruments used to collect data/information
4. b) Data, eg, in tabular format, etc.
c) Testimonials, comments made by users of the
product/service/program
d) Case studies of users of the product/service/program
e) Any related literature
Pitfalls to Avoid
1. Don't balk at evaluation because it seems far too "scientific."
It's not. Usually the first 20% of effort will generate the first
80% of the plan, and this is far better than nothing.
2. There is no "perfect" evaluation design. Don't worry about
the plan being perfect. It's far more important to do something,
than to wait until every last detail has been tested.
3. Work hard to include some interviews in your evaluation
methods. Questionnaires don't capture "the story," and the story
is usually the most powerful depiction of the benefits of your
services.
4. Don't interview just the successes. You'll learn a great deal
about the program by understanding its failures, dropouts, etc.
5. Don't throw away evaluation results once a report has been
generated. Results don't take up much room, and they can
provide precious information later when trying to understand
changes in the program.
McNamara, C. (2006a). Contents of an evaluation plan. In Basic
guide to program evaluation (including outcomes evaluation).
http://managementhelp.org/evaluation/program-evaluation-
guide.htm#anchor1586742
McNamara, C. (2006b). Reasons for priority on implementing
outcomes-based evaluation.In Basic guide to outcomes-based
evaluation for nonprofit organizations with very limited
resources.
http://managementhelp.org/evaluation/outcomes-evaluation-
guide.htm#anchor30249
5. JACQUELINE FULTON
[email protected] | 404-409-5067 |
▪ The top 1/3 of your resume is critical. Begin with a five-is
sentence Career Summary that explains why you’re the perfect
fit for the company and position. Start by highlighting your
years of experience and the industries you have worked in.
▪ Note your biggest strengths and the areas where you excel.
▪ Don’t forget to list hard skills, areas of expertise and your
knowledge of specific functions highlighted in the job posting.
▪ Share your Unique Value Proposition – what sets you apart
from other candidates.
▪ You can conclude with what you’re like as a colleague, the
work environments you thrive in and/or your passions.
AREAS OF EXPERTISE
| Legal Compliance| Supply Chain Risk Management
Public Relations | Cybersecurity | Cyber Investigations |
Physical Cybersecurity Monitoring |
Technology Profile
6. Microsoft Office Suite | ATIS Recording| Verint Cameras CCTV
| Oral and Written
WORK EXPERIENCE
Life Insurance Agent | The Johnson Agency |
12.2015 – 09.2016
• Participate in meetings to conduct briefs to bring awareness to
new services, and programs
• Protect users Personal Identifiable Information (PPI) with the
database
• Monitors and update users’ policies based off client
requirements
• Develop marketing strategies to compete with other insurance
companies
• Analyze insurance programs to recommend best practices
Access Control Security Technology Specialist | Howard
University | 10.2012 – 07.2015
• Responds to customer service related issues promptly
• Participate in meetings to bring awareness to cybersecurity
7. updates
• Manage various technology programs within the University
• Create security protocols for emerging technologies with the
Department of Public Safety
• Monitored CCTV within the University
• Assigning access controls to secure the University CCTV
system
• Administrator for ATIS Recording and WALES/NCIC
Technical Support Specialist | G4 Secure
Solution
| 08.2010 – 08.2011
• Developed technical documentation reports to deter criminal
activity and misconduct
• Liaison between various departments to assign risk
assessments to hazardous conditions
• Monitored CCTV and access controls for information systems
8. • Provide timely and accurate responses to customer related
issues
Medical Assistant / Executive Assistant | Dr. Vigila Harris
F.A.A.P| 09.2004– 08.2010
EDUCATION & CIVIC ENGAGEMENT
Master of Science in Cybersecurity May 2014 – University of
Maryland University College, Adelphi, MD
Bachelor of Science in Criminal Justice May 2010 -
Westwood College, Annandale, VA