Running head: THE ROUGH DRAFT 1
THE ROUGH DRAFT 10
The Rough Draft
Nicole Jensen
Grantham University
Introduction
Service learning is one of the trial training forms where learning happens through progression of characterized activity and reflection (Brengle, 1999). For this situation, understudies fill in as a group through drawing in whatever they learn, to issues influencing the network and, in the meantime, sparkling upon their experience amid the instruction procedure for their locale and the wellbeing of their own. Network organizations, then again, allude to a formal set up which connects with a school with people, affiliations, private as well as open segment associations to provide a specific program, administration or asset with a goal for empowering the understudies to accomplish more.
Whenever the networks' accomplice in administration learning open, they furrow back some detectable advantages. Control is improved among understudies as specific standards drive them. The groups likewise build up some control abilities in their communication with the understudies and different tenants. It is through administration exercises where real network needs are met and issues influencing the network are tended to. Understudies may end up inactive and if not put energetically, may wind up doing things that are not viewed as ethically upright.
Nonetheless, a few difficulties are set to influence the understudies amid their administration learning in the network. Such issues may extend from a threatening society, disappointment with the work, and time requirements.
The World Scouts Movement organization has been endeavoring to deliver a world that is better through different techniques, for example, enabling or urging the youngsters to go along with them and presentation of the scout programs in the networks.
Community organization
Baden-Powell is credited with the establishment and development of the World Organization of the Scout Movement (WOSM). In 1907, he united twenty young men from various networks of the source in a test camp held at Brownsea Island, England. After one year, an office for Boy Scouts was opened in London. The World Scout's dresser was later established in 1920 amid the central World Scout meeting.
The association endeavors to make a superior world by presenting scout programs in networks and empowering/enabling youngsters to go along with them. It centers on contribution, acknowledgment, and intergenerational trade to connect with the adolescent and to allow them to accomplish their maximum capacity. Its central goal is "to add to the training of youngsters, through an esteem framework dependent on the Scout Promise and Law, to help assemble a superior existence where individuals are self-satisfied as people and assume a valuable job in the public arena. This implies the association gives educative projects that enhance the lives of the energetic populace and also their networks.
Then again, the associa.
Running head THE ROUGH DRAFT1THE ROUGH DRAFT10.docx
1. Running head: THE ROUGH DRAFT 1
THE ROUGH DRAFT 10
The Rough Draft
Nicole Jensen
Grantham University
Introduction
Service learning is one of the trial training forms where
learning happens through progression of characterized activity
and reflection (Brengle, 1999). For this situation, understudies
fill in as a group through drawing in whatever they learn, to
issues influencing the network and, in the meantime, sparkling
upon their experience amid the instruction procedure for their
locale and the wellbeing of their own. Network organizations,
then again, allude to a formal set up which connects with a
school with people, affiliations, private as well as open segment
associations to provide a specific program, administration or
asset with a goal for empowering the understudies to
accomplish more.
Whenever the networks' accomplice in administration learning
2. open, they furrow back some detectable advantages. Control is
improved among understudies as specific standards drive them.
The groups likewise build up some control abilities in their
communication with the understudies and different tenants. It is
through administration exercises where real network needs are
met and issues influencing the network are tended to.
Understudies may end up inactive and if not put energetically,
may wind up doing things that are not viewed as ethically
upright.
Nonetheless, a few difficulties are set to influence the
understudies amid their administration learning in the network.
Such issues may extend from a threatening society,
disappointment with the work, and time requirements.
The World Scouts Movement organization has been endeavoring
to deliver a world that is better through different techniques, for
example, enabling or urging the youngsters to go along with
them and presentation of the scout programs in the networks.
Community organization
Baden-Powell is credited with the establishment and
development of the World Organization of the Scout Movement
(WOSM). In 1907, he united twenty young men from various
networks of the source in a test camp held at Brownsea Island,
England. After one year, an office for Boy Scouts was opened in
London. The World Scout's dresser was later established in
1920 amid the central World Scout meeting.
The association endeavors to make a superior world by
presenting scout programs in networks and
empowering/enabling youngsters to go along with them. It
centers on contribution, acknowledgment, and intergenerational
trade to connect with the adolescent and to allow them to
accomplish their maximum capacity. Its central goal is "to add
to the training of youngsters, through an esteem framework
dependent on the Scout Promise and Law, to help assemble a
superior existence where individuals are self-satisfied as people
and assume a valuable job in the public arena. This implies the
association gives educative projects that enhance the lives of
3. the energetic populace and also their networks.
Then again, the association has a dream of elevating exploring
to make it the best instructive youth development over the globe
and enlist more than 100 million adolescents by 2023. It
addresses the issues of networks by giving administration
figuring out how the adolescent/understudies who draw in with
individuals from the system.
Community Partnership
The World Scouts' Association has banded together with
different organizations, and associations. A portion of its
accomplices incorporates the United Nations Association,
International young men's scouts, Troop 1, the young men's
scouts of the United Nations, and numerous universities
comprehensively. The associations are profitable in
guaranteeing that the jobs of the association are spread and
achieved differently, the issues influencing a gathering of
individuals or the whole network are settled without any
difficulty, and that a great many people get learning concerning
the association, its accomplices and comprehend their jobs. The
association should band together with associations, for example,
the W.H.O to guarantee that nourishment security is met even in
the appetite-driven zones.
Cross-cultural challenges
The most well-known test is the dialect spoken since individuals
from various networks have diverse dialects and societies which
will impede the method of correspondence making the ventures
in the association moderate and some of the time unachieved.
Another issue is the distinction in feeling since the number of
inhabitants in the association has an alternate method for
different survey angles, and the approaching gathering has
another plan for thinking about alternative points of view.
(Serc.carleton.edu, 2018).
Humanitarian considerations
Today the American youth faces different difficulties that
4. discourage their way to progress, and these difficulties
incorporate medications, realism, and divergence in instruction.
Today the adolescent is more centered on these viewpoints, they
overlook their objective in life subsequently to destroy this
issue from the network. Making of projects and occasions bring
individuals from various systems together, and for this
situation, the association takes part in solidarity work bringing
together and restricting distinctive individuals together as one
and enabling them to cooperate and join their brains to take care
of issues in the networks.
Volunteers
The association utilizes volunteers to achieve a large portion of
its undertakings. For instance, the individuals from the
exploring development are being used in guaranteeing that
nature is perfect and that young people who have drawn
themselves in unethical conduct, restored and instructed on the
upside of residual ethically upright. The association's
accomplices may likewise volunteer in giving guide and assets
where required.
At the point when the volunteers give their opportunity to the
association, they advantage themselves and the network as well.
The volunteers can boost their chance on more valuable issues
as opposed to merely remaining inert. The people group, like
this, profits by its problems being tended to and subsequently
understood (Austin, 2000). Volunteering includes openly
offering to accomplish something while benefit learning
provides training including activity and reflection.
Roadblocks
The potential risk that the association may confront is the full
financing of the whole movement. Since various understudies
have alternate points of view at that point getting them justified
to the association's procedure will require advance preparing
subsequently in many occurrences the financial parts of
hierarchical administration will be the essential obligation. The
association has had no issues in the past that demonstrate a poor
reflection on the college. Understudies have the danger of time
5. limitations and significant commitment to the different
activities. Furthermore, understudies have the test of
reconciliation with the course content.
Future Vision
The association, then again, has had the possibility of the
exploring advancement with the goal of making the association
the best youth development into the extent training concerned.
It means to enlist more than one hundred million youngsters
continuously in 2023 across the globe. It addresses the
networks' issues through the arrangement of the administration
learning for the understudies and young people who include the
network individuals.
Challenges
As the World Organization for the Scouts Movement will
connect with the understudies from the college in the
administration learning, it may confront numerous difficulties.
A couple of challenges are set to impact the understudies in the
midst of their organization learning in the system. Such issues
may keep running from a debilitating society, dissatisfaction
with the work, and time constraints. Given a lot of school work,
understudies may disregard to have the adequate vitality to
appreciate organization learning (Vizenor Sou, & Ertmer, 2017).
Improvements
The association should unite with the relationship, for instance,
the W.H.O to ensure that sustenance security is met even in the
longing driven zones. The affiliations are significant in
guaranteeing, and those by far most get data concerning the
membership, its assistants and fathom their occupations.
The association should consider encouraging volunteers to help
in achieving their objectives. The people from the investigating
improvement can be used in ensuring that the earth is
impeccable and that youngsters who have been educated.
Conclusions
The world youth association gives many chances to the young to
take an interest in exploring projects to guarantee that particular
network needs are met in a composed way. Understudy network
6. organizations make a substantial positive effect on the
individuals from the network.
Banding together with different associations is exceptionally
successful for the running of an association. The advantages
naturally surpass the difficulties particularly in connection to
benefit learning. The accomplices utilize volunteers to assemble
limit with regards to positive change, to fortify and extend
administrations and projects, lastly to manufacture associations
with other organization offices.
Uniting with various affiliations is unusually high for the
running of an alliance. The favorable circumstances like this
outperform the challenges especially in association with
advantage learning. The assistants use volunteers as far as
possible concerning positive change, to brace and broaden
organizations and ventures, in conclusion, to gather relationship
with other association workplaces.
Because of dialect issues, the mix of the course substance will
remain a problem that is begging to be addressed for the
association in its endeavors to help the young being developed.
Also, this may prompt contradiction in cooperating the ventures
subsequently may not work out and satisfy their potential.
References
Astin, A. W., Vogelgesang, L. J., Ikeda, E. K., & Yee, J. A.
(2000). How service learning affects students.
Bringle, R. G., & Hatcher, J. A. (1999). Reflection in service
learning: Making meaning or experience. Educational Horizons,
7. 17
Haddad, M. A. (2007). Politics and volunteering in Japan: A
global perspective. Cambridge University Press.
Johnson, R., & Clifton, J. (2010). Younger Generations Less
Likely to Join Boy Scouts. Retrieved from
http://www.gallup.com/poll/145187/Younger-Generations-Less-
Likely-Join-Boy-Scouts.aspx
Lonegren, L. (2011). Scouting for Sustainable Development. A
study of young people environmental views within the scout
movement.
Polson, E., Kim, Y., Jang, S., Johnson, B., & Smith, B. (2013).
Being Prepared and Staying Connected: Scouting's Influence on
Social Capital and Community Involvement. Social Science
Quarterly, 94(3), 758-776.
Scheidlinger, S. (1948). A comparative study of the boy scout
movement in different national and social groups. American
Sociological Review, 13(6), 739-750.
Serc.carleton.edu. (2018). “Challenges of Service-Learning.”
Retrieved from;
https://serc.carleton.edu/introgeo/service/challenges.html
P A T H F I N D E R I N T E R N A T I O N A L T O O L S E R
I E S
Monitoring and Evaluation – 2
CONDUCTING IN-DEPTH
INTERVIEWS:
A Guide for Designing and
Conducting In-Depth Interviews
for Evaluation Input
By
8. Carolyn Boyce, MA, Evaluation Associate
Palena Neale, PhD, Senior Evaluation Associate
May 2006
P A T H F I N D E R I N T E R N A T I O N A L T O O L S E R
I E S
Monitoring and Evaluation – 2
CONDUCTING IN-DEPTH
INTERVIEWS:
A Guide for Designing and
Conducting In-Depth Interviews
for Evaluation Input
By
Carolyn Boyce, MA, Evaluation Associate
Palena Neale, PhD, Senior Evaluation Associate
May 2006
Acknowledgements
The authors would like to thank the following Pathfinder
employees and partners for their
technical inputs into this document: Anne Palmer (Futures
Group International), Ugo
Daniels (African Youth Alliance (AYA)), Veronique Dupont
(Pathfinder/Extending Service
Delivery (ESD)), Cathy Solter, Lauren Dunnington, and
9. Shannon Pryor (Pathfinder
headquarters). Jenny Wilder and Mary Burket are also thanked
for their inputs and
assistance in editing and producing this document.
2 PATHFINDER INTERNATIONAL: CONDUCTING IN-
DEPTH INTERVIEWS
What is an In-Depth Interview?
In-depth interviewing is a qualitative research technique that
involves conducting intensive
individual interviews with a small number of respondents to
explore their perspectives on a
particular idea, program, or situation. For example, we might
ask participants, staff, and others
associated with a program about their experiences and
expectations related to the program, the
thoughts they have concerning program operations, processes,
and outcomes, and about any
changes they perceive in themselves as a result of their
involvement in the program.
When are In-Depth Interviews Appropriate?
In-depth interviews are useful when you want detailed
information about a person’s
thoughts and behaviors or want to explore new issues in depth.
Interviews are often used
to provide context to other data (such as outcome data), offering
a more complete picture
of what happened in the program and why. For example, you
may have measured an
increase in youth visits to a clinic, and through in-depth
interviews you find out that a
youth noted that she went to the clinic because she saw a new
10. sign outside of the clinic
advertising youth hours. You might also interview a clinic staff
member to find out their
perspective on the clinic’s “youth friendliness.”
In-depth interviews should be used in place of focus groups if
the potential participants
may not be included or comfortable talking openly in a group,
or when you want to
distinguish individual (as opposed to group) opinions about the
program. They are often
used to refine questions for future surveys of a particular group.
What are the Advantages and Limitations of In-Depth
Interviews?
The primary advantage of in-depth interviews is that they
provide much more detailed
information than what is available through other data collection
methods, such as surveys.
They also may provide a more relaxed atmosphere in which to
collect information —
people may feel more comfortable having a conversation with
you about their program as
opposed to filling out a survey. However, there are a few
limitations and pitfalls, each of
which is described below.
Prone to bias: Because program or clinic staff might want to
“prove” that a program is
working, their interview responses might be biased. Responses
from community members
and program participants could also be biased due to their stake
in the program or for a
number of other reasons. Every effort should be made to design
a data collection effort,
create instruments, and conduct interviews to allow for minimal
11. bias.
Can be time-intensive: Interviews can be a time-intensive
evaluation activity because of the
time it takes to conduct interviews, transcribe them, and analyze
the results. In planning
PATHFINDER INTERNATIONAL: CONDUCTING IN-DEPTH
INTERVIEWS 3
your data collection effort, care must be taken to include time
for transcription and
analysis of this detailed data.
Interviewer must be appropriately trained in interviewing
techniques: To provide the most
detailed and rich data from an interviewee, the interviewer must
make that person
comfortable and appear interested in what they are saying. They
must also be sure to use
effective interview techniques, such as avoiding yes/no and
leading questions, using
appropriate body language, and keeping their personal opinions
in check.
Not generalizable: When in-depth interviews are conducted,
generalizations about the results
are usually not able to be made because small samples are
chosen and random sampling
methods are not used. In-depth interviews however, provide
valuable information for
programs, particularly when supplementing other methods of
data collection. It should be
noted that the general rule on sample size for interviews is that
12. when the same stories,
themes, issues, and topics are emerging from the interviewees,
then a sufficient sample size
has been reached.
What is the Process for Conducting In-Depth Interviews?
The process for conducting in-depth interviews follows the
same general process as is
followed for other research: plan, develop instruments, collect
data, analyze data, and
disseminate findings. More detailed steps are given below.
1. Plan
• Identify stakeholders who will be involved.
• Identify what information is needed and from whom. (See
“What are Potential
Sources of Information?”)
• List stakeholders to be interviewed. Identify stakeholder
groups from national, facility,
and beneficiary levels and then identify individuals within those
groups — additional
interviewees may be identified during data collection.
Determine sample if necessary.
• Ensure research will follow international and national ethical
research standards,
including review by ethical research committees. For more
information, please see the
International Ethical Guidelines for Biomedical Research
Involving Human Subjects,
available at
http://www.cioms.ch/frame_guidelines_nov_2002.htm.
13. 4 PATHFINDER INTERNATIONAL: CONDUCTING IN-
DEPTH INTERVIEWS
2. Develop Instruments
• Develop an interview protocol — the rules that guide the
administration and
implementation of the interviews. Put simply, these are the
instructions that are
followed for each interview, to ensure consistency between
interviews, and thus
increase the reliability of the findings. The following
instructions for the interviewer
should be included in the protocol:
•• What to say to interviewees when setting up the interview;
•• What to say to interviewees when beginning the interview,
including ensuring
informed consent and confidentiality of the interviewee (see
Appendix 1
for an example);
•• What to say to interviewees in concluding the interview;
•• What to do during the interview (Example: Take notes?
Audiotape? Both?); and
•• What to do following the interview (Example: Fill in notes?
Check audiotape
for clarity? Summarize key information for each? Submit
written findings?).
• Develop an interview guide that lists the questions or issues to
14. be explored during the
interview and includes an informed consent form. There should
be no more than 15
main questions to guide the interview, and probes should be
included where helpful (see
“Interview Question Tips”). An example is provided in
Appendix 1. Please note that you
will likely need interview guides for each group of
stakeholders, as questions may differ.
• Where necessary, translate guides into local languages and test
the translation.
Interview QuestionTips
• Questions should be open-ended rather than closed-ended. For
example, instead
of asking “Do you know about the clinic’s services?” ask
“Please describe the
clinic’s services.”
• You should ask factual question before opinion questions. For
example, ask, “What
activities were conducted?” before asking, “What did you think
of the activities?”
• Use probes as needed. These include:
•• Would you give me an example?
•• Can you elaborate on that idea?
•• Would you explain that further?
•• I’m not sure I understand what you’re saying.
•• Is there anything else?
PATHFINDER INTERNATIONAL: CONDUCTING IN-DEPTH
INTERVIEWS 5
15. 3. Train Data Collectors
• Identify and train interviewers (see “Training Tips for Data
Collectors”1). Where
necessary, use interviewers that speak the local language.
4. Collect Data
• Set up interviews with stakeholders (be sure to explain the
purpose of the interview,
why the stakeholder has been chosen, and the expected duration
of the interview).
• Seek informed consent of the interviewee (written or
documented oral). Re-explain
the purpose of the interview, why the stakeholder has been
chosen, expected duration
of the interview, whether and how the information will be kept
confidential, and the
use of a note taker and/or tape recorder.
• If interviewee has consented, conduct the interview.
• Summarize key data immediately following the interview.
• Verify information given in interviews as necessary. For
example, if an interviewee
says that a clinic has a policy of not providing services to
anyone under 16, you
should verify that information on your own with the clinic.
Training Tips for Data Collectors
Staff, youth program participants, or professional interviewers
16. may be involved in data
collection. Regardless of what experience data collectors have,
training should include:
• An introduction to the evaluation objectives,
• A review of data collection techniques,
• A thorough review of the data collction items and instruments,
• Practice in the use of the instruments,
• Skill-building exercises on interviewing and interpersonal
communication, and
• Discussion of ethical issues.
6 PATHFINDER INTERNATIONAL: CONDUCTING IN-
DEPTH INTERVIEWS
1 Adamchak, S. et.al. (2000). A Guide To Monitoring and
Evaluating Adolescent Reproductive Health Programs.
Available at
http://www.pathfind.org/site/PageServer?pagename=Publication
s_FOCUS_Guides_and_Tools
5. Analyze Data
• Transcribe and/or review data.
• Analyze all interview data (see “Tips on Analyzing Interview
Responses”2 ).
6. Disseminate Findings
• Write report (see “How are In-Depth Interviews Presented?”).
• Solicit feedback from interviewees and program stakeholders.
• Revise.
• Disseminate to interviewees, program stakeholders, funders,
and the community as
17. appropriate.
What are Potential Sources of Information?
In-depth interviews typically rely on multiple sources of
information to provide as
complete a picture as possible. Information sources could
include:
• Policy Makers • Program Participants/Clients
• Project Staff • Community Members
• Clinic Staff
When choosing interviewees, one should consider a sample that
best represents the diverse
stakeholders and opinions of those stakeholders. The general
rule about interviewing is that
you will know when you have done enough when you hear the
same information from a
number of stakeholders.
Tips on Analyzing Interview Responses
• Read through the interview responses and look for patterns or
themes among
the partcipants.
• If you get a variety of themes, see if you can group them in
any meaningful
way, such as by type of participant. You may, for example, find
that younger
participants tend to think and feel differently from older ones or
that men
and women respond differently.
• You can also identify the responses that seem to have been
18. given with enthusiasm, as
opposed to those that the participants answered in only a few
words.
PATHFINDER INTERNATIONAL: CONDUCTING IN-DEPTH
INTERVIEWS 7
2 University of California San Francisco’s Center for AIDS
Prevention Studies. (1998). Good Questions, Better Answers.
California
Department of Health Services and Northern California
Grantmakers AIDS Task Force. Available at
http://goodquestions.ucsf.edu
How are In-Depth Interviews Presented?
In-depth interviews are flexible in that they can be presented in
a number of ways — there
is no specific format to follow. However, like all evaluation
results, justification and
methodology of the study should be provided, as well as any
supporting information
(i.e. copies of instruments and guides used in the study). In-
depth interview data may
stand alone or be included in a larger evaluation report. If
presented as a stand-alone
report, the following outline is suggested:
1. Introduction and Justification
2. Methodology
a. How was the process carried out? (Describe the process of
selecting the interviewees
and conducting the interviews.)
19. b. What assumptions are there (if any)?
c. Are there any limitations with this method?
d. What instruments were used to collect data? (You may want
to include some or all
in the appendix.)
e. What sample(s) is/are being used?
f. Over which period of time was this data collected?
3. Results
a. What are the key findings?
b. What were the strengths and limitations of the information?
c. Where and how are the results similar and dissimilar to other
findings (if other
studies have been done)?
4. Conclusion and Recommendations
5. Appendices (including the interview guide(s))
In presenting results of in-depth interviews, you need to use
care in presenting the data
and use qualitative descriptors rather than try to “quantify” the
information. You might
consider using qualifiers such as “the prevalent feeling was that
. . .,” or “several participants
strongly felt that . . .,” or even “most participants agreed that . .
.” Numbers and
percentages sometimes convey the impression that results can
be projected to a population,
and this is not within the capabilities of this qualitative research
procedure.
8 PATHFINDER INTERNATIONAL: CONDUCTING IN-
DEPTH INTERVIEWS
20. Providing quotes from respondents throughout the report adds
credibility to the
information. Do be careful that you don’t identify the
respondent or provide quotes
that are easily traced back to an individual, especially if you
have promised
confidentiality. For example, if you have interviewed only one
youth as part of your
sample, and in the report you note that, “one respondent
described the program as
having no impact on accessibility for youth because the services
are ‘way too expensive
for someone my age,’” it would be clear to the reader that the
quote was from the
youth. Ensure that you have a good sample of interviewees
and/or ask permission from
the interviewee before including quotes such as these.
Data can be displayed in tables, boxes, and figures to make it
easier to read. For example, if
you have a number of quotes that you want to highlight, you
might want to display them
in a box like the one below.
You could also highlight recommendations made by your key
stakeholders in a table such
as this.
Key Stakeholder Recommendations for Improving ASRH
1. Train more outreach peers so that they can reach more youth
outside the clinics.
2. Provide more assistance in implementing action plans for
clinic improvements.
3. Community mobilization efforts are neded to enhance future
21. work.
Examples of youth friendly staff interactions
“She taught me a lot and made funny jokes.” — female, age 16
“He said to feel at home with a big smile.” — male, age 14
“They greeted me with a smile and showed me where to go.” —
female, age 17
PATHFINDER INTERNATIONAL: CONDUCTING IN-DEPTH
INTERVIEWS 9
Where Can More Information on In-Depth Interviews be Found?
Adamchak, S., et. al. (2000). A Guide To Monitoring and
Evaluating Adolescent Reproductive
Health Programs. Available at
http://www.pathfind.org/site/PageServer?pagename=
Publications_FOCUS_Guides_and_Tools
Patton, Michael Q. (2002). Qualitative Research & Evaluation
Methods. Thousand Oaks:
Sage Publications.
Prairie Research Associates, Inc. (2001). The In-Depth
Interview. Prairie Research
Associates, Inc. (TechNotes). Available at
http://www.pra.ca/resources/indepth.pdf
United States Agency for International Development’s Center
for Development
Information and Evaluation. (1996). Conducting Key Informant
Interviews. (Performance
Monitoring and Evaluation TIPS) Available at
http://www.usaid.gov/pubs/usaid_eval/
22. pdf_docs/pnabs541.pdf
University of California San Francisco’s Center for AIDS
Prevention Studies. (1998).
Good Questions, Better Answers. California Department of
Health Services and Northern
California Grantmakers AIDS Task Force. Available at
http://goodquestions.ucsf.edu
10 PATHFINDER INTERNATIONAL: CONDUCTING IN-
DEPTH INTERVIEWS
Appendix 1: Sample Key Stakeholder Interview Guide
The following is an example of an interview guide that you
might use with key staff
members of your program, to determine what they found to be
the strengths and
weaknesses of the initiative. Interview guides should contain an
introduction (including
informed consent), a set of questions, and closing comments, as
illustrated in this example.
PATHFINDER INTERNATIONAL: SAMPLE KEY
STAKEHOLDER INTERVIEW GUIDE 11
Introduction Key
Components:
• Thank you
• Your name
• Purpose
23. • Confidentiality
• Duration
• How interview will
be conducted
• Opportunity for
questions
• Signature of
consent
I want to thank you for taking the time to meet with me today.
My name is ____________________________ and I would like
to talk to you about your experiences participating in the
African
Youth Alliance (AYA) project. Specifically, as one of the
components of our overall program evaluation we are assessing
program effectiveness in order to capture lessons that can be
used
in future interventions.
The interview should take less than an hour. I will be taping the
session because I don’t want to miss any of your comments.
Although I will be taking some notes during the session, I can’t
possibly write fast enough to get it all down. Because we’re on
tape,
please be sure to speak up so that we don’t miss your comments.
All responses will be kept confidential. This means that your
interview responses will only be shared with research team
members and we will ensure that any information we include in
our report does not identify you as the respondent. Remember,
you don’t have to talk about anything you don’t want to and you
may end the interview at any time.
24. Are there any questions about what I have just explained?
Are you willing to participate in this interview?
__________________ __________________ __________
Interviewee Witness Date
______________________________________
Legal guardian (if interviewee is under 18)
12 PATHFINDER INTERNATIONAL: SAMPLE KEY
STAKEHOLDER INTERVIEW GUIDE
Questions
• No more than
15 open-ended
questions
• Ask factual before
opinion
• Use probes as
needed
Closing Key
Components:
• Additional
comments
• Next steps
25. • Thank you
1. What YFS strategies (e.g., facility assessment and quality
improvement process, other), interventions (preservice training,
facility strengthening, training of facility supervisors, training
of
outreach staff, NTCDs, service providers, community and
stakeholder mobilization, other), and tools were used (facility
assessment tool, curricula, etc)? Please list.
2. Which of these strategies, interventions and tools would you
consider to be key program elements? Please explain.
3. To what extent did participation in the AYA UNFPA,
Pathfinder,
PATH partnership advance or hinder project implementation?
Please explain.
4. What worked well? Please elaborate.
5. What would you do differently next time? Please explain
why.
6. What strategies, interventions, tools, etc., would you
recommend be sustained and/or scaled up? Please provide a
justification for your response.
7. What strategies, interventions, tools should be discontinued?
Why?
8. What were some barriers, if any, that you encountered?
Staff turnover? Lack of key support? Lack of technical
assistance?
9. How did you overcome the barrier(s)?
26. 10. What effect, if any, do you feel the AYA project had on the
community in which you work?
Increased use of services by youth? Increased knowledge of
youth-
friendly services by clinic staff? Changes to the clinic(s) to
make
them more youth friendly?
11. What recommendations do you have for future efforts such
as these?
Is there anything more you would like to add?
I’ll be analyzing the information you and others gave me and
submitting a draft report to the organization in one month. I’ll
be
happy to send you a copy to review at that time, if you are
interested.
Thank you for your time.
Pathfinder International
9 Galen Street, Suite 217
Watertown, MA 02472
USA
Tel: 617-924-7200
Email: [email protected]
05/06/500
27. Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
28. Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed.
Thus,
process evaluation asks "what," and outcome evaluation asks,
"so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own
process
evaluation for a program of your choosing. There are many
steps involved
in the implementation of a process evaluation, and this
workbook will
attempt to direct you through some of the main stages. It will be
helpful to
think of a delivery service program that you can use as your
example as
29. you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability
in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the
intervention
and the outcomes
5. To provide information on what components of the
intervention
are responsible for outcomes
6. To understand the relationship between program context
(i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of
implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public,
clients,
and funders
10. To improve the quality of the program, as the act of
evaluating is
30. an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid 30
31. c. References 31
Evaluation can be an exciting,
challenging, and fun experience
Enjoy!
* Previously covered in Evaluation Planning Workshops.
** Will not be covered in this expert session. Please refer to
the Evaluation Framework
and Evaluation Module of FHB Best Practice Manual for more
details.
Evaluation Expert Session
July 16, 2002 Page 3
Forming collaborative relationships
A strong, collaborative relationship with program delivery staff
and management will
likely result in the following:
Feedback regarding evaluation design and implementation
32. Ease in conducting the evaluation due to increased cooperation
Participation in interviews, panel discussion, meetings, etc.
Increased utilization of findings
Seek to establish a mutually respectful relationship
characterized by trust, commitment,
and flexibility.
Key points in establishing a collaborative
relationship:
Start early. Introduce yourself and the evaluation team to as
many delivery staff and
management personnel as early as possible.
Emphasize that THEY are the experts, and you will be utilizing
their knowledge and
information to inform your evaluation development and
implementation.
Be respectful of their time both in-person and on the
telephone. Set up meeting places
that are geographically accessible to all parties involved in the
evaluation process.
Remain aware that, even if they have requested the evaluation,
it may often appear as
an intrusion upon their daily activities. Attempt to be as
33. unobtrusive as possible and
request their feedback regarding appropriate times for on-site
data collection.
Involve key policy makers, managers, and staff in a series of
meetings throughout the
evaluation process. The evaluation should be driven by the
questions that are of
greatest interest to the stakeholders. Set agendas for meetings
and provide an
overview of the goals of the meeting before beginning. Obtain
their feedback and
provide them with updates regarding the evaluation process.
You may wish to
obtained structured feedback. Sample feedback forms are
throughout the workbook.
Provide feedback regarding evaluation findings to the key
policy makers, managers,
and staff when and as appropriate. Use visual aids and
handouts. Tabulate and
summarize information. Make it as interesting as possible.
Consider establishing a resource or expert "panel" or advisory
board that is an official
group of people willing to be contacted when you need feedback
or have questions.
34. Evaluation Expert Session
July 16, 2002 Page 4
Determining Program Components
Program components are identified by answering the questions
who, what, when, where,
and how as they pertain to your program.
Who: the program clients/recipients and staff
What: activities, behaviors, materials
When: frequency and length of the contact or intervention
Where: the community context and physical setting
How: strategies for operating the program or intervention
BRIEF EXAMPLE:
Who: elementary school students
What: fire safety intervention
When: 2 times per year
Where: in students’ classroom
How: group administered intervention, small group practice
1. Instruct students what to do in case of fire (stop, drop and
roll).
2. Educate students on calling 911 and have them practice on
play telephones.
35. 3. Educate students on how to pull a fire alarm, how to test a
home fire alarm and how to
change batteries in a home fire alarm. Have students practice
each of these activities.
4. Provide students with written information and have them take
it home to share with their
parents. Request parental signature to indicate compliance and
target a 75% return rate.
Points to keep in mind when determining program
components
Specify activities as behaviors that can be observed
If you have a logic model, use the "activities" column as a
starting point
Ensure that each component is separate and distinguishable
from others
Include all activities and materials intended for use in the
intervention
Identify the aspects of the intervention that may need to be
adapted, and those that should
always be delivered as designed.
Consult with program staff, mission statements, and program
36. materials as needed.
Evaluation Expert Session
July 16, 2002 Page 5
Your Program Components
After you have identified your program components, create a
logic model that graphically
portrays the link between program components and outcomes
expected from these
components.
Now, write out a succinct list of the components of your
program.
WHO:
WHAT:
38. address (conditions), how it will address them (activities), and
what are the expected
results (immediate and intermediate outcomes, long-term goals).
Benefits of the logic model include:
helps develop clarity about a project or program,
helps to develop consensus among people,
helps to identify gaps or redundancies in a plan,
helps to identify core hypothesis,
helps to succinctly communicate what your project or program
is about.
When do you use a logic model
Use...
- During any work to clarify what is being done, why, and with
what intended results
- During project or program planning to make sure that the
project or program is logical and
complete
- During evaluation planning to focus the evaluation
- During project or program implementation as a template for
comparing to the actual program
and as a filter to determine whether proposed changes fit or
not.
39. This information was extracted from the Logic Models: A
Multi-Purpose Tool materials developed by Wellsys
Corporation for the Evaluation Planning Workshop Training.
Please see the Evaluation Planning Workshop
materials for more information. Appendix A has a sample
template of the tabular format.
Evaluation Expert Session
July 16, 2002 Page 7
Determining Evaluation Questions
40. As you design your process evaluation, consider what questions
you would like to answer. It is only after
your questions are specified that you can begin to develop your
methodology. Considering the importance
and purpose of each question is critical.
BROADLY....
What questions do you hope to answer? You may wish to turn
the program components that you have just identified
into questions assessing:
Was the component completed as indicated?
What were the strengths in implementation?
What were the barriers or challenges in implementation?
What were the apparent strengths and weaknesses of each step
of the intervention?
Did the recipient understand the intervention?
Were resources available to sustain project activities?
What were staff perceptions?
What were community perceptions?
What was the nature of the interaction between staff and
clients?
These are examples. Check off what is applicable to you, and
use the space below to write additional broad,
overarching questions that you wish to answer.
41. Evaluation Expert Session
July 16, 2002 Page 8
SPECIFICALLY ...
Now, make a list of all the specific questions you wish to
answer, and organize your questions categorically. Your
list of questions will likely be much longer than your list of
program components. This step of developing your
evaluation will inform your methodologies and instrument
choice.
Remember that you must collect information on what the
program is intended to be and what it is in reality, so you
may need to ask some questions in 2 formats.
For example:
How many people are intended to complete this intervention
per week?"
How many actually go through the intervention during an
average week?"
Consider what specific questions you have. The questions below
are only examples! Some may not be appropriate
for your evaluation, and you will most likely need to add
additional questions. Check off the questions that are
applicable to you, and add your own questions in the space
provided.
WHO (regarding client):
Who is the target audience, client, or recipient?
How many people have participated?
42. How many people have dropped out?
How many people have declined participation?
What are the demographic characteristics of clients?
Race
Ethnicity
National Origin
Age
Gender
Sexual Orientation
Religion
Marital Status
Employment
Income Sources
Education
Socio-Economic Status
What factors do the clients have in common?
What risk factors do clients have?
Who is eligible for participation?
How are people referred to the program? How are the
screened?
How satisfied are the clients?
YOUR QUESTIONS:
Evaluation Expert Session
July 16, 2002 Page 9
43. WHO (Regarding staff):
Who delivers the services?
How are they hired?
How supportive are staff and management of each other?
What qualifications do staff have?
How are staff trained?
How congruent are staff and recipients with one another?
What are staff demographics? (see client demographic list for
specifics.)
YOUR QUESTIONS:
WHAT:
What happens during the intervention?
What is being delivered?
What are the methods of delivery for each service (e.g., one-
on-one, group session, didactic instruction,
etc.)
What are the standard operating procedures?
What technologies are in use?
What types of communication techniques are implemented?
What type of organization delivers the program?
How many years has the organization existed? How many
44. years has the program been operating?
What type of reputation does the agency have in the
community? What about the program?
What are the methods of service delivery?
How is the intervention structured?
How is confidentiality maintained?
YOUR QUESTIONS:
WHEN:
When is the intervention conducted?
How frequently is the intervention conducted?
At what intervals?
At what time of day, week, month, year?
What is the length and/or duration of each service?
Evaluation Expert Session
July 16, 2002 Page 10
45. YOUR QUESTIONS:
WHERE:
Where does the intervention occur?
What type of facility is used?
What is the age and condition of the facility?
In what part of town is the facility? Is it accessible to the
target audience? Does public transportation access
the facility? Is parking available?
Is child care provided on site?
YOUR QUESTIONS:
WHY:
Why are these activities or strategies implemented and why
46. not others?
Why has the intervention varied in ability to maintain interest?
Why are clients not participating?
Why is the intervention conducted at a certain time or at a
certain frequency?
YOUR QUESTIONS:
Evaluation Expert Session
July 16, 2002 Page 11
Validating Your Evaluation Questions
Even though all of your questions may be interesting, it is
important to narrow your list to questions that
will be particularly helpful to the evaluation and that can be
answered given your specific resources, staff,
and time.
Go through each of your questions and consider it with respect
to the questions below, which may be helpful in
streamlining your final list of questions.
Revise your worksheet/list of questions until you can answer
"yes" to all of these questions. If you cannot answer
"yes" to your question, consider omitting the question from your
47. evaluation.
Validation
Yes
No
Will I use the data that will stem from these questions?
Do I know why each question is important and /or valuable?
Is someone interested in each of these questions?
Have I ensured that no questions are omitted that may be
important to
someone else?
48. Is the wording of each question sufficiently clear and
unambiguous?
Do I have a hypothesis about what the “correct” answer will be
for each
question?
Is each question specific without inappropriately limiting the
scope of the
evaluation or probing for a specific response?
Do they constitute a sufficient set of questions to achieve the
purpose(s) of
the evaluation?
Is it feasible to answer the question, given what I know about
the
resources for evaluation?
49. Is each question worth the expense of answering it?
Derived from "A Design Manual" Checklist, page 51.
Evaluation Expert Session
July 16, 2002 Page 12
Determining Methodology
Process evaluation is characterized by collection of data
primarily through two formats:
1) Quantitative, archival, recorded data that may be managed
by an computerized
tracking or management system, and
2) Qualitative data that may be obtained through a variety of
formats, such as
surveys or focus groups.
50. When considering what methods to use, it is critical to have a
thorough
understanding and knowledge of the questions you want
answered. Your
questions will inform your choice of methods. After this section
on types of
methodologies, you will complete an exercise in which you
consider what method
of data collection is most appropriate for each question.
Do you have a thorough understanding of your
questions?
Furthermore, it is essential to consider what data the
organization you are
evaluating already has. Data may exist in the form of an
existing computerized
management information system, records, or a tracking system
of some other
sort. Using this data may provide the best reflection of what is
"going on," and it
will also save you time, money, and energy because you will not
have to devise
your own data collection method! However, keep in mind that
you may have to
adapt this data to meet your own needs - you may need to add or
replace fields,
records, or variables.
What data does your organization already have?
Will you need to adapt it?
51. If the organization does not already have existing data, consider
devising a
method for the organizational staff to collect their own data.
This process will
ultimately be helpful for them so that they can continue to self-
evaluate, track
their activities, and assess progress and change. It will be
helpful for the
evaluation process because, again, it will save you time, money,
and energy that
you can better devote towards other aspects of the evaluation.
Management
information systems will be described more fully in a later
section of this
workbook.
Do you have the capacity and resources to devise
such a system? (You may need to refer to a later
section of this workbook before answering.)
Evaluation Expert Session
July 16, 2002 Page 13
Who should collect the data?
52. Given all of this, what thoughts do you have on who should
collect data for your
evaluation? Program staff, evaluation staff, or some
combination?
Program Staff: May collect data from activities such as
attendance, demographics,
participation, characteristics of participants, dispositions, etc;
may
conduct intake interviews, note changes regarding service
delivery,
and monitor program implementation.
Advantages: Cost-efficient, accessible, resourceful, available,
time-efficient,
and increased understanding of the program.
Disadvantages: May exhibit bias and/or social desirability, may
use data for critical
judgment, may compromise the validity of the program; may put
staff in uncomfortable or inappropriate position; also, if staff
collect
data, may have an increased burden and responsibility placed
upon
them outside of their usual or typical job responsibilities. If you
utilize staff for data collection, provide frequent reminders as
well
as messages of gratitude.
Evaluation staff: May collect qualitative information regarding
implementation,
general characteristics of program participants, and other
53. information that may otherwise be subject to bias or distortion.
Advantages: Data collected in manner consistent with overall
goals and timeline
of evaluation; prevents bias and inappropriate use of
information;
promotes overall fidelity and validity of data.
Disadvantages: May be costly and take extensive time; may
require additional
training on part of evaluator; presence of evaluator in
organization
may be intrusive, inconvenient, or burdensome.
Evaluation Expert Session
July 16, 2002 Page 14
When should data be collected?
Conducting the evaluation according to your timeline can be
challenging. Consider how
much time you have for data collection, and make decisions
54. regarding what to collect
and how much based on your timeline.
In many cases, outcome evaluation is not considered appropriate
until the program has
stabilized. However, when conducting a process evaluation, it
can be important to start
the evaluation at the beginning so that a story may be told
regarding how the program
was developed, information may be provided on refinements,
and program growth and
progress may be noted.
If you have the luxury of collecting data from the start of the
intervention to the end of
the intervention, space out data collection as appropriate. If you
are evaluating an
ongoing intervention that is fairly quick (e.g., an 8-week
educational group), you may
choose to evaluate one or more "cycles."
How much time do you have to conduct your evaluation?
How much time do you have for data collection (as opposed to
designing the evaluation,
training, organizing and analyzing results, and writing the
report?)
Is the program you are evaluating time specific?
How long does the program or intervention last?
At what stages do you think you will most likely collect data?
Soon after a program has begun
55. Descriptive information on program characteristics that will not
change; information
requiring baseline information
During the intervention
Ongoing process information such as recruitment, program
implementation
After the intervention
Demographics, attendance ratings, satisfaction ratings
Evaluation Expert Session
July 16, 2002 Page 15
Before you consider methods
A list of various methods follows this section. Before choosing
what methods are
most appropriate for your evaluation, review the following
questions. (Some may
already be answered in another section of this workbook.)
What questions do I want answered? (see previous section)
Does the organization already have existing data, and if so,
56. what kind?
Does the organization have staff to collect data?
What data can the organization staff collect?
Must I maintain anonymity (participant is not identified at all)
or confidentiality
(participant is identified but responses remain private)? This
consideration
pertains to existing archival data as well as original data
collection.
How much time do I have to conduct the evaluation?
How much money do I have in my budget?
How many evaluation staff do I have to manage the data
collection activities?
Can I (and/or members of my evaluation staff) travel on site?
What time of day is best for collecting data? For example, if
you plan to conduct
focus groups or interviews, remember that your population may
work during the
57. day and need evening times.
Evaluation Expert Session
July 16, 2002 Page 16
Types of methods
A number of different methods exist that can be used to collect
process
information. Consider each of the following, and check those
that you think would
be helpful in addressing the specific questions in your
evaluation. When "see
sample" is indicated, refer to the pages that follow this table.
√ Method Description
Activity,
participation, or
client tracking log
Brief record completed on site at frequent intervals by
participant or deliverer.
May use form developed by evaluator if none previously exists.
Examples: sign
in log, daily records of food consumption, medication
58. management.
Case Studies
Collection of in-depth information regarding small number of
intervention
recipients; use multiple methods of data collection.
Ethnographic
analysis
Obtain in-depth information regarding the experience of the
recipient by
partaking in the intervention, attending meetings, and talking
with delivery staff
and recipients.
Expert judgment
Convene a panel of experts or conduct individual interviews to
obtain their
understanding of and reaction to program delivery.
Focus groups
Small group discussion among program delivery staff or
recipients. Focus on
their thoughts and opinions regarding their experiences with the
intervention.
Meeting minutes
(see sample)
Qualitative information regarding agendas, tasks assigned, and
coordination and
implementation of the intervention as recorded on a consistent
basis.
59. Observation
(see sample)
Observe actual delivery in vivo or on video, record findings
using check sheet
or make qualitative observations.
Open-ended
interviews –
telephone or in
person
Evaluator asks open questions (i.e., who, what, when, where,
why, how) to
delivery staff or recipients. Use interview protocol without
preset response
options.
Questionnaire
Written survey with structured questions. May administer in
individual, group,
or mail format. May be anonymous or confidential.
Record review
Obtain indicators from intervention records such patient files,
time sheets,
telephone logs, registration forms, student charts, sales records,
or records
specific to the service delivery.
Structured
60. interviews –
telephone or in
person
Interviewer asks direct questions using interview protocol with
preset response
options.
Evaluation Expert Session
July 16, 2002
Page 17
Sample activity log
This is a common process evaluation methodology because it
systematically records exactly what is happening during
implementation. You may wish to devise a log such as the one
below and alter it to meet your specific needs. Consider
computerizing such a log for efficiency. Your program may
already have existing logs that you can utilize and adapt for
your
evaluation purposes.
Site:
Recorder:
63. Evaluation Expert Session
July 16, 2002
Page 18
Meeting Minutes
Taking notes at meetings may provide extensive and invaluable
process information that
can later be organized and structured into a comprehensive
report. Minutes may be taken
by program staff or by the evaluator if necessary. You may find
it helpful to use a
structured form, such as the one below that is derived from
Evaluating Collaboratives,
University of Wisconsin-Cooperative Extension, 1998.
Meeting Place: __________________ Start time: ____________
Date: _____________________________ End time:
____________
65. 3.
Sample observation log
Evaluation Expert Session
July 16, 2002
Page 19
Observation may occur in various methods, but one of the most
common is
hand-recording specific details during a small time period. The
following is several rows
from an observation log utilized during an evaluation examining
school classrooms.
CLASSROOM OBSERVATIONS (School Environment Scale)
Classroom 1: Grade level _________________ (Goal: 30
minutes of observation)
Time began observation: _________Time ended
observation:_________
Subjects were taught during observation period:
___________________
66. PHYSICAL ENVIRONMENT
Question
Answer
1. Number of students
2. Number of adults in room:
a. Teachers
b. Para-pros
c. Parents
Total:
a.
b.
c.
3. Desks/Tables
a. Number of Desks
b. Number of Tables for students’ use
c. Any other furniture/include number
(Arrangement of desks/tables/other furniture)
a.
b.
c.
4. Number of computers, type
67. 5. How are computers being used?
6. What is the general classroom setup? (are there walls,
windows, mirrors,
carpet, rugs, cabinets, curtains, etc.)
7. Other technology (overhead projector, power point, VCR,
etc.)
8. Are books and other materials accessible for students?
9. Is there adequate space for whole-class instruction?
12. What type of lighting is used?
13. Are there animals or fish in the room?
68. 14. Is there background music playing?
15. Rate the classroom condition
Poor Average Excellent
16. Are rules/discipline procedures posted? If so, where?
17. Is the classroom Noisy or Quiet?
Very Quiet Very Noisy
Choosing or designing measurement instruments
Consider using a resource panel, advisory panel, or focus group
to offer feedback
Evaluation Expert Session
69. July 16, 2002
Page 20
regarding your instrument. This group may be composed of any
of the people listed
below. You may also wish to consult with one or more of these
individuals throughout
the development of your overall methodology.
Who should be involved in the design of your instrument(s)
and/or provide feedback?
Program service delivery staff / volunteers
Project director
Recipients of the program
Board of directors
Community leader
Collaborating organizations
Experts on the program or service being evaluated
Evaluation experts
_________________________
_________________________
_________________________
Conduct a pilot study and administer the instrument to a group
of recipients, and then
obtain feedback regarding their experience. This is a critical
component of the
development of your instruments, as it will help ensure clarity
of questions, and reduce
the degree of discomfort or burden that questions or processes
70. (e.g., intakes or
computerized data entry) elicit.
How can you ensure that you pilot your methods? When will
you do it, and whom will you use
as participants in the study?
Ensure that written materials are at an appropriate reading
level for the population.
Ensure that verbal information is at an appropriate terminology
level for the population.
A third or sixth-grade reading level is often utilized.
Remember that you are probably collecting data that is
program-specific. This may
increase the difficulty in finding instruments previously
constructed to use for
questionnaires, etc. However, instruments used for conducting
process evaluations of
other programs may provide you with ideas for how to structure
your own instruments.
Evaluation Expert Session
July 16, 2002
71. Page 21
Linking program components and methods (an example)
Now that you have identified your program components, broad
questions, specific
questions, and possible measures, it is time to link them
together. Let's start with your
program components. Here is an example of 3 program
components of an intervention.
Program Components and Essential Elements:
There are six program components to M2M. There
are essential elements in each component that must
be present for the program to achieve its intended
results and outcomes, and for the program to be
identified as a program of the American Cancer
Society.
Possible Process Measures
1) Man to Man Self-Help and/or Support Groups
The essential elements within this component are:
• Offer information and support to all men
with prostate cancer at all points along the
cancer care continuum
• Directly, or through collaboration and
referral, offer community access to
72. prostate cancer self-help and/or support
groups
• Provide recruitment and on-going training
and monitoring for M2M leaders and
volunteers
• Monitor, track and report program
activities
• Descriptions of attempts to schedule and advertise
group meetings
• Documented efforts to establish the program
• Documented local needs assessments
• # of meetings held per independent group
• Documented meetings held
• # of people who attended different topics and speakers
• Perceptions of need of survey participants for
additional groups and current satisfaction levels
• # of new and # of continuing group members
• Documented sign-up sheets for group meetings
• Documented attempts to contact program dropouts
• # of referrals to other PC groups documented
• # of times corresponding with other PC groups
• # of training sessions for new leaders
• # of continuing education sessions for experienced
leaders
• # and types of other on-going support activities for
volunteer leaders
• # of volunteers trained as group facilitators
73. • Perceptions of trained volunteers for readiness to
function as group facilitators
Evaluation Expert Session
July 16, 2002
Page 22
2) One-to-One Contacts
The essential elements within this component are:
• Offer one-to-one contact to provide
information and support to all men with
prostate cancer, including those in the
diagnostic process
• Provide recruitment and on-going training
and monitoring for M2M leaders and
volunteers
• Monitor, track and report program
activities
74. • # of contact pairings
• Frequency and duration of contact pairings
• Types of information shared during contact pairings
• # of volunteers trained
• Perception of readiness by trained volunteers
• Documented attempts for recruiting volunteers
• Documented on-going training activities for volunteers
• Documented support activities
3) Community Education and Awareness
The essential elements within this component are:
• Conduct public awareness activities to
inform the public about prostate cancer
and M2M
• Monitor, track and report program
activities
• # of screenings provided by various health care
providers/agencies over assessment period
75. • Documented ACS staff and volunteer efforts to
publicize the availability and importance of PC and
screenings, including health fairs, public service
announcements, billboard advertising, etc.
• # of addresses to which newsletters are mailed
• Documented efforts to increase newsletter mailing list
Page 23
Linking YOUR program components, questions, and methods
Consider each of your program components and questions that
you have devised in an earlier section of this workbook, and the
methods that you checked off on the "types of methods" table.
Now ask yourself, how will I use the information I have
obtained from this question? And, what method is most
appropriate for obtaining this information?
Program Component
Specific questions that go with this
76. component
How will I use this
information?
Best method?
Page 24
Program Component
Specific questions that go with this
77. component
How will I use this
information?
Best method?
Evaluation Expert Session
July 16, 2002
Page 25
78. Data Collection Plan
Now let's put your data collection activities on one sheet - what
you're collecting, how you're doing it, when, your sample, and
who will collect it. Identifying your methods that you have just
picked, instruments, and data collection techniques in a
structured manner will facilitate this process.
Method
Type of data (questions, briefly
indicated)
Instrument used
When
implemented
Sample
Who collects
E.g.: Patient
interviews in health
dept clinics
Qualitative - what services they are
79. using, length of visit, why came in,
how long wait, some quantitative
satisfaction ratings
Interview created
by evaluation team
and piloted with
patients
Oct-Dec; days
and hrs
randomly
selected
10 interviews
in each
clinic
Trained
interviewers
81. Process data is frequently collected through a management
information system (MIS) that
is designed to record characteristics of participants,
participation of participants, and
characteristics of activities and services provided. An MIS is a
computerized record
system that enables service providers and evaluators to
accumulate and display data
quickly and efficiently in various ways.
Will your evaluation be enhanced by periodic data presentations
in tables or other
structured formats? For example, should the evaluation utilize a
monthly print-out of
services utilized or to monitor and process recipient tracking
(such as date, time, and
length of service)?
YES
NO
Does the agency create monthly (or other periodic) print outs
reflecting
services rendered or clients served?
YES
82. NO
Will the evaluation be conducted in a more efficient manner if
program
delivery staff enter data on a consistent basis?
YES
NO
Does the agency already have hard copies of files or records
that would be
better utilized if computerized?
YES
NO
Does the agency already have an MIS or a similar computerized
database?
YES
NO
83. If the answers to any of these questions are YES,
consider using an MIS for your evaluation.
If an MIS does not already exist, you may desire to design a
database in which you can
enter information from records obtained by the agency. This
process decreases missing
data and is generally efficient.
If you do create a database that can be used on an ongoing
basis by the agency, you may
consider offering it to them for future use.
Page 27
Evaluation Expert Session
July 16, 2002
Information to be included in your MIS
Examples include:
Client demographics
Client contacts
84. Client services
Referrals offered
Client outcomes
Program activities
Staff notes
Jot down the important data you would like to be included in
your MIS.
Managing your MIS
What software do you wish to utilize to manage your data?
What type of data do you have?
How much information will you need to enter?
How will you ultimately analyze the data? You may wish to
create a database directly in
the program you will eventually use, such as SPSS?
Will you be utilizing lap tops?
85. Page 28
Evaluation Expert Session
July 16, 2002
If so, will you be taking them onsite and directly entering your
data into them?
How will you download or transfer the information, if
applicable?
What will the impact be on your audience if you have a laptop?
Tips on using an MIS
If service delivery personnel will be collecting and/or entering
information into the MIS
for the evaluator's use, it is generally a good idea to provide
frequent reminders of the
importance of entering the appropriate information in a timely,
consistent, and regular
manner.
86. For example, if an MIS is dependent upon patient data
collected by public health officers
daily activities, the officers should be entering data on at least a
daily basis. Otherwise,
important data is lost and the database will only reflect what
was salient enough to be
remembered and entered at the end of the week.
Don't forget that this may be burdensome and/or inconvenient
for the program staff.
Provide them with frequent thank you's.
Remember that your database is only as good as you make it.
It must be organized and
arranged so that it is most helpful in answering your questions.
If you are collecting from existing records, at what level is he
data currently available?
For example, is it state, county, or city information? How is it
defined? Consider whether
adaptations need to be made or additions need to be included for
your evaluation.
Back up your data frequently and in at least one additional
format (e.g., zip, disk, server).
Consider file security. Will you be saving data on a network
server? You may need to
87. consider password protection.
Page 29
Evaluation Expert Session
July 16, 2002
Allocate time for data entry and checking.
Allow additional time to contemplate the meaning of the data
before writing the report.
Page 30
Evaluation Expert Session
July 16, 2002
88. Implement Data Collection and Analysis
Data collection cannot be fully reviewed in this workbook, but
this page offers a few tips
regarding the process.
General reminders:
THANK everyone who helps you, directs you, or participates
in anyway.
Obtain clear directions and give yourself plenty of time,
especially if you are traveling
long distance (e.g., several hours away).
Bring all of your own materials - do not expect the program to
provide you with writing
utensils, paper, a clipboard, etc.
Address each person that you meet with respect and attempt to
make your meeting as
conducive with their schedule as possible.
Most process evaluation will be in the form of routine record
keeping (e.g., MIS). However, you
may wish to interview clients and staff. If so:
Ensure that you have sufficient time to train evaluation staff,
data collectors, and/or
89. organization staff who will be collecting data. After they have
been trained in the data
collection materials and procedure, require that they practice
the technique, whether it is
an interview or entering a sample record in an MIS.
If planning to use a tape recorder during interviews or focus
groups, request permission
from participants before beginning. You may need to turn the
tape recorder off on
occasion if it will facilitate increased comfort by participants.
If planning to use laptop computers, attempt to make
consistent eye contact and spend
time establishing rapport before beginning. Some participants
may be uncomfortable with
technology and you may need to provide education regarding
the process of data
collection and how the information will be utilized.
If planning to hand write responses, warn the participant that
you may move slowly and
Page 31
90. Evaluation Expert Session
July 16, 2002
may need to ask them to repeat themselves. However, prepare
for this process by
developing shorthand specific to the evaluation. A sample
shorthand page follows.
Page 32
Evaluation Expert Session
July 16, 2002
Annual Evaluation Reports
The ultimate aim of all the Branch’s evaluation efforts is to
increase the intelligent use of
information in Branch decision-making in order to improve
health outcomes. Because we
understand that many evaluation efforts fail because the data are
never collected and that even
more fail because the data are collected but never used in
decision-making, we have struggled to
find a way to institutionalize the use of evaluation results in
Branch decision-making.
91. These reports will serve multiple purposes:
The need to complete the report will increase the likelihood
that evaluation is done and
data are collected.
The need to review reports from lower levels in order to
complete one’s own report
hopefully will cause managers at all levels to consciously
consider, at least once a year,
the effectiveness of their activities and how evaluation results
suggest that effectiveness
can be improved.
The summaries of evaluation findings in the reports should
simplify preparation of other
reports to funders including the General Assembly.
Each evaluation report forms the basis of the evaluation report
at the next level. The contents
and length of the report should be determined by what is mot
helpful to the manager who is
receiving the report. Rather than simply reporting every
possible piece of data, these reports
should present summary data, summarize important conclusions,
and suggest recommendations
based on the evaluation findings. A program-level annual
evaluation report should be ten pages
or less. Many my be less than five pages. Population team and
Branch-level annual evaluation
reports may be longer than ten pages, depending on how many
findings are being reported.
However, reports that go beyond ten pages should also contain a
shorter Executive Summary, to
92. insure that those with the power to make decisions actually read
the findings.
Especially, the initial reports may reflect formative work and
consist primarily of updates on the
progress of evaluation planning and implementation. This is
fine and to be expected.
However, within a year or two the reports should begin to
include process data, and later actual
outcome findings.
This information was extracted from the FHB Evaluation
Framework developed by Monica Herk and Rebekah Hudgins.
Page 33
Evaluation Expert Session
July 16, 2002
Suggested shorthand - a sample
The list below was derived for a process evaluation regarding
charter schools. Note the use of general shorthand as
well as shorthand derived specifically for the evaluation.
93. CS
Charter School
mst
Most
Sch School b/c Because
Tch Teacher, teach st Something
P Principal b Be
VP Vice Principal c See
Admin Administration, administrators r Are
DOE Dept of Education w/ When
BOE Board of Education @ At
Comm Community ~ About
Stud Students, pupils = Is, equals, equivalent
Kids Students, children, teenagers ≠ Does not equal, is not the
same
K Kindergarten Sone Someone
Cl Class # Number
CR Classroom $ Money, finances, financial, funding,
expenses, etc.
W White + Add, added, in addition
B Black < Less than
AA African American > Greater/more than
SES Socio-economic status ??? What does this mean? Get more
info on, I'm confused…
Lib Library, librarian DWA Don't worry about (e.g. if you wrote
something unnecessary)
94. Caf Cafeteria Ψ Psychology, psychologist
Ch Charter ∴ Therefore
Conv Conversion (school) ∆ Change, is changing
S-up Start up school mm Movement
App Application, applied ↑ Increases, up, promotes
ITBS Iowa Test of Basic Skills ↓ Decreases, down, inhibits
LA Language arts X Times (e.g. many x we laugh)
SS Social Studies ÷ Divided (we ÷ up the classrooms)
QCC Quality Core Curriculum C With
Pol Policy, politics Home, house
Curr Curriculum ♥ Love, adore (e.g. the kids ♥ this)
LP Lesson plans Church, religious activity
Disc Discipline O No, doesn't, not
Girls, women, female 1/2 Half (e.g. we took 1/2)
Boys, men, male 2 To
Page 34
Evaluation Expert Session
July 16, 2002
F
Father, dad
96. Appendix A
Logic Model Worksheet
Population Team/Program Name
__________________________ Date
_______________________
If the following
CONDITIONS
AND
ASSUMPTIONS
exist...
And if the following
ACTIVITIES are
implemented to
address these
conditions and
assumptions
Then these
SHORT-TERM
OUTCOMES may
be achieved...
98. Page 35
Evaluation Expert Session
July 16, 2002
Appendix B
Pitfalls To Avoid
Avoid heightening expectations of delivery staff, program
recipients, policy makers, or
community members. Ensure that feedback will be provided as
appropriate, but may or may
not be utilized.
99. Avoid any implication that you are evaluating the impact or
outcome. Stress that you are
evaluating "what is happening," not how well any one person is
performing or what the
outcomes of the intervention are.
Make sure that the right information gets to the right people -
it is most likely to be utilized
in a constructive and effective manner if you ensure that your
final report does not end up on
someone's desk who has little motivation or interest in utilizing
your findings.
Ensure that data collection and entry is managed on a
consistent basis - avoid developing an
evaluation design and than having the contract lapse because
staff did not enter the data.
Page 36
Evaluation Expert Session
100. July 16, 2002
Appendix C
References
References used for completion of this workbook and/or that
you may find helpful for
additional information.
Centers for Disease Control and Prevention. 1995. Evaluating
Community Efforts to Prevent
Cardiovascular Diseases. Atlanta, GA.
Centers for Disease Control and Prevention. 2001. Introduction
to Program Evaluation for
Comprehensive Tobacco Control Programs. Atlanta, GA.
Freeman, H. E., Rossi, P. H., Sandefur, G. D. 1993. Workbook
for evaluation: A systematic
approach. Sage Publications: Newbury Park, CA.
Georgia Policy Council for Children and Families; The Family
Connection; Metis Associates,
Inc. 1997. Pathways for assessing change: Strategies for
community partners.
Grembowski, D. 2001. The practice of health program
evaluation. Sage Publications: Thousand
Oaks.
Hawkins, J. D., Nederhood, B. 1987. Handbook for Evaluating
Drug and Alcohol Prevention
Programs. U.S. Department of Health and Human Services;
101. Public Health Service; Alcohol,
Drug Abuse, and Mental Health Administration: Washington, D.
C.
Muraskin, L. D. 1993. Understanding evaluation: The way to
better prevention programs.
Westat, Inc.
National Community AIDS Partnership 1993. Evaluating
HIV/AIDS Prevention Programs in
Community-based Organizations. Washington, D.C.
NIMH Overview of Needs Assessment. Chapter 3: Selecting the
needs assessment approach.
Patton, M. Q. 1982. Practical Evaluation. Sage Publications,
Inc.: Beverly Hills, CA.
Page 37
Evaluation Expert Session
July 16, 2002
Posavac, E. J., Carey, R. G. 1980. Program Evaluation: Methods
and Case Studies.
Prentice-Hall, Inc.: Englewood Cliffs, N.J.
102. Rossi, P. H., Freeman, H. E., Lipsey, M. W. 1999. Evaluation:
A Systematic Approach. (6th
edition). Sage Publications, Inc.: Thousand Oaks, CA.
Scheirer, M. A. 1994. Designing and using process evaluation.
In: J. S. Wholey, H. P. Hatry, &
K. E. Newcomer (eds) Handbook of practical program
evaluation. Jossey-Bass Publishers: San
Francisco.
Taylor-Powell, E., Rossing, B., Geran, J. 1998. Evaluating
Collaboratives: Reaching the
potential. Program Development and Evaluation: Madison, WI.
U.S. Department of Health and Human Services; Administration
for Children and Families;
Office of Community Services. 1994. Evaluation Guidebook:
Demonstration partnership
program projects.
W.K. Kellogg Foundation. 1998. W. K. Kellogg Foundation
Evaluation Handbook.
Websites:
www.cdc.gov/eval/resources
www.eval.org (has online text books)
www.wmich.edu/evalctr (has online checklists)
www.preventiondss.org
When conducting literature reviews or searching for additional
information, consider using
alternative names for "process evaluation," including:
formative evaluation
program fidelity
103. implementation assessment
implementation evaluation
program monitoring
Resources
Dudley, J. R. (2014). Social work evaluation: Enhancing what
we do. (2nd ed.) Chicago, IL: Lyceum Books. Chapter 8,
“Improving How Programs and Practice Work” (pp. 167–207)
Plummer, S.-B., Makris, S., & Brocksen S. (Eds.). (2014b).
Social work case studies: Concentration year. Baltimore, MD:
Laureate International Universities Publishing. [Vital Source e-
reader]. Read the following section: “Social Work Research:
Qualitative Groups” (pp. 68–69)
Document: Bliss, M. J., & Emshoff, J. G. (2002). Workbook for
designing a process evaluation. Retrieved from
http://beta.roadsafetyevaluation.com/evaluationguides/info/work
book-for-designing-a-process-evaluation.pdf (PDF)
QSR International. (n. d.). NVivo 10. Retrieved October 17,
2013, from
http://www.qsrinternational.com/products_nvivo.aspx. Use this
webpage to view a demonstration of how qualitative data
analysis can be assisted by software. You may explore any of
the demos, but it is recommended that you start with NVivo
eDemo. In order to view this demo, you will need to register,
and download (or enable) the latest Adobe Flash Player.
Boyce, C., & Neale, P. (2006). Conducting in-depth interviews:
A guide for designing and conducting in-depth interviews for
evaluation input. Pathfinder International Tool Series:
Monitoring and Evaluation – 2. Retrieved from
104. http://www.cpc.unc.edu/measure/training/materials/data-
quality-portuguese/m_e_tool_series_indepth_interviews.pdf
Hesselink, A. E., & Harting, J. (2011). Process evaluation of a
multiple risk factor perinatal programme for a hard-to-reach
minority group. Journal of Advanced Nursing, 67(9), 2026–
2037.
Lee, E., Esaki, N., & Greene, R. (2009). Collocation:
Integrating child welfare and substance abuse services. Journal
of Social Work Practice in the Addictions, 9(1), 55–70.
Maxwell, N., Scourfield, J., Holland, S., Featherstone, B., &
Lee, J. (2012). The benefits and challenges of training child
protection social workers in father engagement. Child Abuse
Review, 21(4), 299–310.
Assignment: Drafting a Process Evaluation
The steps for process evaluation outlined by Bliss and Emshoff
(2002) may seem very similar to those for conducting other
types of evaluation that you have learned about in this course;
in fact, it is the purpose and timing of a process evaluation that
most distinguish it from other types of evaluation. A process
evaluation is conducted during the implementation of the
program to evaluate whether the program has been implemented
as intended and how the delivery of a program can be improved.
A process evaluation can also be useful in supporting an
outcome evaluation by helping to determine the reason behind
program outcomes.
There are several reasons for conducting process evaluation
throughout the implementation of a program. Chief among them
is to compare the program that is being delivered to the original
program plan, in order to identify gaps and make improvements.
105. Therefore, documentation from the planning stage may prove
useful when planning a process evaluation.
For this Assignment, you either build on the work that you
completed in Weeks 6, 7, and 8 related to a support group for
caregivers, or on your knowledge about a program with which
you are familiar. Review the resource “Workbook for Designing
a Process Evaluation”.
Submit a 4- to 5-page plan for a process evaluation. Include the
following minimal information:
· A description of the key program elements
· A description of the strategies that the program uses to
produce change
· A description of the needs of the target population
· An explanation of why a process evaluation is important for
the program
· A plan for building relationships with the staff and
management
· Broad questions to be answered by the process evaluation
· Specific questions to be answered by the process evaluation
· A plan for gathering and analyzing the information
Social Work Research: Qualitative Groups
A focus group was conducted to explore the application of a
cross-system collaboration and its effect on service delivery
outcomes among social service agencies in a large urban county
on the West Coast. The focus group consisted of 10 social
workers and was led by a facilitator from the local office of a
major community support organization (the organization).
Participants in the focus group had diverse experiences working
with children, youth, adults, older adults, and families. They
represented agencies that addressed child welfare, family
106. services, and community mental health issues. The group
included five males and five females from diverse ethnicities.
The focus group was conducted in a conference room at the
organization’s headquarters. The organization was interested in
exploring options for greater collaboration and less
fragmentation of social services in the local area. Participants in
the group were recruited from local agencies that were either
already receiving or were applying for funding from the
organization. The 2-hour focus group was recorded.
The facilitator explained the objective of the focus group and
encouraged each participant to share personal experiences and
perspectives regarding cross-system collaboration. Eight
questions were asked that explored local examples of cross-
system collaboration and the strengths and barriers found in
using the model. The facilitator tried to achieve maximum
participation by reflecting the answers back to the participants
and maintaining eye contact.
To analyze the data, the researchers carefully transcribed the
entire recorded discussion and utilized a qualitative data
analysis software package issued by StatPac, which offers a
product called Verbatim Blaster. This software focuses on
content coding and word counting to identify the most salient
themes and patterns.
The focus group was seen by the sponsoring entity as successful
because every participant eventually provided feedback to the
facilitator about cross-system collaboration. It was also seen as
a success because the facilitator remained engaged and
nonjudgmental and strived to have each participant share their
experiences.
In terms of outcomes, the facilitator said that the feedback
obtained was useful in exploring new ways of delivering
107. services and encouraging greater cooperation. As a result of this
process, the organization decided to add a component to all
agency annual plans and reports that asked them to describe
what types of cross-agency collaboration were occurring and
what additional efforts were planned.
Social Work Research: Single Subject
1.What specific intervention strategies (skills, knowledge, etc.)
did you use to address this client situation?
I utilized basic research knowledge and skills, such as study
design, sampling, data collection, data analysis, writing up
findings, and dissemination.
2.Which theory or theories did you use to guide your practice?
I used basic research knowledge to guide my practice.
3.What local, state, or federal policies could (or did) affect this
situation?
As in any research, federal and other regulations exist regarding
the ethics of the study and how research can and/or should be
conducted. Laws, declarations, and code that may apply include
the U.S. Federal Policy for the Protection of Human Subjects
(also known as the Common Rule), the World Medical
Association’s Declaration of Helsinki, a statement of ethical
principles like the Belmont Report: Ethical Principles and
Guidelines for the Protection of Human Subjects of Research of
the U.S. National Commission for the Protection of Human
Subjects of Biomedical and Behavioral Research, and
Institutional Review Board guidelines of the institution with
which the research is affiliated.
4.Were there any legal or ethical issues present in the case? If
so, what were they and how were they addressed?
Legal and ethical issues in this case centered on informed
consent and the protection of human subjects. When doing
108. research, all study participants must be fully informed about the
study and the implications for them as participants. All risks
must be identified and minimized. In order to implement the
study, an Institutional Review Board (IRB) was asked to
evaluate the ethical correctness of the study. Only after IRB
approval was obtained could the study be conducted. After
completion of the study, a report was submitted for IRB review
and the study was closed.
5.How can evidence-based practice be integrated into this
situation?
As in any empirical research, the findings from this study can
contribute to evidence-based practice. However, single-subject
designs are not considered very strong when it comes to
generalizability.
6.Describe any additional personal reflections about this case.
Single-subject designs are fairly easy to implement and can
provide very useful information on the case level. While their
empirical strength is often considered weak, their applicability
and usefulness make them a good method for clinical practice
and, if following a multiple baseline design, they can provide
good research data as well.
Peer Review Rubric
Meets or Exceeds Established Assignment Criteria
Grading Criteria
Possible Points
Points Awarded
An introduction with a clear thesis statement
15
14
I think the thesis statement is good but needs some work it is a
109. bit wordy.
Grading Criteria
Possible Points
Points Awarded
Project includes all required topics:
Community Organization, Community Partnerships, Cross-
Cultural Challenges, Humanitarian Considerations, Volunteers,
Roadblocks, Future Vision, Challenges, and Improvements
15
15
Good job on this portion all of the criteria was met.
Grading Criteria
Possible Points
Points Awarded
Meets project presentation requirements:
· Paper – 5-7 pages
· Presentation - 8-10 minutes presentation including visual
elements (graphics, pictures, etc.), 150-200 notes per slide, and
recorded voice
· Speech - 8-10 minutes and a written script of the narrative
15
15
All of the criteria in this segment was met, well done.
Clearly Presents Well-Reasoned Ideas and Concepts
Grading Criteria
Possible Points
Points Awarded
Evidence supporting claims cited throughout the project
15
12
There is some evidence supporting your claims, but you need to
add more and be precise.
110. Grading Criteria
Possible Points
Points Awarded
Used a minimum of 10 reputable sources
15
8
Although you sited eight sources at the end of your paper you
only used four in text citations. I would work on adding them to
the paper so the reader knows where the information is coming
from.
Provide feedback for your score in this category here. Why did
you take points off or why did you provide full points? Include
constructive (positive) comments and recommendations.
Quality of Project
Grading Criteria
Possible Points
Points Awarded
Academic and professional tone and appearance
15
15
Need to fix your page numbers they are centered instead of at
the far right.
Grading Criteria
Possible Points
Points Awarded
Grammar and proofreading
10
8
Paragraph 1. Then again: now and again, formal set up in which
connects schools.
Check spacing between paragraphs.
Instead of using titles for community organization and
111. partnerships… try have the previous sentence flow into the
next.
Endeavors was repeated multiple times instead try events,
actions et cetera.
Try to avoid starting sentences with then again.
Revise word choice for flow and comprehension.
Conclusion
Grading Criteria
Possible Points
Points Awarded
Total
100
87
Overall, I think this paper is on track, but still needs some
work. I would read back over the paper and correct any APA
mistakes and try to find places where you could remove certain
words or add words for flow and comprehension. There are
some points that I highlighted in the paper that I think should be
revised for wording and try to end paragraphs with preambles to
the next so when the paper is read it flows better.
Running head: THE ROUGH DRAFT 1
THE ROUGH DRAFT 10
The Rough Draft
Nicole Jensen
Grantham University
112. Introduction
Service learning is one of the trial training forms where
learning happens through progression of characterized activity
and reflection (Brengle, 1999). For this situation, understudies
fill in as a group through drawing in whatever they learn, to
issues influencing the network and, in the meantime, sparkling
upon their experience amid the instruction procedure for their
locale and the wellbeing of their own. Network organizations,
then again, allude to a formal set up which connects with a
school with people, affiliations, private as well as open segment
associations to provide a specific program, administration or
asset with a goal for empowering the understudies to
accomplish more.
Whenever the networks' accomplice in administration learning
open, they furrow back some detectable advantages. Control is
improved among understudies as specific standards drive them.
The groups likewise build up some control abilities in their
communication with the understudies and different tenants. It is
through administration exercises where real network needs are
met and issues influencing the network are tended to.
Understudies may end up inactive and if not put energetically,
may wind up doing things that are not viewed as ethically
upright.
Nonetheless, a few difficulties are set to influence the
understudies amid their administration learning in the network.
Such issues may extend from a threatening society,
disappointment with the work, and time requirements.
The World Scouts Movement organization has been endeavoring
113. to deliver a world that is better through different techniques, for
example, enabling or urging the youngsters to go along with
them and presentation of the scout programs in the networks.
Community organization
Baden-Powell is credited with the establishment and
development of the World Organization of the Scout Movement
(WOSM). In 1907, he united twenty young men from various
networks of the source in a test camp held at Brownsea Island,
England. After one year, an office for Boy Scouts was opened in
London. The World Scout's dresser was later established in
1920 amid the central World Scout meeting.
The association endeavors to make a superior world by
presenting scout programs in networks and
empowering/enabling youngsters to go along with them. It
centers on contribution, acknowledgment, and intergenerational
trade to connect with the adolescent and to allow them to
accomplish their maximum capacity. Its central goal is "to add
to the training of youngsters, through an esteem framework
dependent on the Scout Promise and Law, to help assemble a
superior existence where individuals are self-satisfied as people
and assume a valuable job in the public arena. This implies the
association gives educative projects that enhance the lives of
the energetic populace and also their networks.
Then again, the association has a dream of elevating exploring
to make it the best instructive youth development over the globe
and enlist more than 100 million adolescents by 2023. It
addresses the issues of networks by giving administration
figuring out how the adolescent/understudies who draw in with
individuals from the system.
Community Partnership
The World Scouts' Association has banded together with
different organizations, and associations. A portion of its
accomplices incorporates the United Nations Association,
International young men's scouts, Troop 1, the young men's
scouts of the United Nations, and numerous universities
comprehensively. The associations are profitable in
114. guaranteeing that the jobs of the association are spread and
achieved differently, the issues influencing a gathering of
individuals or the whole network are settled without any
difficulty, and that a great many people get learning concerning
the association, its accomplices and comprehend their jobs. The
association should band together with associations, for example,
the W.H.O to guarantee that nourishment security is met even in
the appetite-driven zones.
Cross-cultural challenges
The most well-known test is the dialect spoken since individuals
from various networks have diverse dialects and societies which
will impede the method of correspondence making the ventures
in the association moderate and some of the time unachieved.
Another issue is the distinction in feeling since the number of
inhabitants in the association has an alternate method for
different survey angles, and the approaching gathering has
another plan for thinking about alternative points of view.
(Serc.carleton.edu, 2018).
Humanitarian considerations
Today the American youth faces different difficulties that
discourage their way to progress, and these difficulties
incorporate medications, realism, and divergence in instruction.
Today the adolescent is more centered on these viewpoints, they
overlook their objective in life subsequently to destroy this
issue from the network. Making of projects and occasions bring
individuals from various systems together, and for this
situation, the association takes part in solidarity work bringing
together and restricting distinctive individuals together as one
and enabling them to cooperate and join their brains to take care
of issues in the networks.
Volunteers
The association utilizes volunteers to achieve a large portion of
its undertakings. For instance, the individuals from the
exploring development are being used in guaranteeing that