Sheryl Ryan, PhD, OTR/L
Week 5
 Program evaluation:
 Definition
 Purposes
 Describe several approaches
 Discuss the process
 Describe the differences between experimental, quasi-experimental, and non-
experimental evaluation designs.
 Discuss the appropriate use of qualitative methods in program evaluation.
 Identify the uses of evaluation results.
 Discuss the importance of disseminating the results of program development and
evaluation.
 Identify the ethical considerations in designing and conducting evaluation research.
Quiz 1
Vocabulary database
2 Quotes/2 Questions
Case: Ocean Therapy
Program Evaluation
Photo: http://www.surfersway.org/
 Appreciative inquiry approach
 Conceptual, instrumental, and process
use
 Contingency perspective
 Efficiency evaluations
 Impact evaluations
 Outcome evaluations
 Process evaluations
 Experimental, non-experimental, and
quasi-experimental designs
 Formative and Summative evaluation
 Logic models
 Managerial approach
 Objectives approach
 Participatory approach
 Program efficiency evaluation
 Program evaluation
 Qualitative
 Quantitative
 Stakeholders
 4D Model
 Indicators
Chapter 6
Carly Rogers:
Surfing – Infinite
Possibilities to
Heal
https://www.youtube.com/watch?v=
Wfb8tHn8Xv4
http://jimmymillerfoundation.org/ocean-therapy/
Collect data and
information
Use it to inform future
action
“Evaluators rely on stakeholders to
help develop the essential questions
that will be answered in the
evaluation” (Scaffa and Reitz, 2014:97)
 Evaluation is vital to helping design
and shape your practice
 Evaluation of your program should be
one of the first things you consider
during the development stage
 Evaluation should be continuous,
throughout the life of your program
https://www.opendemocracy.net/openglobalrights/evaluation-and-human-rights
:
Formative or Process Evaluations
 Determines whether program
activities have been implemented as
intended and resulted in certain
outputs. The results are used for
program improvement.
 Theoretical framework
 Design
 Activities
 Operation
Summative
 When the results are used to
determine effects or
continue/discontinue the program.
 Outcomes
 Impact
 Effectiveness/Efficiency
https://education.uky.edu/evaluationcenter/
Needs assessment
Program theory
Program implementation
Program outcome – Long-term effects
Program impact – Immediate effects
Program efficiency – Cost/benefit analysis justifies funding
https://www.isasurf.org/first-ever-isa-world-adaptive-surfing-championship-set-to-be-held-in-la-jolla-california-september-24-27/
Objectives Approach
 Collect data by previously agreed
upon methods to determine whether
goals and objects are being met.
 Relatively easy to quantify, but
limited to evaluating program
objectives only.
 Rarely used in isolation.
Managerial Approach
 Goal is to provide useful information
to managers to improve their decision
making.
 Evaluations look at how the program
operates as well as goals.
 Value of a program is determined by
managers and is reflected in decisions
they make based on evaluation
results.
Participatory Approach
 A formative/process approach
 Qualitative
 Considers the needs of all stakeholder
groups rather than just managers –
360* approach.
 Determines critical information
needed by each of stakeholder groups
to develop program improvement
plans.
Utilization-Focused Approach
 A process for including “users” in the
design of the evaluation process.
 Shifts attention from the program to
be evaluated and focuses on those
who will use the program evaluation
results.
 Highly situation and context specific.
 Mixed qualitative and quantitative
methods.
Appreciative Inquiry Approach
 4D Model
 Discovery – appreciate what is good
 Dream – envision what might be
 Design – consider what should be
 Destiny – implementing change
AKA Transtheoretical Model of Health
Behavior Change
 5 Principles
 Constructivist – multiple realities exist
 Simultaneity – inquiry is intervention
 Poetic – programs narrating own stories
and can change directions at any time
 Anticipatory – imagination of program
guides its current actions
 Positive – focus on positive experiences
increases motivation, inspiration, and
engagement
http://dancumberworth.co.uk/practice/
Develop evaluation questions
Determine
data needs
Choose evaluation
methods
Utilize evaluation
results
Identify stakeholders
http://www.theintrinsicvalue.com/tag/picture-of-a-airplane
 Determine OT Related Data Needs  Occupational performance
 Adaptation
 Health and wellness
 Participation
 Prevention
 Quality of life
 Role competence
 Self-advocacy
 Occupational justice
https://www.amazon.com/Occupational-Therapy-Practice-Framework-Process/dp/1569003610
Choosing Evaluation Methods
 Quantitative Designs
 Non-experimental
 Cross-sectional, cohort
 Quasi-experimental
 Compare two groups, one receiving
intervention
 Experimental
 Randomized controlled trial
 Qualitative Designs
 Provides an in-depth perspective on
phenomena that are not easily
quantifiable.
 Thick description, naturalistic,
experience oriented
 Interviews, focus groups, observation,
document review
 Instrumental use – to inform future action
 Conceptual use – impact on decision makers’
thought processes
 Process use – cognitive and behavioral changes result
from participating in the process
 Symbolic use – for political gain
COMMUNICATION!
 Beneficence
 Select appropriate evaluation approach
and outcome measures
 IRB approval
 Using current assessments, following
copyright laws
 Nonmaleficence
 Train evaluators and staff to ensure
competency
 Autonomy/Confidentiality
 Secure evaluation materials and data
 Compliance with evaluation protocol
and confidentiality as needed
 Procedural Justice
 IRB approval
 Veracity
 Report accurate results to participants
and stakeholders in a timely manner
 Fidelity
 Avoid conflicts of interest
Ot5101 005 week 5

Ot5101 005 week 5

  • 1.
  • 2.
  • 3.
     Program evaluation: Definition  Purposes  Describe several approaches  Discuss the process  Describe the differences between experimental, quasi-experimental, and non- experimental evaluation designs.  Discuss the appropriate use of qualitative methods in program evaluation.  Identify the uses of evaluation results.  Discuss the importance of disseminating the results of program development and evaluation.  Identify the ethical considerations in designing and conducting evaluation research.
  • 4.
    Quiz 1 Vocabulary database 2Quotes/2 Questions Case: Ocean Therapy Program Evaluation Photo: http://www.surfersway.org/
  • 5.
     Appreciative inquiryapproach  Conceptual, instrumental, and process use  Contingency perspective  Efficiency evaluations  Impact evaluations  Outcome evaluations  Process evaluations  Experimental, non-experimental, and quasi-experimental designs  Formative and Summative evaluation  Logic models  Managerial approach  Objectives approach  Participatory approach  Program efficiency evaluation  Program evaluation  Qualitative  Quantitative  Stakeholders  4D Model  Indicators
  • 6.
  • 7.
    Carly Rogers: Surfing –Infinite Possibilities to Heal https://www.youtube.com/watch?v= Wfb8tHn8Xv4 http://jimmymillerfoundation.org/ocean-therapy/
  • 8.
    Collect data and information Useit to inform future action “Evaluators rely on stakeholders to help develop the essential questions that will be answered in the evaluation” (Scaffa and Reitz, 2014:97)
  • 9.
     Evaluation isvital to helping design and shape your practice  Evaluation of your program should be one of the first things you consider during the development stage  Evaluation should be continuous, throughout the life of your program https://www.opendemocracy.net/openglobalrights/evaluation-and-human-rights
  • 10.
    : Formative or ProcessEvaluations  Determines whether program activities have been implemented as intended and resulted in certain outputs. The results are used for program improvement.  Theoretical framework  Design  Activities  Operation Summative  When the results are used to determine effects or continue/discontinue the program.  Outcomes  Impact  Effectiveness/Efficiency https://education.uky.edu/evaluationcenter/
  • 11.
    Needs assessment Program theory Programimplementation Program outcome – Long-term effects Program impact – Immediate effects Program efficiency – Cost/benefit analysis justifies funding https://www.isasurf.org/first-ever-isa-world-adaptive-surfing-championship-set-to-be-held-in-la-jolla-california-september-24-27/
  • 12.
    Objectives Approach  Collectdata by previously agreed upon methods to determine whether goals and objects are being met.  Relatively easy to quantify, but limited to evaluating program objectives only.  Rarely used in isolation. Managerial Approach  Goal is to provide useful information to managers to improve their decision making.  Evaluations look at how the program operates as well as goals.  Value of a program is determined by managers and is reflected in decisions they make based on evaluation results.
  • 13.
    Participatory Approach  Aformative/process approach  Qualitative  Considers the needs of all stakeholder groups rather than just managers – 360* approach.  Determines critical information needed by each of stakeholder groups to develop program improvement plans. Utilization-Focused Approach  A process for including “users” in the design of the evaluation process.  Shifts attention from the program to be evaluated and focuses on those who will use the program evaluation results.  Highly situation and context specific.  Mixed qualitative and quantitative methods.
  • 14.
    Appreciative Inquiry Approach 4D Model  Discovery – appreciate what is good  Dream – envision what might be  Design – consider what should be  Destiny – implementing change AKA Transtheoretical Model of Health Behavior Change  5 Principles  Constructivist – multiple realities exist  Simultaneity – inquiry is intervention  Poetic – programs narrating own stories and can change directions at any time  Anticipatory – imagination of program guides its current actions  Positive – focus on positive experiences increases motivation, inspiration, and engagement http://dancumberworth.co.uk/practice/
  • 15.
    Develop evaluation questions Determine dataneeds Choose evaluation methods Utilize evaluation results Identify stakeholders http://www.theintrinsicvalue.com/tag/picture-of-a-airplane
  • 16.
     Determine OTRelated Data Needs  Occupational performance  Adaptation  Health and wellness  Participation  Prevention  Quality of life  Role competence  Self-advocacy  Occupational justice https://www.amazon.com/Occupational-Therapy-Practice-Framework-Process/dp/1569003610
  • 17.
    Choosing Evaluation Methods Quantitative Designs  Non-experimental  Cross-sectional, cohort  Quasi-experimental  Compare two groups, one receiving intervention  Experimental  Randomized controlled trial  Qualitative Designs  Provides an in-depth perspective on phenomena that are not easily quantifiable.  Thick description, naturalistic, experience oriented  Interviews, focus groups, observation, document review
  • 18.
     Instrumental use– to inform future action  Conceptual use – impact on decision makers’ thought processes  Process use – cognitive and behavioral changes result from participating in the process  Symbolic use – for political gain COMMUNICATION!
  • 19.
     Beneficence  Selectappropriate evaluation approach and outcome measures  IRB approval  Using current assessments, following copyright laws  Nonmaleficence  Train evaluators and staff to ensure competency  Autonomy/Confidentiality  Secure evaluation materials and data  Compliance with evaluation protocol and confidentiality as needed  Procedural Justice  IRB approval  Veracity  Report accurate results to participants and stakeholders in a timely manner  Fidelity  Avoid conflicts of interest

Editor's Notes

  • #5 Therapeutic Surfing for Special Needs: Resource List California The Jimmy Miller Foundation provides ocean therapy for at-risk children with physical and emotional disabilities. Marin County Spectrum Surf camp helps children with autism and cerebral palsy learn to surf. THERASurf (Marin County) and A Walk on Water Rhode Island University of Rhode Island surf program effectiveness research Hawaii Surfers Healing (ABILITY Magazine interview) AccesSurf Hawaii Costa Rica Ocean Healing Group
  • #8 12 Minute TEDx UCLA talk
  • #9 Evaluations provide accountability to stake holds and community: effects, efficiency. Also, support the development and improvement of a program, or further stakeholders’ understanding of the program.
  • #11 Summative example: Transportation A program to develop a system of high speed trains is initially viewed as a failure as it exceeds planned budget. However, within a decade ridership is far greater than business plans had anticipated. The overall impact on the economy, quality of life and the environment can be demonstrated to be exceedingly positive. Formative example: Process Evaluation determines whether program activities have been implemented as intended and resulted in certain outputs. You may conduct process evaluation periodically throughout the life of your program and start by reviewing the activities and output components of the logic model (i.e., the left side). Results of a process evaluation will strengthen your ability to report on your program and use information to improve future activities. It allows you to track program information related to Who, What, When and Where questions: • To whom did you direct program efforts? • What has your program done? • When did your program activities take place? • Where did your program activities take place? • What are the barriers/facilitators to implementation of program activities?
  • #12 Imagine you want to evaluate your adaptive surfing program Or use a student group example Page 97-99 Needs assessment – incidence and prevalence of a problem. Were we accurate? Was this thorough or focused enough? Incidence, Prevalence, Target Populations, Secondary Populations Program Theory Evaluation – Making a logic model that explains the relationships between the parts. Evaluate the accuracy and appropriateness of the logic model. Program Implementation Evaluation – Did it follow the design and logic models? Was it implemented in a way consistent with the vision and plan? Program Outcome Evaluation – Long-term and establishes cause/effect relationship between program and desired changes. Program Impact Evaluation – Measures achievement of program objectives in short term – immediate effects on target population. Program Efficiency Evaluation – Cost benefit assessment to justify funding, secure grants, accountability to stakeholders.
  • #13 Use a “Contingency Perspective” – meaning choose the appropriate approach depending on the situational needs. No need to do all of them, but important to justify why the one or two you choose is/are the most relevant/appropriate. Surf program Fall prevention program Gardening program…
  • #14 Use a participatory approach to plan your program evaluation at your site. Who? What? How? When appropriate? When not? Use a U-FE approach to plan an evaluation at your site. Who? What? How? When appropriate? When not?
  • #15 AI is most useful when – fear of evaluation exists, change needs to be accelerated, dialogue is critical, poor relationships or sense of hopelessness, and desire to build a community of practice. Handout: Use it to practice on each other – about their CBPs Image: http://dancumberworth.co.uk/practice/
  • #16 Surfing program example: Your CBP example: Stakeholders – funders, advisory boards, program managers, those who benefit from the program, those disadvantaged by the program in some way, general public with interest in outcomes of the evaluation.
  • #17 Surfing program example: Your CBP example:
  • #19 Examples of each one