This document discusses the construction of implementation science for scaling out interventions. It begins by illustrating the use of team science and practice networks in implementation research. It then discusses adding the concept of scaling out to the traditional research pipeline for implementation, which involves transporting evidence-based interventions to new service delivery systems and contexts. Two options for research and practice involving scaling out are presented: a static model which focuses on strict adherence to programs, and a dynamic model which allows for planned adaptation. The document argues for the dynamic model and discusses approaches like ADAPT-ITT and the dynamic adaptation process for evaluating interventions during the scaling out process. It emphasizes the need to evaluate both implementation and effectiveness when scaling out.
Lucknow Call girls - 8800925952 - 24x7 service with hotel room
Scaling Out Interventions Across Diverse Settings
1. Construction of an Implementation Science
for Scaling Out Interventions
C Hendricks Brown
Department of Psychiatry and Behavioral Sciences
Department of Preventive Medicine
Institute for Public Health and Medicine
Center for Engineering and Health
Director, Center for Prevention Implementation
Methodology (Ce-PIM)
Director, Prevention Science and Methodology Group
(PSMG)
Hendricks.brown@northwestern.edu
2. Acknowledgments
Funded by National Institute on Drug
Abuse (NIDA) and Office of Behavioral
and Social Science Research (OBSSR)
P30 DA027828
Substance Abuse and Mental Health
Services Administration (SAMHSA)
NIMH Hopkins PRC & Training
Sheppard Kellam, Phil Leaf, Nick Ialongo
3. Co-Authors
• Patti Chamberlain, OSLC
• Lisa Saldana, OSLC
• Courteney Padgett, OSLC
• Gracelyn Cruden, Northwestern
• Wei Wang, USF
Carlos Gallo, Northwestern
Car Hai, USC
Hopkins Pevention Research Center May 2014
4. Outline
1. Standard Research Pipeline involving
Implementation
CAL-OH Head-to-Head Randomized
Implementation Trial
2. Adding Scaling-Out to our Research Pipeline
Definition and Illustrations
3. Constructing Implementation Science for Scaling
Out – Off-Label Prevention Implementation
Evaluation Design Options
Automatizing Fidelity Ratings
4. Conclusions
Hopkins Pevention Research Center May 2014
5. Objectives
•Illustrate Use of Team Science and Practice
Networks
•Make Connections to Major Themes in this
Hopkins Implementation Meeting
Simulation Modeling
Technology:
Continuous Evaluation of Evolving
Interventions and Automated Fidelity
Assessment
•Illustrate elements required for “Scaling-Out”
Alternative Research/Evaluation Designs
Hopkins Pevention Research Center May 2014
6. Acknowledgments: Team Science and Practice in
Implementation
Construction of an implementation
Science
Implementation of Implementation
Science
Hopkins Pevention Research Center May 2014
11. We are making progress in Implementation Science
Example: A Head-to-Head
Randomized Implementation Trial of
Two Alternative Implementation
Strategies for a Single Evidence-
Based Intervention
Hopkins Pevention Research Center May 2014
12. For Implementation, the Program Delivery
System, rather than the Clinical/Preventive
Intervention, is in the Foreground
Clinical/Preventive
Intervention
Multilevel, Program
Delivery System I
Landsverk J, Brown CH, Chamberlain P, Palinkas L, Horwitz SM, Ogihara M.
Design and Analysis in Dissemination Research 2012.
Clinical/Preventive
Intervention
Multilevel, Program
Delivery System II
Different
Same
Hopkins Pevention Research Center May 2014
13. CAL-OH Head-to-Head Randomized Implementation Trial
• Single evidence-based intervention:
− Multidimensional Treatment Foster Care (MTFC)
• 2 alternative strategies of implementing this same program
Standard implementation (Stnd)
Community Development Team (CDT)
• Randomize 51 Counties to implementation strategy*
• Evaluate implementation success using Stages of Implementation
Completion (Saldana IS 2014)
− Implementatoin Should Be Faster and go Farther w CDT
− More (# families served),
− Better (fidelity)
*Chamberlain P, et al. (2008). Engaging and Recruiting Counties in an Experiment on
Implementing Evidence–Based Practice in California. Administration and Policy in Mental Health
and Mental Health Services Research, 35(4): 250-260.
Hopkins Pevention Research Center May 2014
14. Randomize 51 Counties in CA and OH to
Implementation Strategy and Time
(Cohort)
Randomized Roll-Out Design*
40 CA Counties
26 Wait
LIsted
CDT
Stnd
Wait
Listed
13 Wait
LIsted
COHORT 1 COHORT 2 COHORT 3 COHORT 4
*Brown, et al. 2009 Ann Rev PH
11 OH
Counties
Hopkins Pevention Research Center May 2014
15. Summary of Findings (Brown et al., under review)
• Mixed Results
• Evidence that
− CDT increased numbers
of families served
− CDT counties completed
implementation more thoroughly
• No evidence that
− CDT affected rate of adoption
− CDT changed speed of implementation
0 5 10 15 20 25 30
051015202530
Figure 4. Comparison of Placement
Quantiles for CDT and IND Counties
CDT
IND
Number Served Quantiles for
CDT versus STD (EQQ Plot)
Hopkins Pevention Research Center May 2014
16. Remarks about CAL-OH Randomized
Implementation Trial
•Able to Maintain Randomize County
Design even during statewide
depression
•Equipoise may really exist
•Expensive, Judiciously Choose Head-
to-Head Implementation Trials
Hopkins Pevention Research Center May 2014
17. But ….
Implementation requires more
scientific and methodologic
frameworks to be developed
Hopkins Pevention Research Center May 2014
19. A 3-Dimensional View of Implementation
Vertical Scaling
Adoption
Increased coverage, range, sustainability of services (Ilott
et al., IS 2013, Elias et al., School Psychol Rev 2003)
Scaling-Up (Large-Scale Implementation)
Horizontal Scaling
Implement Same Intervention across Different Settings
Scaling-Out
Depth
Qual Improvement/Local Evaluation versus Generalizable
Knowledge
Landsverk et al., 2012, Brown et al., 2014, Cheung &
Duan 2013
Hopkins Pevention Research Center May 2014
20. Implications of Moving through the Pipeline
• Determine “Evidence-Based” Programs
Blueprints (www.blupeprintsprograms.com)
Evidence from experimental design
Clear findings of positive impact
Multisite replication
Programs deemed ready for widespread use
• Research focus on Implementation, Leaving questions of Effectiveness
Behind
• Consider the Program as Fixed,
• Inihibit or Permit limited adaptation or modification
• Guidance on Where to Use?
Hopkins Pevention Research Center May 2014
21. Little Focus on
• How Context Affects a Fixed Program
• How Programs may need to be revised to fit in different contexts
• Improving Programs as they are Sustained (Chambers et al., Imp Sci
2013)
Hopkins Pevention Research Center May 2014
22. Resolution: standard / high
Figure 1.
Program drift and voltage drop. Illustrating the concepts of
'program drift,’ in which the expected effect of an intervention is
presumed to decrease over time as practitioners adapt the delivery
of the intervention (A), and 'voltage drop,’ in which the effect of an
intervention is presumed to decrease as testing moves from Efficacy
to Effectiveness to Dissemination and Implementation (D&I)
research stages (B).
Chambers et al. Implementation Science 2013 8:117
doi:10.1186/1748-5908-8-117
Download authors' original image
Chambers et al., 2013
Maximal benefit of any intervention can only be realized
through ongoing development, evaluation and refinement in
diverse populations and systems
Hopkins Pevention Research Center May 2014
23. Dynamic Sustainability Framework (Chambers et al.,
IS 2013)
“Ultimate aim : Quality Improvement of
Interventions, not Quality Assurance of
Interventions”
Hopkins Pevention Research Center May 2014
24. Scaling Out
Is an evidence-based program,
shown to be effective in one or two
settings, also effective in diverse
settings?
Hopkins Pevention Research Center May 2014
25. Scaling Out refers to the transportation of an
existing, evidence-based intervention (or decision
system) into a new service delivery system and
broader ecological system.
• Moving SISTA, an HIV Prevention Program, and Strong African American
Families, an adolescent drug and HIV prevention program from CBOs
into African American churches.
• Embedding Familias Unidas, an Hispanic adolescent drug abuse and HIV
prevention program initially delivered through schools, into Family and
Adolescent Medicine.
• Moving QUIT, a brief intervention for primary care into drug treatment
programs.
• Moving Communities that Care, a community-based decision support
system, and Strong African American Families from rural into urban
settings.
• Moving the Good Behavior Game from urban into rural and pediatric
care systems.
• Integrating Prevention Programs from communities into Juvenile
Justice
• Moving a peer-to-peer suicide prevention program from schools into
the Air Force
Hopkins Pevention Research Center May 2014
26. Working Definition of Scaling-Out (Ce-PIM
Workgroup on Scaling Out, unpub, 2014)
“Scaling out is the deliberate use of strategies to
implement and sustain evidence-based interventions
through or across settings to promote the greatest
public health impact. Scaling out involves both practice
and research perspectives valuing local knowledge and
expertise through collaborations of service settings,
policy-makers, consumers, and researchers. Scaling out
can involve promoting fidelity and appropriate
adaptations that may improve care, outcomes and
public health for local contexts and populations.”
Hopkins Pevention Research Center May 2014
27. ChihMing
Scaling Out Perspective on Implementation
Research
Local knowledge
Generalized knowledge
Scale
Up
Scale Out Across Diverse Contexts
Explore
Adopt/Prepare
Im
plem
ent
Sustain
Hopkins Pevention Research Center May 2014
29. Static Model
1. Conduct years of efficacy and effectiveness
randomized trials to demonstrate that you could get
a program to work within a limited context.
2. Have someone else replicate the trial on a similar
population and settting
3. Have this program identified as an Evidence-Based
Program
4. Promote the Widescale Use of this Program in
diverse settings
5. Believe it works wherever the program is delivered.
6. Resist Adapting the Intervention, “Strict Adherence
to the Model”
Hopkins Pevention Research Center May 2014
30. Dynamic Model –
Planned Adaptation (Aarons et al., 2012)
1. Conduct years of efficacy and effectiveness
randomized trials to demonstrate that you could get
a program to work within a limited context.
2. Have someone else replicate the trial on a similar
population and settting
3. Identify this as Evidence-Based Program
4. Plan and Allow for Planned Adaptions in Program,
Organization, and Surrounding Ecologic System
5. Evaluate both effectiveness and implementation in
diverse settings
Hopkins Pevention Research Center May 2014
31. ChihMing
Scaling Out Implementation Evaluation
Local knowledge
Generalized knowledge
Scale
Up
Scale Out Across Diverse Contexts
Explore
Adopt/Prepare
Im
plem
ent
Sustain
Hopkins Pevention Research Center May 2014
34. Dynamic Model: Local Adaptation
• Start with existing program
• Adapt program and service delivery context….
Without Evaluation, this is like Off-Label Prescription
Use
Unlike FDA
We don’t have specification of conditions for its use
1/5 of prescriptions are off-label , ¾ had little scientific
evidence (Radley et al., JAMA Intern Med 2006)
Need a Scientific and Evaluation Approach for Off-Label
Prevention Implementation
Hopkins Pevention Research Center May 2014
35. Two Approaches in the Literature
•ADAPT-ITT
•Dynamic Adaptation Process
Hopkins Pevention Research Center May 2014
36. ADAPT-ITT
Wingood & DiClemente JAIDS
2008
Assess risk in target population, address cultural
context staying with core intervention elements
Evaluation:
1)Pilot test intervention w/ 20 participants,
stakeholders and agency staff
2)Pilot randomized trial w/ 3-month outcome data
. . . . .
Conduct randomized effectiveness trial
Hopkins Pevention Research Center May 2014
37. Dynamic Adaptation Process
Aarons et al. Imp Sci, 2012
Allows both program adaptation
and organizational adaptations in a
planned way
Distinguishes core elements and
allowable adaptations
Has been used for a small
evaluation
Hopkins Pevention Research Center May 2014
38. Perspectives on Scaling Out (Ce-PIM Workgroup
2014)
• Ecological Context
Community Epidemiologic /
Cultural / Linguistic / Health Disparities
Policy
• Service Delivery System
Organizational Culture and Climate
Systems Engineering
Economic Analysis
Informatics
Technical Assistance for Implementation
• Program
Core Elements
Program Adaptation
• Measurement, Research Design, and Analysis
Implementation Stages
Fidelity Assessment
• Networks, Partnerships,and Interacttions
Hopkins Pevention Research Center May 2014
39. What Should We Test while
Scaling Out?
Was program
implemented
Is program effective
Hopkins Pevention Research Center May 2014
40. How do you if Scaling Out Works as Intended?
Minimum Needed to Evaluate
•If you use an “Evidence-Based” Program / Principles-
Still require:
participant engagement Attend, Satisfaction
program fidelity Ratings
Hybrid Designs -- Type III ( Curran et
al.,2012) primarily to evaluate
implementation, secondarily
effectiveness
Hopkins Pevention Research Center May 2014
41. Clinical/Prev.
Effectiveness
Research
Implementation
Research
Hybrid
Type I
Hybrid
Type I
Hybrid
Type II
Hybrid
Type II
Hybrid
Type III
Hybrid
Type III
Hybrid Type I: test
clinical/prevention
intervention,
observe/gather
information on
implementation
Hybrid Type II:
test
clinical/prevention
intervention, study
implementation
strategy
Hybrid Type III: test
implementation
strategy,
observe/gather
information on
clinical/prevention
outcomes
Types of Hybrids
Hopkins Pevention Research Center May 2014
42. Community ContextCommunity Context
Effectiveness Trial
Intervention
Agency
Intervention
Agency
Intervention
Agent
Intervention
Fidelity
Intervention
Fidelity
TargetTarget
ParticipationParticipation
Proximal
Outcome
Proximal
Outcome
Distal
Outcome
Distal
Outcome
Hopkins Pevention Research Center May 2014
43. Community ContextCommunity Context
Evaluation Guided by How a Program Should Work
Intervention
Agency
Intervention
Agency
Intervention
Agent
Intervention
Fidelity
Intervention
Fidelity
TargetTarget
ParticipationParticipation
Proximal
Outcome
Proximal
Outcome
Distal
Outcome
Distal
Outcome
Assess Fidelity, Participation, and
Feedback Loops
Hopkins Pevention Research Center May 2014
44. Simplest Evaluations: Quality Improvement
Strategies
•Statistical Control Charts
Monitor
a) the Key Hypothesized Change Factors
b) Whether Feedback is Occurring
Hopkins Pevention Research Center May 2014
45. Number of Youth Suicide Deaths from 1988 to 2002 in County
years
deaths
1988 1990 1992 1994 1996 1998 2000 2002
0123456
Hopkins Pevention Research Center May 2014
46. Attitudes Changed through QPR
Training Wyman et al., 2008
Improvements from
Training and Time
Effect Size
Null Low Med High
Knowledge of Warning
Signs and QPR behaviors
0.46
Attitudes about Suicide
Prevention
0.89
Self-Evaluation of Suicide
Prevention Knowledge
1.06
Knowledge of Clinical
Resources
0.99
Efficacy to Perform
Gatekeeper Role
1.22
Reluctance to engage
with suicidal students
0.29
Hopkins Pevention Research Center May 2014
47. 5 10 15 20
57911 Control Chart
Self Efficacy for Gatekeeper Role
Time
Efficacy
Benneyan et al., 2003
Hopkins Pevention Research Center May 2014
48. Two More Elaborate Potential Research Strategies
for Scaling Out
•Continual Evaluation Designs (Mohr
et al., 2013)
•Simulation with Agent Based
Modeling
Hopkins Pevention Research Center May 2014
49. Continuous Evaluation of
Evolving Behavioral Intervention
Technologies (CEEBIT) Mohr et
al., AJPM 2013
Remove Inferior Interventions
Hopkins Pevention Research Center May 2014
50. Simulation Modeling as a Development Tool
•Predict How a Program will
Interact with its Context
How many peer leaders are
needed?
Hopkins Pevention Research Center May 2014
52. Social Informatics as a Way to Reduce Fidelity
Assessment Burden
It is more efficient to have lots of
poor quality assessments than a
small number of high quality
assessments
As long as you can adjust for
bias!
Brown, Mohr, Gallo et al. JAIDS 2013
Hopkins Pevention Research Center May 2014
53. Two Illustrations
•African Talking Drums: Long sequence of
repeated or similar beats
•Good Behavior Game: – Carlos Gallo:
Assessing Teacher’s Emotional Tone through
signal processing
Hopkins Pevention Research Center May 2014
55. Yoruba sentence translated
• Yoruba is a tonal language of Africa.
• Tonality is used to distinguish meaning at the word level. Similar to the
difference between “very” and “berry” except that the difference is
based on the high or low tone.
• This video has a sample of a Yoruba sentence translated into drums
language. (Clip 2)
Hopkins Pevention Research Center May 2014
56. Talking Drums language
• Can you hear the words “spoken” by this drum? (clip 1)
This song was translated
from a scale of
7 tones (do, re .. si) to
a 3 tone scale (do, re, mi).
While there is loss of information,
There is enough to recognize
the song, and even, recover lyrics
Hopkins Pevention Research Center May 2014
57. Good Behavior Game:
Machine can code all teacher verbalizations while
Observer or Coach can only do a small number
0.2 0.4 0.6 0.8 1.0
024
Benefit of Machine Codings to Observer
Proportion of Neutral Statements
RelativeEfficiency
Modest Validity Data (N=20)
More Validity Data (N=40)
Hopkins Pevention Research Center May 2014
58. Conclusions
1. Standard Research Pipeline
Judiciously Select Head-to-Head Trials
2. Adding Scaling-Out to our Research Pipeline
Fit of Program, Service, Ecologic Context
Evaluation: Minimal Design Approach:
Fidelity, Participation
Use of Simulation Modeling
3. Social Informatics
Support Efficient Fidelity Assessment
Hopkins Pevention Research Center May 2014
59. References
Benneyan J, Lloyd R, Plsek P (2003). Statistical quality control as tool for research and healthcare improvement. Qual Safety
Health Care, 12(6), 458-64.
Brown, CH, Ten Have TR, Jo B, Dagne G, Wyman PA, Muthén BO, Gibbons RD. Adaptive Designs in Public Health. Annual Review
Public Health, 30: 17.1-17.25, 2009.
Brown CH, Sloboda Z, Faggiano F, Teasdale B, Keller F, Burkhart G, Vigna-Taglianti F, Howe G, Masyn K, Wang W, Muthén B,
Stephens P, Grey S, Perrino T, and the Prevention Science and Methodology Group. Methods for Synthesizing Findings on
Moderation Effects Across Multiple Randomized Trials. Prevention Science, 14(2): 144-156, 2013.
Brown CH, Kellam SG, Kaupert S, Muthén BO, Wang W, Muthén L, Chamberlain P, PoVey C, Cady R, Valente T, Ogihara M, Prado G,
Pantin H, Szapocznik J, Czaja S, McManus J. Partnerships for the Design, Conduct, and Analysis of Effectiveness, and
Implementation Research: Experiences of the Prevention Science and Methodology Group. Administration and Policy in Mental
Health, 39: 301-316, 2012.
Brown CH, Mason WA, Brown EC (2014). Translating the Intervention Approach into an Appropriate Research Design -- The Next
Generation Designs for Effectiveness and Implementation Research. In Z Sloboda and H Petras (Eds.), Advances in Prevention
Science: Defining Prevention Science, Springer Publishing.
Brown CH, Mohr DC, Gallo CG, Mader C, Palinkas LA, Wingood G, Prado G, Poduska J, Gibbons RD, Kellam SG, Pantin H, McManus
J, Ogihara M, Valente T, Wulczyn F, Czaja S, Sutcliffe G, Villamar J, Jacombs C. A Computational Future for Preventing HIV in
Minority Communities: How Advanced Technology Can Improve Implementation of Effective Programs. JAIDS 63: Supplement 1,
S72-S84, 2013.
Gallo C., Pantin H, Villamar J, Prado G, Tapia M, Ogihara M, Cruden G, Brown CH (Accepted). Blending Qualitative and
Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention.
Accepted for publication in Admin Mental Health Policy.
Hopkins Pevention Research Center May 2014
60. Brown CH, Chamberlain P, Saldana L, Wang W, Cruden G, Padgett C.,(under review,) Evaluation of two implementation
strategies in fifty-one counties in two states: Results of a cluster randomized implementation trial
Chamberlain P, Brown CH, Saldana L, Reid J, Wang W, Marsenich L, Cosna T, Padgett C. Engaging and Recruiting Counties
in an Experiment on Implementing Evidence–Based Practice in California. Administration and Policy in Mental Health and
Mental Health Services Research, 35(4): 250-260, 2008.
Chamberlain, P, Brown, CH, Saldana, L. Observational Measure of Implementation Progress: The Stages of
Implementation Completion (SIC), Implementation Science, 6(116), 1-8, 2011.
Cheung K and Duan N (2013). Design of Implementation Studies for Quality Improvement Programs: An
Effectiveness/Cost Effectiveness Framework. AJPH epub.
Glasgow RE, McKay HG, Piette JD, Reynolds KD (2001). The RE-AIM framework for evaluating interventions: what can it
tell us about approaches to chronic illness management? Patient Ed and Counseling: 44: 119-127.
Hawkins JD, Oesterle S, Brown EC, Abbott RD, Catalano RF (2014). Youth problem behaviors 8 years after implementing
the Communities that Care Prevention System: A Community-Randomized Trial. JAMA Pediatr, 168(2) 122-129.
IOM 2009 Preventing Mental, Emotional, and Behavioral Disorders Among Young People: Progress and Possibilities.
Committee on Prevention of Mental Disorders and Substance Abuse Among Children, Youth, and Young Adults: Research
Advances and Promising Interventions. Mary Ellen O’Connell, Thomas Boat, and Kenneth E. Warner, Editors. Board on
Children, Youth, and Families, Division of Behavioral and Social Sciences and Education. Washington, DC: The National
Academies Press, 2009.
Palinkas LA, Holloway IW, Rice E, Brown CH, Valente TW, Chamberlain P. Influence Network Linkages across Treatment
Conditions in Randomized Controlled Trials. Accepted for publication in Prevention Science.
Hopkins Pevention Research Center May 2014
61. Landsverk J, Brown CH, Chamberlain P, Palinkas L, Horwitz SM, Ogihara M. Design and Analysis in Dissemination and Implementation
Research. (2012). In R Brownson, G Colditz and E Proctor (Eds.), Dissemination and Implementation Research in Health: Translating Science to
Practice, Oxford University Press
Mohr DC, Cheung K, Schueller SM, Brown CH, Duan N Continuous Evaluation of Evolving Behavioral Intervention Technologies. American
Journal of Preventive Medicine, 45(4): 517-523, 2013.
Spoth R, Rohrbach L, Greenberg M, Robertson E, Leaf P, Brown CH, Fagan A, Catalano R, Pentz MA, Sloboda Z, Meyer A, Hawkins D. (2013).
Addressing Challenges for the Next Generation of Type 2 Translation Research: The Translation Science to Population Impact (TSci2PI)
Framework. Prevention Science. Prev Sci. 14(4): 319–351.
Hopkins Pevention Research Center May 2014