Interactive Systems Framework Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., et al. (2008). Bridging the Gap Between Prevention Research and Practice: The Interactive Systems Framework for Dissemination and Implementation. American Journal of Community Psychology, 41(3), 171-181.
* Green LW, Orleans T, Ottoson JM, Cameron R, Pierce JP, Bettinghaus EP. Inferring strategies for disseminating physical activity policies, programs, and practices from the successes of tobacco control. Am J Prev Med. 2006;31(4)(suppl):S66–S81.
2007-2009 12 Awarded Sites & 5 Programs Funded Organizations Evidence-Based Program 4 Churches Body and Soul 4 Worksites Treatwell 5-A-Day 2 Community Coalitions Parents as Teachers (PAT) High 5 Low Fat Program Senior Center Little By Little Nutrition Program Hospital Diabetes Management Center Patient-Centered Assessment & Counseling for Exercise (PACE)
Fidelity: “ faithfulness” to the implementation of program elements in the way they were intended to be delivered in the original intervention
Core elements*: required components that represent the theory and internal logic of the intervention and most likely produce the intervention’s effectiveness
Key process steps: required implementation or program delivery steps that are conducted to contribute to the intervention’s effectiveness
*Eke, Neumann, Wilkes, Jones. Preparing effective behavioral interventions to be used by prevention providers: the role of researchers during HIV Prevention Research Trials. AIDS Education & Prevention 2006, 18(4 Suppl A):44-58.
3 (of 7) sites in 1 st cohort did not conduct all core elements
All 5 sites in 2 nd cohort conducted all core elements
Decided not to conduct due to context/climate:
“ They thought that if we had some type of event like that [family picnic/party], that would be saying now you’re asking me to take unemployment weeks but...you’re having an event...” - Site coordinator
Contextual Factors (related to implementation) * Mentioned in both cohorts Blue text = barrier that prevented completion of core element(s) - 1 st cohort BARRIERS FACILITATORS
Expanded the program from worksite/coalition to the community
“ This project seems to have opened the door for a brand new [obesity] issue that our county had not talked about…all of a sudden the light went on …[the collaborative] said we need to add this to our benchmarks as a group and start working on this.” - Site coordinator
Added physical activity & weight loss events (to nutrition program)
“ Because of the nutrition part of it, people began to feel better and they had more energy. So they was able to do more physical activities and wanted to do more as far as looking at weight loss…” - Site coordinator
Map of the Adaptation Process (Mckleroy et al., 2006)
Focus on objectives of each key step:
EBP Training Topics (pre-award) Session Title What Do We Mean By Evidence-Based? Needs Assessment and Program Planning Finding an Evidence-Based Program Selecting a Program That Fits Your Community Adapting the Evidence-Based Program with Fidelity Implementing an Evidence-Based Program Evaluating Your Program
TEACH model: Translating Evidence into Action through Collaboratives for Health TA Contact Structured TA Topics (examples) Stage in Map of Adaptation Process Pre-award Training See prior training slide Assess, Select, Prepare Kick-Off Training for awarded sites EBIs, Needs assessment, Organizational readiness, Core elements Assess, Select, Prepare Site Visit Fit, Adaptation, Evaluation planning Assess, Select, Prepare, Pilot Conference Call Implementation Work Plan, Partnerships Assess, Select, Prepare, Pilot Ongoing Contact Overcoming barriers, implementation fidelity, maintenance Assess, Pilot, Implement, Maintenance
Tools Adapted from: Lesesne, C. A., Lewis, K. M., Moore, C., Fisher, D., Green, D., & Wandersman, A. (2007). Promoting Science-based Approaches to Teen Pregnancy Prevention using Getting To Outcomes: Draft June 2007. Unpublished manual.
Almost all (18) reported someone from their organization advocated for the use of an EBP for the currently funded mini-grant
*Levinger and Bloom, 2000 ; Weiss et al., 2002; Preskill and Tores, 1998; Caplan, 1971; Kenny and Sofaer, 2000; Schminke et al, 2002) Survey topic areas Example Measures – Survey Questions
Attitudes about EBPs
(Hannon et al, 2009)
Likert Scale: Strongly Disagree Strongly Agree
EBPs are easy to understand.
EBPs are easy for us to adapt for use in our community.
Skills related to EBPs
(Chinman et al., 2008)
Likert Scale: Very hard Very Easy
Assess organizational readiness to implement an evidence-based program.
Determine what needs to be changed in an EBP to increase fit to your community.
Likert Scale: Strongly Disagree Strongly Agree
We have appropriate staff skills to achieve our mission.
Staff use data/information to inform their decision-making.
The leadership of the organization fosters respect, trust, inclusiveness, and openness in the organization.
Preliminary Results: Skills Related to EBPs Tasks (1= very hard; 5= very easy) Mean SD Tasks with higher reported ability Define goals and objectives for your program. 3.95 .83 Discuss the benefits of using evidence-based programs. 3.90 .72 Develop an implementation work plan. 3.80 .70 Tasks with lower reported ability Plan for maintenance of program. (e.g. leverage of resources) 2.75 .97 Develop solutions to identified implementation barriers. 3.00 1.03 Describe the steps of the program adaptation process. 3.20 .77 Prepare for the implementation of your program. (e.g. training of staff, hiring of staff, piloting, partnerships) 3.20 1.15
Attitudes about EBPs All scores of negative statements were reversed. The higher the mean score, the more positive their attitude about EBPs. *Reverse Coded
The CPCRN is part of the Prevention Research Centers Program. It is supported by the Centers for Disease Control and Prevention and the National Cancer Institute (Cooperative agreement # 1U48DP0010909-01-1)