Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Southwest Georgia Communities Adapting Evidence-Based Programs


Published on

Published in: Health & Medicine, Education
  • Be the first to comment

  • Be the first to like this

Southwest Georgia Communities Adapting Evidence-Based Programs

  1. 1. Michelle Carvalho, MPH, CHES; Cam Escoffery, PhD, MPH, CHES; Louise Wrensford, PhD; Michelle Kegler, DrPH, MPH Georgia Public Health Association Annual Meeting April 12, 2011
  2. 2. Acknowledgements <ul><li>Selected content adapted from National Cancer Institute’s “Using What Works” </li></ul><ul><li> </li></ul><ul><li>Cancer Prevention and Control Prevention Research Network (CPCRN) </li></ul><ul><li>CDC Grant # 1U48DP0010909-01-1 </li></ul><ul><li>Funded by CDC and NCI </li></ul>
  3. 3. Presentation Objectives <ul><li>Participants will be able to: </li></ul><ul><li>Discuss the use of mini-grants to disseminate evidence-based programs (EBPs). </li></ul><ul><li>Describe the EPRC’s mini-grants program and process evaluation questions about adapting an EBP </li></ul><ul><li>Explain contextual factors that may influence program implementation and adaptation </li></ul><ul><li>Describe a proactive training and technical assistance process and process evaluation for community organizations to implement EBPs with fidelity </li></ul>
  4. 4. <ul><ul><ul><li>What do you think of when you hear the term </li></ul></ul></ul><ul><ul><ul><li>“ evidence-based”? </li></ul></ul></ul>Question
  5. 5. Evidence-based Programs <ul><li>An evidence-based program (EBP) has been: </li></ul><ul><li>Implemented with a group </li></ul><ul><li>Evaluated </li></ul><ul><li>Found to be effective. </li></ul>
  6. 6. What Is Evidence? <ul><li>Systematic reviews of multiple intervention studies </li></ul><ul><li>Review articles </li></ul><ul><li>An intervention research study </li></ul><ul><li>Surveillance data </li></ul><ul><li>Program evaluation </li></ul><ul><li>Word of mouth </li></ul><ul><li>Personal experience </li></ul>OBJECTIVE SUBJECTIVE
  7. 7. Benefits of using Evidence-Based Programs <ul><li>What are advantages to using evidence-based programs? </li></ul><ul><ul><li>Effective in the study populations </li></ul></ul><ul><ul><li>Cost effective </li></ul></ul><ul><ul><li>Shorten the time it takes to develop a program </li></ul></ul><ul><ul><li>Reduce the time it takes to research a community </li></ul></ul><ul><ul><li>Help narrow the evaluation. </li></ul></ul>
  8. 8. Mini-Grants as a Strategy for Dissemination of EBPs <ul><li>Mini-grants are common in health promotion initiatives & have potential for creating demand for evidence-based interventions </li></ul><ul><li>Mini-grants can be combined with dissemination strategies shown to work </li></ul><ul><ul><li>Training workshops (Rohrbach 2006; Elliot 2004) </li></ul></ul><ul><ul><ul><li>Increases adoption, capacity, fidelity, maintenance </li></ul></ul></ul><ul><ul><li>Technical Assistance (Pentz 2006; Shepherd 2008; Rohrbach 2006) </li></ul></ul><ul><ul><ul><li>Ongoing support, feedback, coaching </li></ul></ul></ul><ul><ul><li>Incentives (Basen-Engquist ,1994; Glanz, 2002) </li></ul></ul><ul><ul><ul><li>stipends, equipment, materials </li></ul></ul></ul>
  9. 9. Frameworks for Translation of Evidence <ul><li>Passive diffusion is not enough to encourage the adoption of evidence-based interventions.* </li></ul><ul><li>Frameworks are needed to guide active dissemination strategies to translate evidence into community practice </li></ul>* Pentz, Jasuja, G. K., Rohrbach, L. A., Sussman, S., & Bardo, M. T. (2006). Translation in tobacco and drug abuse prevention research. Evaluation & the Health Professions, 29 (2), 246-271.
  10. 12. Interactive Systems Framework Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., et al. (2008). Bridging the Gap Between Prevention Research and Practice: The Interactive Systems Framework for Dissemination and Implementation. American Journal of Community Psychology, 41(3), 171-181.
  11. 13. Mini-grants Program to Disseminate EBPs <ul><li>A “push-pull method” (i.e. funds + TA) increases </li></ul><ul><li>demand while building capacity* </li></ul><ul><li>2 cohorts: 2007 & 2008 (12-18 month period) </li></ul><ul><li>12 SW GA community organizations awarded </li></ul><ul><li>Received up to $4000 & technical assistance (TA) </li></ul><ul><li>Implemented 5 RTIPs programs (nutrition or PA) </li></ul>* Green LW, Orleans T, Ottoson JM, Cameron R, Pierce JP, Bettinghaus EP. Inferring strategies for disseminating physical activity policies, programs, and practices from the successes of tobacco control. Am J Prev Med. 2006;31(4)(suppl):S66–S81.
  12. 14.
  13. 17. 16 Mini-grants sites funded 2007-2012 5 Tifton Thomasville Bainbridge Albany Blakely Pelham Sylvester Nashville Adel Valdosta Cordele
  14. 18. 2007-2009 12 Awarded Sites & 5 Programs Funded Organizations Evidence-Based Program 4 Churches Body and Soul 4 Worksites Treatwell 5-A-Day 2 Community Coalitions Parents as Teachers (PAT) High 5 Low Fat Program Senior Center Little By Little Nutrition Program Hospital Diabetes Management Center Patient-Centered Assessment & Counseling for Exercise (PACE)
  15. 19. Engaging Community Expertise <ul><li>Emory PRC Community Advisory Board (CAB) roles: </li></ul><ul><li>Prioritized behavioral risk factors: </li></ul><ul><ul><li>nutrition, physical activity, tobacco prevention/cessation </li></ul></ul><ul><li>Helped to develop mini-grants and TA process </li></ul><ul><li>Facilitated promotion of program to community </li></ul><ul><li>Joint EPRC/CAB review committee </li></ul><ul><li>selected grantees </li></ul><ul><li>Currently co-authoring presentations </li></ul><ul><li>and publications </li></ul>
  16. 20. * Collected in both cohorts (Other tools in 1 st cohort only) Construct Process Evaluation Question(s) Data Collection Methods Reach <ul><li>What proportion of the intended audience participated in each activity? </li></ul><ul><li>Project Report Forms* </li></ul><ul><li>Demographics form </li></ul>Implementation <ul><li>Fidelity: To what extent were core elements of the program implemented as described in program materials? </li></ul><ul><li>Project Report Forms* </li></ul><ul><li>Monthly calls </li></ul><ul><li>Interviews (coordinators)* </li></ul><ul><li>Adaptation: How and why did sites adapt core elements of the intervention? </li></ul><ul><li>Project Report Forms* </li></ul><ul><li>Monthly calls </li></ul><ul><li>Interviews (coordinators)* </li></ul><ul><li>Committee focus group </li></ul>Context <ul><li>What contextual factors may have affected intervention adoption and implementation? </li></ul><ul><li>Interviews (coordinators)* </li></ul><ul><li>Committee focus group </li></ul><ul><li>Monthly calls </li></ul><ul><li>Mini-grant applications* </li></ul><ul><li>Census data </li></ul>Maintenance <ul><li>What plans has the site made to continue promoting health after the end of the project? </li></ul><ul><li>Interviews (coordinators)* </li></ul><ul><li>Committee focus group </li></ul>Resources <ul><li>What resources did EPRC provide to support this project? </li></ul><ul><li>EPRC financial records* </li></ul><ul><li>TA log </li></ul><ul><li>To what extent did grantees perceive that EPRC technical assistance helped them to implement the programs with fidelity? </li></ul><ul><li>Interviews (coordinators)* </li></ul>
  17. 21. Program Fidelity <ul><li>Fidelity: “ faithfulness” to the implementation of program elements in the way they were intended to be delivered in the original intervention </li></ul><ul><li>Core elements*: required components that represent the theory and internal logic of the intervention and most likely produce the intervention’s effectiveness </li></ul><ul><li>Key process steps: required implementation or program delivery steps that are conducted to contribute to the intervention’s effectiveness </li></ul>*Eke, Neumann, Wilkes, Jones. Preparing effective behavioral interventions to be used by prevention providers: the role of researchers during HIV Prevention Research Trials. AIDS Education & Prevention 2006, 18(4 Suppl A):44-58.
  18. 22. Program Core Elements <ul><li>Core elements for each program were identified based on: </li></ul><ul><ul><li>underlying theory & process evaluation findings </li></ul></ul><ul><ul><li>published articles describing the program </li></ul></ul><ul><ul><li>available program materials </li></ul></ul><ul><ul><li>program description on NCI’s Research Tested Intervention Programs (RTIPs) website </li></ul></ul>
  19. 23. Adaptation is… <ul><li>… making </li></ul><ul><li>Changes </li></ul><ul><li>Additions </li></ul><ul><li>Deletions </li></ul><ul><li>Substitutions </li></ul><ul><li>to an evidence-based program in order to make it more suitable for a particular population and/or an organization’s capacity. </li></ul>
  20. 24. Fidelity Findings <ul><li>95% of core elements conducted across all sites </li></ul><ul><li>9 of 12 (75%) sites conducted all core elements </li></ul><ul><ul><li>3 (of 7) sites in 1 st cohort did not conduct all core elements </li></ul></ul><ul><ul><li>All 5 sites in 2 nd cohort conducted all core elements </li></ul></ul><ul><li>Decided not to conduct due to context/climate: </li></ul><ul><li>“ They thought that if we had some type of event like that [family picnic/party], that would be saying now you’re asking me to take unemployment weeks’re having an event...” - Site coordinator </li></ul>
  21. 25. Contextual Factors (related to implementation) * Mentioned in both cohorts Blue text = barrier that prevented completion of core element(s) - 1 st cohort BARRIERS FACILITATORS <ul><li>Schedule/time conflicts* </li></ul><ul><li>Difficulty with recruitment or retention* </li></ul><ul><li>Lack of resources/funds* </li></ul><ul><li>Difficulty with changing behavior </li></ul><ul><li>Staff/leadership transitions </li></ul><ul><li>Slow economy/worksite financial difficulties </li></ul><ul><li>Leadership support* </li></ul><ul><li>Staff/volunteers* </li></ul><ul><li>Print materials/resources* </li></ul><ul><li>In-kind resources/facilities* </li></ul><ul><li>Partnerships* </li></ul><ul><li>Donated Resources* </li></ul><ul><li>Fit with mission </li></ul><ul><li>Fit with Infrastructure/Activities </li></ul>
  22. 26. Fidelity-Adaptation Continuum <ul><li>Added/customized materials </li></ul><ul><li>Added activities </li></ul><ul><li>Shifted primary audience </li></ul><ul><li>Held concurrent physical activity </li></ul><ul><li>& weight loss events </li></ul><ul><li>Changed delivery format/process steps </li></ul><ul><li>Expanded audience (to community) </li></ul><ul><li>Shifted focus to other behaviors </li></ul><ul><li>Did not complete all core elements </li></ul>HIGH FIDELITY MAJOR ADAPTATION MINOR ADAPTATION LOW FIDELITY ADAPTATION EXAMPLES NEEDS EVALUATION
  23. 27. Adaptation Quotes <ul><li>Expanded the program from worksite/coalition to the community </li></ul><ul><li>“ This project seems to have opened the door for a brand new [obesity] issue that our county had not talked about…all of a sudden the light went on …[the collaborative] said we need to add this to our benchmarks as a group and start working on this.” - Site coordinator </li></ul><ul><li>Added physical activity & weight loss events (to nutrition program) </li></ul><ul><li>“ Because of the nutrition part of it, people began to feel better and they had more energy. So they was able to do more physical activities and wanted to do more as far as looking at weight loss…” - Site coordinator </li></ul>
  24. 28. Reasons for Adaptations <ul><li>Expand program reach (broader community) </li></ul><ul><li>Generate/maintain engagement </li></ul><ul><li>Strengthen/reinforce program message </li></ul><ul><li>Fit program to organization’s infrastructure/activities </li></ul><ul><li>Reach specific audiences (esp. underserved) </li></ul><ul><li>Added content to reach specific audiences (teen parents) </li></ul><ul><li>“ You got to think about being also sensitive to the age of the parent. If you have [a parent] that’s maybe 14…give them something that can be kinda fun…” </li></ul><ul><li>- Site coordinator </li></ul>
  25. 29. A Tale of 4 Sites… <ul><li>Body & Soul/Church: Minimal Adaptation </li></ul><ul><ul><li>Minor additions (incentives & activities) </li></ul></ul><ul><li>Body & Soul/Church: Major Adaptation </li></ul><ul><ul><li>Shifted focus to physical activity/weight loss </li></ul></ul><ul><li>Little by Little/Senior Center: </li></ul><ul><li>Intermediate-Major Adaptation </li></ul><ul><ul><li>Assisted delivery of CD-ROM & added activities </li></ul></ul><ul><li>Treatwell 5-a-Day/CBO: Major Adaptation </li></ul><ul><ul><li>Shifted audience to Advisory Board, then community </li></ul></ul><ul><ul><li>Newsletters  monthly local newspaper stories </li></ul></ul>
  26. 30. Limitations <ul><li>Small number of sites (n=12) in rural SW GA </li></ul><ul><li>Limited measurement of fidelity & implementation quality </li></ul><ul><li>Time span 12-18 months – more time needed to learn about maintenance </li></ul><ul><li>Self report/social desirability </li></ul><ul><li>Data reflects information from only 5 intervention programs </li></ul><ul><li>Data may not be generalizable to other settings, populations, regions & programs </li></ul>
  27. 31. 2010-12 Mini-grants Cohort <ul><li>Mini-grants period will span 2 years </li></ul><ul><li>4 sites funded at $8000 each </li></ul><ul><li>Structured and proactive TA and training </li></ul><ul><li>RTIPs programs: </li></ul><ul><ul><li>CATCH: Coordinated Approach to Child Health </li></ul></ul><ul><ul><li>Family Matters </li></ul></ul><ul><ul><li>Body & Soul </li></ul></ul><ul><li>Process evaluation focus on TA & training </li></ul>
  28. 32. Map of the Adaptation Process <ul><li>Developed a structured TA model derived from the </li></ul><ul><li>Map of the Adaptation Process (Mckleroy et al., 2006) </li></ul><ul><li>Focus on objectives of each key step: </li></ul>
  29. 33. EBP Training Topics (pre-award) Session Title What Do We Mean By Evidence-Based? Needs Assessment and Program Planning Finding an Evidence-Based Program Selecting a Program That Fits Your Community Adapting the Evidence-Based Program with Fidelity Implementing an Evidence-Based Program Evaluating Your Program
  30. 34. TEACH model: Translating Evidence into Action through Collaboratives for Health TA Contact Structured TA Topics (examples) Stage in Map of Adaptation Process Pre-award Training See prior training slide Assess, Select, Prepare Kick-Off Training for awarded sites EBIs, Needs assessment, Organizational readiness, Core elements Assess, Select, Prepare Site Visit Fit, Adaptation, Evaluation planning Assess, Select, Prepare, Pilot Conference Call Implementation Work Plan, Partnerships Assess, Select, Prepare, Pilot Ongoing Contact Overcoming barriers, implementation fidelity, maintenance Assess, Pilot, Implement, Maintenance
  31. 36. Tools Adapted from: Lesesne, C. A., Lewis, K. M., Moore, C., Fisher, D., Green, D., & Wandersman, A. (2007). Promoting Science-based Approaches to Teen Pregnancy Prevention using Getting To Outcomes: Draft June 2007. Unpublished manual.
  32. 37. TEACH Evaluation Questions <ul><li>Kept the original evaluation questions and added capacity questions related to the impact of TEACH: </li></ul><ul><li>Do attitudes toward EBAs become more positive as a result of the TEACH process? </li></ul><ul><li>Does self-efficacy for EBA behaviors increase as a result of the TEACH process? </li></ul><ul><li>Does organizational capacity for EBAs increase as a result of the TEACH process? </li></ul>
  33. 38. Process Evaluation Plan <ul><li>Baseline survey (n=20)- 80 close-ended items </li></ul><ul><li>Follow-up at 3 (n=12)- 76 closed + 4 open-ended items </li></ul><ul><li>Additional follow up at 24 months </li></ul><ul><li>TA tracking database </li></ul><ul><li>Project Report Forms </li></ul><ul><li>Qualitative interviews w/ coordinators at 24 months </li></ul>
  34. 39. Participant descriptions <ul><li>Completed baseline surveys (n=20) </li></ul><ul><li>Included directors, coordinators, educators </li></ul><ul><li>9 (45%) held supervisory or managerial roles </li></ul><ul><li>6 (30%) were “front line staff” </li></ul><ul><li>15 (75%) had a bachelors degree or higher </li></ul><ul><li>Averaged 9 years at current organization </li></ul><ul><li>6 (33.3%) reported prior experience with EBPs </li></ul><ul><li>Almost all (18) reported someone from their organization advocated for the use of an EBP for the currently funded mini-grant </li></ul>
  35. 40. *Levinger and Bloom, 2000 ; Weiss et al., 2002; Preskill and Tores, 1998; Caplan, 1971; Kenny and Sofaer, 2000; Schminke et al, 2002) Survey topic areas Example Measures – Survey Questions <ul><ul><li>Attitudes about EBPs </li></ul></ul><ul><ul><li>14 items </li></ul></ul><ul><ul><li>(Hannon et al, 2009) </li></ul></ul><ul><li>Likert Scale: Strongly Disagree  Strongly Agree </li></ul><ul><li>EBPs are easy to understand. </li></ul><ul><li>EBPs are easy for us to adapt for use in our community. </li></ul><ul><ul><li>Skills related to EBPs </li></ul></ul><ul><ul><li>18 items </li></ul></ul><ul><ul><li>(Chinman et al., 2008) </li></ul></ul><ul><li>Likert Scale: Very hard  Very Easy </li></ul><ul><li>Assess organizational readiness to implement an evidence-based program. </li></ul><ul><li>Determine what needs to be changed in an EBP to increase fit to your community. </li></ul><ul><ul><li>Organizational functioning* </li></ul></ul><ul><ul><li>38 items </li></ul></ul><ul><li>Likert Scale: Strongly Disagree  Strongly Agree </li></ul><ul><li>We have appropriate staff skills to achieve our mission. </li></ul><ul><li>Staff use data/information to inform their decision-making. </li></ul><ul><li>The leadership of the organization fosters respect, trust, inclusiveness, and openness in the organization. </li></ul>
  36. 41. Preliminary Results: Skills Related to EBPs Tasks (1= very hard; 5= very easy) Mean SD Tasks with higher reported ability Define goals and objectives for your program. 3.95 .83 Discuss the benefits of using evidence-based programs. 3.90 .72 Develop an implementation work plan. 3.80 .70 Tasks with lower reported ability Plan for maintenance of program. (e.g. leverage of resources) 2.75 .97 Develop solutions to identified implementation barriers. 3.00 1.03 Describe the steps of the program adaptation process. 3.20 .77 Prepare for the implementation of your program. (e.g. training of staff, hiring of staff, piloting, partnerships) 3.20 1.15
  37. 42. Attitudes about EBPs All scores of negative statements were reversed. The higher the mean score, the more positive their attitude about EBPs. *Reverse Coded
  38. 43. Implications for Practice <ul><li>Using evidence-based strategies and programs can save time and can benefit communities </li></ul><ul><li>Mini-grants, training and technical assistance are promising strategies to translate evidence into community practice </li></ul><ul><li>Evidence-based programs can be strategically adapted to meet the needs of a community </li></ul><ul><li>More evaluation is needed to determine how best to adapt and implement EBPs with fidelity </li></ul>
  39. 44. Acknowledgements <ul><li>Mini-grant sites </li></ul><ul><li>Sally Honeycutt </li></ul><ul><li>Kirsten Rodgers </li></ul><ul><li>Karen Glanz </li></ul><ul><li>Johanna Hinman </li></ul><ul><li>Jenifer Brents </li></ul><ul><li>Molly Russ </li></ul><ul><li>Yao Shi </li></ul>The CPCRN is part of the Prevention Research Centers Program. It is supported by the Centers for Disease Control and Prevention and the National Cancer Institute (Cooperative agreement # 1U48DP0010909-01-1) <ul><li>JK Veluswamy </li></ul><ul><li>Margaret Clawson </li></ul><ul><li>Megan Brock </li></ul><ul><li>Nidia Banuelos </li></ul><ul><li>Alma Nakasone </li></ul><ul><li>Amanda Wyatt </li></ul><ul><li>Deltavier Frye </li></ul><ul><li>Ana Iturbides </li></ul>
  40. 45. QUESTIONS? Michelle Carvalho [email_address]