Obesity for steering committee 3-19-12 final

1,416 views

Published on

P

Published in: Health & Medicine, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,416
On SlideShare
0
From Embeds
0
Number of Embeds
1,065
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Reminder of where we are in the processReminder of some of the hopes from last timeBRIEF summary of what was on people’s minds and how that was woven into process (if needed)
  • Review agenda (tight agenda, will keep us moving)Other handoutsSuggest norms:
  • Then, using the literature review typology, we assigned an evidence rating for each strategy. This rating is based off of both the type of research conducted and the results of that research. Can’t go off the data source alone, need to also consider the results. Background about Typology:Focus of this is on Evidence Based Public Health, not Evidence Based Medicine.Adaptation from the Healthy People 2020 typology with added category “Not Recommended” (To define the “Not Recommended” category, the group borrowed from category I (Insufficient Evidence to Make a Recommendation) and D (Not Recommended) from the U.S. Preventive Services Task Force (#5), as well as our own original ideas)EPE/PSD began integrating this typology in November 2011Typology handout has more information about the classifications. EXAMPLES: a systematic review that is over 10 years old is not necessarily proven. A systematic review that shows there is no good evidence is most likely in the emerging category because there simply is not enough information about the results.
  • Refer to pages 6-13. The specific information requested is on these pages, broken down by sector. Some of the information gaps may have been filled simply by seeing the overall survey results (i.e., what is the state role). Others may not. If you still have a need for more information, please raise your specific question in the next segment, where each sector team will have time for Q&A.
  • Tentative results: Total of 59 strategies for increasing physical activity and healthy eating were identified by 8 sector teams ranging from not recommended to proven.Each sector team will have 5 minutes to answer questions related to strategies within their sector. Determine who will be answering questions.Timekeeping
  • Obesity for steering committee 3-19-12 final

    1. 1. Prioritizing Obesity Strategies Obesity Integration Steering Committee Meetings 2.29.12, 3.19.12, 4.2.12
    2. 2. Prioritization ProcessWhen Purpose/ActionsMeeting 1 (2/29) • Getting Started • Agreements For Moving ForwardBetween Meetings Survey Coming Tomorrow (due 3/9)(e-input) • Review/Provide Input on Proposed Criteria • Identify State Roles for StrategiesMeeting 2 (3/19) • Review/Finalize Criteria9:30am -12:30pm • Clarify Strategies • Clarify State RolesBetween Meetings Prioritization Survey (sent 3/21, due 3/26)(e-input) • Rate Strategies Using Final Criteria • Refine Concise StatementMeeting 3 (4/2) • Share Results of Prioritization9:30am – 12:30pm • Gather Additional Input for Ex Committee
    3. 3. Agenda Review Welcome and Overview Review/Finalize Criteria Break Clarify Strategies and State Roles Next Steps and Closure
    4. 4. Discussions and DecisionsDiscussion: All encouraged to participateDecisions: For Today & Prioritization Survey (btwnMeetings 2 & 3) 60% super majority vote, motioned by a steering committee member One vote per Steering Committee Member Executive Committee Members, and other observers, do not vote
    5. 5. Steering Committee VotingMembers Sector Team Leads or Designee (8 reps from PSD) Healthy Eating (2 reps from PSD) LHAs (5 total: 1 rep each from El Paso, Weld, Boulder, Pueblo, West-Central Partnership) External Organizations (3 total: 1 rep each from Live Well, Kaiser, Health Foundation)
    6. 6. Review/Finalize Criteria
    7. 7. The “Ask” Regarding Criteria 11 prioritization criteria proposed Asked to make 2 judgments:  RATE each criteria  RANK top five criteria Opportunity to suggest other criteria 19 people responded
    8. 8. Criteria Ratings - Results Not at all Well Somewhat Well Fairly Well Well Very Well
    9. 9. Rating Takeaways All criteria would do fairly well or better at helping during the prioritization process. Could potentially combine some criteria:  Population Impact AND Expected Reach?  Community Support, Capacity to Implement, AND Opportunity for Leverage? Could consider the following additional criteria:  Alignment with National Priorities  Evidence Level
    10. 10. Criteria Rankings - Results
    11. 11. Ranking Takeaways Ranking a more effective way to distinguish preferences among criteria that are all helpful. “Least Helpful” ranking could be misleading – only least helpful among 5 favorites. At least 60% of responding members chose the following as one of their top 5 criteria:  Likelihood of population impact  Capacity to Implement
    12. 12. Considerations Evidence is being published every day. Executive committee will get:  the full list of strategies with their ranks from the prioritization survey. Future implementation teams will consider:  Applicability to Colorado  Resources needed PSD will keep you informed.
    13. 13. Demonstration of PrioritizationScoring No right way ~ rather match how you want to make the decision Rate each strategy against the criteria Criteria scale: 1 = “Little” to 5 = “Great” Create a prioritization score from your ratings = add the ratings on each criteria, divide by the number of criteria, divide by the number of respondents (raters) Expect ties
    14. 14. Demonstration of PrioritizationScoring Rank Criteria Criteria B: Rank Rank w/Paint your A: True Easy to Priority using 2 Rank with multiplhouse Blue clean Sum Score criteria using 3 weighting ierStrategy A 1 3 4 2.0 6 6 6 2Strategy B 2 3 5 2.5 4 2 2 3Strategy C 3 2 5 2.5 4 5 5 3Strategy D 4 5 9 4.5 1 1 1 1Strategy E 5 3 8 4.0 2 2 3 6Strategy F 3 3 6 3.0 3 4 3 5
    15. 15. Demonstration of Prioritization Scoring With 3 criteria Rank Criteria A: Criteria B: Criteria C: Priority using 3 True Blue Easy to clean Longlasting Sum Score criteria Equal Weight 1 1 1Strategy A 1 3 1 5 1.7 6Strategy B 2 3 5 10 3.3 2Strategy C 3 2 3 8 2.7 5Strategy D 4 5 5 14 4.7 1Strategy E 5 3 2 10 3.3 2Strategy F 3 3 3 9 3.0 4
    16. 16. Demonstration of PrioritizationScoring With 3 criteria and weighting Criteria A: Criteria B: Criteria C: Criteria C: Priority True Blue Easy to clean Longlasting Longlasting Score Rank Weight 1 1 1 1Strategy A 1 3 1 1 1.5 6Strategy B 2 3 5 5 3.8 2Strategy C 3 2 3 3 2.8 5Strategy D 4 5 5 5 4.8 1Strategy E 5 3 2 2 3.0 3Strategy F 3 3 3 3 3.0 3
    17. 17. Demonstration of PrioritizationScoring With multiplierPaint your Criteria A: Criteria B: Priority Evidence Finalhouse True Blue Easy to clean Score (multiplier) Score RankStrategy A 1 3 2.0 proven (4) 8.0 2Strategy B 2 3 2.5 Likely effective 7.5 3 (3)likely effectiveStrategy C 3 2 2.5 (3) 7.5 3Strategy D 4 5 4.5 promising (2) 9.0 1Strategy E 5 3 4.0 emerging (1) 4.0 6Strategy F 3 3 3.0 promising (2) 6.0 5
    18. 18. Discussion & Decision Discussion: Can some of the potential prioritization criteria be combined based on similarity?Could potentially combine some criteria:  Population Impact AND Expected Reach  Community Support, Capacity to Implement, AND Opportunity for Leverage Decision: Vote on combination(s) suggested.
    19. 19. Discussion of AdditionalCriteria Could consider the following additional criteria:  Alignment with National Priorities  Evidence Level
    20. 20. In a nutshell…Levels ofEvidence in PSD Proven: systematic or narrative reviews; considers study design and execution, external validity, body of evidence, and results Likely Effective: peer review articles in scientific literature; considers study design and execution, external validity, body of evidence, and results Promising: written program evaluation without formal peer reviews; considers summative evidence of effectiveness, theory, and formative evaluation data Emerging: ongoing work with little evidence so far, but sound theory and evaluation in place Not Recommended: evidence of effectiveness is conflicting and/or of poor quality and/or suggestive of harm
    21. 21. Discussion & Decision Discussion: How many criteria should be included in the prioritization rubric?  Top 3  Top 5  All Decision: Vote on number of criteria to be included in final prioritization rubric.
    22. 22. Discussion & Decision Discussion: Should any of the prioritization criteria be more heavily weighted than others? Decision: Vote on weighting.
    23. 23. Clarify Strategies and State Roles
    24. 24. The “Ask” RegardingStrategies and CDPHE Roles 58 strategies presented (only included those Notes: with some level of evidence) 1. Evidence level noted inaccurately on Asked to select appropriate CDPHE roles and survey for 2 Built Environment identify information gaps for each strategy: Strategies: • Transportation Three purposes: policy/access to transit  Info about CDPHE role may provide information that will be helpful at a later stage in the EBPH • Open space preservation process.  Can we eliminate any strategies based on no 2. Diabetes Prevention Program has now CDPHE role? been designated as “Proven”  What information is needed before we can move forward? Pages 4-13
    25. 25. Takeaways Four roles were most commonly noted for CDPHE:  Influence state-level policy with regard to this strategy  Provide funding to local or state partners to implement this strategy  Provide guidance and/or technical assistance to local or state implementation partners  Coordinate activities with other state agencies No strategies met elimination criteria, but the specific CDPHE role was unclear for 8 strategies (see page 5). Members desire more information about 8 strategies (see pages 6-13)
    26. 26. Literature Review ResultsSECTOR EVIDENCE RATINGSSchools 7 Likely Effective; 2 Promising; 5 EmergingChild Care 4 Likely Effective; 3 Emerging; 2 Not RecommendedFood Systems 2 Likely Effective; 3 PromisingHealth Systems 2 Proven; 3 Likely Effective; 1 Promising; 1 EmergingWorksites 5 Proven; 1 Likely EffectiveMedia 1 Likely Effective; 2 Promising; 2 Emerging; 1 Not RecommendedCommunity 1 Proven; 5 Likely Effective; 1 EmergingBuilt Environment 5 Likely Effective; 2 Emerging
    27. 27. Discussion & Decision Discussion: Should strategies where no clear agreement was reached about a CDPHE role be included or excluded from the prioritization process? Decision: Vote on whether to eliminate 8 strategies in final prioritization rubric.
    28. 28. Discussion & Decision Discussion: Should any strategies not previously identified by the sector teams be included in the prioritization process? For each new strategy discussed, need:  Description  Evidence: Individual Study? Narrative Review? Systematic Review?  Potential State Role Decision: Vote on whether or not there is a state role for each new strategy proposed. If so, the strategy moves to the list to be prioritized.
    29. 29. Final Tally of # of StrategiesMoving Forward for Prioritization
    30. 30. Next Steps:Prioritization ProcessWhen Purpose/ActionsMeeting 1 (2/29) • Getting Started • Agreements For Moving ForwardBetween Meetings Survey Coming Tomorrow (due 3/9)(e-input) • Review/Provide Input on Proposed Criteria • Identify State Roles for StrategiesMeeting 2 (3/19) • Review/Finalize Criteria9:30am -12:30pm • Clarify Strategies • Clarify State RolesBetween Meetings Prioritization Survey (sent 3/21, due 3/26)(e-input) • Rate Strategies Using Final Criteria • Refine Concise StatementMeeting 3 (4/2) • Share Results of Prioritization9:30am – 12:30pm • Gather Additional Input for Ex Committee
    31. 31. Thank You and Closure On the index card provided let us know What is still on your mind? Leave note sheets on table before you leave For those on the phone: email your responses to: Laurie.schneider@ucdenver.edu

    ×