Plenary Day 1 Speaker 1 Hal Sox

867 views
809 views

Published on

Published in: Health & Medicine, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
867
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Plenary Day 1 Speaker 1 Hal Sox

  1. 1. Knowing what works in Health Care: A roadmap for the nation IOM Committee on Reviewing Evidence to Identify Highly Effective Clinical Services January 2008
  2. 2. Committee Charge: To recommend <ul><li>An approach to identifying highly effective services </li></ul><ul><li>A process to evaluate evidence on clinical effectiveness </li></ul><ul><li>An organizational framework for using evidence reports to make recommendations </li></ul>
  3. 3. -Evidence-Based Practice Centers -Proprietary firms Systematic review <ul><li>Policy </li></ul><ul><ul><li>Practice guidelines </li></ul></ul><ul><ul><li>Performance measures </li></ul></ul><ul><ul><li>Insurance coverage </li></ul></ul>Body of evidence <ul><li>USPSTF </li></ul><ul><li>ACP </li></ul><ul><li>NIH </li></ul><ul><li>BC </li></ul><ul><li>CMS </li></ul><ul><li>ACC/AHA </li></ul>
  4. 4. Strengths of the system <ul><li>Some excellent role models for transparent guideline development </li></ul><ul><li>A network of skilled SR teams </li></ul><ul><li>Some GL users have a lot of muscle </li></ul><ul><ul><li>Coverage decisions, practice measures </li></ul></ul>
  5. 5. The problems <ul><li>SRs and GLs often lack scientific rigor </li></ul><ul><li>Body of evidence is often weak. </li></ul><ul><li>Difficult for user to connect recommendations to the evidence </li></ul><ul><ul><li>No standard language for rating strength of evidence or recommendation. </li></ul></ul><ul><ul><li>No standard process for developing recommendations. </li></ul></ul><ul><ul><li>No expectation for clear explanations </li></ul></ul><ul><ul><li>Bias and conflict hidden from view </li></ul></ul><ul><li>Duplicated effort and conflicting recommendations </li></ul>
  6. 6. Policy in the US <ul><li>A work in Progress: a National Program for clinical effectiveness research </li></ul><ul><ul><li>? Fund primary research, ? Develop GLs </li></ul></ul><ul><ul><li>everyone is calling for it </li></ul></ul><ul><ul><li>legislation being written </li></ul></ul><ul><ul><li>Institute of Medicine committee  high-level blueprint </li></ul></ul>
  7. 7. Comparative clinical effectiveness research <ul><li>Primary investigation: </li></ul><ul><ul><li>RCTs, cohort studies, measure test performance. </li></ul></ul><ul><ul><li>Head-to-head comparisons of drugs, tests </li></ul></ul><ul><li>Summative research </li></ul><ul><ul><li>systematic reviews and meta-analyses of primary investigations </li></ul></ul>
  8. 8. IOM committee charge <ul><li>Primary investigation: </li></ul><ul><ul><li>RCTs, cohort studies, measure test performance. </li></ul></ul><ul><ul><li>Head-to-head comparisons of drugs, tests </li></ul></ul><ul><li>Summative research </li></ul><ul><ul><li>systematic reviews </li></ul></ul><ul><ul><li>meta-analyses </li></ul></ul>IOM report
  9. 9. The Committee’s recommendations
  10. 10. Recommendation: high level <ul><li>Congress should direct Secretary DHHS to designate a single entity to produce credible, unbiased information on CE. </li></ul><ul><ul><li>Would set priorities for and fund SRs. </li></ul></ul><ul><ul><li>Develop a common language and stds for SRs and GLs. </li></ul></ul><ul><ul><li>A forum to address conflicting GLs. </li></ul></ul>
  11. 11. Setting priorities The Program Systematic review <ul><li>Policy </li></ul><ul><ul><li>Practice guidelines </li></ul></ul><ul><ul><li>Performance measures </li></ul></ul><ul><ul><li>Insurance coverage </li></ul></ul>
  12. 12. Recommendation: high level <ul><li>The Program should develop standards to minimize bias due to conflict of interest </li></ul><ul><ul><li>For priority setting </li></ul></ul><ul><ul><li>For evidence assessment </li></ul></ul><ul><ul><li>For recommendations </li></ul></ul>
  13. 13. Setting priorities The Program Systematic review <ul><li>Policy </li></ul><ul><ul><li>Practice guidelines </li></ul></ul><ul><ul><li>Performance measures </li></ul></ul><ul><ul><li>Insurance coverage </li></ul></ul>
  14. 14. Recommendation: setting priorities <ul><li>The Program should appoint a standing committee (the Priority Setting Advisory Committee). </li></ul><ul><li>The process of priority setting should be open, transparent, efficient, and timely. </li></ul>
  15. 15. Recommendation: setting priorities <ul><li>Priorities should reflect the potential to </li></ul><ul><ul><li>Improve health outcomes </li></ul></ul><ul><ul><li>Reduce the burden of disease and health disparities </li></ul></ul><ul><ul><li>Eliminate undesirable variation </li></ul></ul><ul><ul><li>Reduce the economic burden of disease </li></ul></ul><ul><ul><li>Reduce the economic burden of treatment </li></ul></ul><ul><li>PSAC members should have a broad range of expertise and interests </li></ul><ul><li>Minimize committee bias due to COI </li></ul>
  16. 16. Setting priorities The Program Systematic review <ul><li>Policy </li></ul><ul><ul><li>Practice guidelines </li></ul></ul><ul><ul><li>Performance measures </li></ul></ul><ul><ul><li>Insurance coverage </li></ul></ul>
  17. 17. Problems with systematic reviews: the IOM committee’s view <ul><li>Current practice falls short of ideal </li></ul><ul><li>Methods poorly documented </li></ul><ul><li>Plan poorly executed </li></ul><ul><li>Quality of studies not assessed </li></ul><ul><li>Inappropriate statistical techniques </li></ul>
  18. 18. Assessing Evidence: recommendations <ul><li>The Program should develop and require : </li></ul><ul><ul><li>Evidence-based standards for SRs. </li></ul></ul><ul><ul><li>A common language for stating the strength of evidence </li></ul></ul><ul><li>The Program should invest in: </li></ul><ul><ul><li>Developing better methods </li></ul></ul><ul><ul><li>The professional workforce to do SRs. </li></ul></ul>
  19. 19. Developing Practice Guidelines: Recommendations
  20. 20. Done by the Program Systematic review <ul><li>Policy </li></ul><ul><ul><li>Practice guidelines </li></ul></ul><ul><ul><li>Performance measures </li></ul></ul><ul><ul><li>Insurance coverage </li></ul></ul>Setting priorities
  21. 21. Hybrid vs. Agency Model <ul><li>Hybrid Model </li></ul><ul><li>Agency Model </li></ul>PS SR GL production PS SR GL production
  22. 22. Hybrid vs. Agency Model <ul><li>Hybrid Model </li></ul><ul><li>Agency Model </li></ul>PS SR GL production PS SR GL production
  23. 23. Done by the Program Done By Existing Entities Systematic review <ul><li>Policy </li></ul><ul><ul><li>Practice guidelines </li></ul></ul><ul><ul><li>Performance measures </li></ul></ul><ul><ul><li>Insurance coverage </li></ul></ul>Setting priorities
  24. 24. Rationale for the hybrid model <ul><li>Reduce political opposition to creating The Program </li></ul><ul><li>Render the Program less vulnerable to political pressure. </li></ul><ul><li>Engender trust through the clinical credibility of professional organizations. </li></ul><ul><li>Avoid duplication of effort by The Program and professional organizations. </li></ul>
  25. 25. Won’t the hybrid model simply perpetuate the chaotic system we have now?
  26. 26. Won’t the hybrid model simply perpetuate the chaotic system we have now? Not if all guideline producers adhere to The Program’s standards for process and language for rating the strength of the evidence and recommendations.
  27. 27. Tie their credibility to adherence to standards of good practice How to motivate guideline developers to adhere??
  28. 28. Recommendations: developing guidelines <ul><li>Guideline groups should use The Program’s standards, document their adherence, and publish the documentation. </li></ul><ul><li>Users of guidelines should preferentially use recommendations developed according to Program standards. </li></ul>
  29. 29. Challenges for The Program <ul><li>Developing consensus on standards for SRs and GLs. </li></ul><ul><ul><li>Language </li></ul></ul><ul><ul><li>Process </li></ul></ul><ul><li>Developing brand credibility </li></ul><ul><ul><li>Meeting the Program’s standards becomes a mark of GL quality that GL users require as a matter of corporate integrity. </li></ul></ul>
  30. 30. The IOM Committee’s recommendations about priority-setting
  31. 31. Getting nominations Source: IOM report “What works….”, 2008 anyone NHS NICE Anyone CMS website MCAC Anyone Varies Cochrane BCBSA people BCBSA people BCBSA Anyone Federal Register + stakeholders AHRQ & USPSTF Who can nominate? Solicitation Organ-ization
  32. 32. Topic nominations received by AHRQ (2005 and 2006) Source: IOM report “What works….”, 2008 47 Total 8 Other 16 Professional Societies 3 Health Plans 20 Federal Agencies Number Source
  33. 33. AHRQ: Categories of Topics Nominated Source: IOM report “What works….”, 2008 5 Other 8 QI and patient safety 6 Organization and Finance 1 Rehabilitation 36 Treatment 13 Diagnosis 7 Prevention Number Category
  34. 34. Priority-setting Criteria Source: IOM report “What works….”, 2008 X X Variation X X x X New Evidence X X x X Controversy X X X x X Impact X X X X Burden X X Cost USPSTF NICE OMAR BC-BSA AHRQ
  35. 35. The IOM committee’s conclusions <ul><li>We lack evidence to inform a choice between priority-setting methods. </li></ul><ul><li>AHRQ has lots of experience with an open process. </li></ul><ul><li>Consensus exists on the key variables (ex. Variation) </li></ul><ul><li>The Program needs a fast track review process with its own priority setting process </li></ul><ul><ul><li>Health plans need to decide about new technologies quickly </li></ul></ul>
  36. 36. The IOM committee’s conclusions (cont.) <ul><li>The Program needs a timely process for revisiting earlier reviews </li></ul><ul><ul><li>Another task competing for SR resources </li></ul></ul><ul><ul><li>The evidence base changes rapidly: </li></ul></ul><ul><ul><ul><li>Shojania et al, 2007 found that 57% of SRs had a signal that would change the basic conclusion in a median of 5 years . </li></ul></ul></ul>
  37. 37. The IOM committee recommendations <ul><li>Appoint a Priority Setting Advisory Committee </li></ul><ul><ul><li>Process should be open, transparent, efficient, and timely. </li></ul></ul><ul><ul><li>Key criteria include: </li></ul></ul><ul><ul><ul><li>Impact on health outcomes across lifespan </li></ul></ul></ul><ul><ul><ul><li>Reduce disease burden and disparities </li></ul></ul></ul><ul><ul><ul><li>Reduce variation </li></ul></ul></ul><ul><ul><ul><li>Take costs into account </li></ul></ul></ul><ul><ul><li>Seek a balanced committee to reduce impact of bias due to conflict of interest. </li></ul></ul>
  38. 38. Principles to guide priority setting process <ul><li>Consistency </li></ul><ul><ul><li>Develop rules and stick to them </li></ul></ul><ul><li>Efficiency </li></ul><ul><li>Objectivity </li></ul><ul><ul><li>Stick to the criteria and the evidence; avoid bias due to conflict of interest </li></ul></ul><ul><li>Responsiveness </li></ul><ul><ul><li>Meet the needs of decision makers </li></ul></ul><ul><li>Transparency </li></ul><ul><ul><li>Define methods and post them; meetings are open to anyone </li></ul></ul>
  39. 39. The IOM committee’s two paramount criteria <ul><li>Priorities should reflect what patients and doctors want to know. </li></ul><ul><li>They should also reflect the potential of the intervention to improve outcomes that are important to patients. </li></ul>
  40. 40. Two Examples of Priority-setting <ul><li>1992 IOM committee’s recommended process </li></ul><ul><li>AHRQ’s process for selecting topics for the Clinical Effectiveness Reviews Program </li></ul>
  41. 41. IOM committee to recommend a priority setting process (1992) <ul><li>The report described an explicit process for developing a priority list. </li></ul><ul><li>Charles Phelps, Ph.D contributed the basic idea. </li></ul><ul><li>The agency for which the process was intended folded, and the method never got much traction. </li></ul>
  42. 42. W = the weight for a criterion S = the score for the criterion Source: IOM Committee on Priority Setting, 1992 Priority Score = Σ W n x lnS n
  43. 43. <ul><li>Prevalence of condition </li></ul><ul><li>Burden of illness </li></ul><ul><li>Cost </li></ul><ul><li>Variation in rates of use </li></ul><ul><li>Potential of TA to change health outcomes </li></ul><ul><li>Potential of TA to change costs </li></ul><ul><li>Potential of TA to change ELS issues </li></ul>Priority Score = Σ W n x lnS n
  44. 44. Staff Calculate priority score and rank order Model Assign a score to each criterion Staff + panel Obtain data for each criterion Panel Winnow the list Staff Solicit nominations of candidate technologies Panel Choose criteria and assign weights Priority Score = Σ W n x lnS n
  45. 45. 0 1 0.69 2.0 1.0 Δ ELS 1.04 2 2.08 4.0 1.5 Δ cost 2.62 3.7 2.33 3.2 2.0 Δ outco -2.13 0.17 -1.23 0.36 1.2 Var 11.24 1,800 13.66 9000 1.5 Cost 1.19 1.7 3.28 4.3 2.3 Burden 7.37 100 5.44 30 1.6 Prev W*lnS Score(S n ) W*lnS Score (S n ) W n =cardiac condition = acute surgery
  46. 46. Priority Score = Σ W n x lnS n 0 1 0.69 2.0 1.0 Δ ELS 1.04 2 2.08 4.0 1.5 Δ cost 2.62 3.7 2.33 3.2 2.0 Δ outco -2.13 0.17 -1.23 0.36 1.2 Var 11.24 1,800 13.66 9000 1.5 Cost 1.19 1.7 3.28 4.3 2.25 Burden 7.37 100 5.44 30 1.6 Prev W*lnS Score(S n ) W*lnS Score (S n ) W n =cardiac condition = acute surgery =26.25 =21.34
  47. 47. Priority setting for AHRQ’s Clinical Effectiveness Review Program <ul><li>AHRQ invites the public to nominate topics on a public website. </li></ul><ul><li>Quarterly, AHRQ collates the topics. </li></ul><ul><li>Scientific Resource Center writes a topic summary. </li></ul>
  48. 48. AHRQ’s five priority-setting criteria <ul><li>Prevalence of a condition </li></ul><ul><li>Burden of a condition </li></ul><ul><li>Cost of care of a condition </li></ul><ul><li>Disproportionate representation of the condition in the Medicare population </li></ul><ul><li>Potential for impact </li></ul>Source: Jean Slutsky, personal communication

×