• Share
  • Email
  • Embed
  • Like
  • Private Content
Best practices for data analysis 16 jan12
 

Best practices for data analysis 16 jan12

on

  • 1,523 views

Presentation given at the Breakthrough Strategies to Boost HEDIS Scores & Quality Management meeting, Key West, January 16, 2012.

Presentation given at the Breakthrough Strategies to Boost HEDIS Scores & Quality Management meeting, Key West, January 16, 2012.

Statistics

Views

Total Views
1,523
Views on SlideShare
1,517
Embed Views
6

Actions

Likes
1
Downloads
14
Comments
0

2 Embeds 6

https://www.linkedin.com 5
http://www.linkedin.com 1

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution-NonCommercial LicenseCC Attribution-NonCommercial License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Best practices for data analysis 16 jan12 Best practices for data analysis 16 jan12 Presentation Transcript

    • Best Practices for Data Analysis A Chasing Logic Case Study Wayne Pan, MD, MBA Chief Medical Officer Pacific Partners Management Services, Inc.Breakthrough Strategies to Boost HEDIS Scores & Quality Management • The Reach Resort, Key West • January 16, 2012
    • Best Practices for Data Analysis A Chasing Logic Case Study? Wayne Pan, MD, MBA Chief Medical Officer Pacific Partners Management Services, Inc.Breakthrough Strategies to Boost HEDIS Scores & Quality Management • The Reach Resort, Key West • January 16, 2012
    • to recap.... Breaking Down the Silos for Risk Adjustment, HEDIS & Care Management: From Streamlining Charts to Capturing & Sharing the Right Data A Complementary Duo - Risk Adjustment and HEDIS: How to Ensure a Plan is Effectively Accomplishing Objectives for BothPractical Steps to Improve the Integrity, Quality & Timeliness of Data & Supplemental Data How Star Rating Measures Correlate with Overall HEDIS Quality The State of Health Care Quality – Plans, Providers and Consumers in the New World of Star Ratings and Exchanges A Two-Part Discussion on Effective Quality Improvement Partnerships A Collaborated Effort: Plans & Providers Working to Improve Clinical Outcomes Provider Education & Incentives: Creating Tools & Toolkits for Providers That They Will Use
    • we cannot continue to keep quality, risk adjustment/stratification, cost-effectiveness, medical management, network management, and member engagement in separate silos
    • that’s because....
    • they’re all related!
    • with the common denominator....
    • the patient.
    • some learnings from California’s IHA P4P....
    • theory: combine the variousplans’ quality P4P bonusprograms into a single coherentprogram for physicians
    • results: very modest gains inquality outcomes despite morethan half a billion dollars in P4Pbonuses paid out, since 2004
    • what happened?
    • not enough money set asidequality not linked to efficiencyproviders always have sicker,less compliant patients
    • the P4P program didn’t changeprovider or patient behavior
    • the plans didn’t save any money
    • sponsored byIHAP4P
    • two ideas....
    • combine quality & efficiency....
    • Two Strategic P4P Goals for 2011 2015Goal #1: Bend the cost trend Targets: •Total cost below risk adjusted, geography adjusted average •Total Cost trend below Consumer Price Index +1Goal #2: Achieve meaningful quality improvements in clinical care and patient experience, and increase meaningful use of health IT Copyright © 2011 Integrated Healthcare Association. All rights reserved. 4
    • be patient-centered....
    • why does that matter?
    • insanity: doing the samething over and over andexpecting different results
    • get out of our rut?
    • 4x4 healthcare
    • 4processes
    • communication
    • collaboration
    • Case ManagersPatients PCPs Specialists coordination
    • anticipation
    • 4dimensionaldata
    • financial
    • administrative
    • clinical
    • retrospective retrospective
    • reactivecare
    • $$$$$$
    • + behavioral
    • predictive
    • prediction can lead to....
    • proactivecare
    • From: Dan Roam, “American Healthcare: a 4-napkin explanation”www.slideshare.net/danroam/healthcare-napkins-all
    • one more thing....
    • all providers are not the same
    • use multiple communicationchannels and communicateconsistently....
    • case study: SCCIPA
    • Santa Clara County 1,304.01 sq. miles 1,781,642 (2010) $74,335
    • 5 PCP 80 Specialists 173 PCP 343 Specialists 57 PCP 104 Specialists 11 PCP 30 SpecialistsSCCIPA founded in 1986 physician-owned, physician-governed 800+ physicians - 240+ PCPs, 550+ specialists all 9 hospitals - including a tertiary care center 9 health plans (Commercial and Medicare Advantage)
    • peopleprocessesplatform
    • peoplehospitalistsSNFistsonsite case managerscomplex case managersutilization review staff
    • processesP4P/CMS 5 STAR dashboardincentive bonus based on qualityAscender for HCC processpaper quarterly physician workplanhospitalists perform HCC coding
    • platformcommon web-based communication platformfacilitates administrative functionsrules-based management of processesintuitive user-interfaceembed quality reminders into office/provider workflowprovider feedbackprovide actionable clinical data at point of careallow patients to access their own dataallow patients to provide feedback and enter their own data
    • results
    • Inpatient  Hospital  Admissions   Per  1000  Enrolled  Patients  (Commercial)70.060.050.040.0 2011  Milliman  Benchmark  (Well  Managed) 2011  Milliman  Benchmark  (Moderately  Managed) 2011  Milliman  Benchmark  (Loosely  Managed) SCCIPA  A dmits  per  1000  (Commercial)
    • Bed  Days  Per  1000  Enrolled  Population  (Commercial)300.0250.0200.0150.0100.0 2011  Milliman  Benchmark  (Well  Managed) 2011  Milliman  Benchmark  (Moderately  Managed) 2011  Milliman  Benchmark  (Loosely  Managed) SCCIPA  Bed  Days  (Commercial)
    • Average  Length  of  Stay  (Commercial)4.33.83.32.8 2011  Milliman  Benchmark  (Well  Managed) 2011  Milliman  Benchmark  (Moderately  Managed) 2011  Milliman  Benchmark  (Loosely  Managed) SCCIPA  A verage  L ength  of  Stay  (Commercial)
    • Inpatient  Hospital  Admissions   Per  1000  Enrolled  Population  (Medicare)350.0300.0250.0200.0 2011  Milliman  Benchmark  (Well  Managed) 2011  Milliman  Benchmark  (Moderately  Managed) 2011  Milliman  Benchmark  (Loosely  Managed) SCCIPA  A dmits  (Medicare)
    • Bed  Days  Per  1000  Enrolled  Population  (Medicare)20001750150012501000 750 2011  Milliman  Benchmark  (Well  Managed) 2011  Milliman  Benchmark  (Moderately  Managed) 2011  Milliman  Benchmark  (Loosely  Managed) SCCIPA  Bed  Days  (Medicare)
    • Average  Length  of  Stay  (Medicare)5.55.04.54.0 2011  Milliman  Benchmark  (Well  Managed) 2011  Milliman  Benchmark  (Moderately  Managed) 2011  Milliman  Benchmark  (Loosely  Managed) SCCIPA  A LOS(Medicare)
    • * * * * *increase/same scores in 21 of 26 measures
    • no
    • When you improve a little biteach day, eventually big thingsoccur. Don’t look for big, quickimprovement. Instead, seeksmall improvement one day at atime. That’s the only way ithappens - and when it happens,it lasts. John Wooden
    • iteration
    • Virtually nothing comesout right the first time.Failures, repeatedfailures, are finger postson the road toachievement. The onlytime you don’t want tofail is the last time you trysomething. One failsforward toward success.Charles F. Kettering
    • don’t be afraid to FAIL....
    • failfast
    • combine quality & efficiency.
    • be patient-centered.
    • use multiple communicationchannels and communicateconsistently....
    • welcome tohealthcare2.0
    • Thank you! wpan@ppmsi.comWWW.SNOOPY.COM