Developing the Dashboard

2,709 views

Published on

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,709
On SlideShare
0
From Embeds
0
Number of Embeds
311
Actions
Shares
0
Downloads
54
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Wherever we see systematic measurement of results in healthcare—no matter what the country--we see those results improve
  • Data and resulting information that supports our thinking feels good, actually results in a dopamine rush. However, data and resulting information that does not necessarily support our thinking yields the greatest insight! New York Times article in Sunday Review (10/19) Why we make bad decisions.
  • Biggest challenge is getting leaders on same page with overall strategy. Dashboard is a tool to facilitate engagement in the process and evaluation of outcomes.
  • Use iterative approach
    Allows audience to make sure it answers right questions
    Will be able to quickly assess feasibility and translate business needs into data and technical requirements
  • First Established baseline measurement period.
    Then we determine metrics, while capitalizing on existing reports
    These metrics were our key performance indicators.
    Felt strongly about including cost component. Describe the metric we used.
  • First Established baseline measurement period.
    Then we determine metrics, while capitalizing on existing reports
    These metrics were our key performance indicators.
    Felt strongly about including cost component. Describe the metric we used.
  • Talk about importance of determining the baseline.
    Talk about process for getting data from multiple sources.
    Talk about the importance of leadership making this work in supplying data etc. a priority.
    Talk about underlying foundation of data integrity- trusting validity and transparency fostered by senior leadership
  • True needs always become clearer after the dashboard or report is already in use.
  • Grounding in academics
  • Developing the Dashboard

    1. 1. Developing a Dashboard Measurement and Evaluation: Understanding the Impact of Innovation October 29, 2013 1
    2. 2. Objectives Describe dashboard development process in the context of MGH Innovation Unit work. Identify barriers and facilitators in developing, implementing, and sustaining the Innovation Unit Dashboard performance measurement tool. 2
    3. 3. Agenda Overview of Innovation Units Empirical Outcomes Dashboard Development Rationale Innovation Unit Dashboard:  Metric selection  Data sources & relevant benchmarks Using data to tell stories Future considerations 3
    4. 4. Positioning MGH for The Future Care Redesign: Population Management: Reducing the Trend of Healthcare Costs, Long-term Outpatient Care Multidisciplinary Services, Large Patient Population, Big $ $$ The Patient Journey Patient Affordability For MGH & Payers: Direct Patient Care: ED, Periop, Inpatient (Innovation Units) Overhead (NonLabor costs) Incentives: Intrinsic and Extrinsic Technology Application: Partners E-Care, Outcomes Registries
    5. 5. Innovating Care at MGH We are attempting transformational change. Innovation Units are tests of change that will help us quickly identify what works and what does not work to improve the quality of care delivered to our patients.  High performing interdisciplinary teams that deliver safe, effective, efficient, timely, equitable care, that is patient- and family-centered  Standardization of processes and care reduces variation and introduces a systematic approach to improving quality and safety in the inpatient setting  Identify and prioritize hazards and opportunities for standardization, then implement evidence based methods to rectify the problem 5
    6. 6. Three Key Areas of Focus and Four Desired Outcomes Focus 1. New Culture through Relationship Based Care 2. New Role of Attending Nurse; Domains of Practice 3. Standardized Processes  Throughput and LOS Reduction  Technology  Controlling Variation  Implementing Evidence Based Practice Outcomes 1. Patient Satisfaction: care is equitable and patient- and family-focused 2. Clinical Quality: to improve quality and to make care safer 3. Unit Cost Reductions: to make care more cost effective 4. Staff Satisfaction: to remain a great place to practice 6
    7. 7. “Patient Journey” Framework Before Preadmission Care During Admission Process: ED, Direct Admits, Transfers Patient Stay; Direct Patient Care, Tests, Treatments, Procedures, Clinical Support, Operational Support Post Discharge Process Post Discharge Care Support Functions: Finance, Information Systems, HR Goal: High-performing interdisciplinary teams that deliver safe, effective, timely, efficient and equitable care that is patient and family centered. Where Are There Opportunities to Reduce Costs Across These Processes of Care? 7
    8. 8. Innovations in Care Delivery “Patient Journey” Framework – Initial 15 Interventions Patient stay; direct patient care; tests; treatments; procedures; clinical support; operational support Discharg e process Intervention Admission process: ED, direct admits, transfers After Intervention Intervention Preadmission care During Intervention Before Postdischarge care Goal: High-performing, inter-disciplinary teams that deliver safe, effective, timely, efficient, and equitable care that is patient- and family-centered Discharge Planning: -Est. discharge date -Discharge disposition Domains of Practice Daily Interdisciplinary Team Rounds Electronic Unit Whiteboards In-Room Whiteboards Smart Phones Wireless laptop computers/tablets Business cards Hourly rounding Quiet hours Welcome Packet (notebook and discharge envelope) Relationship-based care ♦ The Attending Nurse role Copyright MGH 2012 -8- ♦ Discharge -Follow-up Call Program Hand-Over Rounding Checklist
    9. 9. Focus on Empirical Outcomes • Focus on “What difference have you made?” • Shift from structure and process to outcomes. • Key indicators that paint a picture of the organization. 9
    10. 10. Donabedian Model Donabedian, 1966; 1990 © American Nurses Credentialing Center 10
    11. 11. – Evaluation Innovation Cluster Focus Areas * Interventions ** Evaluation (Pre, During, Post) Throughout Admission Relationship-Based Care Attending Nurse Handover Rounding Checklist Patient Engagement Quantitative •HCAHPS Pre-Admission •Leadership Influence over Professional Practice Environments (LIPPES) Pre-Admit Data Collection Welcome Packet During Admission Roles & Structures Education Communication Domains of Practice Interdisciplinary Rounds Business Cards Quiet Hours Hourly Rounding Electronic White Boards In Room White Boards Smart Phones Hand Held/ Tablets Post-Discharge Discharge Follow-up Phone Calls Others as identified •LOS •Quality Indicators •Patients Perceptions of Feeling Known (PPFKN) •Readmissions •Revised Perceptions of Practice Environment Scale (RPPE) Qualitative •Focus Groups (Staff, Patients, Families, etc) •Observations •Narratives •Survey of the Innovation Unit Expectations (SIUE-pre) •Survey of the Innovation Unit Experiences (SIUE-post) •Cost per Case Mix * The clusters are a lens with which we gain perspective on any particular intervention. •Staff Retention Other measures as identified ** May apply to any or all 3 of the cluster focus areas June 2013
    12. 12. Why a Dashboard “Rapid Improvement in any field requires measuring results…” Porter, Lee 12
    13. 13. Data and Information Data Information Data are individual facts, statistics or items of information. (http://dictionary.com) Information is the result of processing, manipulating and organizing data in a way that adds to the knowledge of the person receiving it. (http://en.wikiquote.org) 13
    14. 14. Challenge Create an easy to use dashboard tool Implement quickly Consolidate Key Performance Indicators (KPIs) from multiple hospital sources Provide visibility of data across Innovation Units Use current benchmarks to measure performance Foster data transparency Drive improvement through PDSA Cycle  Supporting change with data  Testing changes  Spreading improvements 14
    15. 15. Dashboard strategy Begin with end in mind Know your customers/understand how they use information Know questions dashboard is trying to answer When thinking about metrics, make sure you can actually collect the data Develop a draft, engage users, and iterate Disseminate to users Refine as needed Periodically revisit dashboard needs as they relate to ongoing measurement plan. 15
    16. 16. Dashboard tactics Create shell, simple mockup  Make sure all functions and levels of info are represented.  Make sure it is feasible to obtain all the data.  Figure out who will be doing the data aggregation and preparation.  These steps will help determine scope. Fill in shell with metrics  Complete prototype (in Excel)  Consolidate draft metrics from multiple sources into single, concise, printable view  Highlight performance relative to benchmarks with visual displays  Refine structure and design, including time periods for reporting Demo/pilot dashboard  Helps set expectations with users  Validates format and metrics Document business requirements  Includes calculation of metrics and description of benchmarks and data sources Plan for updates 16
    17. 17. Dashboard Development Innovation Unit Dashboard sample Massachusetts General Hospital - PCS Innovation Unit Dashboard At a Glance Measures Unit Unit Unit ICU Unit Unit Unit ICU QUALITY AND SAFETY Patient-Centered Outcome Measures Falls per 1,000 Patient Days Total Fall Rate Observed Falls with Injury per 1,000 Patient Days Falls with Injury Rate Observed Unit Unit Benchmark Color Shading Relative to Benchmark Cr it ical Car e Adult Benchm ar k 0.99 Sur g Adult Benchm ar k 2.57 2.78 7 3.15 9 3.55 10 0.00 0 0.95 1 0.00 0 3.14 7 1.55 2 0.00 0 0.47 1 1.36 3 NDNQI 0.79 2 0.70 2 0.35 1 0.00 0 0.00 0 0.00 0 0.00 0 0.77 1 0.00 0 0.00 0 0.00 0 NDNQI 3.3% 1 0.0% 0 0.0% 0 0.0% 0 0.0% 0 0.0% 0 7.1% 1 NA 0.0% 0 0.0% 0 NDNQI 3.3% 1 0.0% 0 0.0% 0 0.0% 0 0.0% 0 0.0% 0 7.1% 1 NA 0.0% 0 0.0% 0 NDNQI 0.0% 0 0.0% 0 0.0% 0 0.0% 0 0.0% 0 0.0% 0 7.1% 1 NA 0.0% 0 0.0% 0 NDNQI NA NA 0.0% 0 0.0% 0 0.0% 0 NA NA NA NA NA 3.40 1 0.00 0 0.00 0 2.55 1 Hospital Acquired (HA) Pressure Ulcers Total HA Pressure Ulcer Prevalence Rate 0.0% Observed 0 Hospital Acquired (HA) Pressure Ulcers Type II or Greater Total HA Pressure Ulcer Type II or Greater Prevalence Rate 0.0% Observed 0 Restraints Total Restraint Prevalence Rate Observed Unit 0.0% 0 Peripheral Intravenous (PIV) Infiltrations - Pediatric/Neonatal Total PIV Infiltration Prevalence NA Observed Central Line-associated Bloodstream Infections per 1,000 Line Days (CLABSI) Total CLABSI Rate 0.00 1.73 0.00 Observed 0 1 0 Metric categories:  Throughput and Efficiency  Patient & Staff Satisfaction  Quality and Safety  Infection Control  Patient Satisfaction  Staff Satisfaction 2.84 3 NA 0.00 0 4.85 1 NDNQI NHSN Worse NA Better NA NA NA Worse NA Better NA NA NA Worse NA Better NA NA NA Worse NA Better NA NA NA Worse NA Better NA NA NA Worse NA Better NA NA NA >1 NA 0 or 1 Color Shading relative to Benchmark: Rate is worse (higher) than benchmark. Rate is better (lower) than benchmark. Goal: Measure the impact of Innovation Units’ interventions Tactic: Reliably store & communicate evaluation data 17
    18. 18. NSI Reporting, Examples Pre-Innovation unit launch (Summer 2011) Sample unit level report for CAUTI metric Sample Excellence Every Day Portal Page Sample Shared File area folder structure. Reporting for select metrics through Excellence Every Day Portal Quarterly unit level data and charts developed and stored in Shared File Area for RN Leadership (Printed color copies delivered to units) 18
    19. 19. What we did: Pre-Innovation unit launch Current Period Color Scoring Thresholds HPM Metric H=Higher is better; L=Lower is better @ or be tte r H/ L Hospital Leader Sep 10 3. Uniform High Quality 3.2 CHF Composite 96% 99% H 99% 3.3 PNE Composite 94% 97% H 98% 3.4 SCIP Wound Infection Composite 97% 99% H 97% Ordered 94% 100% H 97% Received 92% 3.5 SCIP-VTE, Rate of VTE Prophylaxis: 99% H Hospital 100% Jun 10 Mar 10 Sep 10 Dec 09 97% 100% 99% 97% 95% 98% 95% 95% 97% 98% 99% 96% 98% 100% 98% 100% 98% 100% 98% 99% Hospital Hospital Hosptial Hospital Hospital Le a d e r Leader Leader Leader Leader Jun 10 Mar 10 Sep 10 Dec 09 100% 97% 100% 100% 96% 97% 92% 89% 97% 97% 99% 96% 100% 98% 94% 100% 100% 97% 94% 100% Jun 10 Mar 10 Sep 10 Dec 09 99% 98% 94% 99% 94% 99% 95% 90% 98% 94% 94% 96% 100% 90% 100% 100% 99% 90% 94% 100% Jun 10 Mar 10 Sep 10 Dec 09 97% 97% 98% 100% 98% 96% 99% 93% 99% 99% 97% 98% 100% 100% 96% 100% 100% 100% 96% 100% Jun 10 Mar 10 Sep 10 Dec 09 98% 92% 98% 99% 92% 97% 94% 95% 97% 97% 96% 98% 100% 99% 97% 100% 98% 93% 97% 100% Jun 10 Mar 10 Sep 10 Dec 09 96% 97% 95% 96% 96% 95% 93% 98% 98% 97% 98% 94% 98% 96% 90% 91% 93% 96% 90% 91% Jun 10 Mar 10 Dec 09 100% 99% 97% 94% 93% 94% 97% 98% 97% 95% 80% 93% 95% 80% 93% Note: Sample dashboard for demonstration purposes only. Identified need for robust, comprehensive, tool for Nursing Sensitive Indicator (NSI) reporting. PCS had an initial Executive Committee Dashboard in place. Talked with internal experts for Strategic Performance Indicator reporting. 19
    20. 20. What we did: Metric selection Goals and Metrics 1. Improve Patient Experience Nurse Communication Quiet at Night Responsiveness Cleanliness Pain Management Overall rating Discharge Information 2. Improve Quality Decrease Hospital Acquired Conditions CAUTI Falls with Injury CLABSI Central Line Infections Pressure Ulcers Restraint Utilization Peripheral Intravenous (PIV) Infiltrations Interventions Source Hourly Rounding Quiet Hours Pain Tiger Team Hotel Style Cleaning Smart Phones and Whiteboards Patient/Family notebook Discharge Envelope Discharge phone calls Unit based patient advocate Service Excellence, HCAHPS data Relationship-based care Attending Nurse Handover Rounding checklist Domains of Practice Interdisciplinary Rounds Business Cards Quiet hours Hourly Rounding Electronic Whiteboards Smart Phones Infection Control, Patient Care Services Office of Quality and Safety 20
    21. 21. What we did: Metric selection, continued Goals and Metrics 3. Reduce Costs Direct Cost per Case mix adjusted discharge Labor Expense Hours Worked per Equiv Patient Day Medical Supply Expense Total Expense 4. Maintain Staff Satisfaction Professional Practice Environment Staff Staff Perception Survey Mean Scores NDNQI RN Survey Practice Environment Scale--Nursing Work Index Mean Score Optimize Efficiency/Throughput Readmissions ALOS Admission/ ED Admit Volume Interventions Source Follow up phone calls Handovers White boards Smart Phones Discharge Envelope PHS Finance, MGH Finance Relationship-based care Attending Nurse Role Domains of Practice Institute for Patient Care Standardized processes Use of Technology Controlling Variation Implementing Evidence-based practice Safe Handover Communication Finance Department, Admitting Department, Center for Quality and Safety 21
    22. 22. What we did: Data Sources & Benchmarks PHS Finance (EPSI) Admitting (PATCOM) Massachusetts General Hospital - PCS Innovation Units Dashboard Measures Patient Care Services (NDNQI) ED Information System (EDIS) Patient-Centered Outcome Measures Falls per 1,000 Patient Days Total Fall Rate Observed (N) Falls with Injury per 1,000 Patient Days Falls with Injury Rate Observed (N) Oncology Lunder 9 Medicine Ellison 16 NICU Blake 10 Ellison 17 Ellison 18 Surgery White 7 4.50 11 1.46 3 4.95 13 0.77 1 1.92 2 1.32 2 2.16 5 1.79 2 TBD 0.65 2 4.85 10 0.45 1 0.41 1 0.49 1 1.52 4 0.00 0 0.96 1 0.00 0 0.00 0 0.89 1 TBD 0.00 0 1.45 3 0.45 1 0.0% 0 6.9% 2 0.0% 0 0.0% 0 0.0% 0 0.0% 0 7.7% 1 TBD NA 4.8% 1 4.2% 1 0.0% 0 0.0% 0 0.0% 0 0.0% 0 0.0% 0 0.0% 0 7.7% 1 TBD NA 4.8% 1 4.2% 1 0.0% 0 0.0% 0 0.0% 0 0.0% 0 0.0% 0 0.0% 0 7.7% 1 TBD NA 0.0% 0 0.0% 0 NA NA 0.0% 0 0.0% 0 0.0% 0 NA NA NA NA NA NA 2.90 1 4.76 1 0.00 0 1.10 1 1.70 2 TBD NA 0.00 0 0.00 0 Hospital Acquired (HA) Pressure Ulcers Total HA Pressure Ulcer Prevalence Rate 0.0% Observed (N) 0 Hospital Acquired (HA) Pressure Ulcers Type II or Greater Total HA Pressure Ulcer Type II or Greater Prevalence Rate 0.0% Observed (N) 0 Restraints Total Restraint Prevalence Rate Observed (N) 0.0% 0 Peripheral Intravenous (PIV) Infiltrations - Pediatric/Neonatal Total PIV Infiltration Prevalence NA Observed (N) Infection Control (CDC) Service Excellence (HCAHPS) CICU ICU Obstetrics Ellison 9 Blake 12 Blake 13 Individual Units listed across top Individual Units listed across top Central Line-associated Bloodstream Infections per 1,000 Line Days (CLABSI) Total CLABSI Rate 6.54 NA 1.36 Observed (N) 1 1 CQS (EPSI - Readmissions) Pediatrics Ortho White 6 QUALITY AND SAFETY Note: metrics to be reported beginning FY 2012 Catheter-associated Urinary Tract Infections per 1,000 Device Days Ventilator-associated Pneumonia per 1,000 Vent Days Psych Vascular Blake 11 Bigelow 14 Color Shading relative to Benchmark: Rate is worse (higher) than benchmark. Rate is better (lower) than benchmark. PCS Institute for Patient Care (Staff Perception Surveys) Many sources and contacts Service Excellence (Pediatric Survey) MGH Finance (EPSI/Action OI) Many (many) data formats 22
    23. 23. What we did: Dashboard notes Massachusetts General Hospital - PCS Innovation Unit Dashboard Notes Metric Total Fall Rates Notes: Calculation Number of Patient Falls with Injury is the The number of events reported, number of events reported that resulted in calculated per 1000 patient days. patient injury, calculated per 1000 patient days. Lower values are better performance. At this time there are no unit-based benchmarks for patient falls so we are using Benchmarks from Partners HPM. In September 2010, PCS Quality & Safety began to submit data to NDNQI and unit based benchmarks will soon be available. Benchmark Data Source Contact Frequency NDNQI Incident reports from RL Solutions Nancy McCarthy Quarterly The number of events reported that NDNQI resulted in patient injury, calculated per 1000 patient days. Incident reports from RL Solutions Nancy McCarthy Quarterly Numerator = # of hospital acquired, stage II or greater pressure ulcers. Denominator = Total # of discharges? NDNQI Nancy McCarthy Quarterly HA - Pressure Ulcer Rates Stage II or greater Quarterly pressure ulcer incidence rate data Numerator = # of hospital acquired, is collected in a one-day prevalence study. stage II or greater pressure ulcers. # of hospital acquired, stage II or greater Denominator = Total # of discharges? pressure ulcers. Lower values are better. NDNQI Nancy McCarthy Quarterly Restraint Rate Restraint prevalence % of Patients in restraints NDNQI Office of Quality and Safety Quarterly Peripheral Intravenous Inflitrations NEW; Pediatric and Neonatal populations Office of Quality and Safety Quarterly CLABSI Infection Rate Quarterly Line Infection incidence rate Total PIV Infiltration Point Prevalence. NDNQI Total number of PIV infiltrations (Grades 2-4) on the unit divided by the total number of PIV sites on a unit. For children less than age 10, a Grade 1 infiltration is defined identically to a Grade 2. Numerator = # Line Infections (hospital NHSN Pooled acquired). Denominator = 1000 line Mean days One-day prevalence / incidence study conducted by nursing reps from PCS Office of Quality and Safety. One-day prevalence / incidence study conducted by nursing reps from PCS Office of Quality and Safety. One-day prevalence / incidence study conducted by nursing reps from PCS Office of Quality and Safety. One-day prevalence / incidence study conducted by nursing reps from PCS Office of Quality and Safety. Infection Control Paula Wright, Irene Goldenshtein Quarterly Falls w/ Injury Rates Number of Patient Falls with Injury is the number of events reported that resulted in patient injury, calculated per 1000 patient days. Lower values are better performance. At this time there are no unit-based benchmarks for patient falls so we are using Benchmarks from Partners HPM. In HA - Pressure Ulcer Rates-- Quarterly pressure ulcer incidence rate data ALL is collected in a one-day prevalence study. # of hospital acquired pressure ulcers (any stage). Lower values are better. Defines metric, source, contact, frequency. 23
    24. 24. Success of Dashboard Tool “I post the dashboard on our Communication Board on our unit.” “I love the opportunity to be transparent with my staff—it facilitates ownership of the clinical practice and an understanding of the global picture.” “It inspires my staff to ask questions about what we could be doing differently.” “I used the dashboard as part of the rollout of interventions.” “Data are key for having conversations with my staff.” Initial dashboard pushed out when (12) Innovation Units launched. Expanded dashboard as other phases rolled out. Accessed centrally in Shared File Areas and on Intranet. 24
    25. 25. Using Data to Tell Stories Quantitative data never tell the full story Project outcomes measured with qualitative themes from interviews and observations as well Promote narrative culture Not everything that can be counted counts, and not everything that counts can be counted. Albert Einstein, Physicist 25
    26. 26. 26 5
    27. 27. 27
    28. 28. Future Considerations Involve stakeholders in development process Review and revise list of metrics  Simplify  Identify “need to know” vs. “nice to know” data Connect dashboard with trend information Look beyond red/yellow/green Include graphical and visual representations of data Provide detailed notes and caveats Maintain data integrity Automate (to extent possible) 28
    29. 29. Critical Success Factors Shared vision Leadership engagement Agreement on metrics and data definitions Accountability 29
    30. 30. Resources Edward Tufte, www.edwardtufte.com 30
    31. 31. Questions? “Discovery consists in seeing what everyone else has seen and thinking what no one else has thought.” Albert Szent-Gyorgyi, Hungarian Biochemist, 1937 Nobel Prize Winner 31

    ×