Improving the Performance of the Public Mental Health System

  • 569 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
569
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
14
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Improving the Performance of the Public Mental Health System: Quality Assessment and Improvement
  • 2. Berwick’s Question
    • How can one tell whether or not a healthcare organization is really serious about improving its quality, instead of simply engaging in defensive measurement to protect itself against the demands of outsiders for information?
      • Don Berwick, The Basic Concepts of Quality Improvement , unpublished paper, 1987
  • 3. Quality Assessment and Improvement
    • Definitions
    • The Quality Environment
    • The Quality Assessment Component
    • DMAI—An Ongoing Process
    • Fundamental Principles of Quality Management
  • 4. Definitions
    • Performance measurement is the “regular collection and reporting of data to track work produced and results achieved”
    • Performance measure is “the specific quantitative representation of capacity, process, or outcome deemed relevant to the assessment of performance”
    • Performance standard is “ a generally accepted, objective standard of measurement such as a rule or guideline against which an organization’s level of performance can be compared”
  • 5. Definitions
    • Performance management is “the use of performance measurement information to help set agreed-upon performance goals, allocate and prioritize resources, inform managers to either confirm or change current policy or program directions to meet those goals, and report on the success in meeting those goals”
    • Performance measurement is “NOT punishment”
    • Guidebook for Performance Measurement ,
    • Turning Point Project
  • 6. Definitions
    • Two of the primary uses for the results of performance measurement are for:
      • Making comparisons of performance levels
      • Improving the quality of the processes and outcomes of the organization
    • The American College of Mental Health Administration (ACMHA) applied these distinctions between comparison and quality improvement in the proposed Consensus Set of Indicators for Behavioral Health
  • 7. Definitions
    • In the ACMHA project, five national accreditation entities reached consensus on a set of performance measures (CARF [The Rehabilitation Commission], the Council on Accreditation, the Council on Quality and Leadership in Support of Persons with Disabilities, JCAHO and NCQA) but not on the specifications for measurement
    • They concluded that it was “important to recognize that selecting appropriate measures depends on the purpose of assessing performance”
    • They designated measures as either a comparison measure or a quality improvement measure to clarify the intended use of each measure and its data set
  • 8. Definitions
    • For comparison purposes, the standards and measures should provide sufficiently valid and reliable quantification such that comparison across the system’s programs and departments can be made. By identifying the highest level of performance or outcome (the benchmark), an organization can duplicate those work processes to achieve higher performance overall.
    • For improving quality , some standards and measures lend themselves more to internal monitoring of performance and local accountability and are most suitable for supporting the improvement of the organization rather than for comparability among organizations.
  • 9. Definitions
    • Balanced Budget Act (BBA) of 1997 was a substantial rewrite of the Medicaid and Medicare program rules. Final rules were passed on 6/14/02; protocols and checklists then rolled out. Details in the protocols and checklists are critical for an understanding of BBA impact.
    • External Quality Review Organization (EQRO) is an independent entity that meets competence criteria for conducting Medicaid EQR activities; EQROs are being selected through state procurement processes to review the operations of risk bearing organizations contracting with state Medicaid agencies.
  • 10. Definitions
    • Managed Care Organization (MCO) is Medicaid’s term for a health plan that provides health care services to Medicaid enrollees
      • Examples include Group Health Cooperative, Community Health Plan of Washington and Molina
    • Prepaid Inpatient Health Plan (PIHP) is Medicaid’s term for a health plan that provides a more limited range of services than an MCO, for specialty services such as mental health
      • Examples include the Washington State Regional Support Networks (RSNs) and Oregon’s Mental Health Organizations (MHOs)
  • 11. The Quality Environment
    • Crossing the Quality Chasm: a New Health System for the 21 st Century, Institute of Medicine (IOM) 2001
      • Redesign of the health care system based on 10 new rules
      • Build organizational supports for change, including the incorporation of care process and outcome measures into daily work and revising financial methods to support quality work
    • Priority Areas for National Action: Transforming Health Care Quality , IOM 2003—20 priority areas selected including major depression (screening and treatment) and severe and persistent mental illness (focus on treatment in the public sector)
  • 12. The Quality Environment
    • Currently an IOM Committee is studying how to adapt the Quality Chasm recommendations to Mental Health and Addictive Disorders
    • In December 2004, a meeting was co-hosted by the National Council for Community Behavioral Healthcare (NCCBH), RWJ Center for Health Care Strategies and SAMHSA to frame a National Initiative for Behavioral Health Care Quality Improvement (CMS participated in this effort, an opportunity to generate their support as well as to foster relationships)
  • 13. The Quality Environment
    • The Institute for Healthcare Improvement (IHI) and Don Berwick, MD, have led the healthcare dialogue from its early beginnings
    • IHI is a healthcare industry focal point through National Forums, trainings, and Breakthrough Series that target reducing adverse drug events, medical errors or reducing delays and waiting times throughout the system
    • IHI partnered with Health Resources and Services Administration (HRSA) in developing and staffing the Health Disparities Collaboratives for Federally Qualified Health Centers (asthma, diabetes, depression)
  • 14. The Quality Environment
    • National Committee for Quality Assurance (NCQA), created jointly by healthcare purchasers and HMOs to assess, measure and report on the quality of care provided by managed care organizations
      • Measures performance through HEDIS ® , a standardized measurement system for MCOs
      • Accredits MCOs using standards grounded in QI—accreditation is based on a combination of accreditation survey scores and scores on HEDIS ® measures (33 of 100 points)
    • MAA incorporates most of the NCQA quality standards in MCO contracts, collects selected HEDIS ® performance measures, and uses NCQA accreditation for a major part of the EQRO review
  • 15. The Quality Environment
    • Joint Commission on Accreditation of Healthcare Organizations (JCAHO) accreditation process has shifted from “survey preparation and scores to continuous operational improvement in support of safe, high-quality care”
    • ORYX ® core measure data are used to continually assess key performance areas, and eventually will be incorporated into the organization’s performance report as core measures are adopted for programs
    • JCAHO prepared the CMS protocols for BBA EQROs to use in review of Medicaid MCOs and PIHPs
  • 16. The Quality Environment
    • BBA requires EQROs to operate within specific protocols:
      • Determine Compliance with Federal Medicaid Managed Care Regulations
      • Validate performance measures and methods of calculating measures of performance
      • Validate Performance Improvement Projects (PIPs) and methods of conducting a PIP
      • Conduct an Information Systems Capabilities Assessment (ISCA)
  • 17.
    • BBA rules require that the MHD and PIHPs have a Quality Assessment and Performance Improvement Program (QAPI) that includes mechanisms to detect both under-utilization and over-utilization
    • MHD is “expected to continuously and consistently monitor the appropriateness and quality of the consumer care delivery system” in PIHPs
    • MHD infrastructure is charged with reviewing statewide mental health data, recommending system improvements, and designing and implementing quality improvement projects and processes
    The Quality Environment
  • 18. The Quality Environment
    • Sample questions from the EQRO protocol include:
      • Have any recent QAPI activities been implemented to monitor compliance with established standards for timeliness of access to care and member services?
      • What types of information does the program provide to support recredentialing of providers?
      • How does your PIHP detect over- and under-utilization? Provide examples.
      • How are enrollee and provider data from all components of your network used in your QAPI?
  • 19. The Quality Environment
    • So we have a QAPI and PIPs—it’s all about the bureaucracy, right?—wrong, the work must have relevance to the organizational vision, mission and goals—it’s about achieving your purpose and serving your consumers
    • Requires leadership commitment and a deep understanding of the vision and mission of the system and/or organization
    • If you cannot tell 1) how a project specifically relates to your agency’s vision and mission, or 2) (worse) if you cannot tell how your agency’s mission and vision relate to quality, the project should be sidelined until you can…
    • Hayes and Nelson, A Handbook Of Quality Change Implementation For Behavioral Health
  • 20. The Quality Assessment Component
    • Are your system decisions made in a “data-free environment”?
    • How do you know if your agency is achieving its goals?
    • How do you know when you should initiate a PIP?
    • How will you decide on implementing practice guidelines?
    • How will you know if (and why or why not) the PIP or practice guideline is successful?
  • 21. The Quality Assessment Component
    • JCAHO ORYX ® Core Measurement Sets :
      • Data Quality Principles (Handout 1)
      • May vary by setting or by key issue
      • Relate to the basic principles of care, process oriented
    • Performance measure categories considered useful in the accreditation process
      • Clinical
      • Health status
      • Perception of care/service
    • Categories not considered useful include:
      • Financial measures
      • Utilization measures, unless related to a standard of quality
  • 22. The Quality Assessment Component
    • For JACHO Behavioral Health, requirements differ by type of organization
    • Organizations providing 24 hour care:
      • Select a minimum of six clinical, health status or perception of care measures from the set of JCAHO approved measures
      • Measures must focus on the clients that receive 24 hour services
    • Organizations providing non-24 hour care and/or 24 hour care for an ADC of less than 10:
      • Select at least six measures from any relevant source,
      • Share data, analytic conclusions and actions taken with surveyor
      • In future will be expected to select and enroll in a listed performance measurement system when core measures relevant to their services are identified
  • 23. The Quality Assessment Component
    • NCQA HEDIS ® BH Measurement Sets (Handout 2)
      • Follow up after hospitalization for mental illness
      • Antidepressant medication management
      • Mental health utilization—inpatient discharges and average length of stay
      • Mental health utilization—percentage of members receiving services
      • Chemical dependency utilization—inpatient discharges and average length of stay
      • Initiation and engagement of AOD dependence treatment
      • Identification of AOD services
  • 24. The Quality Assessment Component
    • PIHP Measurement Sets
      • Verity examples (Handout 3)
    • ACMHA Indicators (Handout 4)
    • PIHP Master Calendar (Handout 5)
    • Pilot Measurement of Initiation and Engagement (Handout 6)
  • 25. The Quality Assessment Component
  • 26.  
  • 27. The Quality Assessment Component
    • The quality assessment component requires that a group of selected indicators are regularly tracked and reported
    • The data should be regularly analyzed through the use of control charts and comparison charts (Stay tuned for details!)
    • These indicators should tell you if you are achieving your agency goals and objectives
    • These indicators can provide the basis for deciding when a PIP might be indicated and the baseline information for measuring the future impact of PIPs
  • 28. The Quality Assessment Component
    • The quality organization does not wait to be told (via regulations or requirements) what processes, procedures, or programs to implement. Instead the quality organization proactively implements a program that it recognizes it may have to alter as standards or regulations are developed…
    • Consider the ACMHA list of indicators as a starting place—it includes measures of what quality service means to consumers
    From Hayes and Nelson, A Handbook Of Quality Change Implementation For Behavioral Health
  • 29. The Quality Assessment Component
    • Quality assessment is an absolutely necessary, but not sufficient, step to change from a “data-free environment” to a “culture of measurement”
  • 30. DMAI—An Ongoing Process
    • Living in a the “plan-do, plan-do” world?
    • Too busy fighting fires to “close the loop”?
    • Quality assessment indicators must have relevance to the organizational vision, mission, goals and objectives
    • PIPs based on goals and objectives must use a Design/Measure/Analyze/
    • Improve cycle
  • 31.
    • MHD Implementation and Design Group has designated a standard methodology for the infrastructure to use in assessing, choosing, developing, monitoring and evaluating QI opportunities and outcomes
    • The Design, Measure, Analyze and Improve (DMAI) model is:
      • Congruent with and supported by JCAHO
      • Data driven—integrates trending, tracking, analysis and action into day-to-day processes
      • Already used by numerous providers and at least one RSN
    DMAI—An Ongoing Process
  • 32. DMAI—An Ongoing Process From Hayes and Nelson, A Handbook Of Quality Change Implementation For Behavioral Health
  • 33. DMAI—an Ongoing Process
    • D esign—two steps at the beginning
      • Establish objectives of the project
      • Establish the processes used to meet the objective
    • M easure
      • Establish the specific outcome and process measures the project will use for baseline and post implementation measurement
  • 34. DMAI-An Ongoing Process
    • A nalyze—two types of analysis
      • Use statistical and numerical methods
      • Use comparative methods
    • I mprove
      • Implement revised processes until analysis of measures indicates that the objectives have been met
  • 35. DMAI—An Ongoing Process
    • DMAI adds clarity about objectives into the cycle— this is implied, but not specified in the PDCA or PDSA, and two types of analyses are specified
  • 36. Fundamental Principles of Quality Management
    • Know your customers and what they need
    • Focus on processes
    • Use data for making decisions
    • Understand variation in processes
    • Use teamwork to improve work
    • Make quality improvement continuous
    • Demonstrate leadership commitment
  • 37.
    • Know Your Customers
    • Identify “customers” and their needs—in healthcare there are usually two sets of customers
      • The people who use your services are the primary customers
      • The purchasers of your services also have requirements
    • Set goals based on their needs and DMAI objectives based on the goals
    • Monitor performance and satisfaction to target performance improvement opportunities
    • Improve or redesign how work is done
    D esign I mprove
  • 38.
    • Focus on Processes
    • 85% of poor quality is a result of poor work processes, not of staff doing a bad job
    • When things go wrong, it is often at the point of the “handoff” in the process
    • Attend to improving the overall design, not just one part—some of the most complex and poor quality processes are the result of “improving” and creating “work arounds” at some steps instead of redesigning the entire process
    D esign I mprove
  • 39. Focus on Processes
    • Advice from NCQA, JCAHO and others—measure processes that are
      • High-risk
      • High-volume
      • Problem prone And
    • Can be tracked and reported as summary or aggregate statistics
    • Are being selected by other organizations to allow statistically valid comparisons to be made (for purposes of benchmarking)
    D esign I mprove
  • 40.
    • Use Data to Make Decisions
    • Use performance assessment data to target improvement
    • Use data analysis tools to develop information
    • Analyze data to identify root cause
    • Use data to monitor performance outcomes
    DMAI
  • 41. Use Data to Make Decisions
    • Collection of data on clinical outcome alone does not provide useful information about what led to the outcome, or how it can be replicated or improved
    • Pairing collection of outcome data with data on key process performance measures associated with the outcome will provide information on the consistency of the process of care
    • Statistical analysis of these sets of data tells an organization whether it is improving performance on outcomes while improving consistency
    M easure A nalyze
  • 42. Use Data to Make Decisions M easure A nalyze
  • 43. Use Data to Make Decisions
    • “ Symptom” is the indication of a problem, but not a statement of cause
    • “ Theory” is the preliminary diagnosis about the cause
    • “ Analysis” includes data that confirms or rules out theories
    • “ Solution”is the change that will best address the cause
    • “ Information” is data that confirms whether the solution is having the expected impact
    DMAI
  • 44. Use Data to Make Decisions DMAI From Hayes and Nelson, A Handbook Of Quality Change Implementation For Behavioral Health
  • 45. Use Data to Make Decisions
    • Check Sheet
    • Bar Chart
    • Histogram
    • Pareto Chart
    • Control Chart
    • Run Chart
    • Affinity Diagram
    • Brainstorming
    • Process Flow Chart
    • Interrelational Diagraph
    • Matrix Diagram
    • Tree Diagram
    • Cause and Effect Diagram
    Numerical Tools Conceptual Tools DMAI
  • 46. Use Data to Make Decisions
    • Conceptual tools support theory generation regarding root causes, a key step in the PIP process
    • Root causes:
      • In the logical chain of causes
      • Directly and economically controllable
      • Can be considered a constant part of (or deficiency in) the process under study
      • If eliminated, the problem disappears or is drastically reduced (the Pareto Principle or 80-20 rule)
    • Initiating a PIP that defines a desired solution, rather than a process to be studied can be hazardous to your QAPI’ s health!
    D esign
  • 47. Use Data to Make Decisions
    • Brainstorming for root causes—theory generation thrives on divergent thinking, so no idea is a bad one…
      • What can go wrong in the process we are studying?
        • Problems in hand-offs between steps
        • Problems in execution within steps
      • Look at machines, materials, methods, measurements, and people
    • Cause-effect or Fishbone diagram (Handout 7)
      • Organizes and displays theories
      • Encourages divergent thinking
      • Demonstrates the complexity of the problem
      • Encourages scientific analysis (rule-out)
    • Failure to use a Cause-effect diagram or use of an incomplete one, can be hazardous to a PIP’s health!
    D esign
  • 48. Use Data to Make Decisions
    • Numerical tools support analysis of PIP theories, measurement of PIP implementation and ongoing assessment
    • Specific theories are needed to drive data collection and analysis
    • Data collection and analysis leads to “the vital few” root causes by narrowing the competing theories of cause
      • Look for clusters of causes that can be tested together
      • Use stratifying variables to localize the problem and identify likely causes
      • Do Pareto analysis of symptoms and theories
    M easure A nalyze
  • 49. Use Data to Make Decisions
    • From Methods and Tools of Quality Improvement
    • Institute for Healthcare Improvement
    DMAI Forty or more paired measurements Scatter diagrams Correlations Forty or more measurements Histograms Distributions Time-ordered measurements (At least 12 sets of data points) Line graphs Trend Simple tallies by category (At least 30 cases) Bar charts, pie charts or summary statistics Simple percentage or magnitude comparisons Data Needed Use To Show
  • 50. Use Data to Make Decisions
    • The Four Dimensions of Variability
    Center: average, median or mode Spread: range or standard deviation Shape Sequence: trend From Methods and Tools of Quality Improvement Institute for Healthcare Improvement M easure A nalyze
  • 51. Use Data to Make Decisions
    • The average by itself is not a good summary of data, use a variety of numerical summaries (Handout 8)
    • Measures of center include:
      • Average/Mean: the total data values divided by the total number of observations
      • Median: the middle value in the data set, half of the data value lie above, half lie below the median
      • Mode: the most frequently occurring values in the set of data
    • Use histograms to look at overall variation patterns
    • Use line graphs to look at patterns over time
    M easure A nalyze
  • 52. Use Data to Make Decisions
    • Pareto Principle
      • In any group of things that contribute to a common effect, a relative few contributors will account for the majority of the effect
      • These few contributors are call the “vital few” while the many other contributors are called the “useful many”
      • The “vital few” hold the greatest potential gain from quality improvement efforts
    • Pareto Diagram—A fact based tool for priority setting in quality improvement efforts (Handout 9)
    M easure A nalyze
  • 53. Use Data to Make Decisions
    • Control charts (Handout 9)
      • Variation in performance data is the result of a complex system of causes
      • Variation in this system of causes has characteristics of random variation
      • Used for ongoing quality assessment, control charts can help decide when to take action on the process based on the data
      • Statistics provide “standard” distributions and mathematical methods for testing common and special cause variation
    M easure A nalyze
  • 54.
    • Understand Variation
    • Sources of variation include: machines, materials, methods, measurements, people, environment
    • Control charts are pictures of trend data with an extra feature—the range of variation built into the system
    • Common cause variation occurs if the process is stable— variation in data points will be random and obey a mathematical law—it is said to be in statistical control, with a large number of small sources of variation
    • If an organization reacts to random variation in a process that is stable/in statistical control, it is called tampering and leads to further complexity, increasing variation and mistakes
    M easure A nalyze
  • 55. Understand Variation
    • Special cause variation arises because of specific circumstances which are not part of the process all the time and may or may not ever recur—if the recurrence is periodic, clues to the root cause may emerge
    • Not in statistical control is:
      • One data point above or below the upper/lower control limits (three standard deviations)
      • Two out of three consecutive data points beyond two standard deviations
      • Of five consecutive data point, four are on the same side of the mean and beyond one standard deviation
      • Eight consecutive data points are on the same side of mean
    • Need to investigate special cause variation before making any conclusions about performance level
    M easure A nalyze
  • 56. Understand Variation
    • Don’t redesign an entire process (a PIP) when there is special cause variation, because there is not a consistent process to improve or stable baseline data to measure the impact of PIP implementation
    • A sentinel event is a special cause variation requiring root cause analysis
    • Examine specific incident(s) of special cause variation and make changes to a single element only after very careful analysis
    • Failure to distinguish between common and special cause variation can be hazardous to organizational performance!
    M easure A nalyze
  • 57. Understand Variation
    • Control chart analysis is done before comparison analysis to ensure a given process is stable before evaluation of relative performance level
    • Comparison charts are based on multiple organizations’ performance data (or on standards/benchmarks that have been adopted) and are used to evaluate relative performance level
    • If the process is stable, the only way to make improvements is to fundamentally change some aspect of the process—through a redesign of the process or PIP
    • Use benchmark data to create a new control chart that “raises the bar” on consistency of expected average performance and control limits
    M easure A nalyze
  • 58.
    • Use Teamwork
    • PIPs need buy-in from all stakeholders
    • Process being studied is stable, but complex
    • Creative ideas are needed
    • Division of labor is needed
    • Process often crosses functions
    • Solution generally affects many
    I mprove D esign
  • 59. Use Teamwork
    • Those who do the work
      • Have theories about the cause
      • Have the detailed knowledge needed for conceptual analysis tools such as fishbone diagrams
      • Have the clinical and intuitive judgment needed for design work
      • Have ideas about improving the processes
    • Improving processes means change—involvement in planning change and having staff that are seen as leaders for the change will be critical to successful implementation
    • Open, safe communication is critical for improving processes (Handout 10)
    D esign I mprove
  • 60. Use Teamwork
    • Provide every team with a clear charge and support resources
    • Teams should adopt working agreements (everything from cell phone rules to decision procedures)
    • Teams should have assigned roles of facilitators and recorders
    • The team process has some predictable stages that it is useful to keep in mind:
      • Forming
      • Storming
      • Norming
      • Performing
    I mprove D esign
  • 61.
    • Make QI Continuous
    DMAI From Hayes and Nelson, A Handbook Of Quality Change Implementation For Behavioral Health
  • 62. Make QI Continuous
    • QI is a system-wide approach to assessing and continuously improving quality of the processes and services over time
      • See inter-relationships, not parts
      • Understand the flow of work, not the one-time snapshot
      • Detail the work processes
      • Determine cause and effect relationships
      • Identify points of highest leverage
      • Improve and innovate, not just change for change’s sake
    • A way of doing business, not the exclusive responsibility of one individual or a committee
    DMAI
  • 63. Make QI Continuous
    • Use quality assessment to identify areas for improvement
    • Charge PIP team and provide support
      • Provide DMAI training
      • Use tools to understand root causes
      • Use data for baseline and analysis
      • Design process improvement to address root causes
    • Train ... train ... train …staff on the newly designed process improvement
    • Evaluate the impact of process improvements
    • If you don’t get the results you expected…….use assessment to understand why, revise accordingly and try again
    DMAI
  • 64. Make QI Continuous
    • Measure improvement over time and compared to benchmarks
      • Health of the people served
      • Customer satisfaction
      • Cycle time
      • Accuracy/consistent features
      • Financial performance
    • Quality assessment is critical to measure the impact of PIPs and practice guidelines
      • Measurement of baseline rates
      • Initial and second remeasurement after implementation
      • Plan to ensure demonstrated improvement can be maintained over time
    • EQROs are looking for performance measurement over time
    DMAI
  • 65.
    • Demonstrate Leadership Commitment
    • Build a QAPI culture
    • Connect the organization’s strategic plan to performance improvement
    • Know and use quality principles
    • Encourage all staff to use quality improvement in daily work
    • Reward improvements
    • Assure adequate QAPI infrastructure for quality assessment and improvement activities
    DMAI
  • 66. Demonstrate Leadership Commitment: QAPI Culture
    • Clearly stated and enacted constancy of purpose—a deep understanding of the vision and mission
    • Regular review of key indicator data
    • Decisions made on data rather than hunches or opinions
    • Long range view supports search for root causes and permanent solutions rather than quick fixes
    • Focus on systems rather than individuals
    • Continued identification of improvement opportunities
    • Publicize successes (Handout 11)
    • Clear communication agency-wide regarding the commitment to quality and the change processes necessary to implement improvement
    DMAI
  • 67. Demonstrate Leadership Commitment: QAPI Infrastructure
    • Governance
      • Oversight and accountability
    • Program structure
      • Who will do what when, with what processes for recommending or deciding
    • Staff
      • Support for ongoing monitoring and analysis, for training and facilitating improvement activities
    • Data system
      • Collect data and report in a user friendly way
    DMAI
  • 68. Demonstrate Leadership Commitment: QAPI Description
    • Goals
    • Organizational structures, responsibilities, and flow of information
      • Quality council/committee
      • Method for selecting PIP projects, charging PIP work teams
    • Scope
      • Programs/services/staff included
      • Processes included
    • Process for using quality assessment results to plan changes
    DMAI
  • 69. Demonstrate Leadership Commitment: QAPI Work Plan
    • Goals
    • Important aspects of care/services
      • Activities that involve a high volumes, high degree of risk and/or tend to produce problems for patients or staff
    • Monitoring activities associated with important aspects of care/services
      • Methods of measurement, frequency, timelines for reporting
    • Consumer satisfaction monitoring
    DMAI
  • 70. Demonstrate Leadership Commitment: QAPI Work Plan
    • Planned PIPs (in process, new) and timelines
    • Evaluation of PIPs now implemented and timelines
    • Annual evaluation of QAPI workplan and program description, with proposed revisions
    DMAI
  • 71. Berwick’s Answer
    • Quality improvement is grounded in values,
    • It begins with a commitment at the top of the organization,
    • It takes money,
    • It has mechanisms for horizontal integration of quality measurement and control up and down the line of management and is relevant to front line staff,
    • It requires statistical sophistication,
    • The focus is on design, not simply on performance,
    • Management is responsive and looks for ways to remove obstacles to improvement, and
    • There is a strategy to drive out fear