Value Summary 2.0
Standardized Improvement Framework
1
Improvement Design
& Implement
Goals
Baseline Analysis
& Investigation
Team &
Project Vision
Monitoring
Use Improvement Science
Why Use the Value Summary
2
Concise – avoids death by PowerPoint
Methodology – promote improvement that works
Measurement – track work at project & enterprise level
Transparency – self-service visibility to value work
Communication – standardize review of value work from director to staff
Lean
6s
PDSA
Project Definition
Baseline Analysis
Investigation
Improvement Design
Implement
Monitoring
Value Improvement Framework
UUHC Value Methodology
Project Definition + Goals
Baseline Analysis
Investigation
Improvement Design
Implement
Monitoring
Value Improvement Framework
UUHC Value Methodology in Value Summary
1
2
3 4
5
5 Sections of the Value Summary
5
1 2 3 4 5
6
1
Project Definition
Team & Project Vision
7
Project Definition
Engage the People – Team Elements
Ask often: “Do we have the right team”?
Representation of all roles
Upstream / Downstream
Experts who do the work
Team Member Roles
8
are not only a great way to
identify individual roles
within a project, they are
also key to creating reliable
reporting of project work to
appropriate department
leaders.
9
Project Definition
Why & How Elements
Why is this an important issue?
Why are you working on this now?
Internal / external drivers
How does this benefit the patient/customer?
How does this benefit the team?
10
Problem & Goals
Specific, Measurable, Attainable, Relevant, & Time-bound
2
11
G O A L S
12
Specific - How specific is based on your judgment.
“Poor communication” and “inefficiency” are not specific. “Readmission rates for ileostomy
patients” is specific enough.
Measureable - Define with an actual number.
Some, more, many are not numbers. “20% increase,” is a number you can track concretely.
Attainable - Is your goal realistic?
Chasing unrealistic goals is demoralizing.
Relevant - This area is another judgment call.
Time-bound – Set the date when you want the goal met.
Problem & Goals
SMART Goals
Source: http://healthsciences.utah.edu/accelerate/blog/2017/01/the-smart-way-to-keep-your-new-years-resolutions.php
13
14
15
Problem & Goals
Goal Type
Process
Action to get to
the outcome
e.g. removed the Foley
before 48 hours
e.g. performed preventive
maintenance within
96 hours
Outcome
Output from the
process
e.g. urinary tract infection
rate
e.g. equipment failure rate
Balancing
Unintended
Consequences
e.g. Reducing length of
stay but increasing
readmissions is not
an acceptable trade-
off.
Strive for a mix. Implementation of a solution is not a goal type
16
Problem & Goals
Measure Elements
Numerator & Denominator
Local
Meaningful
Transparent
When measurement is used effectively, teams can design,
implement, and sustain improvements. Elements of effective and
actionable measures are:
Source: Becoming a Value Driven Organization. Value Collaboration October 2015.
17
Baseline Analysis &
Investigation
3
18
Baseline Analysis & Investigation
Methods / Tools
Examine & Document
Baseline Process
Benchmark
To Peers
Analyze Data
19
Baseline Analysis & Investigation
Tools to Examine + Document Process
What does the
process tell you?
Describe your major
findings from each tool.
Attach related Documents.
20
Baseline Analysis & Investigation
Tools to Analyze Data
What does the data
tell you?
Describe major findings
from each analysis.
Data collection can be:
 Manual e.g. tally sheet, survey
 Automated e.g. data warehouse
Attach related documents
(no VDO/cost data).
21
Baseline Analysis & Investigation
Tools to Benchmark
What did you learn
from others?
Describe what best
practices you learned
from peers.
Attach any related
documents.
22
Baseline Analysis & Investigation
Summary
ROOT CAUSE
What did you learn?
Synthesize the information you’ve collected to target & prioritize opportunities for
improvement.
One method to identify the root cause is to ask ‘why’ 5 times. The reason a
problem exists usually goes deeper; keep going until you feel comfortable
you’ve identified the real reason(s).
23
Baseline Analysis & Investigation
Check your Goals
Process
Action to get to
the outcome
e.g. removed the Foley
before 48 hours
Outcome
Output from the
process
e.g. urinary tract infection
rate
Balancing
Unintended
Consequences
e.g. Reducing length of
stay but increasing
readmissions
Now that you have a better understanding of your problem and what changes will
be made to your process, add/adjust SMART goals.
24
Improvement Design
& Implement
4
25
Improvement Design & Implementation
How to Improve a Process
 Make it Reliable e.g. Standard Work
 Make it Simple e.g. Workplace Organization
 Make it Visible e.g Visual Management
 Make it Flow e.g. Eliminate Waste
There is no one-size-fits-all solution; find what works for your team.
26
Improvement Design & Implementation
Elements for Success
Design Changes
to Process / Workflow
Communication Plan
for improved design
Forcing Functions
to guide use of
improved design
27
Improvement Design & Implementation
Changes to Process / Workflow
Provide information in the value summary such that others can
understand and potentially replicate.
What are your process change(s), Who (role) is accountable, and
When / Where is it happening in the process?
What major findings does the improvement design address?
28
Improvement Design & Implementation
Customer / Patient Elements
Convenient
Empathetic
Coordinated
Reliable
Don’t forget about your customer! Improvements should be:
29
Improvement Design & Implementation
Communication plan
Plan to communicate the improved design:
 Policy (re)written
 Communication campaign
 Education, internal
 Education, patient/customer
Plan a communication strategy for anyone affected by the process -
upstream & downstream. Attach related documents.
30
Improvement Design & Implementation
Create a Reliable Process
Tools to ensure the improved design is followed:
MANUAL
TOOLS
COMPUTER /
AUTOMATION
PHYSICAL
MECHANISMS
VISUAL
REMINDERS
Person will be
expected to fill
out and check/
monitor their
work.
The step is
automatically
performed or
resides in a
trackable system.
The new process or
step will happen on
it’s own or the error
can’t happen
because of design.
Person will be
expected to notice
reminder and take
additional steps as
needed.
E.g. paper checklist,
nursing whiteboard
E.g. EMR order set,
telemetry monitor
E.g. barcodes, RFIDE.g. poster, best
practice alert
31
Improvement Design & Implementation
Forcing Functions
Forcing functions ensure that the right step is done right every time. The more
automated, the more effective it is at preventing errors. Automation (system) is not
always practical; determine your needs by considering the severity, likelihood, and
the detectability of the error.
MANUAL
TOOLS
E.g. paper checklist,
nursing whiteboard
COMPUTER /
AUTOMATION
E.g. EMR order set,
telemetry monitor
PHYSICAL
MECHANISM
E.g. barcodes,
RFID
VISUAL
REMINDERS
E.g. poster, best
practice alert
[-] EFFECTIVE [+]
Source: http://healthsciences.utah.edu/accelerate/blog/2017/01/sepsis-using-emr-as-a-forcing-function.php
PEOPLE FOCUSED SYSTEM FOCUSED
32
Monitoring & Impact
5
33
Monitoring & Impact
What Gets Measured Gets Managed
Monitor data continuously
Monitor process (Goals/Gemba)
Reflect on effectiveness &
adjust design, if needed.
At least 1 year of monitoring is
recommended; 2-3 years to
ensure sustainability.
Is it working?
34
Monitor & Impact
Measure Elements
Numerator & Denominator
Local
Meaningful
Transparent
Providing results to individuals can engage team members in their
ability to contribute to the improvement. This is often done outside of
the Value Summary reporting & monitoring.
Value Summary 2.0
http://pulse.utah.edu/go/valuesummary
35
http://healthsciences.utah.edu/accelerate 36
The Smart Way to Keep your New Year's Resolutions
Can improvement science help you keep your new year’s resolutions? Every
year, Chrissy Daniels coaches leaders throughout the system as they set goals.
She knows what works.
Lean Guard Rails: Using the EMR as a Forcing Function
This post is about the Sepsis project’s technical achievement using a process
improvement principle. Our system taught Epic, Utah’s electronic medical record
(EMR) how to provide urgent, life-saving information to clinicians.
WHAT IMPROVEMENT (REALLY) LOOKS LIKE
identify problem
JK
START END
AHA! found the real problem
keep going!
set goals
assemble the team
now we have the
right team
analysis &
investigation
design improvement
implement

Value Summary 2.0 Overview

  • 1.
    Value Summary 2.0 StandardizedImprovement Framework 1 Improvement Design & Implement Goals Baseline Analysis & Investigation Team & Project Vision Monitoring
  • 2.
    Use Improvement Science WhyUse the Value Summary 2 Concise – avoids death by PowerPoint Methodology – promote improvement that works Measurement – track work at project & enterprise level Transparency – self-service visibility to value work Communication – standardize review of value work from director to staff
  • 3.
    Lean 6s PDSA Project Definition Baseline Analysis Investigation ImprovementDesign Implement Monitoring Value Improvement Framework UUHC Value Methodology
  • 4.
    Project Definition +Goals Baseline Analysis Investigation Improvement Design Implement Monitoring Value Improvement Framework UUHC Value Methodology in Value Summary 1 2 3 4 5
  • 5.
    5 Sections ofthe Value Summary 5 1 2 3 4 5
  • 6.
  • 7.
    7 Project Definition Engage thePeople – Team Elements Ask often: “Do we have the right team”? Representation of all roles Upstream / Downstream Experts who do the work
  • 8.
    Team Member Roles 8 arenot only a great way to identify individual roles within a project, they are also key to creating reliable reporting of project work to appropriate department leaders.
  • 9.
    9 Project Definition Why &How Elements Why is this an important issue? Why are you working on this now? Internal / external drivers How does this benefit the patient/customer? How does this benefit the team?
  • 10.
    10 Problem & Goals Specific,Measurable, Attainable, Relevant, & Time-bound 2
  • 11.
  • 12.
    12 Specific - Howspecific is based on your judgment. “Poor communication” and “inefficiency” are not specific. “Readmission rates for ileostomy patients” is specific enough. Measureable - Define with an actual number. Some, more, many are not numbers. “20% increase,” is a number you can track concretely. Attainable - Is your goal realistic? Chasing unrealistic goals is demoralizing. Relevant - This area is another judgment call. Time-bound – Set the date when you want the goal met. Problem & Goals SMART Goals Source: http://healthsciences.utah.edu/accelerate/blog/2017/01/the-smart-way-to-keep-your-new-years-resolutions.php
  • 13.
  • 14.
  • 15.
    15 Problem & Goals GoalType Process Action to get to the outcome e.g. removed the Foley before 48 hours e.g. performed preventive maintenance within 96 hours Outcome Output from the process e.g. urinary tract infection rate e.g. equipment failure rate Balancing Unintended Consequences e.g. Reducing length of stay but increasing readmissions is not an acceptable trade- off. Strive for a mix. Implementation of a solution is not a goal type
  • 16.
    16 Problem & Goals MeasureElements Numerator & Denominator Local Meaningful Transparent When measurement is used effectively, teams can design, implement, and sustain improvements. Elements of effective and actionable measures are: Source: Becoming a Value Driven Organization. Value Collaboration October 2015.
  • 17.
  • 18.
    18 Baseline Analysis &Investigation Methods / Tools Examine & Document Baseline Process Benchmark To Peers Analyze Data
  • 19.
    19 Baseline Analysis &Investigation Tools to Examine + Document Process What does the process tell you? Describe your major findings from each tool. Attach related Documents.
  • 20.
    20 Baseline Analysis &Investigation Tools to Analyze Data What does the data tell you? Describe major findings from each analysis. Data collection can be:  Manual e.g. tally sheet, survey  Automated e.g. data warehouse Attach related documents (no VDO/cost data).
  • 21.
    21 Baseline Analysis &Investigation Tools to Benchmark What did you learn from others? Describe what best practices you learned from peers. Attach any related documents.
  • 22.
    22 Baseline Analysis &Investigation Summary ROOT CAUSE What did you learn? Synthesize the information you’ve collected to target & prioritize opportunities for improvement. One method to identify the root cause is to ask ‘why’ 5 times. The reason a problem exists usually goes deeper; keep going until you feel comfortable you’ve identified the real reason(s).
  • 23.
    23 Baseline Analysis &Investigation Check your Goals Process Action to get to the outcome e.g. removed the Foley before 48 hours Outcome Output from the process e.g. urinary tract infection rate Balancing Unintended Consequences e.g. Reducing length of stay but increasing readmissions Now that you have a better understanding of your problem and what changes will be made to your process, add/adjust SMART goals.
  • 24.
  • 25.
    25 Improvement Design &Implementation How to Improve a Process  Make it Reliable e.g. Standard Work  Make it Simple e.g. Workplace Organization  Make it Visible e.g Visual Management  Make it Flow e.g. Eliminate Waste There is no one-size-fits-all solution; find what works for your team.
  • 26.
    26 Improvement Design &Implementation Elements for Success Design Changes to Process / Workflow Communication Plan for improved design Forcing Functions to guide use of improved design
  • 27.
    27 Improvement Design &Implementation Changes to Process / Workflow Provide information in the value summary such that others can understand and potentially replicate. What are your process change(s), Who (role) is accountable, and When / Where is it happening in the process? What major findings does the improvement design address?
  • 28.
    28 Improvement Design &Implementation Customer / Patient Elements Convenient Empathetic Coordinated Reliable Don’t forget about your customer! Improvements should be:
  • 29.
    29 Improvement Design &Implementation Communication plan Plan to communicate the improved design:  Policy (re)written  Communication campaign  Education, internal  Education, patient/customer Plan a communication strategy for anyone affected by the process - upstream & downstream. Attach related documents.
  • 30.
    30 Improvement Design &Implementation Create a Reliable Process Tools to ensure the improved design is followed: MANUAL TOOLS COMPUTER / AUTOMATION PHYSICAL MECHANISMS VISUAL REMINDERS Person will be expected to fill out and check/ monitor their work. The step is automatically performed or resides in a trackable system. The new process or step will happen on it’s own or the error can’t happen because of design. Person will be expected to notice reminder and take additional steps as needed. E.g. paper checklist, nursing whiteboard E.g. EMR order set, telemetry monitor E.g. barcodes, RFIDE.g. poster, best practice alert
  • 31.
    31 Improvement Design &Implementation Forcing Functions Forcing functions ensure that the right step is done right every time. The more automated, the more effective it is at preventing errors. Automation (system) is not always practical; determine your needs by considering the severity, likelihood, and the detectability of the error. MANUAL TOOLS E.g. paper checklist, nursing whiteboard COMPUTER / AUTOMATION E.g. EMR order set, telemetry monitor PHYSICAL MECHANISM E.g. barcodes, RFID VISUAL REMINDERS E.g. poster, best practice alert [-] EFFECTIVE [+] Source: http://healthsciences.utah.edu/accelerate/blog/2017/01/sepsis-using-emr-as-a-forcing-function.php PEOPLE FOCUSED SYSTEM FOCUSED
  • 32.
  • 33.
    33 Monitoring & Impact WhatGets Measured Gets Managed Monitor data continuously Monitor process (Goals/Gemba) Reflect on effectiveness & adjust design, if needed. At least 1 year of monitoring is recommended; 2-3 years to ensure sustainability. Is it working?
  • 34.
    34 Monitor & Impact MeasureElements Numerator & Denominator Local Meaningful Transparent Providing results to individuals can engage team members in their ability to contribute to the improvement. This is often done outside of the Value Summary reporting & monitoring.
  • 35.
  • 36.
    http://healthsciences.utah.edu/accelerate 36 The SmartWay to Keep your New Year's Resolutions Can improvement science help you keep your new year’s resolutions? Every year, Chrissy Daniels coaches leaders throughout the system as they set goals. She knows what works. Lean Guard Rails: Using the EMR as a Forcing Function This post is about the Sepsis project’s technical achievement using a process improvement principle. Our system taught Epic, Utah’s electronic medical record (EMR) how to provide urgent, life-saving information to clinicians.
  • 37.
    WHAT IMPROVEMENT (REALLY)LOOKS LIKE identify problem JK START END AHA! found the real problem keep going! set goals assemble the team now we have the right team analysis & investigation design improvement implement

Editor's Notes

  • #4 Measure with VDO
  • #5 Measure with VDO
  • #12 http://healthsciences.utah.edu/accelerate/blog/2017/01/the-smart-way-to-keep-your-new-years-resolutions.php
  • #13 http://healthsciences.utah.edu/accelerate/blog/2017/01/the-smart-way-to-keep-your-new-years-resolutions.php
  • #16 Process measures:​ Processes are the actions we take. Examples of process measures: “removed the Foley before 48 hours,” or “performed preventive maintenance within a 96 hours of due date.” Outcome measures: Outcomes are the things that happen, in part because of our actions and in part due to factors out of our control. Examples of outcome measures corresponding to the process examples : “urinary tract infection rate,” or “equipment failure rate.” Other common outcome measures include length of stay, costs, and mortality. Balance measures: What gets measured gets managed. If a team is focused only on the measures most directly impacted, they run the risk of unintentionally degrading other important measures. For example, value work ​that reduces length of stay but increases readmissions is not an acceptable trade-off so readmissions would be a sensible balance measure. ​​​​​​​
  • #19 Planned methods/tools to analyze my problem: Examine the current state (Gemba walk; Voice of customer analysis - team, patient, etc.) Document current state (Process map; Spaghetti diagram; Value-added test; Categorical brainstorming /fishbone diagram; Voice of the customer analysis)Other Analyze current state data (Descriptive statistics; Histogram; Run charts; Pareto; Other table/graph) Benchmarking (Literature review; Performance benchmarking; Process benchmarking)
  • #21 Your baseline state isn’t complete until you have data. This is a 2 part discussion: how to get the data (manual – tally sheets, work study or automated - data warehouse, etc. )
  • #23 There are various tools and methods to conduct a root cause analysis. A popular method in the lean community to discover a root cause in less complex problems is to ask WHY 5 times. Usually the reason a problem exist goes deeper than asking it once or twice. Keep going until you feel comfortable you have identified the real reasons
  • #26 Make it visible (Visual Management) Make it reliable (Standard Work) Make it simple (Workplace Organization) Make it Flow (Eliminate Waste) Make it …
  • #28 The goal of a detailed improvement design is to identify the steps necessary to implement the pilot as intended –it becomes the standard work for implementation. Does your intervention DIRECTLY address the root cause? I should. Don’t get caught in throwing solutions because they sound nice. Stick to the plan. Improve on the root cause
  • #32 When possible introduce forcing functions in your process. Just like standard work, forcing functions (error proofing) help you ensure that the right step is done every time
  • #36 Problem solving is not a form you fill, but value summaries help us gather our thoughts. There is a method in this “madness” of process improvement: Define the problem (vision) What do you want the performance of the process to be (realistically) (goal) Find out why the process isn’t performing this way now (baseline) Implement a “fix” that based on your hypothesis will make the process perform that way (Pilot Implementations) Did it work? Why? Why not? (monitor)