Improving Care: More Method, Less Uncertainty, Impact summit 30 October 2013

759 views
585 views

Published on

Improving Care: More Method, Less Uncertainty, Impact summit
30 October 2013
Improving Care: More Method, Less Uncertainty – Impact Summit, the second full day event in the Measurement Masterclass series, took place at the Central Hall Westminster in London on 30 October. The event was opened by Professor Sir Bruce Keogh and NHS IQ’s own Professor Moira Livingston, and included contributions from experts from across England and a virtual appearance by Dr Bob Lloyd.

This series for senior clinical leaders was developed to help increase the understanding of the principles of measurement for improvement. Designed to stimulate and challenge, it is supporting clinical leads in holding influential discussions with policy makers and data collectors.

To take the series forward and promote measurement for improvement more widely, NHS Improving Quality is setting up an advisory group to design and develop more learning resources for senior clinicians and their teams

More information: http://www.nhsiq.nhs.uk/capacity-capability/measurement-masterclass.aspx

Published in: Health & Medicine, Education
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
759
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
34
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide


  • Poll Title: Tell us where you think you are on the journey through measurement for improvement:

    http://www.polleverywhere.com/multiple_choice_polls/IriLN7as9lA4v2o


  • Poll Title: What are the 3 reasons for measurement?

    http://www.polleverywhere.com/multiple_choice_polls/Bfm432nVHsWmPLa
  • Maxine will describe driver diagrams


  • Poll Title: How confident are you now in using driver diagrams to address the messiness of life?

    http://www.polleverywhere.com/multiple_choice_polls/0tK65pO461iwq7O


  • Poll Title: A good measure

    http://www.polleverywhere.com/multiple_choice_polls/h66EoN6rffAJXjI
  • Operational definitionSimple exercise to bring home the point – how many wearing red?
  • This is a simple example using DNA as a measure. It is sufficiently generic to appeal to a wide range of projects and delegatesGo through each section but focus on the calculation. Explain that the definition needs to be comprehensive enough to avoid ambiguity
  • Operational definitionSimple exercise to bring home the point – how many wearing red?


  • Poll Title: Thinking back to the Checklist exercise, how much has that changed your thinking about the definition and collection of your chosen measure?

    http://www.polleverywhere.com/multiple_choice_polls/kwxwrAJxIwvxidq


  • Poll Title: What is an operational definition

    http://www.polleverywhere.com/multiple_choice_polls/OkbekndyzWBtIbl


  • Poll Title: Run and control charts are used to track progress over time because they allow us to identify common and special cause variation. How are you using them in your work:

    http://www.polleverywhere.com/multiple_choice_polls/z7Ax0OdBmlcYjXd


  • Poll Title: Why is it important to identify which type of variation we have in our data?

    http://www.polleverywhere.com/multiple_choice_polls/FMXovzDO3BYQKDB


  • Poll Title: Share thoughts and reflections during the afternoon session

    http://www.polleverywhere.com/free_text_polls/7yl1qrB7kxLYqsB
  • Improving Care: More Method, Less Uncertainty, Impact summit 30 October 2013

    1. 1. Improving Care: More Method, Less Uncertainty, Impact summit 30th October 2013
    2. 2. Professor Moira Livingston Clinical Director of Improvement Capability NHS Improving Quality
    3. 3. Housekeeping
    4. 4. Starting the journey
    5. 5. The journey so far… EVENT Improving Care: More Method, Less Uncertainty The first in a series of measurement master-classes for senior clinicians Friday 6th September WEBINAR Thursday 10th Oct Dr Bob Lloyd, Institute for Healthcare Improvement US, Professor Moira Livingston, NHS Improving Quality, Professor Sir Bruce Keogh, NHS England, Julian Hartley, NHS Improving Quality, Dr Maxine Power, Salford Royal NHS Foundation Trust Different national approaches - how to use national data to drive improvement at all levels Dr Veena Raleigh, Kings Fund, Göran Henriks, Jönköping County Council, Sweden, Prof Jonathon Gray, Dr Mataroria Lyndon, Counties Manukau Health, New Zealand WEBINAR Thursday 17th Oct Different national approaches – mortality, exploring how to use complex indicators to drive improvement WEBINAR Different national approaches - improvement and transparency Dr Bob Lloyd, Institute for Healthcare Improvement US, Dr Anna Trinks, Jönköping 19 Delegates County Council, Sweden Wednesday Dr Carol Peden, Royal United Hospital Bath, Alide Chase, Diane Waite, Kaiser 23rd Oct Permanente, US
    6. 6. Shape of the day Time 0930-0945 0945-1000 1000-1100 1115-1130 1130-1230 1230-1310 1310-1430 1430-1550 1550-1600 Topic Lead Welcome, introductions and overview of the day Professor Moira Livingston Clinical Director of Improvement Capability NHS Improving Quality View from the top Professor Sir Bruce Keogh National Medical Director, NHS England The strategic measurement for improvement journey • Choosing the right measures Mike Davidge with Dr Bob Lloyd (15 min video) Dr Maxine Power Dr Veena Raleigh Break The strategic measurement for improvement journey • Collecting good data • Making sense of data Lunch Knowledge Exchange: Making it happen • Details on your desks Steering the measurement journey: what next? Summary and Closing Mike Davidge with Dr Maxine Power Dr Veena Raleigh Mark Outhwaite Mark Outhwaite Professor Sir Bruce Keogh
    7. 7. Purpose of the impact summit The key aims: • Reflect and review learning and implications from the master-class so far • Build depth of knowledge • Discuss and identify how to make improvements in our measurement systems– based on better / more informed decision making • Promote understanding of the difference between measurement for improvement and for other purposes • Share and embed practical techniques for choosing measures, applying measures and interpreting measures We will do this by: • Case studies of real world examples, with opportunity to discuss and question • Providing interactive sessions to work through some personal measurement challenges, to identify some actions and next steps • Create the opportunity to identify further support needed to take for forward a measurement for improvement system, culture and practices Note: this course will be eligible for CPD points, information to be circulated after the event
    8. 8. Speakers for this morning Professor Sir Bruce Keogh National Medical Director, NHS England Mike Davidge Director (Measurement), NHS Elect Veena S Raleigh PhD Senior Fellow, The King’s Fund Maxine Power PhD, MPH Director of Innovation and Improvement Science, Salford Royal NHS Foundation Trust and Managing Director of Haelo
    9. 9. Knowledge Exchange Speakers • Mel Varvel, Improvement Manager, NHS Improving Quality • Preventing People from Dying Prematurely: GRASPing the Measurement Nettle • Dr Frances Healey, RGN, RMN, PhD, Senior Head of Patient Safety Intelligence, NHS England & Matthew Foggarty, Patient Safety, NHS England • The genie is out of the bottle: when Measurement for Improvement is used for other purposes • Clare Howard, MRPharmS, Deputy Chief Pharmaceutical Officer NHS England • Developing metrics for safer medication practice • Dr Carol Peden, Quality Improvement Fellow-Health Foundation and Consultant in Anaesthesia and Critical Care Medicine, Royal United Hospital Bath • Mortality Reviews • Martin McShane, Director (Domain 2) Improving the quality of life for people with Long Term Conditions, NHS England & Professor Alistair Burns, National Clinical Director for Dementia, NHS England and The University of Manchester • Dementia
    10. 10. Professor Sir Bruce Keogh National Medical Director NHS England
    11. 11. Mike Davidge Director (Measurement) NHS Elect
    12. 12. Using Poll Everywhere Live feedback and polling Either Text: mfimp to 07624806527 to link your phone to the session Then all you do is send poll responses to that number as a normal SMS/text Will not work if you withhold your number Or Point your smartphone/tablet browser at www.pollev.com/mfimp To participate in the polls Wifi: MMCNHSIQ – no password No premium costs – just contained within your normal contract rates
    13. 13. Question 1
    14. 14. Question 2
    15. 15. A word from our teacher • Bob Lloyd reminds us briefly what he covered on 6th September • We will be revisiting some of these points this morning with practical exercises
    16. 16. Be clear why you are measuring and the messiness of life CHOOSING THE RIGHT MEASURES
    17. 17. Choosing indicators Veena Raleigh Senior Fellow 30 October 2013
    18. 18. Precursors of measurement: clarity about... Who (audience): providers, commissioners, patients etc Why (aim): - quality improvement, judgement, research What (content): - dimension of quality, efficiency - population, service/sector, pathway - unit of measurement How (process): - definition, data sources - statistical methods - interpretation
    19. 19. Audience for measurement (1) parliament / government the NHS: - commissioners - managers - professional staff patients, families, carers the public regulators, auditors researchers the media The appropriate content and presentation formats of indicators for these audiences differ
    20. 20. Audience for measurement (2) For example: clinicians need disaggregated, risk-adjusted information at small unit level, benchmarked against peers, and showing trends over time commissioners want information on outcomes, and quality linked to cost-effectiveness patients, public want information that is simply constructed, clearly presented, and easy to interpret ie good vs bad
    21. 21. Aim of measurement • Judgement: - performance assessment/management - incentivising quality improvement (P4P eg QOF, CQUIN, quality premiums) - supporting patient choice - public accountability assumes unambiguous evidence of performance, designed for EXTERNAL accountability or • Quality improvement: - internal use - benchmarking against peers for feedback and learning assumes indicators are 'tin openers' for INTERNAL use, designed to prompt further investigation and appropriate action
    22. 22. Indicators for judgement Indicators for improvement unambiguous interpretation variable interpretation possible unambiguous attribution ambiguity tolerable definitive marker of quality screening tool good data quality ‘good enough’ data quality good risk-adjustment partial risk-adjustment tolerable statistical reliability preferred but not essential cross-sectional time trends (SPCs, run charts etc) punishment/reward learning, change in practice external control internal control data for public use data for internal use stand-alone allowance for context risk of unintended consequences low risk
    23. 23. Content of measurement (1) dimension of quality: effectiveness, patient experience, safety ……….. timely, access, equity, VfM, care coordination and integration population group, condition, service structure, process and outcome indicators: S + P = O unit of measurement eg commissioner and/or provider
    24. 24. Content of measurement (2) Indicators for commissioners (CCGs, LAs): - population based Indicators for providers: - Primary care - Community care - Out-of-hours care - Hospital care (emergency and planned) - Tertiary and specialist care - Mental health care - Palliative care - Social care (residential & home care) Indicators by population group, condition
    25. 25. Example: cancer NHSOF / COIS domain 1 indicators: cancer mortality < 75 cancer survival reducing cancer mortality depends on: reducing cancer incidence AND improving cancer survival these outcomes require improvement in the underlying drivers eg: cancer incidence: preventive measures eg smoking cessation services (process measure) cancer survival: screening, timely referral, treatment rates (process measures), staff capacity/skills and surgical volumes (structure measures)
    26. 26. Cancer (example indicators) Inequalities PRIMARY OUTCOME MEASURES Cancer mortality O Cancer incidence O Risk factors and prevention Rates of: - incidence O - smoking prevalence, diet etc IO - population awareness P - no of smoking cessation clinics S - smoking quitters O Key S=structure measure P=process measures IO=intermediate outcome measure O=outcome measures Cancer survival O Diagnosis, treatment, end-of-life care Rates of: - screening P - referrals, diagnostic tests, time to results P - detection rates O - stage at diagnosis O - access, waiting times P - cancers detected at emergency presentation P - surgical volumes S - treatment (surgery, radiotherapy) rates P - information for patients P - length of stay, readmission, mortality rates O - one-year survival: proxy for late diagnosis O - management by a multidisciplinary team P - staff skills, training S - adherence to guidelines P - access to end-of-life care P - patient experience and wellbeing O - cancer deaths by place of death O - participation in national clinical audits S
    27. 27. Aims exercise If you were in a lift with the rest of your table group could you clearly and briefly describe your aim in a sentence – i.e. the time it takes to travel from one floor to the next? Write your aim statement down Share with your table 30
    28. 28. Driver Diagrams
    29. 29. Aim Measurement Drivers (changes)
    30. 30. What is a Driver Diagram? • • • • • Reinforces the aim statement as the goal Clarifies the big picture Identifies primary system components Identifies projects which will influence Aids in development of measurement Most importantly: Helps to articulate the overall aim and avoid missing important system components 33
    31. 31. What are driver diagrams used for? • • • • Personal improvement projects Clarification in complex tasks Project / Programme Management Strategy, design and execution
    32. 32. Primary Drivers • • • • • • Push conceptual thinking Avoid focus on one area alone Usually categorical Abstract Removal reduces likelihood of success Projects wrap into them
    33. 33. Secondary Drivers • • • • Projects Tasks Actions Focus Areas • Aid allocation of workload • Ensure clarity and focus for testing
    34. 34. My driver diagram for weight loss Healthy Eating Lose 2 stone Measurement & feedback by March 2014 Prevent avoidable complications (Lifestyle) Exercise •Regular shopping •More fresh fruit •3 meals per day •No food after 6pm •2 litres of water per day •Weekly weight •Measure Inches •Pictures on the fridge •Regular support •Weight record chart updated showing trend •Plan for eating out / weekends •Beer & wine – develop a plan •Know your weaknesses •Habits and patterns •Avoid bad influencers •Encourage contact with supportive people •Daily exercise for a minimum of 20 mins •Measure progress •Identify barriers •Build distractions to help •Add something nice – sauna / jacuzzi •Search for an exercise that suits
    35. 35. Agree Operational Definitions Develop & test a measurement instrument for harm free care from pressure ulcers, falls, catheters and VTE by September 2011 • Evidence review • Expert debate / in • Grey areas agreed • Practical use Develop Technical Capability • Design characteri • Local, regional, na • Universal platform • Guidelines for use Determine how the instrument is used • Who collects & w • From where? • What happens aft • How are data use Determine the level of user • Local users - feedb • Data leads - feedb
    36. 36. Outcome :1 Rate of patient’s harmed by falls
    37. 37. Process :2 training in falls
    38. 38. Cancer (example indicators) Inequalities PRIMARY OUTCOME MEASURES Cancer mortality O Cancer incidence O Risk factors and prevention Rates of: - incidence O - smoking prevalence, diet etc IO - population awareness P - no of smoking cessation clinics S - smoking quitters O Key S=structure measure P=process measures IO=intermediate outcome measure O=outcome measures Cancer survival O Diagnosis, treatment, end-of-life care Rates of: - screening P - referrals, diagnostic tests, time to results P - detection rates O - stage at diagnosis O - access, waiting times P - cancers detected at emergency presentation P - surgical volumes S - treatment (surgery, radiotherapy) rates P - information for patients P - length of stay, readmission, mortality rates O - one-year survival: proxy for late diagnosis O - management by a multidisciplinary team P - staff skills, training S - adherence to guidelines P - access to end-of-life care P - patient experience and wellbeing O - cancer deaths by place of death O - participation in national clinical audits S
    39. 39. PRIMARY PREVENTION REDUCE MORTALITY FROM CANCER IN ENGLAND BY XX% BY MARCH 2016 • • • • Lifestyle Genetics Campaigns Social determinants SECONDARY PREVENTION • • • • • Screening Primary care Access to L2/3 service Lifestyle change Medicines optimisation SERVICE OPTIMISATION • • • • Value driven Quality greater than cost Equity in access Excellent experience END OF LIFE AND SOCIAL CARE • • • • • Cross sector working Hospice & faith Seven day HSC service Equipment Pain management
    40. 40. Cascading drivers 1 2 1 2 3 3
    41. 41. Limitations of driver diagrams • Not a perfect science • Two dimensional & simplistic • Working schematic – requires amendment • Interplay between drivers • Contribution of each driver is not equal
    42. 42. Question 3
    43. 43. Question 4
    44. 44. Please take only 15 minutes COFFEE
    45. 45. The measurement journey COLLECTING GOOD DATA
    46. 46. Introducing the Measures checklist
    47. 47. Define measures An operational definition is a description, in quantifiable terms, of what to measure and the steps to follow to measure it consistently
    48. 48. Example definition Measure name: DNA rate for clinic A Why is it important? (Provides justification and any links to organisation strategy) We need to ensure that the clinic is not disrupted by having unexpected gaps in the clinic schedule. The policy for this clinic is to offer another appointment which means that other patients may be disadvantaged if we have too many patients being rescheduled. Who owns this measure? (Person responsible for making it happen) Measure definition The outpatient clinic manager What is the definition? (Spell it out very clearly in words) The percentage of patients booked to attend clinic A who did not attend for their appointment and no warning was received at the clinic before it started. What data items do you need? The number of patients booked to attend clinic (B) and the number of patients who failed to attend without warning (F) What is the calculation? 100 x DNA patients (F) / Booked patients (B) Which patient groups are to be covered? Do you need to stratify? (For example, are there differences by shift, time of day, day of week, severity etc) All patients booked into clinic
    49. 49. Collecting data • What – All patients, a portion or a sample? • Who – collects the data? • When – is it collected – real time or retrospective? • Where – is it collected? • How – is it obtained – Computer system or audit? You need a plan which you test using PDSA cycles
    50. 50. Checklist exercise • Complete page one and collect on page two of the measures checklist provided for a measure that you are using or are planning to use • Share with your colleagues You have 15 minutes
    51. 51. Question 5
    52. 52. Question 6
    53. 53. Variation MAKING SENSE OF DATA
    54. 54. Variation exercise • Using the materials provided make the best paper aeroplane you can • Put your initials on it You have 15 minutes When instructed - throw your planes!
    55. 55. Fishbone diagram Equipment People Procedures Skills / ideas Some tables had scissors, rulers to help Throwing styles Problem No clear instructions provided Causes Air /Wind Environment Types of paper e.g. card, tracing paper, Materials Aeroplanes fly different distances
    56. 56. Classifying variation Common Cause Stable in time and therefore relatively predictable The paper used Persons technique Design of the plane Mike’s plane Special Cause Irregular in time and therefore unpredictable Water spill
    57. 57. Why classify variation? “There are different improvement strategies depending of which type of variation is present (common cause or special cause), so it is important for a team to know the difference.” Michael George Chairman and CEO of George Group Consulting
    58. 58. Question 7
    59. 59. Question 8
    60. 60. Instructions for the afternoon session
    61. 61. The Knowledge Exchange Carousel • After lunch you will be directed to move direct to a Knowledge Exchange Carousel ‘Pod’ with the same number as your table number • You will rotate through 3 ‘Pods’ at 25 minute intervals • In each Pod you will discuss a case study presented by a speaker • After the third Knowledge Exchange session you will remain in the Pod for the next task
    62. 62. Knowledge Exchange Speakers • Mel Varvel, Improvement Manager, NHS Improving Quality • Preventing People from Dying Prematurely: GRASPing the Measurement Nettle • Dr Frances Healey, RGN, RMN, PhD, Senior Head of Patient Safety Intelligence, NHS England & Matthew Foggarty, Patient Safety, NHS England • The genie is out of the bottle: when Measurement for Improvement is used for other purposes • Clare Howard, MRPharmS, Deputy Chief Pharmaceutical Officer NHS England • Developing metrics for safer medication practice • Dr Carol Peden, Quality Improvement Fellow-Health Foundation and Consultant in Anaesthesia and Critical Care Medicine, Royal United Hospital Bath • Mortality Reviews • Martin McShane, Director (Domain 2) Improving the quality of life for people with Long Term Conditions, NHS England & Professor Alistair Burns, National Clinical Director for Dementia, NHS England and The University of Manchester • Dementia
    63. 63. Sharing your learning • At the end of the Knowledge Exchange you will remain in your last Pod • Using the A0 poster template rapidly brainstorm the Barriers and Drivers in the current environment for each step in the measurement process • Identify your top 2 Barriers and top 2 Drivers (dot vote if necessary) • Transfer them to your Action Planner Driver Diagram
    64. 64. Action Planning • Identify the actions you could take collectively as a senior leadership cadre to address the barrier or driver Or • The support you need as a senior leadership cadre to address the barrier or driver
    65. 65. Feedback • One barrier or driver and the associated actions • One headline – if a journalist had been in the Pod with you what would be the headline they would have written
    66. 66. Personal Action Planner
    67. 67. Afternoon thoughts and reflections
    68. 68. Lunch 1230 - 1310
    69. 69. Knowledge Exchange
    70. 70. Feedback
    71. 71. Professor Sir Bruce Keogh National Medical Director NHS England
    72. 72. The Improving Care: More Method, Less Uncertainty, Impact summit Further details about the webinar series : www.nhsiq.nhs.uk

    ×