A Case for Patient Safety Identifying Medical Error Root Causes ...


Published on

  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Show of hands— 9 of your classmates have prior experience in patient safety or quality improvement in health care (10% of your class)
  • OR Error as the ‘flip side’ of human performance For example, anaphylaxis following administration of ampicillin is an adverse event. If there had been no prior knowledge of the allergy, the adverse event would have still occurred, but it would not have been preventable (or technically an error.) If this were to happen in a patient known to be allergic to penicillins, it would be classified as an adverse event caused by error. Maybe the allergy was never documented, or maybe the medical record wasn’t available, or maybe the pharmacy dispensed the wrong medication. The point is the event would not have occurred if there had not been an error somewhere along the way. Just as some adverse events are preventable and some aren’t, some errors lead to adverse events and some do not. When an error occurs but no adverse event happens, we refer to it as a near miss. In other words, a near miss is an error that might otherwise cause harm but by some lucky/timely intervention, harm was averted. A sentinel event is a predefined event that results in death or major loss of function not related to the person’s illness or underlying condition. Since 1996, there has been a JCAHO: Joint Commission on Accreditation of Healthcare Organizations mandate for hospitals to investigate sentinel events in their institutions. They often use Root Cause Analysis to guide the investigation.
  • Let’s consider the evidence…
  • RETROSPECTIVE STUDIES The Harvard Medical Practice Study, a review of over 30,000 medical records from acute care hospitals in New York in 1984. This study found that adverse events occurred in 3.7 percent of hospitalizations, and that 13.6 percent of these led to death. (Brennan T A, Leape L L, Laird N M, et al. Incidence of adverse events and negligence in hospitalized patients. Results of the Harvard Medical Practice Study I. N Engl J Med 1991; 324:370-6.) Considered a classic, although retrospective, study of the epidemiology of mishaps that occur in a wide variety of inpatient settings. Some of the analysis and recommendations fall short in the area of applying human factors principles and techniques, but the authors have since become more aware. 15,000 acute care discharge records in Colorado and Utah. This analysis determined that adverse events occurred in 2.9 percent of hospitalizations, and that 6.6 percent of these led to death. (Thomas E J , Studdert D M, Burstin H R, et al. Incidence and types of adverse events and negligent care in Utah and Colorado. Med Care 2000; 38:261-71.) Australian health care study: population-based review of 14,170 admissions to 28 hospitals in New South Wales and South Australia in 1995. Wilson RM, Runciman WB Gibberc RW, Harrison BT, Newby L, Hamilton JD, The quality in Australian health care study. Med J Aust 1995; 163:458-71. 16.6% hospitalizations 13.7 % Permanent disability 4.9% Death 51% caused by errors
  • Andrews LB, Stocking C, Krizek T, Gottlieb L, Krizek C, Vargish T, et al. An alternative strategy for studying adverse events in medical care. Lancet 1997; 349:309-13. Donchin Y Gopher, D., M. Olin, Y. Badhi, et al., 1995. A look into the Nature and Causes of Human Errors in the Intensive Care Unit. Crit Care Med 1995; 23:294-300. Excellent paper on a prospective study of adverse events and close calls in an Israeli hospital ICU: direct observation, self report, failure modes analysis. CROSS-SECTIONAL Ely JW, Levinson W, Elder NC, Mainous AG, Vinson DC. Perceived causes of family physician’s errors. J Fam Pract. 1995;40(4):337-44. 50% with survey of Family Practice docs 2-20 years experience. Very telling and chilling subjective, retrospective study of how well-intentioned professionals can sometimes (often?) make mistakes that lead to injury or death of a patient.
  • At this slide, go back and remind them that Brennan found 2-4% rate of medical errors. Comment that 2-4% may sound trivial, but then have them consider the numbers on this slide, which are for a 0.1% error rate! Then have them consider the math of what they would translate to at 2-4%!
  • The Institute of Medicine (IOM) report made headlines with its estimate that medical errors account for between 44,000 and 98,000 deaths in the United States per year. (Institute of Medicine. To Err is Human: Building a Safer Health System. Edited by Kohn L, Corrigan J, Donaldson M., Washington D.C.: National Academy Press, 2000.) The report focused on the two large hospital-based retrospective studies as the basis for its estimate. The IOM mortality figures have been challenged for several reasons. One is the hindsight bias inherent in retrospective review of medical records, which some contend overestimates the number of deaths. Another is the lack of control groups in the studies, and therefore the absence of information about the baseline risk of death among the patients whose records were reviewed. Yet another reason is the variability of reviewers’ ratings. It is important to note, however, that the conclusions reported by the IOM were based on more than thirty published studies in the peer-reviewed literature, including, but not limited to, the studies conducted in New York, Colorado and U tah. Indeed, some maintain that the IOM report actually underestimates the mortality due to errors.
  • (National Center for Health Statistics) This slide is nice because it places deaths due to medical errors in hospitals in perspective. Note it would be between the 5 th and 9 rd leading cause of death for all ages in the US! **Estimated cost: $17 - $50 billion (IOM, 2000)
  • Despite what you hear advertised… Be sure to note that the numbers of deaths due to medical errors are ONLY for hospitals (not ambulatory care, not long-term care, etc.) so is probably just the tip of the iceberg.
  • Sound like anybody you know? Sound like your PBL cases?
  • Survey method: questionnaires mailed to 254 house officers in three IM residencies Anonymous, separate postcard indicated participation Questionnaire based on a literature review and two stages of pre-testing Four-point Likert-type and categorical response formats Albert Wu, MD, MPH; Susan Folkman, PhD; Stephen J. McPhee, MD; Bernard Lo,MD. JAMA, April 24, 1991—Vol. 265, No. 16, Pg 2089-2094
  • These were the types of errors the residents reported
  • See http://faculty.washington.edu/chudler/words.html#seffect for an online version of this and other similar J. Ridley Stroop (1935) Studies of Interference in Serial Verbal Reactions. Journal of Experimental Psychology, vol 18, 643-662
  • Conflicting inputs (label and color) cause confusion.
  • Make a policy about always using yellow adaptors to connect tubing to CO2… -best solution – purchase and have available only clear adaptors.
  • Bridger RS, Poluta MA. Ergonomics: Introducing the human factor into the clinical setting. J of Clinical Engineering 1998; 23: 180-188.   Nguyen NT, Ho HS, Smith WD, Philipps C, Lewis C, De Vera RM, Berguer R. An ergonomic evaluation of surgeons' axial skeletal and upper extremity movements during laparoscopic and open surgery. Am J Surg. 2001 Dec;182(6):720-4. Wears RL, Perry SJ. Human factors and ergonomics in the emergency department. Ann Emerg Med. 2002;40:206-212. Johnston WK 3rd, Low RK, Das S. Image converter eliminates mirror imaging during laparoscopy. J Endourol . 2003 Jun;17(5):327-31.
  • Aggregate bias: belief that aggregate data (like that compiled to create guidelines) doesn’t apply to their individual (special) patient Anchoring: locking into initial data too early and failing to change once more data available Availability bias: focus on diagnoses more readily (or recently) seen Confirmation bias: looking for data to confirm dx rather than disprove it Diagnosis momentum: once dx mentioned, it sticks! Gambler’s fallacy: if 10/10 are heads, next one more likely to be tails Sutton’s slip: why do you rob banks? Because that is where the money is! Focus on the obvious only.
  • Shame is not the same as fear of blame or being sued. Since it is internal, the first place most people have to start is with themselves
  • Not unlike other educational processes… Craig Nelson, educator Indiana University. 1. Multiple choice exams-  PBL cases “just give me the objectives” Authoritarian instruction, the clinical king. Doubts rarely raised. Uncertainty is downplayed. “that’s how its been done in the past” 2.All references are equal. Too much info out there! 3. EBM vs. status/authority. What I don’t know vs. What isn’t known (confusing) 4. Balance. Increased tolerance of uncertainty… Many answers, many problems. Truth may not exist.
  • Denial: negation of ‘error’ by defining medicine as an art with gray areas. Repression of actual mistakes by forgetting them Redefinition of mistakes as non-mistakes 2. Discounting: externalization of blame so that mistakes were due to things beyond their control:management, superiors, subordinates, patient, disease process 3. Distancing: ‘I did all I could” Not all that different from what you all find in the psycho-social objectives about the patient experience!!
  • Survey of residents 245 residents in IM programs. Wrote paragraph, answered questions, then questions about: the degree to which they accepted responsibility for the mistake their emotional response to the mistake discussions about the mistake with others institutional response to the mistake changes in practice due to making the mistake 54% discussed with attending, 24% discussed with patient, 54% (scale) accepted responsibility, 71% (scale) level of distress
  • 98% reported constructive changes 18% reported defensive changes constructive changes in practice were significantly associated with female gender, serious outcome, inexperience, case complexity as cause of mistake, accepting responsibility and extent of discussion
  • Defensive changes were significantly associated with job overload and judgmental institutions Multiple regression found that constructive changes were associated with faulty judgment, inexperience, acceptance of responsibility and more discussion Defensive changes were likely if there was a judgmental institutional response (  p 
  • However, patients do not necessarily think of safety as a key dimension of quality. They take it for granted. Many assume that health care is always safe, just as they assume that every airplane they board will take off, fly, and land safely. This erroneous assumption has far-reaching implications for physicians. It leads some patients to have unrealistic expectations of the health care system in general, and of physicians in particular. Unrealistic expectations can also set physicians up to feel ashamed if an error occurs in practice that has adverse effects on the patient’s well-being. Just as our “culture of blame” fuels litigation, and needs to be remedied, a “culture of shame” based on unrealistic expectations fuels the tendency for practitioners to disregard or hide errors. The fact is we all are human, and we all err at times. Work, time, or trouble have made people avoid this Time and trouble avoided later on
  • This is a good slide at which it explain the problem of just blaming or finding fault. I really believe that the analogy of a play is effective. Have the residents imagine a play or a musical. Explain that if you fire or replace one of the actors, but do not change the set, props, or the script, the outcome of the play will be the same. The same holds true for patient safety or medical errors. If you blame, replace, or fire a doctor or nurse but do not change the technology, policies, environment, or method of care (in other words you don’t change the system), then no matter who the new doctor or nurse is, the outcome of the system will most likely be the same. That is why it is so important to analyze the system, identify hazards, and control them. This is also a good place to introduce the concept that most medical errors are the result of someone doing something, that does NOT mean it was the person’s “fault”. Rather, human behavior is simply the final manifestation of a system error. Fix the system, and you prevent the behavior. If you just re-train the person or replace the person with someone else, the same system problem will again manifest itself in unsafe behavior.
  • This new slide is here for discussion of what patient safety is. Up until this point the students probably don’t really know what “patient safety” means and this slide provides discussion for several possible definitions. 2 nd bullet is there because this notion is prevalent. This way the trainer can explain the fault of this logic 3 rd and 4 th and 5 th bullets are the best answer (in my humble opinion) and should be emphasized.
  • The IOM issued its second report on quality of care, Crossing the Quality Chasm , in 2001. This report expanded on the concepts presented in the first IOM report and identified 6 dimensions of quality health care. Everyone deserves quality health care that is: Safe : This is what’s embodied in the oath we’ve all taken to do no harm. Effective : We achieve the desired clinical outcomes by using proven methods and applying our clinical expertise. Patient-centered : This encompasses qualities — such as compassion, empathy, and responsiveness — to patients’ expressed needs and cultural diversity. Timely : This dimension touches on the need to reduce delays in the office and the hospital, in responding to inquiries from patients or colleagues, and in doing what we can to avoid delays in diagnosis and treatment. Efficient : This, quite simply, is a mandate to reduce waste. Overuse of certain services is one factor that drives up the costs of health care, and errors are another. After all, errors drain resources and productivity when they result in an increased length of stay, hospitalization that would otherwise be unnecessary, or preventable morbidity or mortality. Equitable : The aim here is to assure that health care is accessible to everyone, and delivered with sensitivity to patients’ needs and cultural diversity.
  • Safety is certainly a key dimension of quality. But if we consider what it really takes to assure safety — looking at other complex enterprises, such as aviation, as models of what it takes — we see that a systems approach to safety improvement is the way to go. Trying harder will not work. Even if we all do the best we can as individuals, the systems in which we work have to be engineered for safety, so to speak, to assure that we can deliver health care that does no harm. What can improve patient safety? Stepwise correction of problems in health care systems is the key to success. A preoccupation with safety by everyone involved in the delivery of health care is needed.
  • The Chasm report included ten simple rules for health care in the twenty-first century, and here we touch on five rules that contrast the current view with a view of the future. Our oath to do no harm gives us individual responsibility, and, of course, we still have that responsibility as professionals… but we’re not in it alone. Health care technology and health care systems are incredibly complex now, and this means safety is a system property. Health care systems have traditionally treated information as a record of what happened during an office visit or a hospital stay. But looking forward — given the technology that’s available and the emphasis on patient empowerment and consumer choice — the new view is that knowledge must be shared and information must flow freely wherever it’s needed. However, our current health care system saddles us with burdens related to reporting, utilization review, litigation, and blame. These factors set us up to behave as if secrecy is necessary. Of course, we have to preserve patient confidentiality, but we also have to accept that transparency and free flow of information is a key to accountability. This includes both clinical and financial accountability. Our current health care system tends to be reactive rather than proactive. We call upon the available resources when the need arises, and sometimes we overreact. We now have the technology and a growing body of evidence for what works best. We can anticipate people’s needs and draw people in to a health care system that’s ready to deliver. Related to this anticipation of needs is a shift away from the view of variability being driven by professional autonomy, with each of us doing what we’re most comfortable doing. We now have evidence for what works best with a given type of patient, and variability now ought to be driven by what each patient needs clinically and prefers individually. If twenty-first century systems are to improve the safety of health care, we have to consider how we get from where we are now to where we want to be in the future.
  • BLAME AND SHAME: more emotionally satisfying than targeting institutions. If something goes wrong, it seems obvious that an individual is responsible. But, most unsafe acts are blameless. Some of the best people make errors—and those errors are recurrent, because they are related to the system. Attribution of adverse events solely to human errors is like swatting mosquitoes one at a time. If you want to solve the problem, don’t swat the mosquitoes, drain the swamp. The fact is that complex systems harbor many latent failures, with elements that can operate in an unintended or undesirable manner.
  • In an ideal world, each defensive layer would be intact. In reality, they are like slices of Swiss cheese, having many holes. The holes can be both active and latent factors. Any one hole in a slice doesn ’ t necessarily cause a bad outcome. Only when the holes in many layers line up does an adverse event occur. Pros of Swiss Cheese Model -Depicts complexity well … many causes linked together resulting in an adverse event. -Broaden discussion beyond blame and shame Cons of Swiss Cheese Model All barriers look the same, but aren ’ t Appears like “ plugging ” each hole is the same, but isn ’ t Stronger versus weaker Doesn ’ t consider unintended consequences of plugging holes Doesn ’ t consider the effect of a square plug in a round hole
  • System is a set of interdependent elements interacting to achieve a common aim. All systems are perfectly designed to produce the results they produce. If change in output is desired: Must change system Latency
  • Health care systems are highly complex to begin with, and safety can be improved when care processes are simplified. Taking measures to reduce hand-offs and improve physical features of the workplace are two examples. Also, improvements in safety can be achieved through measures that reduce variations in care from case to case. This can be accomplished by standardizing processes and taking measures to reduce reliance on memory and vigilance. Inclusion of “forcing” functions (design features that make it impossible to do the wrong thing) and automation are two ways to reduce variation. Finally, safety can be improved when the system is designed to make the best use of all available resources — including people. Collaboration among members of the health care team and improved communication go a long way toward achieving this goal.
  • Reporting to encourage further institutional learning/interventions Redundancy to ensure double checks of key processes Communication to ensure vital information is shared with all necessary personnel on a timely basis Re-engineering to remove opportunities for error
  • Improving Patient Safety. Harvard Health Policy Review Archives. Fall 2000; vol1, No1.
  • Hand Off Communication-providers are required to provide pertinent information on patient condition when trasnferring from one level of care to another and when shift change handoff occurs. 2. Hand Hygiene-100% compliance required when you go into any patient room. Foamng in and foaming out regardless of whether you plan to touch the patient or not. 3. Medication Reconciliation 4. Critical Test reporting-the lab and radiology send alerts to providers of test results of critical values or critical findngs (new findings on xray etc...) These must be documented in the pt record reflecting action taken if any. 5. Core Measures- following and using all pre-printed order sets and indicating follwing the care parameters and reasons why they are not indicated for a particular patient. Pre-printed discharge instructions are also a requirement for this pt population.(AMI, CHF, Pneumonia and Surgical Site Infection Prevention) Since using these pre=printed orders and discharge instructions the UCH compliance has exceeded national benchmarks 6. Signing, timing and dating all orders and progress note documentation.  7. Universal Protocal- Time out before a procedure including marking the site, confirming the correct consent for the correct procedure planned, confirming the right patient, Assuring a current H&P is available nd reviewed prior to the procedure. 8. Central Line Infection Prevention- Requires full barrier precautions and a checklist completion that all Universal Protocol measures are checked documented. There are procedural checklists developed that walk providers through these required elements. The procedure form also assures appropriate billing for a procedure. Since we introduced these checklist and requirements our infection rates dropped significantly We refer to it as the MET "Medical Emergency Team" Kristin Paston, our Code Team Leader and Sheryle Reichenbauh, Manager of the SICU lead this effort and we have data to support the positive effect with with reducing transfers to the ICU's.   We do have a Stroke Team and been certified by the Joint Commission for our positive efforts with managing ur sroke outcomes. Kerry Brega is our physician champion on this. Diabetes pt management is led by Carol Lee Whitehill. Both of these collaborative physician and nursing teams has positive results.
  • Plenty of examples of quality/safety processes in each clinical block, both positive and negative Anesthesia protocols Transitions in care project Weight-based pediatric dosing Communication Teamwork
  • “Pre-code” example
  • -Stories here are worth more than in-depth theory discussions. -Weeks WB, Bagian JP Developing a culture of safety in the Veterans Health Administration. Eff Clin Pract. 2000 Nov-Dec;3(6):270-6.
  • Equipment (design, availability and maintenance) Environment (staffing levels and skills, workload and shift patterns, administrative and managerial support, physical plant) Teamwork (verbal and written communication, supervision and assistance) Staff (knowledge and skills/training, competence, physical and mental health) Institutional Context (economic and regulatory situation, availability and use of protocols, availability and accuracy of tests) Organization/Management (financial resources and constraints, organizational structure, policy standards and goals, safety culture and priorities) Patient (complexity and seriousness of condition, language and communication, personality and social factors)
  • A Case for Patient Safety Identifying Medical Error Root Causes ...

    1. 1. Medical Students and Medical Errors ICC 7001 April 20, 2009 Wendy Madigosky MD, MSPH Shelly Dierking, CEO Patient Safety Education Partnership
    2. 2. Objectives <ul><li>1) Have an advanced understanding of the occurrence of medical error in the clinical environment </li></ul><ul><li>2) Have an appreciation for the personal impact of medical errors </li></ul><ul><li>3) Be aware of the student role in improving patient safety </li></ul><ul><li>4) Be familiar with local hospital efforts to reduce error and improve quality of care </li></ul>
    3. 3. Of course, we want patients to be safe! <ul><li>Implicit in providing quality health care is ensuring the care is safe. </li></ul><ul><li>No health care provider sets out in the morning to see if they can make the care they provide more dangerous. </li></ul><ul><li>However, the statistics suggest that despite the intrinsic role of safety in quality care, we fall short of the mark. </li></ul>
    4. 4. <ul><li>How many of you have experienced a medical error—as a patient or family member? </li></ul><ul><li>In your training thus far, how many of you have seen something that shouldn’t happen again? </li></ul>
    5. 5. Working definitions… An event or situation that could have resulted in an accident, injury, or illness, but did not, either by chance or through timely intervention. Also referred to as a close call . Near Miss An unexpected occurrence or variation involving death or serious physical or psychological injury or the risk thereof. Sentinel Event An injury caused by medical management rather than the underlying condition of the patient. Adverse Event Failure of a planned action to be completed as intended (e.g. error of execution) or the use of a wrong plan to achieve an aim (e.g. error of planning). Error Institute of Medicine. To Err is Human: Building a Safer Health System. Washington, D.C.: National Academy Press, 1999.
    6. 6. Epidemiology of Medical Errors
    7. 7. Adverse Events in Retrospective Studies <ul><li>New York State, 1984 </li></ul><ul><ul><li>3.7% of hospitalizations </li></ul></ul><ul><ul><li>69% caused by errors </li></ul></ul><ul><li>Colorado and Utah, 1992 </li></ul><ul><ul><li>2.9% of hospitalizations </li></ul></ul><ul><ul><li>6.6% mortality </li></ul></ul>
    8. 8. Adverse Events in Observational Studies <ul><li>Chicago teaching hospital, 1997 </li></ul><ul><ul><li>45.8% patients on general surgical units </li></ul></ul><ul><ul><li>18% produced disability </li></ul></ul><ul><li>Israeli medical-surgical ICU, 1995 </li></ul><ul><ul><li>1.7 errors/patient/day </li></ul></ul>
    9. 9. What does a 2-4% adverse event rate mean? <ul><li>0.1% Rate: </li></ul><ul><ul><li>1 hour of unsafe drinking water every month </li></ul></ul><ul><ul><li>2 unsafe plane landings per day at O’Hare Airport in Chicago </li></ul></ul><ul><ul><li>16,000 pieces of mail lost every hour </li></ul></ul><ul><ul><li>22,000 checks deducted from the wrong bank account each hour </li></ul></ul><ul><ul><li>20,000 incorrect prescriptions every year </li></ul></ul><ul><ul><li>500 incorrect operations each week </li></ul></ul><ul><ul><li>50 babies dropped at birth every day </li></ul></ul><ul><li>Multiply by 20-40 to reflect a 2-4% error rate! </li></ul>
    10. 10. November 1999 <ul><li>33.6 million admissions to U.S. hospitals in 1997 </li></ul><ul><li>44,000 - 98,000 deaths per year as a result of medical errors </li></ul>
    11. 11. Top Causes of Death in US: 2006 <ul><li>Heart disease: 631,636 </li></ul><ul><li>Malignant neoplasm: 559,888 </li></ul><ul><li>Cerebrovascular disease: 137,119 </li></ul><ul><li>Chronic, lower respiratory disease: 124,583 </li></ul><ul><li>All accidents: 121,599 </li></ul><ul><li>Diabetes: 72,449 </li></ul><ul><li>Alzheimer’s: 72,432 </li></ul><ul><li>Influenza and pneumonia: 56,326 </li></ul><ul><li>Nephritis/nephrosis: 45,344 </li></ul><ul><li>Septicemia: 34,234 </li></ul>www.cdc.gov/nchs
    12. 12. Deaths from Adverse Events <ul><li>More common than: </li></ul><ul><ul><li>Breast Cancer </li></ul></ul><ul><ul><li>Motor Vehicle Accidents </li></ul></ul><ul><ul><li>AIDS </li></ul></ul><ul><li>44,000-98,000 estimate does NOT include deaths from ambulatory sites (nursing homes, home-health, office-based practices) </li></ul>
    13. 13. What does this have to do with me? <ul><li>Medical errors are a significant cause of morbidity and mortality </li></ul><ul><li>You are going to make mistakes, witness errors, and participate in unsafe care </li></ul>
    14. 14. Everyone makes mistakes, but… <ul><li>Errors more common if: </li></ul><ul><ul><li>Inexperienced providers </li></ul></ul><ul><ul><li>New techniques used </li></ul></ul><ul><li>Adverse events more common if: </li></ul><ul><ul><li>Patient age >64 </li></ul></ul><ul><ul><li>Invasive procedures </li></ul></ul><ul><ul><li>Complex illnesses </li></ul></ul><ul><ul><li>Longer hospitalization </li></ul></ul>Weingart SN, Wilson RMcL, Gibberd BH. Epidemiology of medical error. BMJ 320;774-777.
    15. 15. Intern and Resident Mistakes <ul><li>114 respondents (36% interns, 64% residents) </li></ul><ul><li>Types of errors respondents admitted to </li></ul><ul><ul><li>Diagnosis (33%) </li></ul></ul><ul><ul><li>Prescribing and dosing (29%) </li></ul></ul><ul><ul><li>Evaluation and treatment (21%) </li></ul></ul><ul><li>Outcomes </li></ul><ul><ul><li>90% reported significant adverse patient outcomes, including death </li></ul></ul>Wu A, Folkman S, McPhee SJ, Lo B. Do House Officers Learn From Their Mistakes? JAMA 1991;265:2089-2094.
    16. 16. Types of Error Causes of Errors <ul><li>Diagnosis </li></ul><ul><li>Evaluation </li></ul><ul><li>Treatment </li></ul><ul><li>Prescribing </li></ul><ul><li>Procedures </li></ul><ul><li>Communication </li></ul><ul><li>Factual ignorance </li></ul><ul><li>Faulty judgment </li></ul><ul><li>Hesitation </li></ul><ul><li>Breaks in concentration </li></ul><ul><li>Inexperience </li></ul><ul><li>Job overload </li></ul><ul><li>Fatigue </li></ul><ul><li>SYSTEM FLAWS </li></ul>
    17. 17. Basic Science of Medical Errors <ul><li>Medical knowledge </li></ul><ul><li>Communication </li></ul><ul><li>Teamwork </li></ul><ul><li>Human factors engineering </li></ul><ul><li>Cognitive science </li></ul><ul><li>Quality Improvement </li></ul>
    18. 18. Demonstration: Stroop Effect Row 1 Row 2 Row 3
    19. 19. Now, State the Color of the Text as Fast as You Can… Red Red Red Blue Blue Blue Yellow Yellow Yellow Green Green Green Row 1 Row 2 Row 3
    20. 20. Again, State the Color of the Text as Fast as You Can… Red Red Red Blue Blue Blue Yellow Yellow Yellow Green Green Green Row 1 Row 2 Row 3
    21. 21. “ Tell the nursing student to attach the oxygen mask and tubing to the green spigot” For further info, see http:// faculty.washington.edu/chudler/words.html#seffect J. Ridley Stroop (1935) Studies of Interference in Serial Verbal Reactions. Journal of Experimental Psychology , vol 18, 643-662 Patient Safety Correlation
    22. 22. Weaker vs. Stronger Remedy Make sure to use the correct color Adaptor!? Better Communication Teamwork
    23. 23. Human Factors Engineering and Your World <ul><li>Anesthesiology </li></ul><ul><ul><li>Design of alarms, monitors, and safety systems </li></ul></ul><ul><li>Emergency Medicine </li></ul><ul><ul><li>Design of decision-making tools and monitoring </li></ul></ul><ul><li>Surgery </li></ul><ul><ul><li>Design of hand tools and visualization devices (laparoscopy) </li></ul></ul>
    24. 24. Video Demo <ul><li>Count the number of passes made between basketball players wearing white T-shirts </li></ul><ul><li>Write down your answer (quietly – not a group effort) </li></ul><ul><li>At the end, I will ask for answers </li></ul>
    25. 25. Cognitive theory <ul><li>Cognition is how people reason and make decisions </li></ul><ul><li>Providers may use deduction, induction or intuition to solve problems </li></ul><ul><li>Novices lean toward deduction and exhaustive work-ups </li></ul><ul><li>Experts have more knowledge and use logic, probability and especially intuition </li></ul><ul><li>Coderre Med Ed 2003 </li></ul>
    26. 26. Diagnostic Cognitive Errors/Solutions <ul><li>Aggregate bias </li></ul><ul><li>Anchoring </li></ul><ul><li>Availability bias </li></ul><ul><li>Confirmation bias </li></ul><ul><li>Diagnosis momentum </li></ul><ul><li>Gambler’s fallacy </li></ul><ul><li>Sutton’s slip </li></ul><ul><li>Develop insight </li></ul><ul><li>Consider alternatives </li></ul><ul><li>Metacognition </li></ul><ul><li>Decrease reliance on memory </li></ul><ul><li>Simulation </li></ul><ul><li>Minimize time pressures </li></ul>Croskerry: Acad Med, Volume 78(8).August 2003.775–780
    27. 27. Where do I learn more about this? <ul><li>Piecemeal within curriculum </li></ul><ul><li>Self-guided study </li></ul><ul><li>IHI Open School </li></ul><ul><ul><li>Free courses in patient safety, human factors engineering, quality improvement, teamwork/communication </li></ul></ul><ul><ul><li>http://www.ihi.org/IHI/Programs/IHIOpenSchool/ </li></ul></ul><ul><ul><li>UCD Chapter now formed--if interested in helping to lead within AMC contact Dr. Madigosky </li></ul></ul><ul><li>AHRQ web M&M: www.webmm.ahrq.gov </li></ul><ul><ul><li>Web-based medical journal showcasing patient safety lessons drawn from actual cases involving medical errors </li></ul></ul><ul><ul><li>5 cases per month from medicine, surgery-anesthesia, OB/GYN, pediatrics, psychiatry </li></ul></ul><ul><ul><li>Commentaries from experts </li></ul></ul>
    28. 28. “ Bad Apple” Theory <ul><li>Our systems are good and would be safe were it not for the actions of a few people who behave erratically. </li></ul><ul><li>If an error occurs the task is to find out who did it and to take the necessary steps so they do not do it again. </li></ul>S. Dekker, The Field Guide to Human Error Investigations
    29. 29. “ New” View of Human Error <ul><li>An error is a symptom of systemic factors in the environment which create the circumstances for an error to happen </li></ul>S. Dekker, The Field Guide to Human Error Investigations
    30. 30. Two views of human error <ul><li>Old </li></ul><ul><li>Human error is a cause of accidents </li></ul><ul><li>To explain failure you must seek failure </li></ul><ul><li>You must find people’s inaccurate assessments, wrong decisions and bad judgments </li></ul><ul><li>New </li></ul><ul><li>Human error is a symptom of deeper problems inside a system </li></ul><ul><li>To explain failure do not seek where people went wrong </li></ul><ul><li>Instead, find how people’s assessments and actions made sense at the time, given the circumstances that surrounded them. </li></ul>S. Dekker, The Field Guide to Human Error Investigations
    31. 31. Our Medical Culture <ul><li>Taught in an authoritarian manner with a sense of absolute right/wrong </li></ul><ul><li>Medicine is infallible; we should be perfect </li></ul><ul><li>There is always one right answer </li></ul><ul><li>Confidence equals competence </li></ul><ul><li>Error equals incompetence, negligence or laziness </li></ul><ul><li>Error carries shame </li></ul>Pilpel D, Schor R, Benbassat J. Barriers to acceptance of medical error: the case for a teaching programme. Med Educ. 1998;32(1):3-7.
    32. 32. Awareness and Shame May be Largest Hurdles <ul><li>1999 Survey at VA and Private Healthcare Organizations </li></ul><ul><ul><li>Only 27% Agreed that Errors were a Serious Problem </li></ul></ul><ul><ul><li>49% “Ashamed” by Error </li></ul></ul><ul><li>Blendon et al. (2003) in NEJM </li></ul><ul><ul><li>A majority of surveyed physicians thought that individual health care providers were more likely to be responsible for medical errors than hospitals </li></ul></ul>
    33. 33. Medical Socialization <ul><li>The ‘truth’ </li></ul><ul><li>Baskin Robbins </li></ul><ul><li>Discipline specific games </li></ul><ul><li>Critical thinking </li></ul>
    34. 34. Typical Responses <ul><li>Denial </li></ul><ul><li>Discounting </li></ul><ul><li>Distancing </li></ul><ul><ul><ul><ul><ul><li>Mizrahi T. Soc. Sci. Med, 1984. Vol 10 No 2 pp 135-146. </li></ul></ul></ul></ul></ul><ul><li>Guilt, fear, anger, embarrassment, humiliation, anxiety, depression, self-doubt, rumination about event, excessive concern, overwork, anguish </li></ul><ul><ul><ul><ul><ul><li>Christensen JF et al. JGIM, 1992. Vol 7 pp424-431 </li></ul></ul></ul></ul></ul>
    35. 35. Resident Responses <ul><li>Remorse </li></ul><ul><li>Anger at selves </li></ul><ul><li>Guilt </li></ul><ul><li>Inadequacy </li></ul><ul><li>Fear </li></ul><ul><li>Psychological impact </li></ul><ul><li>Wu A, et al. JAMA, April 24, 1991—Vol. 265, No. 16, Pg 2089-2094 </li></ul>
    36. 36. Resident Coping Strategies <ul><li>Problem focused </li></ul><ul><ul><li>Acceptance of responsibility </li></ul></ul><ul><ul><li>Consultation to understand the nature of the mistake </li></ul></ul><ul><ul><li>Consultation to correct the mistake </li></ul></ul><ul><ul><li>Planned problem solving (extra-training) </li></ul></ul>
    37. 37. Resident Coping Strategies <ul><li>Emotion-focused </li></ul><ul><ul><li>Obtaining social support </li></ul></ul><ul><ul><li>Disclosure to colleague, friend or spouse </li></ul></ul><ul><ul><li>Disclosure to patient </li></ul></ul><ul><ul><li>Reframing mistake </li></ul></ul>
    38. 38. Resident Changes in Practice <ul><li>Constructive </li></ul><ul><ul><li>Increased information seeking </li></ul></ul><ul><ul><li>Increased vigilance </li></ul></ul><ul><ul><li>Improved self-pacing </li></ul></ul><ul><ul><li>Improved communication </li></ul></ul><ul><ul><li>Supervising others closely </li></ul></ul>
    39. 39. Resident Changes in Practice <ul><li>Defensive </li></ul><ul><ul><li>Avoiding similar patients </li></ul></ul><ul><ul><li>Being unwilling to discuss the error </li></ul></ul><ul><ul><li>Ordering additional but unnecessary tests </li></ul></ul>
    40. 40. Bottom Line <ul><li>To have constructive responses to medical errors: </li></ul><ul><ul><li>Accept responsibility for the error </li></ul></ul><ul><ul><li>Know that it may be emotionally stressful </li></ul></ul><ul><ul><li>Disclose the error to others </li></ul></ul><ul><ul><li>Use the error as an educational tool </li></ul></ul>
    41. 41. Beyond Blame
    42. 42. Barriers to Patient Safety <ul><li>Medicine views errors as failings that deserve: </li></ul><ul><ul><li>Blame and shame </li></ul></ul><ul><ul><li>Corrective actions focusing on individuals </li></ul></ul><ul><li>Lack of awareness </li></ul><ul><li>No blood no foul philosophy </li></ul><ul><ul><li>Many in health care ignore or downplay near misses, resulting in a missed learning opportunity </li></ul></ul>
    43. 43. Should there be a blame free environment? <ul><li>Not necessarily: </li></ul><ul><ul><li>In the VA, intentionally unsafe acts are excluded from safety </li></ul></ul><ul><ul><li>Without individual accountability you cannot have safety or quality </li></ul></ul><ul><li>However, the system should be analyzed to look for problems before concluding that it was the “fault” of an individual </li></ul>
    44. 44. What is patient safety? <ul><li>Patient safety is the euphemism for medical error. </li></ul><ul><li>Patient safety is the prevention of harm or injury to patients. </li></ul><ul><li>But, does a lack of harm = safety? </li></ul><ul><li>Patient safety is the identification and control of things (i.e. hazards) that could cause harm to patients. </li></ul><ul><li>Patient safety is that which allows you to pursue quality. In other words, without basic safety you can’t have quality. </li></ul>
    45. 45. March 2001 <ul><li>SAFE </li></ul><ul><li>Effective </li></ul><ul><li>Patient-centered </li></ul><ul><li>Timely </li></ul><ul><li>Efficient </li></ul><ul><li>Equitable </li></ul>
    46. 46. <ul><li>Safety is a key dimension of quality </li></ul><ul><li>Systems approach to safety improvement </li></ul><ul><ul><li>Simply trying harder will not work </li></ul></ul><ul><ul><li>Stepwise correction of problems in the system is the key to success </li></ul></ul><ul><ul><li>Overcome the culture of blame and shame: </li></ul></ul><ul><ul><ul><li>… Human error is to be expected! </li></ul></ul></ul>Crossing the Quality Chasm Source: Institute of Medicine 2001.
    47. 47. A Few Simple Rules for Health Care in the 21st Century <ul><li>Current Approach </li></ul><ul><li>Do no harm is an individual responsibility </li></ul><ul><li>Information is a record </li></ul><ul><li>Secrecy is necessary </li></ul><ul><li>The system reacts to needs </li></ul><ul><li>Professional autonomy drives variability </li></ul><ul><li>New Approach </li></ul><ul><li>Safety is a system property </li></ul><ul><li>Knowledge is shared and information flows freely </li></ul><ul><li>Transparency is necessary </li></ul><ul><li>Needs are anticipated </li></ul><ul><li>Decision-making is evidence-based </li></ul>
    48. 48. Person vs. System Approaches <ul><li>Person approach </li></ul><ul><ul><li>Focus on individuals </li></ul></ul><ul><ul><li>Blaming individuals for forgetfulness, inattention, or carelessness, poor production </li></ul></ul><ul><ul><li>Methods : disciplinary measures, threat of litigation, retraining, blaming and shaming </li></ul></ul><ul><ul><li>Target : Individuals </li></ul></ul><ul><li>System approach </li></ul><ul><ul><li>Focus on the conditions under which individuals work </li></ul></ul><ul><ul><li>Building defenses to avert errors/poor productivity or mitigate their effects </li></ul></ul><ul><ul><li>Methods : creating better systems </li></ul></ul><ul><ul><li>Targets : System (team, tasks, workplace, organization) </li></ul></ul>Reason J. Human error: models and management. BMJ 2000;320:768-770.
    49. 49. Identifying System Issues <ul><li>Communication Issues </li></ul><ul><ul><li>Handoffs </li></ul></ul><ul><ul><li>Standardization of communication </li></ul></ul><ul><ul><li>Methods of documentation </li></ul></ul><ul><ul><li>Communication between disciplines or across power gradients </li></ul></ul><ul><li>Education or Training Issues </li></ul><ul><li>Equipment Issues </li></ul><ul><li>Staffing Issues </li></ul><ul><li>Fatigue or Scheduling Issues </li></ul><ul><li>Policy Issues </li></ul>
    50. 50. The Swiss Cheese Model (Reason, 1991) <ul><li>Policies/Procedures </li></ul>Environmental Individual Team Profession Adverse Event Defenses Lack of Procedures Punitive policies Mixed Messages Production Pressures Zero fault tolerance Sporadic Training Attention Distractions Clumsy Technology Deferred Maintenance Equipment Triggers
    51. 51. Systems Thinking: Principles and Concepts <ul><li>Interdependencies </li></ul><ul><li>Structure drives behavior </li></ul><ul><li>Cause & effect are separated by time & space </li></ul><ul><li>Any change in a system has unintended consequences </li></ul>
    52. 52. Designing Systems for Safety <ul><li>Simplify processes </li></ul><ul><ul><li>Reduce hand-offs </li></ul></ul><ul><ul><li>Make workplace user-friendly </li></ul></ul><ul><li>Reduce variation </li></ul><ul><ul><li>Standardize processes </li></ul></ul><ul><ul><li>Reduce reliance on memory and vigilance </li></ul></ul><ul><li>Collaborate and improve communication </li></ul><ul><ul><li>Physicians, nurses, NPs, PAs, pharmacists... </li></ul></ul><ul><ul><li>Patients and their families </li></ul></ul>
    53. 53. Safe Care <ul><li>Culture promotes systemic change rather than individual blame </li></ul><ul><li>Mechanisms to report near misses/errors </li></ul><ul><li>Redundancy within system </li></ul><ul><li>Well developed communication systems </li></ul><ul><li>Re-engineering of work-flow and equipment </li></ul>
    54. 54. “Screaming at a system is a very interesting comment on the screamer, but tells us nothing at all about the system.” Donald Berwick, MD, MPP
    55. 55. Why Do We Think the Systems Approach Will Work? <ul><li>Aviation Experience – 400% reduction in aircraft accidents by utilization of root cause analyses and crew safety training. </li></ul><ul><li>We are now using aviators to train health care workers in patient safety. </li></ul>
    56. 56. Why Do We Think the Systems Approach Will Work? <ul><li>Anesthesia Experience – Over the last 20 years, anesthesia deaths have been reduced to 1/20 th the prior rate. Interventions include: </li></ul><ul><ul><li>Standardization of anesthesia machines; re-engineering to prevent O2 cut-offs </li></ul></ul><ul><ul><li>Reduced resident work hours </li></ul></ul><ul><ul><li>End-tidal CO2 monitors </li></ul></ul><ul><ul><li>Pulse oximetry </li></ul></ul>Gwande, A. Complications ; Metropolitan Books, 2002
    57. 57. The traditional view of medicine and medical education on errors <ul><li>M&M’s </li></ul><ul><ul><li>Let’s identify where things went wrong, what the right way should have been and talk about how we can avoid making this mistake again. </li></ul></ul><ul><li>Medical Education </li></ul><ul><ul><li>Focus on the individual learner. When mistakes happen you consider what the gaps in learning were, and how to remedy them. </li></ul></ul>
    58. 58. The new view of errors from the hospital/health system perspective <ul><li>The public has increasing attention on just how safe they are when they come for care </li></ul><ul><li>Traditional methods of identifying and dealing with mistakes don’t seem to work well </li></ul><ul><li>Borrowing from other industries and from fields like human factors engineering </li></ul><ul><li>Mandates to use “new” view methods by JCAHO and others </li></ul>
    59. 59. Right now both views of error co-exist in many hospital and practice settings –not necessarily happily.
    60. 60. Hospitals are now mandated to do “new view” analyses of adverse events <ul><li>Tool’s include </li></ul><ul><ul><li>Mandatory Adverse Event reporting </li></ul></ul><ul><ul><li>Root Cause Analysis (RCA) </li></ul></ul><ul><ul><li>Failure Modes and Effects Analysis (FMEA) </li></ul></ul><ul><ul><li>Reporting of near-misses through voluntary reporting systems (e.g. Patient Safety Net) </li></ul></ul>
    61. 61. UCH Safety/Quality Initiatives <ul><li>Hand Hygiene (100% compliance of foaming in/out) </li></ul><ul><li>Hand Off Communication (during transitions) </li></ul><ul><li>Medication Reconciliation (every visit and transitions) </li></ul><ul><li>Critical Test Reporting (alerts to providers) </li></ul><ul><li>Core Measures (pre-printed order sets) </li></ul><ul><li>Pre-printed discharge instructions (AMI, CHF, Pneumonia, Surgical Site Infections) </li></ul><ul><li>Signing, timing and dating all orders and progress notes </li></ul><ul><li>Universal Protocol (‘time outs’, H&P available/reviewed) </li></ul><ul><li>Central Line Infection Prevention (full barrier precautions, checklist completion) </li></ul><ul><li>MET = Medical Emergency Team (‘rapid response team’) </li></ul><ul><li>Disease specific inter-professional teams: Stroke team, Diabetes etc. </li></ul>Sue West, RN Assistant Vice Chancellor, Professional Risk Management Director, UCH Clinical Excellence & Patient Safety Director, Infection Control
    62. 62. UCH Safety Culture <ul><li>On-going improvement </li></ul><ul><ul><li>2009 AHRQ Safety Culture survey </li></ul></ul><ul><li>Areas to work on </li></ul><ul><ul><li>Patient Safety Net reporting </li></ul></ul><ul><ul><li>Staffing/workload concerns </li></ul></ul><ul><ul><li>Punitive culture concerns </li></ul></ul><ul><ul><li>Disruptive behavior issues </li></ul></ul><ul><ul><li>Physician accountability. </li></ul></ul>UCH Insider: Volume 2, Issue 21 Through April 27, 2009
    63. 63. Website for Additional Information on UCH Quality Services: http://iamaze.uch.ad.pvt/quality/index.htm
    64. 64. University of Colorado Hospital <ul><li>Professional Risk Management Department </li></ul><ul><ul><li>– Sue West, Assistant Vice Chancellor </li></ul></ul><ul><li>To report a “high level” adverse event call: (303)724-7475 </li></ul><ul><li>Use Patient Safety Net at UCH for everything else. Icon on UCH desktop. </li></ul><ul><li>Residents/fellows serve on the Risk Management Committee at UCH and on the board of the School’s Self-Insurance Trust. </li></ul>
    65. 65. The Children’s Hospital <ul><li>Patient Safety Leadership </li></ul><ul><ul><li>Teresa Fisher, Patient Safety Specialist </li></ul></ul><ul><ul><li>Daniel Hyman, Chief Quality Officer </li></ul></ul><ul><ul><li>Jeanne Crane, Risk Manager </li></ul></ul><ul><li>To report an adverse event: </li></ul><ul><ul><li>QSRS (voluntary reporting system)  </li></ul></ul><ul><ul><li>Icon on TCH intranet </li></ul></ul><ul><li>Residents involved with RCAs when they were involved in the care of patients with adverse outcome. All serious outcomes and JCAHO sentinel events get formal RCA. </li></ul>
    66. 66. VA Hospital <ul><li>Patient Safety Officer </li></ul><ul><ul><li>– Jeriann Ascione </li></ul></ul><ul><li>To report an adverse event (303)393-5223 </li></ul><ul><li>National voluntary reporting system. </li></ul><ul><li>RCAs are done by ad hoc committees. Supportive of residents and fellows being involved but scheduling is a problem due to commitment over several weeks. </li></ul>
    67. 67. Denver Health <ul><li>Risk Management Department </li></ul><ul><ul><li>Dave Kvapil, Director </li></ul></ul><ul><li>To report an adverse event: (303)436-7075 </li></ul><ul><li>Risk Management staff do RCAs of all reportable adverse events and JCAHO sentinel events </li></ul><ul><li>Follow up on Patient Safety Net reports of near misses as time permits </li></ul><ul><li>No professionals-in-training involved in the RCA process </li></ul>
    68. 68. So what can I do as a medical student? <ul><li>Observe </li></ul><ul><li>Ask </li></ul><ul><li>Advocate </li></ul><ul><li>Report </li></ul><ul><li>Reflect </li></ul>
    69. 69. Take Home Points <ul><li>Lapses in safety include errors, adverse events and near misses </li></ul><ul><li>Medical errors are frequent and significant threats to safe and quality health care </li></ul><ul><li>A systems approach is more desirable than the blame/shame approach in improving safety </li></ul><ul><li>The culture of medicine has led to barriers in improving patient safety but hospitals are working hard to implement safety/quality processes </li></ul><ul><li>Medical students have a dual role: to learn about safety/quality and to be a part of safety culture and improvement activities </li></ul>
    70. 70. Acknowledgements <ul><li>University of Missouri-Columbia </li></ul><ul><ul><li>Quality and Patient Safety Education Group </li></ul></ul><ul><li>John Gosbee, MD MS </li></ul><ul><ul><li>VA National Center for Patient Safety </li></ul></ul><ul><ul><li>Patient Safety Curriculum Group </li></ul></ul>
    71. 71. Fun Patient Safety Resource… <ul><li>www.webmm.ahrq.gov </li></ul><ul><ul><li>Web-based medical journal showcasing patient safety lessons drawn from actual cases involving medical errors </li></ul></ul><ul><ul><li>5 cases per month from medicine, surgery-anesthesia, OB/GYN, pediatrics, psychiatry </li></ul></ul><ul><ul><li>Commentaries from experts </li></ul></ul>
    72. 72. Why Focus on the Near Miss? <ul><li>10-100 times as frequent as adverse events </li></ul><ul><li>People more willing to talk & help evaluate </li></ul><ul><li>Easier to highlight the system failures rather than individual failures </li></ul><ul><li>Giant step towards prospective patient safety measures </li></ul><ul><ul><li>“ Experience is the best teacher, but if we wait for adverse events, who pays the tuition? The patient!” - Jim Bagian (NCPS) </li></ul></ul>
    73. 73. “ Culture of Safety” and “High Reliability Organizations” <ul><li>Safety is always on the “agenda” – especially for top management </li></ul><ul><li>Embrace information from near misses and hazard analysis </li></ul><ul><li>Communication up and down the “food chain” regardless of hierarchy in organizational structure </li></ul><ul><li>If you are not sure it is safe then it is not safe </li></ul>
    74. 74. System factors contributing to errors <ul><li>Equipment </li></ul><ul><li>Environment </li></ul><ul><li>Teamwork </li></ul><ul><li>Staff </li></ul><ul><li>Institutional Context </li></ul><ul><li>Organization/Management </li></ul><ul><li>Patient </li></ul>