Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Evidence-Based Decision Making for Hospital Administrators

1,475 views

Published on

Evidence-Based Decision Making for Hospital Administrators.
If doctors can do it, administrators can do it?
Presentation for the

Published in: Education
  • Be the first to comment

Evidence-Based Decision Making for Hospital Administrators

  1. 1. Evidence-Based Decision Making If doctors can do it... administrators can do it? NVZD Voorjaarscongres – 4 Juni 2015 - Nyenrode
  2. 2. Exercise Think about a decision you have been involved in making. This decision should be one which: Was reasonably important for your organization Involved spending significant resources Involved several or more people Was made over a period of time (ie. weeks or months) Did not have an easy ‘answer’
  3. 3. Exercise Discuss with your neighbor (1 min) What exactly was the problem (or opportunity)? How many alternative decision options were considered? How much evidence was used, and from which sources (scientific, organizational, experience, crystal ball?) Was any attempt made to explicitly evaluate its quality or trustworthiness?
  4. 4. Evidence based decision-making: What is it?
  5. 5. Evidence-based decision making Central Premise: Decisions should be based on a combination of critical thinking and the ‘best available evidence‘.
  6. 6. Evidence? outcome of scientific research, organizational facts & figures, benchmarking, best practices, personal experience
  7. 7. All managers and leaders base their decisions on ‘evidence’
  8. 8. But…many managers and leaders pay little or no attention to the quality of the evidence they base their decisions on
  9. 9. Trust me, 20 years of management experience
  10. 10. SO ...
  11. 11. Teach managers/leaders how to critically evaluate the validity, and generalizability of the evidence and help them find ‘the best available’ evidence
  12. 12. Evidence based decision Professional experience and judgment Organizational data, facts and figures Stakeholders’ values and concerns Scientific research outcomes Ask Acquire Appraise Aggregate Apply Assess diagnosis intervention
  13. 13. Evidence based practice: Where does it come from?
  14. 14. McMaster University Medical School, Canada Medicine: Founding fathers David Sackett Gordon Guyatt
  15. 15. How it all started
  16. 16. 1. Ask: translate a practical issue into an answerable question 2. Acquire: systematically search for and retrieve the evidence 3. Appraise: critically judge the trustworthiness of the evidence 4. Apply: incorporate the evidence into the decision-making process 5. Assess: evaluate the outcome of the decision taken 5 steps of EBmed
  17. 17. Evidence-Based Practice 1991 Medicine 1998 Education 2000 Social care, public policy Nursing, Criminal justice, Policing, Architecture, Conservation 2010 Management
  18. 18. Evidence-Based Practice
  19. 19. Evidence-Based Practice
  20. 20. Evidence-Based Practice
  21. 21. Evidence-Based Practice
  22. 22. Evidence-based decision-making = the use of evidence from multiple sources to increase the likelihood of a favourable outcome Focus on the decision making process Think in terms of probability
  23. 23. Evidence-Based Decision-Making Why do we need it?
  24. 24. 1. Incompetent people benefit more from feedback than highly competent people. 2. Task conflict improves work group performance while relational conflict harms it. 3. Encouraging employees to participate in decision making is more effective for improving organizational performance than setting performance goals. True or false?
  25. 25. How evidence-based is your HR director?  959 (US) + 626 (Dutch) HR professionals  35 statements, based on an extensive body of evidence  true / false / uncertain HR Professionals' beliefs about effective human resource practices: correspondence between research and practice, (Rynes et al, 2002, Sanders et al 2008)
  26. 26. Outcome: not better than random chance
  27. 27. Evidence-based decision making Professional experience and judgment Organizational data, facts and figures Stakeholders’ values and concerns Scientific research outcomes Ask Acquire Appraise Aggregate Apply Assess
  28. 28. Thinking critical about professional experience and judgment
  29. 29. Discuss with your neighbor (1 min) Why is a physician’s clinical experience, as a rule, more trustworthy than a manager’s professional experience?
  30. 30. Developing expertise 1. A sufficiently regular, predictable environment 2. Opportunities to learn regularities through prolonged practice and feedback The management domain is not highly favorable to expertise!
  31. 31. Bounded rationality
  32. 32. Bounded rationality / prospect theory System 1  Fast  Intuitive, associative  heuristics & biases System 2  Slow (lazy)  Deliberate,  Reasoning  Rational
  33. 33. System 1: short cuts
  34. 34. Shepard’s tables System 1: short cuts
  35. 35. System 1: necessary to survive 95%
  36. 36.  Seeing order in randomness  Overconfidence bias  Halo effect  False consensus effect  Group think  Self serving attribution bias  Sunk cost fallacy  Cognitive dissonance reduction System 1: cognitive errors  Confirmation bias  Authority bias  Small numbers fallacy  In-group bias  Recall bias  Anchoring bias  Availability bias
  37. 37. 1. Pattern recognition 2. Confirmation-bias 3. Groupthink Cognitive errors
  38. 38. We are predisposed to see order, pattern and causal relations in the world. Patternicity: The tendency to find meaningful patterns in both meaningful and meaningless noise. Error 1: pattern recognition
  39. 39. We are pattern seeking primates: association learning Bias 1: pattern recognition
  40. 40. Points of impact of V-1 bombs in London
  41. 41. Points of impact of V-1 bombs in London
  42. 42.  A Type I error or a false positive, is believing a pattern is real when it is not (finding a non existent pattern)  A Type II error or a false negative, is not believing a pattern is real when it is (not recognizing a real pattern) Dr. Michael Shermer (Director of the Skeptics Society) Error 1: pattern recognition
  43. 43.  A Type I error or a false positive: believe that the rustle in the grass is a dangerous predator when it is just the wind (low cost) Error 1: pattern recognition
  44. 44.  A Type II error or a false negative: believe that the rustle in the grass is just the wind when it is a dangerous predator (high cost) Error 1: pattern recognition
  45. 45.  A Type I error or a false positive: believe that the rustle in the grass is a dangerous predator when it is just the wind (low cost)  A Type II error or a false negative: believe that the rustle in the grass is just the wind when it is a dangerous predator (high cost) Error 1: pattern recognition
  46. 46. superstitious rituals superstitious rituals more stress = more prone to type 1 errors Error 1: pattern recognition
  47. 47. Error 1: pattern recognition
  48. 48. 1. Pattern recognition 2. Confirmation-bias 3. Groupthink Cognitive errors
  49. 49. We are predisposed to selectively search for or interpret information in ways that confirms our existing beliefs, expectations and assumptions, and ignore information to the contrary. In other words, we “see what we want to see” 2. Confirmation bias
  50. 50. Example You may believe that astrology actually works. As a result of confirmation bias you’ll remember only those instances when when the prediction in the astrology column came true and forget the majority of the cases when the prediction was very wrong. As a result you will continue to believe astrology has some base in reality 2. Confirmation bias
  51. 51. Confirmation bias Pattern recognition Error 2: confirmation bias
  52. 52. McKinsey (1997 case study / 2001 book)
  53. 53. McKinsey: case study
  54. 54. War on Talent
  55. 55. 1. Pattern recognition 2. Confirmation-bias 3. Groupthink Errors
  56. 56. Groupthink: Groupthink is a psychological phenomenon that occurs within a group of people, in which the desire for harmony or conformity in the group results in an incorrect or irrational decision Error 3: Groupthink
  57. 57. Bias 3: Group think
  58. 58. Error 3: Groupthink
  59. 59. Lean Management / Lean Six Sigma Self steering / autonomous teams Agile working / New World of Working Value based management / health care Talent management Employee engagement Group think?
  60. 60. “I’ve been studying judgment for 45 years, and I’m no better than when I started. I make extreme predictions. I’m over- confident. I fall for every one of the biases.” Bounded rationality
  61. 61. Evidence based decision Professional experience and judgment Organizational data, facts and figures Stakeholders’ values and concerns Scientific research outcomes Ask Acquire Appraise Aggregate Apply Assess diagnosis intervention
  62. 62. Evidence-based decision making Professional experience and judgment Stakeholders’ values and concerns Scientific research outcomes Ask Acquire Appraise Aggregate Apply Assess Organizational data, facts and figures
  63. 63. People operate with beliefs & biases. To the extent you can reduce both and replace them with data, you gain a clear competitive advantage Laszlo Bock (CHRO Google) Organizational data
  64. 64. 1. financial data (cash flow, solvability) 2. business outcomes (ROI, market share) 3. customer/client impact (customer satisfaction) 4. performance indicators (occupancy rate, failure frequency) 5. HR metrics (absenteeism, employee engagement) 6. marketing intelligence (brand awareness, customer feedback) 7. ‘soft’ data (organizational culture, trust in senior management, leadership style, commitment) 8. data from benchmarking Types organizational evidence
  65. 65. Organizational facts and figures
  66. 66. Examples
  67. 67. Can your organization correlate/regress level of education years of experience productivity customer satisfaction failure frequency employee satisfaction employee turnover absenteeism +
  68. 68. Trends
  69. 69. “Where the evidence is strong, we should act on it. Where the evidence is suggestive, we should consider it. Where the evidence is weak, we should build the knowledge to support better decisions in the future.” Jeffrey Zients, acting director of the Office of Management and Budget and President Obama’s Economic Advisor
  70. 70. In the next weeks, before you make a decision, ask yourself:  What exactly is the problem?  What is the evidence available?  Was any attempt made to explicitly evaluate its trustworthiness?

×