1315 wed katrine grint &amp.pptx; holt

647 views
443 views

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
647
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
11
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

1315 wed katrine grint &amp.pptx; holt

  1. 1. Blame Culture, No-Blame Culture and Just CultureKeith Grint & Clare Holt <br />
  2. 2. “The operator of an aircraft, the surgeon performing an operation, must all <br />foresee that their acts might cause death; but we should not describe them as <br />reckless unless the risk taken was unjustifiable.”<br />Smith & Hogan, 1975. Criminal Law<br />According to World Health Organization (WHO)…..<br />You have 1 in 10 million chance of dying in a plane crash<br />You have 1 in 300 chance of dying from a healthcare error in hospital<br />(The Times , 22/7/11)<br />
  3. 3. 3<br />What happens when it all goes pear-shaped?<br />
  4. 4. CLASSIFICATION OF ERRORS<br />BASIC <br />ERROR TYPES<br />Attentional failures<br />Intrusion<br />Omission<br />Mistiming<br />Etc.<br />SLIP<br />UNINTENDED<br />ACTION<br />Memory failures<br />Forgetting<br />Omission<br />Place-losing<br />LAPSE<br />UNSAFE <br />ACT<br />Rule-based <br />Misapplication of good rule<br />Application of bad rule<br />Knowledge-based<br />Many variables<br />Untested Process<br />MISTAKE<br />INTENDED<br />ACTION<br />Routine violations<br />Exceptional violations<br />Acts of sabotage<br />VIOLATION<br />Taken from ‘Human Error’, James Reason (1990, 2009), p207<br />
  5. 5. CLASSIFICATION OF ERRORS<br />BASIC <br />ERROR TYPES<br />Attentional failures<br />Intrusion<br />Omission<br />Mistiming<br />Etc.<br />SLIP<br />UNINTENDED<br />ACTION<br />Memory failures<br />Forgetting<br />Omission<br />Place-losing<br />LAPSE<br />UNSAFE <br />ACT<br />Rule-based <br />Misapplication of good rule<br />Application of bad rule<br />Knowledge-based<br />Many variables<br />Untested Process<br />MISTAKE<br />INTENDED<br />ACTION<br />Routine violations<br />Exceptional violations<br />Acts of sabotage<br />VIOLATION<br />Taken from ‘Human Error’, James Reason (1990, 2009), p207<br />
  6. 6. BLAME CULTURE (1/2): <br />The Sweep it under the carpet school of management<br />You’ve made a mistake<br />Will it show?<br />YES<br />Can you hide it?<br />YES<br />Conceal it before <br />somebody else finds out<br />NO<br />NO<br />Can you blame someone else, special circumstances or a difficult client?<br />Get in first with your version of events<br />YES<br />Bury it<br />NO<br />NO<br />Sit tight and hope the problem goes away<br />Could an admission damage your career prospects?<br />Problem <br />avoided<br />YES<br />
  7. 7. BLAME CULTURE (2/2): <br />The Sweep it under the carpet school of management<br />You’ve made a mistake<br />Will it show?<br />YES<br />Can you hide it?<br />YES<br />Conceal it before <br />somebody else finds out<br />NO<br />NO<br />Can you blame someone else, special circumstances or a difficult client?<br />Get in first with your version of events<br />YES<br />Bury it<br />NO<br />Personal Responsibility <br />Avoided; <br />Organization<br />Continues to<br />Fail; no-one<br />Seems to know why….<br />NO<br />Sit tight and hope the problem goes away<br />Could an admission damage your career prospects?<br />YES<br />
  8. 8. No-BLAME CULTURE (1): <br />You’ve made a mistake<br />Will it show?<br />YES<br />Don’t need to hide it<br />It wasn’t your fault<br />It was probably the fault of the system<br /> Admit it <br />NO<br />Ignore it<br />Personal Responsibility <br />Avoided; <br />Organization<br />Continues to<br />Fail; no-one<br />Seems to know why….<br />
  9. 9. No-BLAME CULTURE (2): <br />You’ve made another mistake<br />Will it show?<br />YES<br />Don’t need to hide it<br />It wasn’t your fault<br />It was probably the fault of the system<br /> Admit it <br />NO<br />No Learning!<br />Ignore it<br />Personal Responsibility <br />Avoided; <br />Organization<br />Continues to<br />Fail; no-one<br />Seems to know why….<br />
  10. 10. JUST CULTURE: <br />You’ve made a mistake<br />Don’t need to hide it<br />Could be partly your fault but it’s<br />likely that other factors are also involved<br />You have a responsibility to prevent it happening again<br /> Admit it <br />Will it show?<br />YES<br />NO<br />Report it through the appropriate channels<br />Admit it<br />Personal Responsibility <br />Taken.<br />Organization<br />Continues to<br />Improve –everyone knows why….<br />Investigated<br /> Organizational learning occurs<br />Information fed back to individual as well as the organization<br />
  11. 11. Just Culture: <br />A Brief Theoretical Overview from ‘Accident’ Theory<br /> In the beginning.... <br />Human Error (First Story Accounts) <br />2. Sequence of Events<br />3 . Systems (Second Story Accounts; tight/loose coupling<br /> icebergs & hard shell/soft shell + process/SOPs)<br />a. Latent Failure/Swiss Cheese model<br />b. Normal Accident<br />c. Just Culture<br />11<br />
  12. 12. 1. HUMAN ERROR (First Story Accounts – the initial assumption)<br />Biggest personnel problem for US military: getting the right people in the right jobs <br /> – problem of selection – fixed through competency framework<br />1943 P-47s & B-17s keep crashing – wheels are retracted on landing instead of flaps<br />Cannot be the planes – look at how robust they are – must be the people.<br />B-17 After<br /> B-17G-80BO 43-38172 8th AF 398th BG 601st BS damaged on a bombing mission over Cologne, Germany, <br />Must be HUMAN ERROR – <br />So , what’s wrong with our pilots?<br />B-17 Before<br />12<br />
  13. 13. Alphonse Chapanis<br />How come the P-47 pilots make same error but C-47s’ don’t?<br />13<br />
  14. 14. 1. HUMAN ERROR<br />P-47 Thunderbolt<br />Flaps & Wheels<br />14<br />
  15. 15. C-47/DC-3<br />1. HUMAN ERROR<br />C-47s don’t have side by side wheel and flap controls <br />with identical levers & coloured toggle switches<br />15<br />
  16. 16. Alphonse Chapanis<br />Mark wheel lever with a wheel & flap lever with a triangle – significant reduction in landing ‘accidents’ <br />HUMAN ERROR is just one possible explanation: 1st story account<br />It’s likely that such mistakes will recur because of the connection between the human and the system – <br />16<br />1. HUMAN ERROR<br />
  17. 17. Folk Myth: systems are 100% reliable – <br /> as long as they are protected from human error<br />Reification: a system is an objective, stable & predictable ‘thing’ - not a moving mass of stuff.<br />Response: eliminate human error, especially in high risk organizations<br />Consequence: system becomes more calcified/brittle – <br /> allows less, not more, learning<br />17<br />1. HUMAN ERROR<br />
  18. 18. Hard Shell (Exogenous) V Soft Shell (Endogenous) organization<br />Hard Shell – externally strong, process-driven but brittle <br /> system designed to prevent error<br />Soft Shell – externally weak but flexible system: <br /> built in resilience via capacity to learn & rectify error<br />18<br /> HARD SHELL - SOFT SHELL<br />Is the safety system hard or soft – prevention or recovery?<br />
  19. 19. 2. Sequence of events model (Heinrich, 1931) – domino run model<br />Events preceding accident occur in linear fixed order with the<br /> accident being the last in sequence<br />Solution: a sequence of barriers to reduce hazard, absorb energy<br />& prevent accident<br />Space Shuttle Columbia 2003:<br />Piece of foam strikes wing on launch breaching thermal protection<br />On re-entry superheated air melts wing which breaks off<br />Solution:<br />19<br />
  20. 20. 2. Sequence of events model (Heinrich, 1931) – domino run model<br />Events preceding accident occur in linear fixed order with the<br /> accident being the last in sequence<br />Solution: a sequence of barriers to reduce hazard, absorb energy<br />& prevent accident<br />Space Shuttle Columbia 2003:<br />Piece of foam strikes wing on launch breaching thermal protection<br />On re-entry superheated air melts wing which breaks off<br />Solution: reinforce wing<br />20<br />
  21. 21. Sequence of Events Model: the hindsight problem<br />Safe Present<br />The hindsight problem:<br />View from investigator<br />Critical Future<br />“A map that shows only those forks in the road that we decided to take”<br />Lubar, 1993: 1168 History from Things (Smithsonian Institute)<br />21<br />
  22. 22. View from the decision-maker<br />Future<br />Present<br />Future<br />Future<br />Future<br />Future<br />Future<br />Future<br />Future<br />22<br />
  23. 23. 23<br />NietzscheanAnxiety over determining causation<br />If we cannot determine cause then ‘problem’ is potentially irresolvable<br />Scott Snook : accidental shootdown of US Blackhawks in Iraq<br />There was ‘no bad guy… no smoking gun, no culprit.’<br />Wrong answer – find a cause<br />Durkheim’s scapegoat<br />23<br />23<br />
  24. 24. 2. Sequence of events model (Heinrich, 1931) – domino run model<br /> The hole in the wing was produced by not simply by debris <br /> but by holes in organizational decision-making<br />24<br />
  25. 25. 25<br />3. Systems Approaches/ Second Stories<br />
  26. 26. Second Story accounts <br /><ul><li>human ‘errors’ are the products –symptoms of system complexity (1st story accounts) </li></ul>– there are usually multiple causes (2nd story accounts)<br /><ul><li> safety or success is less the consequence of perfect process and more the consequence of people's operational practice</li></ul>26<br />
  27. 27. Second Story accounts <br /><ul><li>sharp end </li></ul>– practitioners directly interact with a hazardous process<br /><ul><li>blunt end – regulators, administrators & managers provide resources & constraints that practitioners have to integrate
  28. 28. success & failure is a result of how sharp end practitioners cope with complexity & how their actions are shaped by resources & constraints of those at the blunt end</li></ul>27<br />
  29. 29. Bricoleurs: (Levi Strauss) people who achieve success by stitching together <br />whatever is at hand, whatever needs stitching together to ensure practical success. <br />Bricoleurs & the possibility of rescue: First-Responders to the flooding in New Orleans Kroll-Smith et al, (2007) Journal of Public Management & Social Policy (Fall)<br />The CPR (Cardiopulmonary resuscitation) paradox: <br />5 trainee + 1 experienced paramedics filmed using CPR<br />Film shown to three groups: who is the experienced one?<br />Experienced paramedics get it right 90%<br />Students right 50%<br />Instructors right 30%<br />Why? <br />
  30. 30. Bricoleurs & the possibility of rescue: First-Responders to the flooding in New Orleans<br />Kroll-Smith et al, (2007) Journal of Public Management & Social Policy (Fall)<br />The CPR (Cardiopulmonary resuscitation) paradox: <br />5 trainee + 1 experienced paramedics filmed using CPR<br />Film shown to three groups: who is the experienced one?<br />Experienced paramedics get it right 90%<br />Students right 50%<br />Instructors right 30%<br />Why? Instructors follow training protocols; experienced<br /> paramedics know that the protocols don’t always work<br />Training V Education?<br />Bricoleurs can be undermined by over relying on protocols?<br />First responders in New Orleans were left to their own devices<br />
  31. 31. St Claude Bridge<br />People sheltered on the bridge but the water rose rapidly<br />Police officer went to National Guard base near the bridge and <br /> asked a colonel for the buses to rescue the people<br />Colonel refused but said he would ask his general – <br /> but wasn’t sure where he was ... No buses left the depot<br />
  32. 32. One ambulance driver <br />carried 42 people in one go<br />Police officer commandeered (stole) a <br />refrigerator truck siphoned (stole) diesel from <br />abandoned vehicles to keep it running to feed 100 people for days<br />
  33. 33. Second Story accounts – cont<br /><ul><li>All complex systems contain weaknesses but these are usually </li></ul>transcended – stopped – by the safety seeking actions of individuals<br /><ul><li> Multiple weaknesses exist in all complex systems but failure tends to</li></ul> occur when all of the weaknesses occur simultaneously<br /><ul><li> The search for a single cause inhibits our understanding
  34. 34. To understand failure you must first understand success – </li></ul>how people at sharp end learn & adapt to create safety or <br />success in world fraught with hazards, trade-offs & multiple goals<br />32<br />
  35. 35. Iceberg model:<br />1 accident<br />10 incidents<br />30 near misses<br />600 unsafe acts<br />33<br />Reduce the unsafe acts to reduce the accidents<br />
  36. 36. But US air data suggests the airlines with the most <br />incidents & near misses have the lowest # accidents<br />30,000 near-miss/trivial reports per annum in US aviation<br />Almost no catastrophic crashes reported.<br />The ability to learn is critical to safety because you cannot build a completely safe system<br /> passengers have to fly 19,000 years to die in place crash <br />34<br />
  37. 37. 3a. Latent Failure Model – Swiss Cheese Model (Reason, 1990)<br />Some of the factors that contribute to disaster are latent – <br />present before the disaster - ‘Hidden Pathogens’ (Reason)<br />Active Failures: unsafe acts – people at sharp end – errors quickly apparent<br />Latent Failures:features that lay dormant & only become evident when they<br /> combine and are triggered – people at the blunt end<br />‘People at the sharp end – operators – are not usually the cause of the accident but the inheritors of system defects created by poor design, incorrect installation, faulty maintenance & bad management decision’ <br /> (Reason, 1990: 173 Human Error)<br />Safety critical systems have a series of barriers to prevent/limit/absorb danger<br />But each barrier has holes in it – imperfections – when all the holes line up and are penetrated – disaster occurs<br />35<br />
  38. 38. 36<br />
  39. 39. 37<br />Build an error-tolerant system with long recovery interval<br />If the elimination of error is impossible must build system that enhances error recovery<br />How good is the system at recognizing & responding to disturbances?<br />
  40. 40. 3b. Normal accident theory (Perrow)<br />Multiple safety systems add complexity and increase opacity –<br />When things start to go wrong it’s difficult to see or act appropriately<br />Systems involved - not a single or set of component failures, but the <br /> unanticipated interaction of a multitude of events in a complex system<br />Accidents are not unusual events but normal events given the complexity &<br /> tight coupling of the system<br />38<br />
  41. 41. 3c What is meant by ‘Just’?Dekker (2007)<br />Balancing safety with accountability<br />JUST CULTURE<br />Satisfies the demands for accountability<br />Contributes to learning and improvement <br />Not punishing for ‘unintended’ errors which are part of the professional role of the individual, BUT intentional violations and destruction, are not tolerated<br />39<br />
  42. 42. What is Just? What isn’t Just?<br />Where do you draw the line?<br />Who draws the line?<br />What is the line?<br />‘It’s not obvious, but it needs to a be a judgement by the organisation looking at ‘politics, power and populism.’ (Dekker, 2011)<br />i.e. individuals should be included in the decision!<br />There needs to be some trust to encourage honesty, but what is acceptable/unacceptable needs to be clear<br /> Some organisations are subject to regulatory bodies, some set up their own safety boards and ethics committees<br />40<br />Violation<br />Intentional<br />Slip, lapse<br />Unintentional<br />
  43. 43. Safety Culture = Just + Open<br />Violations can be linked to culture<br />A ‘No Blame’ culture is neither <br />Feasible,<br />Desirable, nor <br />Accountability -free<br /> You need to look ahead to improve (accountability), and not blame the past.<br />To encourage a ‘safety culture’ and hold individuals accountable, they must be given an appropriate level of discretion – ‘a culture of balance’<br />41<br />I’ll get away with it! <br />Everyone does it, they’ll just turn a blind eye.<br />
  44. 44. OPEN-reporting<br />Openness in reporting is providing the environment for individuals to report the trivial near-misses<br />These are what can cascade into latent system failures that fester and can have catastrophic consequences!<br />Front-line professionals are best used to help with future prevention – this can be hindered if they are treated like a criminal!<br />42<br />
  45. 45. OPEN-reporting<br />Open reporting requires<br /><ul><li>Honest disclosure and transparency
  46. 46. Easily submitted, with some immunity
  47. 47. Needs to be seen to be actioned - confidence
  48. 48. Lessons learned (training, change in SOP, etc)
  49. 49. Actions and lessons disseminated (if possible across an industry!)</li></ul>43<br />
  50. 50. Things to bear in mind…………<br />Health care, aviation, petrochemical, nuclear professionals, etc. all have a<br />Criminalization of an unintended error can hamper a safe & just culture – this only encourages people to ‘hide’ their mistake(s)<br />‘<br />‘Dispensing mistakes [in healthcare] happen. And even with the introduction of robots and SOPs, the Utopian ideal of a world without errors is closer to fantasy than reality.’<br />44<br />STRONG SAFETY ETHIC<br />Chapman, 2009 ‘A criminal mistake?’<br />
  51. 51. Conclusions:<br />Stop looking for psychological error mechanisms – 1st story accounts<br /> stop blaming HUMAN ERROR<br />Systematic features of the environment can trigger predictable actions <br /> that lead to ‘error’<br />Safety is less a feature of the system and better understood as being <br /> created by people in complex systems<br />Are systems safe & therefore need protecting from unreliable humans? Or does the elimination of human ‘unreliability’ make the system more brittle so that the sources of resilience are eliminated?<br /> ‘the enemy of safety is not the human: it is complexity’ <br />Woods et al, (2010:1) Behind Human Error (Ashgate)<br />45<br />
  52. 52. A Safety, Just and Learning culture can be strived towards but rarely attained<br />It is the process that is important!<br />46<br />

×