SlideShare a Scribd company logo
1 of 39
ALIAS Conference 14-15 June 2012, EUI - Florence (Italy)


Air disasters as organizational
  errors: the case of Linate




                 Prof. Maurizio Catino
           University of Milan - Bicocca (Italy)
               maurizio.catino@unimib.it                   1
8.10.2001: The second most serious aircrash
              ground accident




                                   SAS MD87




                          Cessna




                                              2
The accident dynamic

  TWR
                     MD87




                  •• •
                   •              •
                                 •• •
         Cessna          R5 R6


                                        3
4
5
The sierra four …
                                         (8.08.23)



                MD87

                                          (8.08.28)




                                          (8.08.32)




       Cessna
                                          (8.08.36)



                              • Roger, …
                       D-VX   hold position
                                       (8.08.40)


                                                   6
The accident dynamic
                            (8.09.19)


                    MD87


                            (8.09.28)




                           (8.09.37)

           Cessna


                            (8.09.38)




                                       7
Accident Dynamic




                   8
9
Why? Who is to blame?
   Cessna pilots mistake
   Ground controller error
   Inadequate signals condition
   Absence of a ground radar
   Airport management negligence
   Tragic fatality
   …                               10
The Error of Human Error…
“... ‘human error’ is not a well defined category of
human performance. Attributing error to the actions
of some person, team, or organisation is
fundamentally a social and psychological process
and not an objective, technical one.”
(Woods et al., 1994)



Assume that the
source of failure is
“human error”

                               Analyse events to
                               find where a
                               person is involved

                                                       Stop analysis
                                                       when one is found
                                                                           11
A multilevel model for the
                    analysis of accidents
Inter-organizational level
 - Integration
 - Coordination                                       Defences
 -…



                               Individual-level
                               (errors, violations,
                               mistakes, decisions)                     Accident




 Organizational level
 - Defences
 - Managerial decisions
 - Error-inducing conditions
 -…                                                     (Catino 2010)              12
1. Individual Level
• The Cessna and two pilots were not qualified and
  certified to operate in low visibility conditions (land
  and take off) such as that day (violation)

• The Cessna crew took the wrong taxiway (error) and
  entered the runway without specific clearance
  (violation)

• There were communication failures between the
  tower and the Cessna pilots: the ground controller
  did not realize that the Cessna was on taxiway R6
  (error), and he issued a clearance to taxi towards the
  main apron although he could not make sense of the
  report position S4
                                                            13
2. Organizational Failures
Failures defences
           No Surface Movement Radar (out of service since November 1999)
           Installed equipment for prevention r.i. at R6 intersection deactivated
           TWY Lights
           Stop Bars

Error-inducing conditions
           The ground markigs were not clearly visible (RWY Holding
            Position Markings)
           Signs, signals and lights were inadequate and misleading (out
            standard ICAO)
           Official documention failed to report the presence of
            unpublished marking (S4, S5, etc)

Latent failures
            No learning from near miss
            Best practices not applied
            No functional Safety Management System
                                                                                     14
3.The bigger picture—Linate

                    ENAC
              (airport authority)




        ENAV                        SEA
(air traffic regulator)      (Service Provider)
                                                  15
Individual Failures                        Organizational and
                                               Inter-organizational Failures
                                            Markings and signs were not in accordance with
                                            ICAO standards; Red bars and TWY lights non
    The Cessna crew took the wrong          controllable by ATC; Deficiency in the state of
taxiway (error) and entered the runway      implementation and maintenance of airport
  without specific clearance (violation)    standard signage; Official documentation failed to
                                            report the presence of unpublished markings
                                            (S4); No equipment to prevent runway incursions

                                            No surface movement radar; Installed equipment
  There were communication failures         for prevention r.i. at R6 intersection deactivated;
  between the tower and the Cessna          Markings and signs were not in accordance with
                                            ICAO standards; Deficiency in the state of
                pilots
                                            implementation and maintenance of airport
                                            standard signage; Non-compliance with
                                            international standards on markings, lights and
                                            signs; High traffic volume; lack of visual aids

 The Cessna and two pilots were not
qualified and certified to operate in low   Lack of coordination among the airport
visibility conditions (land and take off)   authorities; weaknesses in the control system
       such as that day (violation)



                                                                                                  16
Failure Levels

                          Inter-organizational level

     •   Cost/safety trade-offs
     •   Failures of integration and coordination
     •   Bureaucratic safety culture
     •   No Safety Management system
     •   …
              Organizational - level

                                   • No ground radar
    Individual-level               • No international safety
                                     standard
• Errors                           • Weak defenses
• Violations                       • Lack of visual aids
• Communications                   • No learning from near miss
  misunderstandings                • …




                                                                  17
Active versus Latent Failures
               Inter-       Latent Conditions
           Organizational    Coordination neglect
              Factors
                             Inadequate safety policies

                       Organizational     Latent Conditions
                          Factors          No ground radar; no international standard
                                           No learning from near miss; …

                                   Preconditions      Latent Conditions
                                        for            Poor visibility of R5/R6 signs; Mental Fatigue;
                                    Unsafe Acts
                                                       S4 marking unknown to the controller; …

                                                     Unsafe           Active Conditions
                                                      Acts            • The Cessna crew took the wrong
                                                                        taxiway and entered the runway
          Failed or                                                   • Communication failures
       Absent Defenses

                                                                                   Accident & Injury




(Adapted from Reason, 1997)                                                                               18
Conclusions

• If we focus too closely upon the unsafe acts at
  the sharp end, we are in danger of missing the
  fact that this was the result of an organizational
  error
• It’s important to take a system perspective
• Communication and organization problems of
  many kinds were crucial factors in this and other
  disasters


                                                       19
Two ways of looking at accidents
Individual Blame Logic       Organizational Function Logic




                     Errors and
                      Accidents

                                                        20
Vicious Circle

 Individual                Organizational inertia
Blame Logic                 Defensive behavior




                 Blame culture


  Search for the guilty      Hidden errors


                                                    21
Defensive Medicine?
•   Defensive medicine takes place when healthcare
    personnel prescribe unnecessary treatments, or avoid
    high-risk procedures, with the goal of reducing their
    exposure to malpractice litigation
•   Doctors in particular may:
       • prescribe unnecessary tests, procedures or
         specialist visits (positive defensive medicine),
       • or, alternatively, avoid high-risk patients or
         procedures (negative defensive medicine).

                                                            22
Defensive Medicine
         Study         Year   Country       Result
                                        (% of defensive
                                          behaviours)
Tancredi               1978     US           70%

Studdert et al.        1995     US           93%

Summerton              2000     UK           90%

Hymaia                 2006    Japan         98%

Jackson Healthcare     2008     US           72%

Massachusetts          2009     US           83%
Medical Society
                                                          23
Positive Defensive
Medicine




Negative Defensive
Medicine




                     24
25
The side effects of defensive medicine

• The threat of legal investigation does not make the
  medical system more careful and attentive toward the
  patient
• Individual blame logic does not improve patient safety
• Develop the capacity to learn from errors and system
  failures to become more resilient and reliable
• To achieve this, a profound cultural and juridical
  transformation is required
• Promote a different culture to reduce defensive medicine
  and to promote a process of learning from error

                                                             26
Virtuous Circle


  Organizational                         Removing latent factors
  Function Logic                         Organizational learning



                             Just culture


     Search for                        Reporting close calls,
organizational criticality                     errors
                                                                27
Getting the balance right
     Person model                              System model




      Proximal                                   Remote
       factors                                   factors
         Individual                                Collective
       responsibility                            responsibility

                   Both extremes have their pitfalls
(Reason, 1997)                                                    28
Blame free                   Just     Punitive culture
                               culture


All errors to system failure                Individuals are
No individual is to be held                  blamed for all
        accountable                            mistakes




                                                              29
Just culture

 10%             90%
Blame          No Blame




                          30
Establishing a Just Culture

                   At-risk       Reckless         Malicious
Human error
                  behavior       behavior         behavior

Inadvertent       A choice:      Conscious        Violations
action: slips,     risk not     disregard of        Gross
   lapses,       recognized    unreasonable       negligence
  mistakes       or believed         risk          Criminal
                   justified                       offences



 Reassure          Coach                 Punish

       Unintentional                    Deliberate
        No blame                         Culpable              31
The Case of the
    Italian Air Force

•   20 flight divisions;
    1000 pilots
•   1990: The accident
    of “Casalecchio di
    Reno”: 12 people
    died
•   New organization,
    new culture
                           32
New risk and safety policy
•    The promotion of a new vision of risk management and
     safety
•    The promotion of methods for the identification,
     analysis and prevention of risks (critical latent factors)

•    Database for incident reporting (voluntary and
     anonymous for the centre)
•    Ongoing training and education about safety and
     perception of errors in order to learn from them
•    The implementation of a just culture

                                                                  33
Two different strategies
        compliance vs. deterrent
A deterrent strategy (blame culture)
•   is backward-looking,
•   implemented after the accident happens
•   punitive, sanctions directed towards the
    individuals or organizations responsible for an
    error or accident
A compliance strategy (ITAF - just culture)
•   is forward-looking and preventive
•   early identification of errors and latent factors
                                                        34
Just culture at ITAF
(extracts from interviews)
 • For each event we look for the reason why it happens.
   We do not talk about blame and responsibility. We do
   not want to know who the guilty person was but why the
   event happened and what we can do to avoid it in the
   future.

 • Error is a mechanism for learning (… there are some
                                           )
   errors that if analyzed can help prevent future errors.
 • The more people I inform about my error, the less they
   risk repeating the error
 • The organization does not put pressure on people
   committing an error. Nobody is afraid of being punished.
   The debriefings are a training activity to talk and improve
   our work. The exchange among experts and newcomers
   is a good occasion for both people as it helps to see
   things from different points of view.                         35
Reporting of Incident and Flight Safety Occurrences
          1991-2009 (rate for 10,000 hours of flying)
220
                                                                                                                                         1922
200
                                                                                                                    1745          1575
180
                                                                                                                           1650
                                                                                                        1773 1732
160                                                                                       1514
                                                                                   1472          1539
140

120
                                                                            1130
                                                                    1073
100
                                        1180 1064                                                                                    650
                                1143                  989     921
80
                                                                                                                 645   574    550
                                                                                      600    694     729   681
                        865                                                    572
60

40                                                                      391
       410      434                                               340
                                              274           266
                              240   272             245
20
                      143
      22     29
 0
      1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008                                          2009



  (Source: ITAF Flight Safety Inspectorate)                                Human Factors     Total                                         36
Major accidents
                                1990 - 2010
26
                    24
24
22             21
          20                  20
20   19
18
                         16
16
14
12
10                                 9                    8
                                       8    8   8
8
                                                    6                   6       6   6
6                                                           5               5
                                                                    4                    4
4                                                               3                            3
2
0
     90 91     92 93 94 95 96 97 98 99 00 01 02 03 04                   05 06 07    08   09 10




(Source: ITAF Flight Safety Inspectorate)
                                                                                                 37
Number of accidents 1980-2010



                               NUMBER       RATEO   DEADS


      1980 – 89                      87      0,59    61

       1990 - 99                     51      0,38    43

       2000 - 10                     33      0,32    22




(Source: ITAF Flight Safety Inspectorate)
                                                            38
Conclusion
 Either organizations manage human
  errors, by learning from them

                    Or…
 human errors will manage organizations


 To achieve the first one, is fundamental to
            develop a just culture

                                               39

More Related Content

Viewers also liked

ILS CAT II AND LOW VISIBILITY PROCEDURES
ILS CAT II AND LOW VISIBILITY PROCEDURESILS CAT II AND LOW VISIBILITY PROCEDURES
ILS CAT II AND LOW VISIBILITY PROCEDURES
jairosilveira
 
Airport runway By Nikhil Pakwanne
Airport runway By Nikhil PakwanneAirport runway By Nikhil Pakwanne
Airport runway By Nikhil Pakwanne
NIKHIL PAKWANNE
 
Precipitation ppt
Precipitation pptPrecipitation ppt
Precipitation ppt
blain9
 
Types of Precipitation
Types of PrecipitationTypes of Precipitation
Types of Precipitation
atyler29
 
Precipitation presentation
Precipitation presentationPrecipitation presentation
Precipitation presentation
Hamza Ali
 

Viewers also liked (20)

SSA 2012—SSA Strategy and Tactics for Growth (v.2)
SSA 2012—SSA Strategy and Tactics for Growth (v.2)SSA 2012—SSA Strategy and Tactics for Growth (v.2)
SSA 2012—SSA Strategy and Tactics for Growth (v.2)
 
Orange Genie Group - Culture
Orange Genie Group - CultureOrange Genie Group - Culture
Orange Genie Group - Culture
 
DPE Runway Incursion
DPE Runway IncursionDPE Runway Incursion
DPE Runway Incursion
 
Wendover entry form 2013
Wendover entry form 2013Wendover entry form 2013
Wendover entry form 2013
 
Ar10x96 barricade how to for construction personnel
Ar10x96 barricade how to for construction personnelAr10x96 barricade how to for construction personnel
Ar10x96 barricade how to for construction personnel
 
Lvp pdf
Lvp  pdfLvp  pdf
Lvp pdf
 
737 ng cl differences jakub muransky
737 ng cl differences jakub muransky737 ng cl differences jakub muransky
737 ng cl differences jakub muransky
 
Cat i;ii;iii operations jakub muransky
Cat i;ii;iii operations jakub muranskyCat i;ii;iii operations jakub muransky
Cat i;ii;iii operations jakub muransky
 
Low visibility operations rev.1-2012
Low visibility operations   rev.1-2012Low visibility operations   rev.1-2012
Low visibility operations rev.1-2012
 
ILS CAT II AND LOW VISIBILITY PROCEDURES
ILS CAT II AND LOW VISIBILITY PROCEDURESILS CAT II AND LOW VISIBILITY PROCEDURES
ILS CAT II AND LOW VISIBILITY PROCEDURES
 
Airfield Lighting Introductory
Airfield Lighting IntroductoryAirfield Lighting Introductory
Airfield Lighting Introductory
 
Airport runway By Nikhil Pakwanne
Airport runway By Nikhil PakwanneAirport runway By Nikhil Pakwanne
Airport runway By Nikhil Pakwanne
 
Airfield ground lighting (AGL)
Airfield ground lighting (AGL)Airfield ground lighting (AGL)
Airfield ground lighting (AGL)
 
Precipitation and its forms (hydrology)
Precipitation and its forms (hydrology)Precipitation and its forms (hydrology)
Precipitation and its forms (hydrology)
 
Basic runway length
Basic runway lengthBasic runway length
Basic runway length
 
Precipitation ppt
Precipitation pptPrecipitation ppt
Precipitation ppt
 
Types of Precipitation
Types of PrecipitationTypes of Precipitation
Types of Precipitation
 
Airport lighting
Airport lightingAirport lighting
Airport lighting
 
Airport marking
Airport markingAirport marking
Airport marking
 
Precipitation presentation
Precipitation presentationPrecipitation presentation
Precipitation presentation
 

Similar to Air disasters as organisational errors: the case of Linate by M. Catino

Normalization of Deviance
Normalization of Deviance Normalization of Deviance
Normalization of Deviance
mmcharter
 

Similar to Air disasters as organisational errors: the case of Linate by M. Catino (14)

Media Object File Flt Ops Hum Per Seq03
Media Object File Flt Ops Hum Per Seq03Media Object File Flt Ops Hum Per Seq03
Media Object File Flt Ops Hum Per Seq03
 
Normalization of Deviance
Normalization of Deviance Normalization of Deviance
Normalization of Deviance
 
Media Object File Flt Ops Hum Per Seq01
Media Object File Flt Ops Hum Per Seq01Media Object File Flt Ops Hum Per Seq01
Media Object File Flt Ops Hum Per Seq01
 
Body Scanners
Body ScannersBody Scanners
Body Scanners
 
Skynet Week 3 H4D Stanford 2016
Skynet Week 3 H4D Stanford 2016Skynet Week 3 H4D Stanford 2016
Skynet Week 3 H4D Stanford 2016
 
Aircraft safety-systems-in-the-spotlight-thematic-report
Aircraft safety-systems-in-the-spotlight-thematic-reportAircraft safety-systems-in-the-spotlight-thematic-report
Aircraft safety-systems-in-the-spotlight-thematic-report
 
Aircraft Safety Systems: In The Spotlight - An Aranca Report
Aircraft Safety Systems: In The Spotlight - An Aranca ReportAircraft Safety Systems: In The Spotlight - An Aranca Report
Aircraft Safety Systems: In The Spotlight - An Aranca Report
 
Drone Guardian: Countering the drone threat to commercial airports
Drone Guardian: Countering the drone threat to commercial airportsDrone Guardian: Countering the drone threat to commercial airports
Drone Guardian: Countering the drone threat to commercial airports
 
Media Object File Flt Ops Rwy Ops Seq01
Media Object File Flt Ops Rwy Ops Seq01Media Object File Flt Ops Rwy Ops Seq01
Media Object File Flt Ops Rwy Ops Seq01
 
Improvements to Situational Awareness During Approach and Landing Through Enh...
Improvements to Situational Awareness During Approach and Landing Through Enh...Improvements to Situational Awareness During Approach and Landing Through Enh...
Improvements to Situational Awareness During Approach and Landing Through Enh...
 
DRONES THE NEW WEAPON OF CHOICE - ALSO FOR HACKERS
DRONES THE NEW WEAPON OF CHOICE - ALSO FOR HACKERSDRONES THE NEW WEAPON OF CHOICE - ALSO FOR HACKERS
DRONES THE NEW WEAPON OF CHOICE - ALSO FOR HACKERS
 
PE And NET Analysis
PE And NET AnalysisPE And NET Analysis
PE And NET Analysis
 
Module 5 13 software management control
Module 5 13 software management controlModule 5 13 software management control
Module 5 13 software management control
 
Tc Sms 09
Tc Sms 09Tc Sms 09
Tc Sms 09
 

More from ALIAS Network

More from ALIAS Network (20)

Paola Tomasello - Liabilities of Remotely Piloted Aircraft Systems (RPAS): th...
Paola Tomasello - Liabilities of Remotely Piloted Aircraft Systems (RPAS): th...Paola Tomasello - Liabilities of Remotely Piloted Aircraft Systems (RPAS): th...
Paola Tomasello - Liabilities of Remotely Piloted Aircraft Systems (RPAS): th...
 
Luca Falessi - the caa perspective on the future of atm
Luca Falessi - the caa perspective on the future of atmLuca Falessi - the caa perspective on the future of atm
Luca Falessi - the caa perspective on the future of atm
 
Ken Carpenter - application of legal case to acas x
Ken Carpenter - application of legal case to acas xKen Carpenter - application of legal case to acas x
Ken Carpenter - application of legal case to acas x
 
Ken Carpenter - a new generation of airborne collision avoidance systems acas x
Ken Carpenter - a new generation of airborne collision avoidance systems acas xKen Carpenter - a new generation of airborne collision avoidance systems acas x
Ken Carpenter - a new generation of airborne collision avoidance systems acas x
 
Damiano Taurino - operational usages and regulatory framework of rpas
Damiano Taurino - operational usages and regulatory framework of rpasDamiano Taurino - operational usages and regulatory framework of rpas
Damiano Taurino - operational usages and regulatory framework of rpas
 
Anthony Smoker - the ifatca perspective on the future of atm
Anthony Smoker - the ifatca perspective on the future of atmAnthony Smoker - the ifatca perspective on the future of atm
Anthony Smoker - the ifatca perspective on the future of atm
 
Anthony Smoker - the atcos perspective on RPAS: The IFATCA view
Anthony Smoker - the atcos perspective on RPAS: The IFATCA viewAnthony Smoker - the atcos perspective on RPAS: The IFATCA view
Anthony Smoker - the atcos perspective on RPAS: The IFATCA view
 
Dennis Shomko - rpas industry perspective: who’s in charge?
Dennis Shomko - rpas industry perspective: who’s in charge?Dennis Shomko - rpas industry perspective: who’s in charge?
Dennis Shomko - rpas industry perspective: who’s in charge?
 
Roger Sethsson - insurance perspective on automation and innovation in aviation
Roger Sethsson - insurance perspective on automation and innovation in aviationRoger Sethsson - insurance perspective on automation and innovation in aviation
Roger Sethsson - insurance perspective on automation and innovation in aviation
 
Luca Save - a human factors perspective: the loat
Luca Save - a human factors perspective: the loatLuca Save - a human factors perspective: the loat
Luca Save - a human factors perspective: the loat
 
Giovanni Sartor - addressing legal and social aspects the alias project
Giovanni Sartor - addressing legal and social aspects the alias projectGiovanni Sartor - addressing legal and social aspects the alias project
Giovanni Sartor - addressing legal and social aspects the alias project
 
Amedeo Santosuosso - judicial approaches on rpas
Amedeo Santosuosso - judicial approaches on rpasAmedeo Santosuosso - judicial approaches on rpas
Amedeo Santosuosso - judicial approaches on rpas
 
Alfredo Roma - addressing liabilities with rpas
Alfredo Roma - addressing liabilities with rpasAlfredo Roma - addressing liabilities with rpas
Alfredo Roma - addressing liabilities with rpas
 
Stefano Prola - IATA input in alias legal case
Stefano Prola - IATA input in alias legal caseStefano Prola - IATA input in alias legal case
Stefano Prola - IATA input in alias legal case
 
Carolina Rius Alarco - liabilities and automation in aviation - rpas
Carolina Rius Alarco - liabilities and automation in aviation - rpasCarolina Rius Alarco - liabilities and automation in aviation - rpas
Carolina Rius Alarco - liabilities and automation in aviation - rpas
 
Marc Bourgois - experience from long-term and innovative research
Marc Bourgois - experience from long-term and innovative researchMarc Bourgois - experience from long-term and innovative research
Marc Bourgois - experience from long-term and innovative research
 
Maurizio Mancini - the ansp perspective
Maurizio Mancini - the ansp perspectiveMaurizio Mancini - the ansp perspective
Maurizio Mancini - the ansp perspective
 
Hanna Schebesta - test application results
Hanna Schebesta - test application resultsHanna Schebesta - test application results
Hanna Schebesta - test application results
 
Pierpaolo Gori - elements of regulation on remotely piloted aircraft systems
Pierpaolo Gori - elements of regulation on remotely piloted aircraft systemsPierpaolo Gori - elements of regulation on remotely piloted aircraft systems
Pierpaolo Gori - elements of regulation on remotely piloted aircraft systems
 
Giuseppe Contissa - the legal case
Giuseppe Contissa - the legal caseGiuseppe Contissa - the legal case
Giuseppe Contissa - the legal case
 

Recently uploaded

EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
Earley Information Science
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
Enterprise Knowledge
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
 

Recently uploaded (20)

🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Evaluating the top large language models.pdf
Evaluating the top large language models.pdfEvaluating the top large language models.pdf
Evaluating the top large language models.pdf
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 

Air disasters as organisational errors: the case of Linate by M. Catino

  • 1. ALIAS Conference 14-15 June 2012, EUI - Florence (Italy) Air disasters as organizational errors: the case of Linate Prof. Maurizio Catino University of Milan - Bicocca (Italy) maurizio.catino@unimib.it 1
  • 2. 8.10.2001: The second most serious aircrash ground accident SAS MD87 Cessna 2
  • 3. The accident dynamic TWR MD87 •• • • • •• • Cessna R5 R6 3
  • 4. 4
  • 5. 5
  • 6. The sierra four … (8.08.23) MD87 (8.08.28) (8.08.32) Cessna (8.08.36) • Roger, … D-VX hold position (8.08.40) 6
  • 7. The accident dynamic (8.09.19) MD87 (8.09.28) (8.09.37) Cessna (8.09.38) 7
  • 9. 9
  • 10. Why? Who is to blame?  Cessna pilots mistake  Ground controller error  Inadequate signals condition  Absence of a ground radar  Airport management negligence  Tragic fatality  … 10
  • 11. The Error of Human Error… “... ‘human error’ is not a well defined category of human performance. Attributing error to the actions of some person, team, or organisation is fundamentally a social and psychological process and not an objective, technical one.” (Woods et al., 1994) Assume that the source of failure is “human error” Analyse events to find where a person is involved Stop analysis when one is found 11
  • 12. A multilevel model for the analysis of accidents Inter-organizational level - Integration - Coordination Defences -… Individual-level (errors, violations, mistakes, decisions) Accident Organizational level - Defences - Managerial decisions - Error-inducing conditions -… (Catino 2010) 12
  • 13. 1. Individual Level • The Cessna and two pilots were not qualified and certified to operate in low visibility conditions (land and take off) such as that day (violation) • The Cessna crew took the wrong taxiway (error) and entered the runway without specific clearance (violation) • There were communication failures between the tower and the Cessna pilots: the ground controller did not realize that the Cessna was on taxiway R6 (error), and he issued a clearance to taxi towards the main apron although he could not make sense of the report position S4 13
  • 14. 2. Organizational Failures Failures defences  No Surface Movement Radar (out of service since November 1999)  Installed equipment for prevention r.i. at R6 intersection deactivated  TWY Lights  Stop Bars Error-inducing conditions  The ground markigs were not clearly visible (RWY Holding Position Markings)  Signs, signals and lights were inadequate and misleading (out standard ICAO)  Official documention failed to report the presence of unpublished marking (S4, S5, etc) Latent failures  No learning from near miss  Best practices not applied  No functional Safety Management System 14
  • 15. 3.The bigger picture—Linate ENAC (airport authority) ENAV SEA (air traffic regulator) (Service Provider) 15
  • 16. Individual Failures Organizational and Inter-organizational Failures Markings and signs were not in accordance with ICAO standards; Red bars and TWY lights non The Cessna crew took the wrong controllable by ATC; Deficiency in the state of taxiway (error) and entered the runway implementation and maintenance of airport without specific clearance (violation) standard signage; Official documentation failed to report the presence of unpublished markings (S4); No equipment to prevent runway incursions No surface movement radar; Installed equipment There were communication failures for prevention r.i. at R6 intersection deactivated; between the tower and the Cessna Markings and signs were not in accordance with ICAO standards; Deficiency in the state of pilots implementation and maintenance of airport standard signage; Non-compliance with international standards on markings, lights and signs; High traffic volume; lack of visual aids The Cessna and two pilots were not qualified and certified to operate in low Lack of coordination among the airport visibility conditions (land and take off) authorities; weaknesses in the control system such as that day (violation) 16
  • 17. Failure Levels Inter-organizational level • Cost/safety trade-offs • Failures of integration and coordination • Bureaucratic safety culture • No Safety Management system • … Organizational - level • No ground radar Individual-level • No international safety standard • Errors • Weak defenses • Violations • Lack of visual aids • Communications • No learning from near miss misunderstandings • … 17
  • 18. Active versus Latent Failures Inter- Latent Conditions Organizational  Coordination neglect Factors  Inadequate safety policies Organizational Latent Conditions Factors  No ground radar; no international standard  No learning from near miss; … Preconditions Latent Conditions for  Poor visibility of R5/R6 signs; Mental Fatigue; Unsafe Acts  S4 marking unknown to the controller; … Unsafe Active Conditions Acts • The Cessna crew took the wrong taxiway and entered the runway Failed or • Communication failures Absent Defenses Accident & Injury (Adapted from Reason, 1997) 18
  • 19. Conclusions • If we focus too closely upon the unsafe acts at the sharp end, we are in danger of missing the fact that this was the result of an organizational error • It’s important to take a system perspective • Communication and organization problems of many kinds were crucial factors in this and other disasters 19
  • 20. Two ways of looking at accidents Individual Blame Logic Organizational Function Logic Errors and Accidents 20
  • 21. Vicious Circle Individual Organizational inertia Blame Logic Defensive behavior Blame culture Search for the guilty Hidden errors 21
  • 22. Defensive Medicine? • Defensive medicine takes place when healthcare personnel prescribe unnecessary treatments, or avoid high-risk procedures, with the goal of reducing their exposure to malpractice litigation • Doctors in particular may: • prescribe unnecessary tests, procedures or specialist visits (positive defensive medicine), • or, alternatively, avoid high-risk patients or procedures (negative defensive medicine). 22
  • 23. Defensive Medicine Study Year Country Result (% of defensive behaviours) Tancredi 1978 US 70% Studdert et al. 1995 US 93% Summerton 2000 UK 90% Hymaia 2006 Japan 98% Jackson Healthcare 2008 US 72% Massachusetts 2009 US 83% Medical Society 23
  • 25. 25
  • 26. The side effects of defensive medicine • The threat of legal investigation does not make the medical system more careful and attentive toward the patient • Individual blame logic does not improve patient safety • Develop the capacity to learn from errors and system failures to become more resilient and reliable • To achieve this, a profound cultural and juridical transformation is required • Promote a different culture to reduce defensive medicine and to promote a process of learning from error 26
  • 27. Virtuous Circle Organizational Removing latent factors Function Logic Organizational learning Just culture Search for Reporting close calls, organizational criticality errors 27
  • 28. Getting the balance right Person model System model Proximal Remote factors factors Individual Collective responsibility responsibility Both extremes have their pitfalls (Reason, 1997) 28
  • 29. Blame free Just Punitive culture culture All errors to system failure Individuals are No individual is to be held blamed for all accountable mistakes 29
  • 30. Just culture 10% 90% Blame No Blame 30
  • 31. Establishing a Just Culture At-risk Reckless Malicious Human error behavior behavior behavior Inadvertent A choice: Conscious Violations action: slips, risk not disregard of Gross lapses, recognized unreasonable negligence mistakes or believed risk Criminal justified offences Reassure Coach Punish Unintentional Deliberate No blame Culpable 31
  • 32. The Case of the Italian Air Force • 20 flight divisions; 1000 pilots • 1990: The accident of “Casalecchio di Reno”: 12 people died • New organization, new culture 32
  • 33. New risk and safety policy • The promotion of a new vision of risk management and safety • The promotion of methods for the identification, analysis and prevention of risks (critical latent factors) • Database for incident reporting (voluntary and anonymous for the centre) • Ongoing training and education about safety and perception of errors in order to learn from them • The implementation of a just culture 33
  • 34. Two different strategies compliance vs. deterrent A deterrent strategy (blame culture) • is backward-looking, • implemented after the accident happens • punitive, sanctions directed towards the individuals or organizations responsible for an error or accident A compliance strategy (ITAF - just culture) • is forward-looking and preventive • early identification of errors and latent factors 34
  • 35. Just culture at ITAF (extracts from interviews) • For each event we look for the reason why it happens. We do not talk about blame and responsibility. We do not want to know who the guilty person was but why the event happened and what we can do to avoid it in the future. • Error is a mechanism for learning (… there are some ) errors that if analyzed can help prevent future errors. • The more people I inform about my error, the less they risk repeating the error • The organization does not put pressure on people committing an error. Nobody is afraid of being punished. The debriefings are a training activity to talk and improve our work. The exchange among experts and newcomers is a good occasion for both people as it helps to see things from different points of view. 35
  • 36. Reporting of Incident and Flight Safety Occurrences 1991-2009 (rate for 10,000 hours of flying) 220 1922 200 1745 1575 180 1650 1773 1732 160 1514 1472 1539 140 120 1130 1073 100 1180 1064 650 1143 989 921 80 645 574 550 600 694 729 681 865 572 60 40 391 410 434 340 274 266 240 272 245 20 143 22 29 0 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 (Source: ITAF Flight Safety Inspectorate) Human Factors Total 36
  • 37. Major accidents 1990 - 2010 26 24 24 22 21 20 20 20 19 18 16 16 14 12 10 9 8 8 8 8 8 6 6 6 6 6 5 5 4 4 4 3 3 2 0 90 91 92 93 94 95 96 97 98 99 00 01 02 03 04 05 06 07 08 09 10 (Source: ITAF Flight Safety Inspectorate) 37
  • 38. Number of accidents 1980-2010 NUMBER RATEO DEADS 1980 – 89 87 0,59 61 1990 - 99 51 0,38 43 2000 - 10 33 0,32 22 (Source: ITAF Flight Safety Inspectorate) 38
  • 39. Conclusion  Either organizations manage human errors, by learning from them Or…  human errors will manage organizations To achieve the first one, is fundamental to develop a just culture 39