SlideShare a Scribd company logo
1 of 26
HFG Conference, RAeS, 15 October 2003



     Error Management:
Achievements and Challenges
(Have we made a difference?)

            James Reason
Once upon a time . . .
Now: A complex system
Economic & political climate
Cascading influences




                                 Top-level management decisions
                       Culture




                                                                     Culture
                                 Line management implementation
                                 Error-producing conditions in the
                                        team and workplace
                                     Unsafe acts at sharp end
                                          Exceedances
                                     Incidents & near misses
                                            Accidents
Errors need to be managed at
   all levels of the system

Everyone’s blunt end is someone
       else’s sharp end.

        (Karlene Roberts)
Reaching ever higher for the fruit

                     Systemic factors

                      Social factors

                     Individual factors
Milestones
• From 1917: Psychometric testing
• 1940s: Cambridge Cockpit; Applied
  Psychology Unit; centres at Ohio State &
  University of Illinois; ERS (UK)
• 1950s: HFS (US); ‘Human Factors in Air
  Transportation’ (Ross McFarland)
• 1960s: Manned space flight; cockpit
  ergonomics; command instruments
• 1970s: ALPA accident investigation course;
  IATA human factors committee; SHEL(L)
• 1980s: CRM; ASRS; cognitive and systemic
  factors; interaction of many causal factors
• 1990s: Organizational and cultural factors
Sentinel events
• Tenerife runway collision
• Mt Erebus and the Mahon Report
• Manchester runway fire
• Dryden and the Moshansky Report
• BASI reports on the Monarch and
  Seaview accidents
• NTSB Report on Embraer 120 accident
  at Eagle Lake, Texas (Lauber dissent)
• Challenger (Vaughan) and Columbia
  Accident Investigation Board Report
Individual factors
•   Pilot aptitude measures
•   Psychomotor performance
•   Sensory and perceptual factors
•   Fatigue and stress
•   Vigilance decrement
•   Cockpit ergonomics
•   ‘Ironies of automation’
•   Cognitive issues
Predictive value of WW2 AAF
         test battery
    (from Ross McFarland, 1953)


                         Decrease in
                         elimination rates
                         with increase in
                         stanine scores
                         indicates value of
                         properly weighted
                         battery of tests.
Social and team factors
•   Crew resource management
•   LOFT and behavioural markers
•   Cabin evacuation studies
•   Maintenance teams
•   Air traffic controllers
•   Ramp workers
•   Naturalistic decision making
•   Procedural non-compliance
The high-hanging fruit
• Targeting error traps and recurrent
  accidents (e.g. CFIT, maintenance
  omissions, etc.)
• Resolving goal conflicts: production vs
  protection
• Combating the ‘normalization of
  deviance’
• Striving for system resilience (high
  reliability)
ICAO Annex 13 (8 Ed., 1994)  th


1.17. Management information. Accident reports
should include pertinent information concerning
the organisations and their management involved
in influencing the operation of the aircraft. The
organisations include . . . the operator, air traffic
services, airway, aerodrome and weather service
agencies; and the regulatory authority. Information
could include organisational structure and functions,
resources, economic status, management policies
and practices . . .
Ever-widening search for
  the ‘upstream’ factors
 Individuals

 Workplace

Organisation

 Regulators

Society at large
Echoed in many hazardous
         domains
      Piper Alpha     Challenger
Young, NSW
                              Dryden

  Barings                  Zeebrugge

    King’s X                Chernobyl

            Clapham      Columbia
CAIB Report (August, 2003)
‘In our view, the NASA organizational
culture had as much to do with this
accident as the foam.’

‘When the determinations of the causal
chain are limited to the technical flaw
and individual failure, typically the actions
taken to prevent a similar event in the
future are also limited . . .’
But has the pendulum swung
           too far?




  Collective        Individual
responsibility    responsibility
Mr Justice Moshansky on
       the Dryden F-28 crash

Had the system operated operated effectively,
each of the (causal) factors might have been
identified and corrected before it took on
significance . . . this accident was the result of
a failure of the air transportation system as a
whole.
Academician Valeri Legasov
  on the Chernobyl disaster

After being at Chernobyl, I drew the
unequivocal conclusion that the Chernobyl
accident was . . . the summit of all the
incorrect running of the economy which
had been going on in our country for
many years.
                       (pre-suicide tapes, 1988)
CAIB Report (Ch. 5)

‘The causal roots of the accident can
be traced, in part, to the turbulent post-
Cold War policy environment in which
NASA functioned during most of the
years between the destruction of
Challenger and the loss of Columbia.’
Remote factors: some concerns
• They have little causal specificity.
• They are outside the control of system
  managers, and mostly intractable.
• Their impact is shared by many systems.
• The more exhaustive the inquiry, the more
  likely it is to identify remote factors.
• Their presence does not discriminate
  between normal states and accidents; only
  more proximal factors do that.
Revisiting Poisson
• Poisson counted number of kicks received
  by cavalrymen over a given period.
• Developed a model for determining the
  chance probability of a low frequency/high
  opportunity event among people sharing
  equal exposure to hazard.
• How many people would one expect to have
  0, 1, 2, 3, 4, 5, etc. events over a given
  period when there is no known reason why
  one person should have more than any
  other?
Unequal liability: common finding
No. of exceedances by fleet pilots (John Savage)

             More people
 N         have zero events
            than predicted
                                A few people have
                              have more events than
                                would be expected
                                 by chance alone


       0     1    2   3       4    5   6    7   8
     Number of events sustained in a given period
Interpreting pilot-related data
• Repeated events are associated with
  particular conditions. Suggests the need
  for specific retraining.
• Repeated events are not associated with
  particular conditions:
   Bunched in a given time period. Suggests
    influence of local life events. Counselling?
   Scattered over time. Suggests some
    enduring problem. Promote to management?
End-of-century grades

                  C

                  B+

                  A
Conclusions

• Widening the search for error-shaping factors
  has brought great benefits in understanding
  accidents.
• But maybe we are reaching the point of
  diminishing returns with regard to prevention.
• Perhaps we should revisit the individual (the
  heroic as well as the hazardous acts).
• History shows we did that rather well.

More Related Content

Viewers also liked (6)

The Fifth Force - How NASA Builds Teams
The Fifth Force - How NASA Builds TeamsThe Fifth Force - How NASA Builds Teams
The Fifth Force - How NASA Builds Teams
 
The Normalization of Deviance
The Normalization of DevianceThe Normalization of Deviance
The Normalization of Deviance
 
Goodin.ronnie
Goodin.ronnieGoodin.ronnie
Goodin.ronnie
 
Insider threat contributing factors and some case studies
Insider threat contributing factors and some case studiesInsider threat contributing factors and some case studies
Insider threat contributing factors and some case studies
 
Crime and Deviance - Functionalist Approach
Crime and Deviance - Functionalist ApproachCrime and Deviance - Functionalist Approach
Crime and Deviance - Functionalist Approach
 
Crime and Deviance - Marxist Approach
Crime and Deviance - Marxist ApproachCrime and Deviance - Marxist Approach
Crime and Deviance - Marxist Approach
 

Similar to Normalization of Deviance

Flight Safety Part3
Flight Safety Part3Flight Safety Part3
Flight Safety Part3
gaorge1980
 
Fuller.david
Fuller.davidFuller.david
Fuller.david
NASAPMC
 
1315 wed katrine grint &amp.pptx; holt
1315 wed katrine grint &amp.pptx; holt1315 wed katrine grint &amp.pptx; holt
1315 wed katrine grint &amp.pptx; holt
NHSScotlandEvent
 
research proposal-project
research proposal-projectresearch proposal-project
research proposal-project
Marios Gaitanos
 
Angharad Boyson dissertation
Angharad Boyson dissertationAngharad Boyson dissertation
Angharad Boyson dissertation
Angharad Boyson
 

Similar to Normalization of Deviance (20)

Scenario Methodology for Planning Future Activities
Scenario Methodology for Planning Future ActivitiesScenario Methodology for Planning Future Activities
Scenario Methodology for Planning Future Activities
 
HUMAN ERROR
HUMAN ERRORHUMAN ERROR
HUMAN ERROR
 
Safety Management Systems (SMS) and Decision Making
Safety Management Systems (SMS) and Decision MakingSafety Management Systems (SMS) and Decision Making
Safety Management Systems (SMS) and Decision Making
 
Deckplate Dialogue November 11, 2011 Preventability
Deckplate Dialogue November 11, 2011 PreventabilityDeckplate Dialogue November 11, 2011 Preventability
Deckplate Dialogue November 11, 2011 Preventability
 
Flight Safety Part3
Flight Safety Part3Flight Safety Part3
Flight Safety Part3
 
Fuller.david
Fuller.davidFuller.david
Fuller.david
 
An Evolution in Preparedness
An Evolution in PreparednessAn Evolution in Preparedness
An Evolution in Preparedness
 
1315 wed katrine grint &amp.pptx; holt
1315 wed katrine grint &amp.pptx; holt1315 wed katrine grint &amp.pptx; holt
1315 wed katrine grint &amp.pptx; holt
 
Aviation Accident Investigation Using Machine Learning
Aviation Accident Investigation Using Machine LearningAviation Accident Investigation Using Machine Learning
Aviation Accident Investigation Using Machine Learning
 
When Things Break
When Things BreakWhen Things Break
When Things Break
 
Air disasters as organisational errors: the case of Linate by M. Catino
Air disasters as organisational  errors: the case of Linate by M. CatinoAir disasters as organisational  errors: the case of Linate by M. Catino
Air disasters as organisational errors: the case of Linate by M. Catino
 
H7-SafetyHFE.pdf
H7-SafetyHFE.pdfH7-SafetyHFE.pdf
H7-SafetyHFE.pdf
 
Why do we fail? (And how do we stop doing that?
Why do we fail? (And how do we stop doing that?Why do we fail? (And how do we stop doing that?
Why do we fail? (And how do we stop doing that?
 
PE And NET Analysis
PE And NET AnalysisPE And NET Analysis
PE And NET Analysis
 
Unit 2a: Scenario planning presentation
Unit 2a: Scenario planning presentation Unit 2a: Scenario planning presentation
Unit 2a: Scenario planning presentation
 
Media Object File Flt Ops Hum Per Seq01
Media Object File Flt Ops Hum Per Seq01Media Object File Flt Ops Hum Per Seq01
Media Object File Flt Ops Hum Per Seq01
 
Topic 3 swiss cheese model
Topic 3 swiss cheese modelTopic 3 swiss cheese model
Topic 3 swiss cheese model
 
research proposal-project
research proposal-projectresearch proposal-project
research proposal-project
 
Homko Colgan Flight 3407 HFACS Safety Analysis
Homko Colgan Flight 3407  HFACS Safety AnalysisHomko Colgan Flight 3407  HFACS Safety Analysis
Homko Colgan Flight 3407 HFACS Safety Analysis
 
Angharad Boyson dissertation
Angharad Boyson dissertationAngharad Boyson dissertation
Angharad Boyson dissertation
 

Normalization of Deviance

  • 1. HFG Conference, RAeS, 15 October 2003 Error Management: Achievements and Challenges (Have we made a difference?) James Reason
  • 2. Once upon a time . . .
  • 3. Now: A complex system
  • 4. Economic & political climate Cascading influences Top-level management decisions Culture Culture Line management implementation Error-producing conditions in the team and workplace Unsafe acts at sharp end Exceedances Incidents & near misses Accidents
  • 5. Errors need to be managed at all levels of the system Everyone’s blunt end is someone else’s sharp end. (Karlene Roberts)
  • 6. Reaching ever higher for the fruit Systemic factors Social factors Individual factors
  • 7. Milestones • From 1917: Psychometric testing • 1940s: Cambridge Cockpit; Applied Psychology Unit; centres at Ohio State & University of Illinois; ERS (UK) • 1950s: HFS (US); ‘Human Factors in Air Transportation’ (Ross McFarland) • 1960s: Manned space flight; cockpit ergonomics; command instruments • 1970s: ALPA accident investigation course; IATA human factors committee; SHEL(L) • 1980s: CRM; ASRS; cognitive and systemic factors; interaction of many causal factors • 1990s: Organizational and cultural factors
  • 8. Sentinel events • Tenerife runway collision • Mt Erebus and the Mahon Report • Manchester runway fire • Dryden and the Moshansky Report • BASI reports on the Monarch and Seaview accidents • NTSB Report on Embraer 120 accident at Eagle Lake, Texas (Lauber dissent) • Challenger (Vaughan) and Columbia Accident Investigation Board Report
  • 9. Individual factors • Pilot aptitude measures • Psychomotor performance • Sensory and perceptual factors • Fatigue and stress • Vigilance decrement • Cockpit ergonomics • ‘Ironies of automation’ • Cognitive issues
  • 10. Predictive value of WW2 AAF test battery (from Ross McFarland, 1953) Decrease in elimination rates with increase in stanine scores indicates value of properly weighted battery of tests.
  • 11. Social and team factors • Crew resource management • LOFT and behavioural markers • Cabin evacuation studies • Maintenance teams • Air traffic controllers • Ramp workers • Naturalistic decision making • Procedural non-compliance
  • 12. The high-hanging fruit • Targeting error traps and recurrent accidents (e.g. CFIT, maintenance omissions, etc.) • Resolving goal conflicts: production vs protection • Combating the ‘normalization of deviance’ • Striving for system resilience (high reliability)
  • 13. ICAO Annex 13 (8 Ed., 1994) th 1.17. Management information. Accident reports should include pertinent information concerning the organisations and their management involved in influencing the operation of the aircraft. The organisations include . . . the operator, air traffic services, airway, aerodrome and weather service agencies; and the regulatory authority. Information could include organisational structure and functions, resources, economic status, management policies and practices . . .
  • 14. Ever-widening search for the ‘upstream’ factors Individuals Workplace Organisation Regulators Society at large
  • 15. Echoed in many hazardous domains Piper Alpha Challenger Young, NSW Dryden Barings Zeebrugge King’s X Chernobyl Clapham Columbia
  • 16. CAIB Report (August, 2003) ‘In our view, the NASA organizational culture had as much to do with this accident as the foam.’ ‘When the determinations of the causal chain are limited to the technical flaw and individual failure, typically the actions taken to prevent a similar event in the future are also limited . . .’
  • 17. But has the pendulum swung too far? Collective Individual responsibility responsibility
  • 18. Mr Justice Moshansky on the Dryden F-28 crash Had the system operated operated effectively, each of the (causal) factors might have been identified and corrected before it took on significance . . . this accident was the result of a failure of the air transportation system as a whole.
  • 19. Academician Valeri Legasov on the Chernobyl disaster After being at Chernobyl, I drew the unequivocal conclusion that the Chernobyl accident was . . . the summit of all the incorrect running of the economy which had been going on in our country for many years. (pre-suicide tapes, 1988)
  • 20. CAIB Report (Ch. 5) ‘The causal roots of the accident can be traced, in part, to the turbulent post- Cold War policy environment in which NASA functioned during most of the years between the destruction of Challenger and the loss of Columbia.’
  • 21. Remote factors: some concerns • They have little causal specificity. • They are outside the control of system managers, and mostly intractable. • Their impact is shared by many systems. • The more exhaustive the inquiry, the more likely it is to identify remote factors. • Their presence does not discriminate between normal states and accidents; only more proximal factors do that.
  • 22. Revisiting Poisson • Poisson counted number of kicks received by cavalrymen over a given period. • Developed a model for determining the chance probability of a low frequency/high opportunity event among people sharing equal exposure to hazard. • How many people would one expect to have 0, 1, 2, 3, 4, 5, etc. events over a given period when there is no known reason why one person should have more than any other?
  • 23. Unequal liability: common finding No. of exceedances by fleet pilots (John Savage) More people N have zero events than predicted A few people have have more events than would be expected by chance alone 0 1 2 3 4 5 6 7 8 Number of events sustained in a given period
  • 24. Interpreting pilot-related data • Repeated events are associated with particular conditions. Suggests the need for specific retraining. • Repeated events are not associated with particular conditions:  Bunched in a given time period. Suggests influence of local life events. Counselling?  Scattered over time. Suggests some enduring problem. Promote to management?
  • 26. Conclusions • Widening the search for error-shaping factors has brought great benefits in understanding accidents. • But maybe we are reaching the point of diminishing returns with regard to prevention. • Perhaps we should revisit the individual (the heroic as well as the hazardous acts). • History shows we did that rather well.