Economic & political climateCascading influences Top-level management decisions Culture Culture Line management implementation Error-producing conditions in the team and workplace Unsafe acts at sharp end Exceedances Incidents & near misses Accidents
Errors need to be managed at all levels of the systemEveryone’s blunt end is someone else’s sharp end. (Karlene Roberts)
Reaching ever higher for the fruit Systemic factors Social factors Individual factors
Milestones• From 1917: Psychometric testing• 1940s: Cambridge Cockpit; Applied Psychology Unit; centres at Ohio State & University of Illinois; ERS (UK)• 1950s: HFS (US); ‘Human Factors in Air Transportation’ (Ross McFarland)• 1960s: Manned space flight; cockpit ergonomics; command instruments• 1970s: ALPA accident investigation course; IATA human factors committee; SHEL(L)• 1980s: CRM; ASRS; cognitive and systemic factors; interaction of many causal factors• 1990s: Organizational and cultural factors
Sentinel events• Tenerife runway collision• Mt Erebus and the Mahon Report• Manchester runway fire• Dryden and the Moshansky Report• BASI reports on the Monarch and Seaview accidents• NTSB Report on Embraer 120 accident at Eagle Lake, Texas (Lauber dissent)• Challenger (Vaughan) and Columbia Accident Investigation Board Report
Individual factors• Pilot aptitude measures• Psychomotor performance• Sensory and perceptual factors• Fatigue and stress• Vigilance decrement• Cockpit ergonomics• ‘Ironies of automation’• Cognitive issues
Predictive value of WW2 AAF test battery (from Ross McFarland, 1953) Decrease in elimination rates with increase in stanine scores indicates value of properly weighted battery of tests.
Social and team factors• Crew resource management• LOFT and behavioural markers• Cabin evacuation studies• Maintenance teams• Air traffic controllers• Ramp workers• Naturalistic decision making• Procedural non-compliance
The high-hanging fruit• Targeting error traps and recurrent accidents (e.g. CFIT, maintenance omissions, etc.)• Resolving goal conflicts: production vs protection• Combating the ‘normalization of deviance’• Striving for system resilience (high reliability)
ICAO Annex 13 (8 Ed., 1994) th1.17. Management information. Accident reportsshould include pertinent information concerningthe organisations and their management involvedin influencing the operation of the aircraft. Theorganisations include . . . the operator, air trafficservices, airway, aerodrome and weather serviceagencies; and the regulatory authority. Informationcould include organisational structure and functions,resources, economic status, management policiesand practices . . .
Ever-widening search for the ‘upstream’ factors Individuals WorkplaceOrganisation RegulatorsSociety at large
Echoed in many hazardous domains Piper Alpha ChallengerYoung, NSW Dryden Barings Zeebrugge King’s X Chernobyl Clapham Columbia
CAIB Report (August, 2003)‘In our view, the NASA organizationalculture had as much to do with thisaccident as the foam.’‘When the determinations of the causalchain are limited to the technical flawand individual failure, typically the actionstaken to prevent a similar event in thefuture are also limited . . .’
But has the pendulum swung too far? Collective Individualresponsibility responsibility
Mr Justice Moshansky on the Dryden F-28 crashHad the system operated operated effectively,each of the (causal) factors might have beenidentified and corrected before it took onsignificance . . . this accident was the result ofa failure of the air transportation system as awhole.
Academician Valeri Legasov on the Chernobyl disasterAfter being at Chernobyl, I drew theunequivocal conclusion that the Chernobylaccident was . . . the summit of all theincorrect running of the economy whichhad been going on in our country formany years. (pre-suicide tapes, 1988)
CAIB Report (Ch. 5)‘The causal roots of the accident canbe traced, in part, to the turbulent post-Cold War policy environment in whichNASA functioned during most of theyears between the destruction ofChallenger and the loss of Columbia.’
Remote factors: some concerns• They have little causal specificity.• They are outside the control of system managers, and mostly intractable.• Their impact is shared by many systems.• The more exhaustive the inquiry, the more likely it is to identify remote factors.• Their presence does not discriminate between normal states and accidents; only more proximal factors do that.
Revisiting Poisson• Poisson counted number of kicks received by cavalrymen over a given period.• Developed a model for determining the chance probability of a low frequency/high opportunity event among people sharing equal exposure to hazard.• How many people would one expect to have 0, 1, 2, 3, 4, 5, etc. events over a given period when there is no known reason why one person should have more than any other?
Unequal liability: common findingNo. of exceedances by fleet pilots (John Savage) More people N have zero events than predicted A few people have have more events than would be expected by chance alone 0 1 2 3 4 5 6 7 8 Number of events sustained in a given period
Interpreting pilot-related data• Repeated events are associated with particular conditions. Suggests the need for specific retraining.• Repeated events are not associated with particular conditions: Bunched in a given time period. Suggests influence of local life events. Counselling? Scattered over time. Suggests some enduring problem. Promote to management?
Conclusions• Widening the search for error-shaping factors has brought great benefits in understanding accidents.• But maybe we are reaching the point of diminishing returns with regard to prevention.• Perhaps we should revisit the individual (the heroic as well as the hazardous acts).• History shows we did that rather well.