HUMAN ERROR José Luis Garc í a-Chico (firstname.lastname@example.org) San Jose State University ISE 105 Spring 2006 April 24, 2006 “ To err is human…” (Cicero, I century BC) “… to understand the reasons why humans err is science” (Hollnagel, 1993)
Due to a failure, temperature in the reactor increased rapidly. The emergency cooling system should have come into operation but maintenance staff left two valve closed, which blocked flow. Relief valve opened to relief temperature and pressure, but stuck open. Radioactive water pours into containment area and basement for 2 hour.
Operators failed to detect the stuck open valve. An indicator had been installed to indicate the valve was commanded to shut, not the status of the valve.
Some little radioactivity was released to the environment.
B757 and Tu-154 collided in the German airspace, under Zurich control. 71 people were killed.
Only one controller was in charge of two positions during a night shift (two separated displays). Telephone and STCA under maintenance.
ATC detected late the conflict between both aircraft, and instructed T-154 to descend. The TCAS on board the T-154 and B757 instructed the pilots to climb and descend respectively. The T-154 pilot opted to obey controller orders and began a descent to FL 350 where it collided with the B757. B757 had followed its own TCAS advisory to descend.
Error will be taken as a generic term to encompass all those occasions in which a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to the intervention of some change agency. (Reason, 1990)
Human error occurrences are defined by the behavior of the total man-task system (Rasmussen, 1987).
Actions by human operators can fail to achieve their goal in two different ways: the actions can go as planned, but the plan can be inadequate, or the plan can be satisfactory, but the performance can still be deficient (Hollnagel, 1993)
Great source of lessons to be learnt…not of facts to blame.
Careful considerations to keep in mind:
Most people involved in accidents are not stupid nor reckless. They may be only blindness to their actions.
Be aware of possible influencing situational factors.
Be aware of the hindsight bias of the retrospective analyst.
Hindsight bias: Possession of output knowledge profoundly influence the way we analyze and judge past events. It might impose a deterministic logic on the observer about the unfolding events that the individual at the incident time would have not had.
Nine steps to move forward from error: Woods & Cook (2002)
Pursue second stories beneath the surface to discover multiple contributors.
Escape the hindsight bias
Understand work as performed at the sharp end of the system
Search for systemic vulnerabilities
Study how practice creates safety
Search for underlying patterns
Examine how changes create new vulnerabilities
Used new technology to support and enhanced human expertise
Tame complexity through new forms of feedback
A cased study: HUMAN FACTOR ANALYSIS OF OPERATIONAL ERRORS IN AIR TRAFFIC CONTROL Jose Luis Garcia-Chico San Jose State University Master Thesis of Human Factors and Ergonomics
Besnard, D. Greathead, D., & Baxter, G. (2004). When mental models go wrong. Co-occurrences in dynamic, critical systems. International Journal of human Computer Studies, 60, 117-128.
Dekker, S. W. A. (2002) Reconstructing human contributions to accidents: the new view on error and performance. Journal of Safety Research, 33 , pp. 371-385 .
Hollnagel, E. (1993). The phenotype of erroneous actions. International Journal of Man-Machines Studies, 39, 1-32.
Norman, A. D. (1981). Categorization of slips. Psychological review, 88 (1), 1-15.
Parasuraman, R., Sheridan, T.B., & Wickens, C.D. (2000). A model for types and levels of human interaction with automation. IEEE transactions on systems, man, and cybernetics-Part A: Systems and humans, 30 (3), 286-297
Rasmussen, J. (1982). Human errors: A taxonomy for describing human malfunction in industrial installations. Journal of Occupational Accidents, 4, 311-33.
Rasmussen, J. (1987) The definition of human error and a taxonomy for technical system design. In Rasmussen, J., Duncan, K., & Leplat, J. (Eds.), New Technology and Human Error (pp. 23-30). New York, NY: John Wiley & Sons.
Reason, J. T. (1990). Human error . Cambridge, England: Cambridge University Press.
Reason, J. T. (1997). Managing the risks of organizational accidents. Aldershot, England: Ashgate Publishing Company.
Woods, D.D. & Cook, R.I. (2002). Nine steps to move forward from error. Cognition, Technology, and Work, 4, 137-144.