HUMAN ERROR
Upcoming SlideShare
Loading in...5
×
 

HUMAN ERROR

on

  • 15,467 views

http://havatrafik.blogspot.com

http://havatrafik.blogspot.com

Statistics

Views

Total Views
15,467
Views on SlideShare
15,427
Embed Views
40

Actions

Likes
1
Downloads
500
Comments
4

3 Embeds 40

http://www.slideshare.net 31
http://havatrafik.blogspot.com 8
http://webcache.googleusercontent.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • Thank you. Was very good
    Are you sure you want to
    Your message goes here
    Processing…
  • good data
    Are you sure you want to
    Your message goes here
    Processing…
  • very good presentation
    Are you sure you want to
    Your message goes here
    Processing…
  • good presentation and material
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

HUMAN ERROR HUMAN ERROR Presentation Transcript

  • HUMAN ERROR José Luis Garc í a-Chico (jgarciac@email.sjsu.edu) San Jose State University ISE 105 Spring 2006 April 24, 2006 “ To err is human…” (Cicero, I century BC) “… to understand the reasons why humans err is science” (Hollnagel, 1993)
  • What is important to know about human error?
    • Human error is in our nature
      • It might happen everyone, at any time, in any context
    • Some errors are preventable through procedures, system design and automation.
      • But careful, they may introduce new opportunities of erring.
      • Emphasis should be put on error tolerant systems: error recovery instead of erroneous action prevention.
    • Human error might not be an accident cause in itself…it might be caused by multiple factors
      • Do not only blame last human operator alone.
  • Human error in nuclear powers plants
    • Three Mile Island (1979)
    • Due to a failure, temperature in the reactor increased rapidly. The emergency cooling system should have come into operation but maintenance staff left two valve closed, which blocked flow. Relief valve opened to relief temperature and pressure, but stuck open. Radioactive water pours into containment area and basement for 2 hour.
    • Operators failed to detect the stuck open valve. An indicator had been installed to indicate the valve was commanded to shut, not the status of the valve.
    • Some little radioactivity was released to the environment.
  • Human error in nuclear powers plants
    • Uberlinguen (2002)
    • B757 and Tu-154 collided in the German airspace, under Zurich control. 71 people were killed.
    • Only one controller was in charge of two positions during a night shift (two separated displays). Telephone and STCA under maintenance.
    • ATC detected late the conflict between both aircraft, and instructed T-154 to descend. The TCAS on board the T-154 and B757 instructed the pilots to climb and descend respectively. The T-154 pilot opted to obey controller orders and began a descent to FL 350 where it collided with the B757. B757 had followed its own TCAS advisory to descend.
  • Definition of Human Error
    • Error will be taken as a generic term to encompass all those occasions in which a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to the intervention of some change agency. (Reason, 1990)
    • Human error occurrences are defined by the behavior of the total man-task system (Rasmussen, 1987).
    • Actions by human operators can fail to achieve their goal in two different ways: the actions can go as planned, but the plan can be inadequate, or the plan can be satisfactory, but the performance can still be deficient (Hollnagel, 1993)
  • Human error performance (Norman, 1983)
  • Human error taxonomies
    • Errors of omission (not doing the required thing)
      • Forgetting to do it
      • Ignoring to do it deliberately
    • Errors of commission (doing the wrong thing)
      • slips in which the operator has the correct motivation or intention, but carries out the wrong execution
        • Sequence or wrong order of execution
        • Timing: too fast/slow
      • errors based in erroneous expectations and schema.
        • (schema are sensory-motor knowledge structures stored in memory used to guide behavior: efficient and low energy)
  • Human error taxonomies: SRK model of behavior (Rasmussen, 1982)
    • Errors depend on behavior:
      • Skill-based
      • Ruled-based
      • Knowledge-based
  • Error distinctions
  • Generic Error Modeling System-GEMS (Reason, 1990) Skill-based level
  • Human Error Distribution
    • Humans are prone to slip & lapses with familiar tasks:
      • 61% of errors are skill-based
    • Humans are prone to mistakes when tasks become difficult.
      • 28% of errors are rule-based
      • 11% of errors are knowledge-based that require novel reasoning from principles.
      • Approximate data (Reason, 1990) obtained averaging three studies
  • Human are error prone, but….is that all?
    • It seems that human operator is responsible of system disasters, just because they are the last and more visible responsible of the system performance.
    • Distinction between:
      • Active errors : error associated with the performance of the front-line operators, i.e. pilots, air traffic controllers, control rooms crews, etc
      • Latent errors : related to activities removed in time and space form the direct control interface, i.e. designers, managers, maintenance, supervisors.
  • Model of Human Error causation (Reason, 1990) Accident / mishap Adapted from Shappel (2000)
  • Building solutions
    • Each system will require particular instantiation of the approach, but some general recommendations might include:
      • Prevent errors: procedures, training, safety awareness, UI design (allow only valid choices)
      • Tolerate error: UI design (constraints on inputs), decision support tools
      • Recover error: undo capability, confirmation
  • Learning from past accident/incident
    • Great source of lessons to be learnt…not of facts to blame.
    • Careful considerations to keep in mind:
      • Most people involved in accidents are not stupid nor reckless. They may be only blindness to their actions.
      • Be aware of possible influencing situational factors.
      • Be aware of the hindsight bias of the retrospective analyst.
      • Hindsight bias: Possession of output knowledge profoundly influence the way we analyze and judge past events. It might impose a deterministic logic on the observer about the unfolding events that the individual at the incident time would have not had.
  • Nine steps to move forward from error: Woods & Cook (2002)
    • Pursue second stories beneath the surface to discover multiple contributors.
    • Escape the hindsight bias
    • Understand work as performed at the sharp end of the system
    • Search for systemic vulnerabilities
    • Study how practice creates safety
    • Search for underlying patterns
    • Examine how changes create new vulnerabilities
    • Used new technology to support and enhanced human expertise
    • Tame complexity through new forms of feedback
  • A cased study: HUMAN FACTOR ANALYSIS OF OPERATIONAL ERRORS IN AIR TRAFFIC CONTROL Jose Luis Garcia-Chico San Jose State University Master Thesis of Human Factors and Ergonomics
  • Motivation of the study
    • Some figures - Air Traffic in the USA 2004 (FAA, 2005)
      • 46,752,000 a/c in en-route operations
      • 46,873,000 movement in tower operations
      • 1216 OEs
    • OE rate is been increasing during last years (FAA, 2005):
      • 0.66%* in 2002
      • 0.78% in 2003
      • 0.79% in 2004
    • Analysis of errors based on initial Air Traffic Controller Reports:
      • 539 reports (Jan-Jun 2004)
    Overview | Method | Research Questions | Initial Results
  • Taxonomic study: Initial Results Overview | Method | Research Questions | Initial Results
  • Top-10 OEs Overview | Method | Research Questions | Initial Results
  • Proximity of encounters: OE output
  • Concurrent and contextual factors Overview | Method | Research Questions | Initial Results
  • Taxonomic study: Initial Results Overview | Method | Research Questions | Initial Results
  • Proximity in EOs
  • Co-occurrence of OE
  • D-side presence/absence
  • Time on Position
  • Further Reading
    • Besnard, D. Greathead, D., & Baxter, G. (2004). When mental models go wrong. Co-occurrences in dynamic, critical systems. International Journal of human Computer Studies, 60, 117-128.
    • Dekker, S. W. A. (2002) Reconstructing human contributions to accidents: the new view on error and performance. Journal of Safety Research, 33 , pp. 371-385 .
    • Hollnagel, E. (1993). The phenotype of erroneous actions. International Journal of Man-Machines Studies, 39, 1-32.
    • Norman, A. D. (1981). Categorization of slips. Psychological review, 88 (1), 1-15.
    • Parasuraman, R., Sheridan, T.B., & Wickens, C.D. (2000). A model for types and levels of human interaction with automation. IEEE transactions on systems, man, and cybernetics-Part A: Systems and humans, 30 (3), 286-297
    • Rasmussen, J. (1982). Human errors: A taxonomy for describing human malfunction in industrial installations. Journal of Occupational Accidents, 4, 311-33.
    • Rasmussen, J. (1987) The definition of human error and a taxonomy for technical system design. In Rasmussen, J., Duncan, K., & Leplat, J. (Eds.), New Technology and Human Error (pp. 23-30). New York, NY: John Wiley & Sons.
    • Reason, J. T. (1990). Human error . Cambridge, England: Cambridge University Press.
    • Reason, J. T. (1997). Managing the risks of organizational accidents. Aldershot, England: Ashgate Publishing Company.
    • Woods, D.D. & Cook, R.I. (2002). Nine steps to move forward from error. Cognition, Technology, and Work, 4, 137-144.