Your SlideShare is downloading. ×
Chrisman Comer Homko Sheehy At587 Book Report3
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.


Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Chrisman Comer Homko Sheehy At587 Book Report3


Published on

Book report on "A Human Error Approach to Aviation Accident Analysis" by Douglas Wiegmann and Scott Schappell for Purdue University graduate course AT 573 Managing the Risks of …

Book report on "A Human Error Approach to Aviation Accident Analysis" by Douglas Wiegmann and Scott Schappell for Purdue University graduate course AT 573 Managing the Risks of Organizational Accidents.

Published in: Education

  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. Book Report: Chapters 2-4 By Chris Homko
  • 2. Summary
    • 6 Human Error Perspectives
    • Human Factors Analysis and Classification System (HFACS)
    • 3 Accident Case studies using HFACS
  • 3. Cognitive Perspective
    • “… assumption that the pilot’s mind can be conceptualized as essentially an information processing system” (Weigmann, Shappell, p. 21).
  • 4. Cognitive Perspective: Strengths
    • “… provide insight into why errors are committed, and why accidents happen” (Weigmann, Shappell p. 23).
    • “… attempt to go beyond simply classifying “what” the aircrew did wrong…to addressing the underlying causes of human error…” (Weigmann, Shappell p. 23).
    • “ As a result these cognitive models allow seemingly unrelated errors to be analyzed based on fundamental cognitive failures and scientific principles” (Weigmann, Shappell p. 24).
  • 5. Cognitive Perspective: Weaknesses
    • “… many cognitive theories are quite academic and difficult to translate into the applied world of error analysis and accident investigation.” (Weigmann, Shappell p. 25).
    • “… application of these theoretical approaches often remains nebulous and requires analysts and investigators to rely as much on speculation and intuition as they do on objective methods” (Weigmann, Shappell pp. 25-26).
    • Most importantly “…supervisory and other organizational factors that often impact performance are also overlooked by traditional cognitive methods” (Weigmann, Shappell p. 26).
    • Leads to placing the blame on “pilot error” (Weigmann, Shappell, 2003).
  • 6. Ergonomic Perspective
    • “ ..human is rarely, if ever the sole cause of an error or accident. Rather, human performance involves a complex interaction of several factors including “the inseparable tie between individuals, their tools and machines, and their work environment” (Heinrich, et al., 1980, p. 51)” (Weigmann, Shappell, p. 26).
    • SHEL model (Software, Hardware, Environment, Liveware)
  • 7. Ergonomic Perspective: Strengths
    • “… considers a variety of contextual and task-related factors that effect operator performance, including equipment design” (Weigmann, Shappell p. 28).
    • “… discourages analysts and investigators from focusing solely on the operator as the source or cause of errors” (Weigmann, Shappell p. 28).
    • “… greater varieties of error prevention models are available, including the possibility of designing systems that are more “error tolerant”” (Weigmann, Shappell p. 28).
    • Well known, easy to understand, and more complete (Weigmann, Shappell, 2003).
  • 8. Ergonomic Perspective: Weaknesses
    • “… most system models lack any real sophistication when it comes to analyzing the human component” (Weigmann, Shappell p. 30).
    • “… emphasis is place exclusively on the design aspects of the man-machine interface”(Weigmann, Shappell p. 30).
    • “… a possible mismatch between the anthropometric requirements of the task and human characteristics” (Weigmann, Shappell p. 30). (ergonomics)
    • “… tends to promulgate the notion that all errors and accidents are design-induced and can therefore be engineered out of the system”(Weigmann, Shappell, p. 30).
  • 9. Behavioral Perspective
    • “… behaviorists believe that performance is guided by the drive to obtain rewards and avoid unpleasant consequences or punishments (Skinner, 1974)” (Weigmann, Shappell, p. 30).
  • 10. Behavioral Perspective: Strengths
    • “… have contributed greatly to our understanding of how factors such as motivation, rewards, and past experience affect performance and safety” (Weigmann, Shappell p. 32).
  • 11. Behavioral Perspective: Weaknesses
    • “… motivation and ability cannot fully explain how people behave” (Weigmann, Shappell p. 30).
    • “… the consequences of unsafe acts are often fatal…it is hard to believe that someone would not be motivated to perform their best”(Weigmann, Shappell p. 32).
  • 12. Aeromedical Perspective
    • “… errors are merely the symptoms of an underlying mental or psychological condition such as illness or fatigue” (Weigmann, Shappell, p. 32).
  • 13. Aeromedical Perspective: Strengths
    • “… highlights the crucial role that the psychological state of the pilot (i.e., the host) plays in safe performance and flight operations” (Weigmann, Shappell p. 33).
    • “… has taken the lead in shaping both the military’s and industry’s view of fatigue and has helped form policies on such contentious issues as work cheduling, shift-rotations, and crew rest requirements” (Weigmann, Shappell p. 34).
  • 14. Aeromedical Perspective: Weaknesses
    • “… some view pilot physiology and factors that influence it as relatively unimportant in the big picture of flight safety” (Weigmann, Shappell p. 34).
    • “ Or even more unclear is how fatigued, self-medicated, or disoriented a pilot has to be before he or she commits an error that fatally jepordizes the safety of flight”(Weigmann, Shappell p. 34).
    • “… determining whether these factors “caused” an error or accident is another matter entirely” (Weigmann, Shappell p. 34).
  • 15. Psychological Perspective
    • Humanistic approach (Weigmann, Shappell, 2003).
    • “ ..view flight operations as a social endeavor that involves interactions among a variety of individuals…” (Weigmann, Shappell, p. 34).
  • 16. Psychological Perspective: Strengths
    • Good at analyzing personality conflicts (Weigmann, Shappell, 2003).
    • Studies found that over 70% of all commercial aviation accidents to date (1987) had been due to aircrew communication and coordination problems (Weigmann, Shappell, 2003).
    • Responsible for identifying Crew Resource Management as a viable mitigation tool (Weigmann, Shappell, 2003).
  • 17. Psychological Perspective: Weaknesses
    • “… many early approached focused largely on personality variables rather than on crew coordination and communications issues that most contemporary approaches do” (Weigmann, Shappell p. 36).
    • Some models based on Freudian views and are far outside the mainstream (Weigmann, Shappell, 2003).
    • “… little work has been done to empirically test predictions derived from psychological models of human error” (Weigmann, Shappell p. 37).
    • Models are too specific in scope to human interaction and tend to blame everything on a breakdown of Cockpit Resource Management (Weigmann, Shappell, 2003).
  • 18. Organizational Perspective
    • Emphasize the role that organization has in causing accidents (Weigmann, Shappell, 2003).
  • 19. Organizational Perspective: Strengths
    • Have gained a wide acceptance and have much to offer (Weigmann, Shappell, 2003).
    • “… views all human error as something to be managed within the context of risk” (Weigmann, Shappell, p. 42).
    • “… allows the importance of specific errors to be determined objectively based upon the relative amount of risk they impose on safe operations” (Weigmann, Shappell p. 42).
  • 20. Organizational Perspective: Weaknesses
    • “… some have criticized that the “organizational causes” of operator error are often several times removed, both physically and temporally, from the context in which the error is committed” (Weigmann, Shappell, p. 43).
    • “… tends to be a great difficulty linking organizational factors to operator or aircrew errors, particuarly during accident investigations”(Weigmann, Shappell, p. 43).
    • “… little is known about the types of organizational variables that actually cause specific types of errors in the cockpit” (Weigmann, Shappell, p. 43).
    • “… tend to focus almost exclusively on a specific on a single type of causal-factor…” (Weigmann, Shappell, p. 43).
    • “… tend to foster the extreme view that “every accident, no matter how minor, is a failure of the organization” or that “…an accident is a reflection on management’s ability to manage…even minor incidents are symptoms of management incompetence that may result in a major loss” (Ferry, 1988)” (Weigmann, Shappell, p. 43).
  • 21. Human Factors Analysis and Classification System (HFACS)
    • Developed by Weigmann and Shappell (1997-2001).
    • Attempt at “defining the holes” of the swiss cheese (Weigmann, Shappell, 2003).
  • 22. James Reason: Productive System and Breakdown of
    • Productive System: Production vs. Protection
    • Breakdown: Swiss cheese model
  • 23. Active Failures and Latent Conditions
    • Active Failures
      • Have direct impact on the safety of the system
      • Adverse effects are immediately noticed
      • Committed at the “sharp end” of the system
    • Latent Conditions
      • Go beyond the scope of individual
      • May include poor design, lack of supervision, shortfalls in training
      • Result of top-level decisions
      • Can lie dormant for many days, months, or years
  • 24. Swiss Cheese: Strengths
    • “… integrates the human error perspectives described by chapter 2 into a single unified framework” (Weigmann, Shappell, p. 49).
  • 25. Swiss Cheese: Weaknesses
    • “… fails to identify the exact nature of the “holes” in the cheese” (Weigmann, Shappell, p. 49).
    • “… model was geared toward academics rather than practicioners” (Weigmann, Shappell, p. 49).
  • 26. HFACS
    • Attempts to identify the swiss cheese “holes”
    • “… developed and refined by analyzing hundreds of accident reports containing thousands of human causal factors” (Weigmann, Shappell, p. 50).
    • Four levels of failure:
      • Unsafe Acts
      • Preconditions for Unsafe Acts
      • Unsafe Supervision
      • Organizational Influences
  • 27. 1. Unsafe Acts
      • Routine
      • Exceptional
    • ERRORS
      • Skill Based Errors
      • Decision Errors
      • Perceptual Errors
  • 28. 2. Preconditions for Unsafe Acts
    • Condition of Operators
      • Adverse Mental States
      • Adverse Physiological States
      • Physical/Mental Limitations
    • Personnel Factors
      • (CRM)
      • Personal Readiness
    • Environmental Factors
      • Physical Environment
      • Technological Environment
  • 29. 3. Unsafe Supervision
    • Inadequate Supervision
    • Planned Inappropriate Operations
    • Failure to Correct a Known Problem
    • Supervisory Violations
  • 30. 4. Organizational Influences
    • Resource Management
    • Organizational Climate (Safety Culture)
    • Organizational Process (Decisions and Rules)
  • 31. HFACS Model
  • 32. The Three Cases
    • 1995 DC-8 crash in Kansas City.
    • 1994 Learjet 25D crash at Washington DC Dulles Airport.
    • 1992 Beech Model E18S crash into Mount Haleakala.
    Mellon, James. “1648434-N8711H”. Photograph. 22 May 2006. Web. 28 March 2010. Zeljeznjak, Tony. “0255785-XA-TIE”. Photograph. 26 July 2002. Web. 28 March 2010. Schnichels, Helmut. “1673600-N865F”. Photograph. 19 March 2010. Web. 28 March 2010. Note: Photographs are same type, but not of the actual accident aircraft.
  • 33. DC8: Summary
    • Decision Error: Captain continued takeoff
    • Skill-Based Error: Lack of practice with 3 engine takeoff procedure
    • Exceptional Violation: Flight Engineer controlled the throttles for 2 nd takeoff
    • Crew Resource Management Error: Crew discussions were misplaced on rudder authority
  • 34. DC8: Summary (Cont.)
    • Skill-Based Error: Flight Engineer used degrees F instead of degrees C to calculate Vmcg.
    • Adverse Mental State: Pressure to conduct flight before a landing curfew at destination and Captain’s lack of continuous sleep.
    • Planned Inappropriate Operations: Unsafe supervision – other airlines used specially trained crews to conduct 3 engine takeoffs.
  • 35. DC8: Summary (Cont.)
    • Resource Management: Company chose to use an inexperienced crew.
    • Outside Influences: FAA knew of loophole in flight and duty time regulations that permitted a substantially reduced rest time for non-revenue flights and lacked oversight of their operation.
    • Technological Environment: Flight simulator used for training lacked realistic yaw in engine out operations and the procedure in the company manual for 3 engine takeoff was confusing.
  • 36. Lear 25D: Summary
    • Skill-Based Error: Failure of crew to properly fly an ILS approach.
    • Violation: Pilot who was ILS Category 1 certified performed approach in Category 3 weather conditions.
    • Decision Error: Crew continued the approach instead of electing to hold.
    • Adverse Mental State: Crew was fatigued due to time zone change.
  • 37. Lear 25D: Summary (Cont.)
    • Physical/Mental Condition: Captain was known to not perform well under stress and had low flight time.
    • Resource Management: Decision of company to upgrade the pilot to Captain was a miscommunication with the training company.
    • Operational Processes: Ops Specifications of the Company did not adequately address RVR value precedence in weather reports.
    • Resource Management: It was suggested that a GPWS would have prevented the accident.
  • 38. Beech Model E18S: Summary
    • Violation: Pilot flew into IMC conditions when flights were to be conducted VFR.
    • Decision Error: Captain did not understand the effects of an up-sloping cloud layer produced by orographic lifting.
    • Resource Management: Captain did not actively use navigational charts in flight.
    • Skill-Based Error: Habit may have casued him to tune in the wrong radial on the VOR.
  • 39. Beech Model E18S: Summary (Cont.)
    • Skill-Based Error: Captain did not cross-check his tuning error in not using navigational charts.
    • Personal Readiness: Pilot did not possess the proper flight amounts required by the company to act as pilot in command.
    • Resource Management: Company did not detect the fact that the pilot falsified his flight hours on his employment application.
    • Outside Influence: FAA at the time did not require operators to conduct extensive background checks.
  • 40. Lessons Learned
    • HFACS seems to be the most comprehensive approach to accident investigation.
    • Accidents are the result of many underlying factors and not just “pilot error”.
  • 41. Works Cited
    • Mellon, James. “1648434-N8711H”. Photograph. 22 May 2006. Web. 28 March 2010.
    • Schnichels, Helmut. “1673600-N865F”. Photograph. 19 March 2010. Web. 28 March 2010.
    • Wiegmann, A., Douglas. Shappell, A., Scott. “ A Human Error Approach to Aviation Accident Analysis”. Burlington, VT: Ashgate, 2003. Print.
    • Zeljeznjak, Tony. “0255785-XA-TIE”. Photograph. 26 July 2002. Web. 28 March 2010.