This is a presentation I gave at a staff meeting of the Boeing Shared Services International Accounting group on a topic related to safety. I chose this topic after reading about it in an article on business ethics from the Atlantic Monthly that someone had left in the microwave area. Although the article was mainly about ethics, I believe that the subject of Normalization of Deviance is at least as much related to work-related safety.
2. 2
Original source
• The Challenger launch decision : risky technology, culture, and deviance
at NASA Diane Vaughan, Professor of Sociology at Boston College, 1996
3. 3
Definition
“The gradual process through which unacceptable practice or standards
become [treated as] acceptable. As the deviant behavior is repeated
without catastrophic results, it becomes the social norm for the
organization.”
7. 7
Challenger Space Shuttle Disaster, 1986
Engineers continually observed defects in the rocket booster O-
Rings, but they became treated as an “acceptable risk”, due largely
to schedule pressure, after repeated successful launches
Launch day was especially cold. Engineers initially issued an
unprecedented “no-launch” recommendation, but were unable to
persuade NASA to cancel the launch
One component suffered a failure of both primary and backup O-
rings – led to disintegration of the booster rocket and then the
shuttle itself
8. 8
And as if that wasn’t bad enough…
NASA came to accept foam strikes on shuttle heat shields as “normalized deviance”
as well
9. 9
Gulfstream Business Jet crash, 2014
Jet failed to achieve liftoff, went off the end of the runway
Gust Lock was engaged
“the pilots had neglected to perform complete flight control checks
before 98% of their previous 175 takeoffs in the airplane… it is likely
that they decided to skip the [flight control] check at some point in
the past and that doing so had become their accepted practice.” –
NTSB accident report
One source concluded the pilots likely had adopted a pattern of
neglecting more and more checks over time. None of the standard
checks had been performed prior to takeoff.
Go to model
10. 10
Carbide Industries, 2011
• Manufacturing furnace explosion at Louisville, KY plant –
fatalities resulted
• US Chemical Safety Board incident report included an
entire section on “Normalization of Deviance” as a cause
• “…because Carbide did not thoroughly determine the root
causes of the blows [over-pressure incidents that occurred in
1991 and 2004] and eliminate them, the occurrence became
normalized in the day-to-day operations of the facility…CSB
interviews verified that furnace blows were considered normal”
11. 11
Causes (of Normalization of Deviant Practices)
A belief that “rules are stupid and inefficient”.
Belief that work goals are best met by breaking rule(s)
Imperfect knowledge of standards
Fear of speaking up
Source: The normalization of deviance in healthcare delivery. Banja, J. 2010
12. 12
What Can We Do About It?
(Mullane)
• Recognize your vulnerability -- “If it can happen to NASA, it can happen
to anyone.”
• “Plan the work and work the plan.”
• Listen to people closest to the issue.
• Archive and periodically review near-misses and disasters so the
corporate “safety” memory never fades.
Editor's Notes
Date and Time
The Quilley quote is especially relevant when results can be catastrophic (e.g. major workplace injuries), but such occurrences are rare.
i.e. when repeated violations don’t result in catastrophe, the end result becomes redefining success
What they may have actually done is made a 1-in-100,000 chance into a 1-in-1,000 or even 1-in-100 chance
This graph shows deviation going in the outward direction. Might have been appropriate to show it inward to show a “whirlpool effect”
Deviation = deviation from established rules or results
If no failure results, then (over time) the “new normal” shifts one level out from the center
New Normal 3 appears only a small step from previous normal (New Normal 2) to the perpetrators, but is actually much larger (from center)
Key element here is a gradual and continuous drift away from true normal
A case of successful outcomes deceiving engineers and NASA into believing that O-ring damage was much less dangerous than it actually was.
Because it was new technology and experience, engineers developed a high level of risk tolerance. When shuttle launches were successful in spite of deviations and defects, those deviations and defects became increasingly accepted as normal.
Once you go to a new norm, it’s hard to get back to the old one
Very similar behavior occurred with the Columbia with regard to damage from foam strikes on heat shield tiles
Columbia shuttle disaster 2003 – shuttle broke up on re-entry after a foam strike damaged it during launch
A gust lock locks various controls on an aircraft to prevent undesired movements due to wind gusts while the plane is parked. The plane cannot fly if those controls are locked.
These were experienced pilots
Not quite the model of “gradual and continuous drift away from true normal”
Imperfect knowledge may be of the job or of the rules. People may not realize that a common practice is actually a deviation. Even experienced people can get the mistaken belief that they know everything. Thus the “justification” for breaking a rule can be merely apparent.
Although in “The Emperor’s New Clothes”, it was the little boy who pointed out that the emperor had nothing on, new/inexperienced workers are more likely to think “Who am I to say that the emperor has nothing on?” For example, the ethics recommitment scene where a new worker in the group was told that it was standard practice in the group to share a badge in the machine login, even through procedure said to use your own badge
Video example is also example of “having justifications”
Mike Mullane is a retired shuttle astronaut who has become a speaker on safety in hazardous environments. On one of his shuttle missions, the shuttle experienced a “near-miss” where an O-ring failed and allowed fuel leakage but the shuttle was saved by the backup O-ring
Mullane himself would probably understand that it isn’t easy for the new kid on the block to speak up -- speaks about one of his own experiences as a trainee on a flight when he realized that the pilot had exceeded the aircraft fuel range, but said nothing because he was sure that the experienced pilot must have known what he was doing. The plane ran out of gas and crashed, and the two pilots survived only by ejecting from the aircraft.
“Plan the work and work the plan.” – “Train at the best practice level and make sure leaders maintain best practice standards.”
“Listen to people closest to the issue.” – i.e. don’t do what NASA did