SlideShare a Scribd company logo
1 of 92
CASE 3-1 YOU CAN’T GET THERE FROM HERE: UBER
SLOW ON DIVERSITY
Established in 2009, Uber provides an alternative to taxicab
service in 460 cities and nearly 60 countries worldwide. The
trick? Their mobile application for smartphones allows riders to
arrange for transportation with drivers who operate their
personal vehicles. A dual rating system (drivers and customers
rate each other) serves as a quality control device keeping Uber
standards high. (1)
As an international technology firm, Uber has been challenged,
along with other tech giants like Google and Twitter, to
demonstrate that they are attuned to the specific needs of their
employees, more specifically people of color and women. In
Uber’s own words:
At Uber, we want to create a workplace that is inclusive and
reflects the diversity of the cities we serve: where everyone can
be their authentic self, and where that authenticity is celebrated
as a strength. By creating an environment where people from
every background can thrive, we’ll make Uber a better
company—not just for our employees but for our customers,
too. (2)
Yet actions speak louder than words. Uber employees describe
the firm’s work environment amid some managers as
Machiavellian and merciless. Many blame Travis Kalanick,
Uber’s founder and former chief executive, for establishing
such a negative culture. Uber’s fast growth approach to the
market has rewarded employees and managers who have
aggressively pushed for greater revenues and fatter profits at the
seeming cost of human dignity.
For example, Uber has had its share of troubles addressing
issues of sexual misconduct and workforce diversity. These
issues came to light when a former employee, Susan Fowler,
reported in her personal blog that she was being sexually
harassed by her manager and that human resources had been
informed of these infractions. (3) Susan Fowler said in her blog:
On my first official day rotating on the team, my new manager
sent me a string of messages over company chat. He was in an
open relationship, he said, and his girlfriend was having an easy
time finding new partners, but he wasn’t. He was trying to stay
out of trouble at work, he said, but he couldn’t help getting in
trouble, because he was looking for women to have sex with. It
was clear that he was trying to get me to have sex with him, and
it was so clearly out of line that I immediately took screenshots
of these chat messages and reported him to HR. (4)
Uber’s first reaction was to call Ms. Fowler’s accusations as
“abhorrent and against everything Uber stands for and believes
in.”(5) Ms. Fowler purported that her manager was not punished
because he “was a high performer”; yet other female employees
reported similar incidents with the same manager, leading Ms.
Fowler to believe that HR was covering up for her manager.
Uber was in trouble as more and more scandals emerged and
they quickly took the following actions: (a) apologized for some
of their managers’ actions, (b) had a board member and several
female executives provide testimonials on the firm’s positive
work environment, and (c) began to probe workplace policies
and procedures. Arianna Huffington, a board
member, repeatedly labeled new employees as “brilliant jerks.”
(6) Huffington said that this investigation would be different
when Eric H. Holder Jr., the former United States Attorney
General (as well as some others), were hired to conduct their
investigation.
Uber released its first diversity report on March 28, 2017, one
month after these allegations. This report indicated that women
and nonwhite employees are underrepresented at the firm, not
overly dissimilar from other technology-based firms. Some of
the most egregious statistics include: (a) racial configuration-
6% Hispanic, 9% black, 50% white, and (b) 85% of all
technology jobs are held by men, with a mere 36% of the total
workforce comprised of women. (7)
Liane Hornsey, Uber’s chief human resource officer,
acknowledged, “We need to do better and have much more work
to do.” (8) Here are Uber’s next steps:
We’re dedicating $3 million over the next three years to support
organizations working to bring more women and
underrepresented people into tech. This year, our recruiting
team is also embarking on a college tour to recruit talented
students at colleges across the country, including a number of
Historically Black Colleges and Universities (HBCUs) and
Hispanic Serving Institutions (HSIs). Our employee resource
groups play a huge role in all our recruiting events that are
focused on hiring women and people of color at Uber.
In recruiting, we’ve updated our job descriptions to remove
potentially exclusionary language, and we are running interview
training to make our hiring processes more inclusive for women
in tech. We’re also rolling out training to educate and empower
employees, covering topics like “why diversity and inclusion
matters,” “how to be an ally,” and “building inclusive teams.”
Training is not a panacea but educating employees on the right
behaviors is an important step in the right direction.
This is just the beginning of our efforts. Whether you’re a
veteran returning from service or a person with a disability and
regardless of your religious beliefs, your sexual orientation,
your gender identity, or the country you call home, at Uber, we
want to create an environment where you can be yourself. By
deepening our commitment to diversity, we will strengthen our
business and better serve our customers in over 450 cities in
more than 70 countries. (9)
Only time will tell if this fast growth firm can manage its
aggressive culture and diversity as it continues to expand into
new marketplaces and those with differing cultures.
Questions
1. Susan Fowler’s complaint of being the target of sexual
harassment by her manager would be categorized as falling
under which employment law?
2. Which type(s) of harassment was Ms. Fowler exposed to?
3. What actions, if any, has Uber taken to limit their liability
relative to sexual harassment charges?
4. Uber’s diversity report indicates that 36 percent of Uber’s
workforce is made up of women (15% in technical jobs); 50% of
Uber’s employees in the United States are white, while 9% are
black and 6% are Hispanic. Are they in violation of any EEOC
and Affirmative Action laws?
5. Why does diversity matter in general and more specifically
to Uber?
6. What benefits and challenges does Uber derive from a more
diverse workforce?
References
(1) Anderson, A. (n.d.). Uber International C.V. Hoovers.
Retrieved April 4, 2017, from http://0-
subscriber.hoovers.com.liucat.lib.liu.edu/H/company360/fulldes
cription.html?companyId=163401000000000 p.109
(2) Uber. (n.d.). How do we want Uber to look and feel?
Retrieved April 4, 2017, from https://www.uber.com/diversity/
(3) Fowler, S. (2017, February 19). Reflecting on one very,
very strange year at Uber. Retrieved April 12, 2017, from
https://www.susanjfowler.com/blog/2017/2/19/reflecting-on-
one-very-strange-year-at-uber
(4) Ibid.
(5) Patnaik, S. (2017, February 21). Uber hires ex-US Attorney
General Holder to probe sexual harassment. Reuters. Retrieved
April 4, 2017, from http://www.reuters.com/article/us-uber-
tech-sexual-harassment-idUSKBN160041
(6) Isaac, M. (2017, March 28). Uber releases diversity report
and repudiates its “hard-charging attitude.” The New York
Times. Retrieved April 4, 2017, from
http://www.cnbc.com/2017/03/28/uber-releases-diversity-report-
and-repudiates-its-hard-charging-attitude.html
(7) Ibid.
(8) Uber slow on diversity. (2017, March 29). AM New York,
p. A2.
(9) Uber. (n.d.). How do we want Uber to look and feel?
Retrieved April 4, 2017, from https://www.uber.com/diversity/
Case written by Herbert Sherman, Long Island University
Unit II Case Study
Read “Case 3-1, You Can’t Get There From Here: Uber Slow
On Diversity” on page 108 of your textbook. After you have
read the case study, write an analysis of the case study.
Write an introduction to give context to your paper by
explaining what the paper will cover. Then, divide the body of
your paper using the seven headers below. Address the points
within that section, as indicated under the header.
Employment Law
Identify what employment law Susan Fowler’s sexual
harassment claim would be characterized as. Be sure to develop
your answer to include your rationale.
Type of Harassment
Identify the type(s) of harassment to which Ms. Fowler was
exposed. Be sure to develop your answer to include your
rationale.
Uber’s Actions
Identify actions Uber has taken to limit their liability relative to
sexual harassment charges. Be sure to develop your answer to
include your rationale.
EEOC and Affirmative Action
After reviewing Uber’s diversity report, does it appear Uber is
in violation of any EEOC and affirmative action laws? Be sure
to develop your answer to include your rationale.
Diversity Matters
Explain why diversity matters in general and more specifically
to Uber. Be sure to develop your answer to include your
rationale.
Benefits/Challenges of a Diverse Workforce
Identify and explain the benefits and challenges Uber derives
from a more diverse workforce. Be sure to develop your answer
to include your rationale.
Legal Provisions of Uber Case
Write a summary that identifies legal provisions or
considerations covered within this case study as it relates to a
human resource management (HRM) perspective.
Conclude with an analysis with your thoughts on how ethics and
HRM professional standards are framed by legal provisions
within a specific organization or industry (e.g., business, health
care).
Your assignment should be two pages in length, not counting
the title or reference pages. Adhere to APA style when
constructing this assignment, including in-text citations and
references for all sources that are used. Please note that no
abstract is needed.
52 ProfessionalSafety OCTOBER 2011 www.asse.org
Reviewing
Heinrich
Dislodging Two Myths
From the Practice of Safety
By Fred A. Manuele
I
n The Standardization of Error, Stefansson
(1928) makes the case that people are willing to
accept as fact what is written or spoken with-
out adequate supporting evidence. When studies
show that a supposed fact is not true, dislodging
it is difficult because that belief has become deeply
embedded in the minds of people and, thereby,
standardized.
Stefansson pleads for a
mind-set that accepts as
knowledge only that which
can be proven and which
cannot be logically cont r a -
d i c t e d . H e s t a t e s t h a t
his theme applies to all
fields of endeavor except
for mathematics. Safety is
a professional specialty in
which myths have become
standardized and deeply
embedded. This article ex-
amines two myths that
should be dislodged from
the practice of safety:
1) Unsafe acts of workers
are the principle causes of
occupational accidents.
2) Reducing accident fre-
quency will equivalently re-
duce severe injuries.
These myths arise from
the work of H.W. Heinrich
(1931; 1941; 1950; 1959).
They can be found in the four editions of Indus-
trial Accident Prevention: A Scientific Approach.
Although some safety practitioners may not rec-
ognize Heinrich’s name, his misleading prem-
ises are perpetuated as they are frequently cited
in speeches and papers.
Analytical evidence indicates that these prem-
ises are not soundly based, supportable or valid,
and, therefore, must be dislodged. Although this
article questions the validity of the work of an au-
thor whose writings have been the foundation of
safety-related teaching and practice for many de-
cades, it is appropriate to recognize the positive ef-
fects of his work as well.
This article was written as a result of encourage-
ment from several colleagues who encountered
situations in which these premises were cited as
fact, with the resulting recommended preventive
actions being inappropriate and ineffective. Safety
professionals must do more to inform about and
refute these myths so that they may be dislodged.
Recognition: Heinrich’s Achievements
Heinrich was a pioneer in the field of accident
prevention and must be given his due. Publica-
tion of his book’s four editions spanned nearly
30 years. From the 1930s to today, Heinrich likely
has had more influence than any other individual
on the work of occupational safety practitioners.
In retrospect, knowing the good done by him in
promoting greater attention to occupational safety
and health should be balanced with an awareness
of the misdirection that has resulted from applying
some of his premises.
Heinrich’s Sources Unavailable
Attempts were made to locate Heinrich’s research,
without success. Dan Petersen, who with Nestor
Roos, authored a fifth edition of Industrial Accident
Prevention, was asked whether they had located
Heinrich’s research. Petersen said that they had to
IN BRIEF
•This article identifies two myths
derived from the work of H.W. Heinrich
that should be dislodged from the prac-
tice of safety: 1) unsafe acts of workers
are the principal causes of occupational
accidents; and 2) reducing accident
requency will equivalently reduce
severe injuries.
•As knowledge has evolved about
how accidents occur and their causal
factors, the emphasis is now correctly
placed on improving the work system,
rather than on worker behavior. Hein-
rich’s premises are not compatible with
current thinking.
•A call is issued to safety profession-
als to stop using and promoting these
premises; to dispel these premises in
presentations, writings and discussions;
and to apply current methods that look
beyond Heinrich’s myths to determine
true causal factors of incidents.
Fred A. Manuele, P.E., CSP, is president of Hazards Limited,
which he formed
after retiring from Marsh & McLennan where he was a
managing director and
manager of M&M Protection Consultants. His books include
Advanced Safety
Management: Focusing on Z10 and Serious Injury Prevention,
On the Practice of
Safety, Innovations in Safety Management: Addressing Career
Knowledge Needs, and
Heinrich Revisited: Truisms or Myths. A professional member
of ASSE’s North-
eastern Illinois Chapter and an ASSE Fellow, Manuele is a
former board member
of ASSE, NSC and BCSP.
Professional Development
Peer-Reviewed
http://www.asse.org
www.asse.org OCTOBER 2011 ProfessionalSafety 53
rely entirely on the
previous editions of
Heinrich’s books as
resources. Thus, the only
data that can be reviewed
are contained in Heinrich’s
books. His information-gather-
ing methods, survey documents
that may have been used, the qual-
ity of the information gathered and the
analytical systems used cannot be examined.
Two items of note for this article: Citations from
Heinrich’s texts are numbered H-1, H-2, etc., and
correspond to the chart in Table 1, which indicates
the page numbers and editions in which each ci-
tation appears. All other citations appear as in-text
references in the journal’s standard style.
Furthermore, in today’s social climate, some of
Heinrich’s terminology would be considered sex-
ist. He uses phrases such as man failure, the foreman
and he is responsible. Consider the time in which he
wrote. The fourth edition was published in 1959.
Psychology & Safety
Applied psychology dominates Heinrich’s work
with respect to selecting causal factors and is given
great importance in safety-related problem resolu-
tion. Consider the following:
1) Heinrich expresses the belief that “psy-
chology in accident prevention is a fundamen-
tal of great importance” (H-1).
2) His premise is that “psychology lies
at the root of sequence of accident causes”
(H-2).
3) In the fourth edition, Heinrich states that he
envisions “the more general acceptance by man-
agement of the idea that an industrial psycholo-
gist be included as a member of the plant staff as a
physician is already so included” (H-3).
4) The focus of applied psychology on the em-
ployee, as in the following quotation:
Indeed, safety psychology is as fairly appli-
cable to the employer as to the employee.
The initiative and the chief burden of ac-
tivity in accident prevention rest upon the
employer; however the practical field of
effort for prevention through psychology
is confined to the employee, but through
management and supervision. (H-4)
Note that the focus of applied psychology is on
the worker as are other Heinrichean premises. Since
application of practical psychology is confined to
the worker, who reports to a supervisor, the psy-
chology applier is the supervisor. With due respect
to managers, supervisors and safety practitioners, it
is doubtful that many could knowledgeably apply
psychology “as a fundamental of great importance”
in their accident prevention efforts.
Table 1
Pages Cited by Edition
http://www.asse.org
54 ProfessionalSafety OCTOBER 2011 www.asse.org
Heinrich’s Causation Theory: The 88-10-2 Ratio
Heinrich professes that among the direct and
proximate causes of industrial accidents:
•88% are unsafe acts of persons;
•10% are unsafe mechanical or physical condi-
tions;
•2% are unpreventable (H-5).
According to Heinrich, man failure is the problem
and psychology is an important element in correct-
ing it. In his discussion of the relation of psychology
to accident prevention, Heinrich advocates identi-
fying the first proximate and most easily prevented
cause in the selection of remedies. He says:
Selection of remedies is based on practical
cause-analysis that stops at the selection of
the first proximate and most easily prevented
cause (such procedure is advocated in this
book) and considers psychology when re-
sults are not produced by simpler analysis.
(H-6)
Note that the first proximate and most easily
prevented cause is to be selected (88% of the time
a human error). That concept permeates Hein-
rich’s work. It does not encompass what has been
learned subsequently about the complexity of ac-
cident causation or that other causal factors may
be more significant than the first proximate cause.
For example, the Columbia Accident Investiga-
tion Board (NASA, 2003) notes the need to con-
sider the complexity of incident causation:
Many accident investigations do not go far
enough. They identify the technical cause of
the accident, and then connect it to a vari-
ant of “operator error.” But this is seldom the
entire issue. When the determinations of the
causal chain are limited to the technical flaw
and individual failure, typically the actions
taken to prevent a similar event in the future
are also limited: fix the technical problem and
replace or retrain the individual responsible.
Putting these corrections in place leads to
another mistake: The belief that the problem
is solved. Too often, accident investigations
blame a failure only on the last step in a com-
plex process, when a more comprehensive
understanding of that process could reveal
that earlier steps might be equally or even
more culpable.
A recent example of the complexity of accident
causation appears in this excerpt from the report
prepared by BP personnel following the April 20,
2010, Deepwater Horizon explosion in the Gulf of
Mexico (BP, 2010):
The team did not identify any single action or
inaction that caused this incident. Rather, a
complex and interlinked series of mechanical
failures, human judgments, engineering de-
sign, operational implementation and team
interfaces came together to allow the initia-
tion and escalation of the accident.
Consider another real-world situation in which a
fatality resulted from multiple causal factors:
An operation produces an odorless, color-
less highly toxic gas in an enclosed area. The
two-level gas detection and alarm system
has deteriorated over many years of use,
and the system often leaks gas. An internal
auditor recommends it be replaced with a
three-level system, the accepted practice in
the industry for that type of gas. The auditor
also recommends that maintenance give the
existing system high priority.
Management puts high profits above
safety and tolerates excessive risk taking.
That defines culture problems. Management
decides not to replace the system, and fur-
thermore begins a cost-cutting initiative that
reduces maintenance staff by one-third. The
gas detection and alarm system continue to
deteriorate, and maintenance staff cannot
keep up with the frequent calls for repair and
adjustment.
A procedure is installed that requires
employees to test for gas before entering
the enclosed area. But, supervisors condone
employees entering the area without making
the required test. Both detection and alarm
systems fail. Gas accumulates. An employee
enters the area without testing for gas. The
result is a toxic gas fatality.
Causal factor determination would com-
mence with the deficiencies in the organiza-
tion’s culture whereby: resources were not
provided to replace a defective detection and
alarm system in a critical area; staffing deci-
sions resulted in inadequate maintenance;
and excessive risk taking was condoned.
The employee’s violation of the established
procedure was a contributing factor, but not
principle among several factors.
Heinrich’s theory that an unsafe act is the sole
cause of an accident is not supported in the cited
examples. Also, note that Heinrich’s focus on man
failure is singular in the following citation: “In the
occurrence of accidental injury, it is apparent that
man failure is the heart of the problem; equally ap-
parent is the conclusion that methods of control
must be directed toward man failure” (H-7). [Note:
Heinrich does not define man failure. In making
the case to support directing efforts toward con-
trolling man failure, he cites personal factors such
as unsafe acts, using unsafe tools and willful disre-
gard of instruction.]
A directly opposite view is expressed by Deming
(1986). Deming is known for his work in quality
principles, which this author finds comparable to
the principles required to achieve superior results
in safety.
The supposition is prevalent throughout the
world that there would be no problems in
production or service if only our production
workers would do their jobs in the way that
we taught. Pleasant dreams. The workers are
handicapped by the system, and the system
belongs to the management. (p. 134)
Analytical evidence
indicates that several
of Heinrich’s premis-
es, first introduced in
1931, are not soundly
based, supportable or
valid, and, therefore,
must be dislodged.
http://www.asse.org
www.asse.org OCTOBER 2011 ProfessionalSafety 55
Of all Heinrich’s concepts, his thoughts on ac-
cident causation, expressed as the 88-10-2 ratios,
have had a significant effect on the practice of
safety, and have resulted in the most misdirection.
Why is this so? Because when based on the premise
that man failure causes the most accidents, preven-
tive efforts are directed at the worker rather than
toward the operating system in which the work is
performed.
Many safety practitioners operate on the belief
that the 88-10-2 ratios are soundly based and, as
a result, focus their efforts on reducing so-called
man failure rather than attempting to improve the
system. This belief also perpetuates because it is
the path of least resistance for an organization. It is
easier for supervisors and managers to be satisfied
with taking superficial preventive action, such as
retraining a worker, reinstructing the work group
or reposting the standard operating procedure,
than it is to try to correct system problems.
Certainly, operator errors may be causal factors
for accidents. However, consider Ferry’s (1981)
comments on this subject:
We cannot argue with the thought that when
an operator commits an unsafe act, leading
to a mishap, there is an element of human
or operator error. We are, however, decades
past the place where we once stopped in our
search for causes.
Whenever an act is considered unsafe we
must ask why. Why was the unsafe act com-
mitted? When this question is answered in
depth it will lead us on a trail seldom of the
operator’s own conscious choosing. (p. 56)
If, during an accident investigation, a professional
search is made for causal factors beyond an unsafe
act, such as through the five-why method, one will
likely find that the causal factors built into work sys-
tems may be of greater importance than an employ-
ee’s unsafe act. Fortunately, a body of literature has
emerged that recognizes the significance of causal
factors which originate from decisions made above
the worker level. Several are cited here.
Human Errors Above the Worker Level
Much has been written about human error. Par-
ticular attention is given to the Guidelines for Pre-
venting Human Error in Process Safety (CCPS, 1994).
Although process safety appears in the title, the first
two chapters provide an easily read primer on hu-
man error reduction. The content of those chapters
was largely influenced by personnel with plant- or
corporate-level safety management experience.
Safety practitioners should view the following
highlights as generic and broadly applicable. They
advise on where human errors occur, who commits
them and at what level, the effect of organizational
culture and where attention is needed to reduce
the occurrence of human errors. These highlights
apply to organizations of all types and sizes.
•It is readily acknowledged that human errors at
the operational level are a primary contributor to
the failure of systems. It is often not recognized,
however, that these errors frequently arise from
failures at the management, design or technical ex-
pert levels of the company (p. xiii).
•A systems perspective is taken that views error
as a natural consequence of a mismatch between
human capabilities and demands, and an inappro-
priate organizational culture. From this perspec-
tive, the factors that directly influence error are
ultimately controllable by management (p. 3).
•Almost all major accident investigations in re-
cent years have shown that human error was a
significant causal factor at the level of design, op-
erations, maintenance or the management process
(p. 5).
•One central principle presented in this book is
the need to consider the organizational factors that
create the preconditions for errors, as well as the
immediate causes (p. 5).
•Attitudes toward blame will determine whether
an organization develops a blame culture, which
attributes error to causes such as lack of motivation
or deliberate unsafe behavior (p. 5).
•Factors such as the degree of participation that
is encouraged in an organization, and the quality
of the communication between different levels of
management and the workforce, will have a major
effect on the safety culture (p. 5).
Since “failures at the management, design or
technical expert levels of the company” affect the
design of the workplace and the work methods—
that is, the operating system—it is logical to suggest
that safety professionals should focus on system
improvement to attain acceptable risk levels rather
than principally on affecting worker behavior.
Reason’s (1997) book, Managing the Risks of
Organizational Accidents, is a must-read for safety
professionals who want an education in human er-
ror reduction. It has had five additional printings
since 1997. Reason writes about how the effects of
decisions accumulate over time and become the
causal factors for incidents resulting in serious in-
juries or major damage when all the circumstances
necessary for the occurrence of a major event fit
together. This book stresses the need to focus on
decision making above the worker level to prevent
major accidents. Reason states:
Latent conditions, such as poor design, gaps
in supervision, undetected manufacturing
defects or maintenance failures, unworkable
procedures, clumsy automation, shortfalls in
training, less than adequate tools and equip-
ment, may be present for many years before
they combine with local circumstances and
active failures to penetrate the system’s lay-
ers of defenses.
They arise from strategic and other top-
level decisions made by governments,
regulators, manufacturers, designers and or-
ganizational managers. The impact of these
decisions spreads throughout the organiza-
tion, shaping a distinctive corporate culture
and creating error-producing factors within
the individual workplaces. (p. 10)
http://www.asse.org
56 ProfessionalSafety OCTOBER 2011 www.asse.org
The traditional occupational safety ap-
proach alone, directed largely at the unsafe
acts of persons, has limited value with re-
spect to the “insidious accumulation of la-
tent conditions [that he notes are] typically
present when organizational accidents occur.
(pp. 224, 239)
If the decisions made by management and oth-
ers have a negative effect on an organization’s
culture and create error-producing factors in the
workplace, focusing on reducing human errors at
the worker level—the unsafe acts—will not ad-
dress the problems.
Deming achieved world renown in quality assur-
ance. The principle embodied in what is referred to
as Deming’s 85-15 rule also applies to safety. The
rule supports the premise that prevention efforts
should be focused on the system rather than on the
worker. This author draws a comparable conclu-
sion as a result of reviewing more than 1,700 inci-
dent investigation reports. This is the rule, as cited
by Walton (1986): “The rule holds that 85% of the
problems in any operation are within the system
and are the responsibly of management, while only
15% lie with the worker” (p. 242).
In 2010, ASSE sponsored the symposium, Re-
think Safety: A New View of Human Error and
Workplace Safety. Several speakers proposed that
the first course of action to prevent human errors
is to examine the design of the work system and
work methods. Those statements support Dem-
ing’s 85-15 rule. Consider this statement by a hu-
man error specialist [from this author’s notes]:
When errors occur, they expose weakness-
es in the defenses designed into systems,
processes, procedures and the culture. It is
management’s responsibility to anticipate
errors and to have systems and work meth-
ods designed so as to reduce error potential
and to minimize sever-
ity of injury potential
when errors occur.
Since most problems in an
operation are systemic, safety
efforts should be directed to-
ward improving the system.
Unfortunately, the use of the
terms unsafe acts and unsafe
conditions focuses attention
on a worker or a condition,
and diverts attention from the
root-causal factors built into
an operation.
Allied to Deming’s view is
the work of Chapanis, who
was prominent in the field of
ergonomics and human fac-
tors engineering. Represen-
tative of Chapanis’s writings
is “The Error-Provocative
Situation,” a chapter in The
Measurement of Safety Perfor-
mance (Tarrants, 1980). Cha-
panis’s message is that if the design of the work
is error-provocative, one can be certain that errors
will occur in the form of accident causal factors. It
is illogical to conclude in an incident investigation
that the principal causal factor is the worker’s un-
safe act if the design of the workplace or the work
methods is error-inviting. In such cases, the error-
producing aspects of the work (e.g., design, layout,
equipment, operations, the system) should be con-
sidered primary.
U.S. Department of Energy (1994) describes the
management oversight and risk tree (MORT) as a
“comprehensive analytical procedure that provides
a disciplined method for determining the systemic
causes and contributing factors of accidents.” The
following reference to “performance errors” is of
particular interest.
It should be pointed out that the kinds of
questions raised by MORT are directed at
systemic and procedural problems. The ex-
perience, to date, shows there are a few “un-
safe acts” in the sense of blameful work level
employee failures. Assignment of “unsafe
act” responsibility to a work-level employee
should not be made unless or until the pre-
ventive steps of 1) hazard analysis; 2) man-
agement or supervisory direction; and
3) procedures safety review have been shown
to be adequate. (p. 19)
Each of these more recent publications refutes
the premise that unsafe acts are the primary causes
of occupational accidents.
Heinrich’s Data Gathering & Analytical Method
Heinrich recognized that other studies on acci-
dent causation identified both unsafe acts and un-
safe conditions as causal factors with almost equal
frequency. Those studies produced results different
from the 88-10-2 ratios. For example, the Accident
Figure 1
Foundation of a Major Injury
Note. Adapted from Industrial Accident Prevention: A Scientific
Approach
(1st ed.) (p. 91), (2nd ed.) (p. 27), (3rd ed.) (p. 24), (4th ed.) (p.
27), by H.W.
Heinrich, 1931, 1941, 1950, 1959, New York: McGraw-Hill.
Heinrich’s 300-29-1
ratios have been
depicted as a tri-
angle or a pyramid.
http://www.asse.org
www.asse.org OCTOBER 2011 ProfessionalSafety 57
Prevention Manual for Industrial Operations: Ad-
ministration and Programs, 8th edition (NSC, 1980)
contains these statements about studies of accident
causation:
Two historical studies are usually cited to
pinpoint the contributing factor(s) to an ac-
cident. Both emphasize that most accidents
have multiple causes.
•A study of 91,773 cases reported in Penn-
sylvania in 1953 showed 92% of all nonfatal
injuries and 94% of all fatal injuries were due
to hazardous mechanical or physical condi-
tions. In turn, unsafe acts reported in work
injury accidents accounted for 93% of the
nonfatal injuries and 97% of the fatalities.
•In almost 80,000 work injuries re-
ported in that same state in 1960, unsafe
condition(s) was identified as a contributing
factor in 98.4% of the nonfatal manufactur-
ing cases, and unsafe act(s) was identified as
a contributing factor in 98.2% of the nonfatal
cases. (p. 241)
Although aware that others studying accident
causation had recognized the multifactorial nature
of causes, Heinrich continued to justify selecting a
single causal factor in his analytical process. Hein-
rich’s data-gathering methods force the accident
cause determination into a singular and narrow
categorization. The following paragraph is found
in the second through fourth editions. It follows an
explanation of the study resulting in the formula-
tion of the 88-10-2 ratios. “In this research, major
responsibility for each accident was assigned either
to the unsafe act of a person or to an unsafe me-
chanical condition, but in no case were both per-
sonal and mechanical causes charged” (H-8).
Heinrich’s study resulting in the 88-10-2 ratios
was made in the late 1920s. Both the relation-
ship of a study made then to the work world as
it now exists and the methods used in producing
it are questionable and unknown. As to the study
methods, consider the following paragraph, which
appears in the first edition; minor revisions were
made in later editions.
Twelve thousand cases were taken at random
from closed-claim-file insurance records.
They covered a wide spread of territory and
a great variety of industrial classifications.
Sixty-three thousand other cases were taken
from the records of plant owners. (H-9)
The source of the data was insurance claims files
and records of plant owners, which cannot provide
reliable accident causal data because they rarely
include causal factors. Nor are accident investiga-
tion reports completed by supervisors adequate re-
sources for causal data. When this author provided
counsel to clients in the early stages of developing
computer-based incident analysis systems, insur-
ance claims reports and supervisors’ investigation
reports were examined as possible sources for
causal data. It was rare for insurance claims reports
to include provisions to enter causal data.
This author has examined more than 1,700 in-
cident investigation reports completed by super-
visors and investigation teams. In approximately
80% of those reports, causal factor information was
inadequate. These reports are not a sound base
from which to analyze and conclude with respect
to the reality of causal factors.
Summation on the 88-10-2 Ratios
Heinrich’s data collection and analytical meth-
ods in developing the 88-10-2 ratios are unsup-
portable. Heinrich’s premise, that unsafe acts are
the primary causes of occupational accidents, can-
not be sustained. The myth represented by those
ratios must be dislodged and actively refuted by
safety professionals.
An interesting message of support with respect
to avoiding use of the 88-10-2 ratios comes from
Krause (2005), a major player in worker-focused
behavior-based safety:
Many in the safety community believe a high
percentage of incidents, perhaps 80% to
90%, result from behavioral causes, while the
remainder relate to equipment and facilities.
We made this statement in our first book in
1990. However, we now recognize that this
dichotomy of causes, while ingrained in our
culture generally and in large parts of the
safety community, is not useful, and in fact
can be harmful. (p. 10)
The Foundation of a Major Injury: The 300-29-1 Ratios
Heinrich’s conclusion with respect to the ratios
of incidents that result in no injuries, minor injuries
and a major lost-time case was the base on which
educators taught and many safety practitioners
came to believe that reducing accident frequency
will achieve equivalent reduction in injury sever-
ity. The following statement appears in all four edi-
tions of his text: “The natural conclusion follows,
moreover, that in the largest injury group—the
minor injuries—lies the most valuable clues to ac-
cident causes” (H-10).
The following discussion and statistics establish
that the ratios upon which the foregoing citation is
based are questionable and that reducing incident
frequency does not necessarily achieve an equiva-
lent reduction in injury severity. Heinrich’s 300-29-1
ratios have been depicted as a triangle or a pyramid
(Figure 1). In his first edition, Heinrich writes:
Analysis proves that for every mishap re-
sulting in an injury there are many other ac-
cidents in industry which cause no injuries
whatever. From data now available concern-
ing the frequency of potential-injury acci-
dents, it is estimated that in a unit group of
330 accidents, 300 result in no injuries, 29 in
minor injuries, and 1 in a major or lost-time
case. (H-11)
In the second edition, “similar” was added to the
citation: “Analysis proves that for every mishap,
there are many other similar accidents in industry
. . .” (H-12).
Heinrich’s study
resulting in the
88-10-2 ratios was
made in the late
1920s. Both the
relationship of a
study made then
to the work world
as it now exists and
the methods used
in producing it are
questionable and
unknown.
http://www.asse.org
58 ProfessionalSafety OCTOBER 2011 www.asse.org
Within a chart displaying the 300-29-1 ratios in
the first edition, Heinrich writes, “The total of 330
accidents all have the same cause.” Note that cause
is singular (H-13). This statement, that all 330 in-
cidents have the same cause, challenges credulity.
Also, note that the sentence quoted in this para-
graph appears only in the first edition. It does not
appear in later editions (H-14).
For background data, Heinrich says in the first,
second and third editions:
The determination of this no-injury accident
frequency followed a most interesting and ab-
sorbing study [italics added]. The difficulties
can be readily imagined. There were few ex-
isting data on minor injuries—to say nothing
of no-injury accidents. (H-15)
In the fourth edition, published 28 years after the
first edition, the source of the data is more specifi-
cally stated:
The determination of this no-injury accident
frequency followed a study of over 5,000 cases
[italics added]. The difficulties can be readily
imagined. There were few existing data on
minor injuries—to say nothing of no-injury
accidents. (H-16)
The credibility of such a revision after 28 years
must be questioned. In Heinrich’s second and third
editions, major changes were made in his presen-
tation on the ratios, without explanation.
1) The statement in the first edition that the 330
accidents all have the same cause was eliminated.
2) In the second edition, changes were made
indicating that the unit group of 330 accidents are
“similar” and “of the same kind” (H-17).
3) In the third edition, another significant addi-
tion is made. The 330 accidents now are “of the
same kind and involving the same person” (H-18).
The following appears in the third and fourth
editions, encompassing the changes noted.
Analysis proves that for every mishap result-
ing in an injury there are many other similar
accidents that cause no injuries whatever.
From data now available concerning the fre-
quency of potential-injury accidents, it is es-
timated that in a unit group of 330 accidents
of the same kind and involving the same person
[italics added], 300 result in no injuries, 29 in
minor injuries and 1 in a major or lost-time
injury. (H-19)
These changes are not explained. If the original
data were valid, how does one explain the sub-
stantial revisions in the conclusions eventually
drawn from an analysis of it? In the second, third
and fourth editions, Heinrich gives no indication
of other data collection activities or of other analy-
ses. How does one support using the ratios without
having explanations of the differing interpretations
Heinrich gives in each edition?
The changes made in the 300-29-1 ratios in the
second and third editions, and carried over into the
fourth edition, present other serious conceptual
problems. To which types of accidents does “in a
unit group of 330 accidents of the same kind and
occurring to the same person” apply? Certainly, it
does not apply to some commonly cited incident
types, such as falling to a lower level or struck by
objects.
For example, a construction worker rides the
hoist to the 10th floor and within minutes backs
into an unguarded floor opening, falling to his
death. Heinrich’s ratios would give this person fa-
vorable odds of 300 to 330 (10 out of 11) of suffer-
ing no injury at all. That is not credible.
Consider the feasibility of finding data in the
5,000-plus cases studied to support the ratios,
keeping in mind that incidents are to be of the
same type and occurring to the same person.
•If the number of major or lost-time cases is 1, the
number of minor injury case files would be 29 and
the number of no-injury case files would be 300.
•If the number of major or lost-time cases is 5,
the number of minor injury case files would be 145
and the number of no-injury case files would be
1,500.
•If the number of major or lost-time cases is 10,
the number of minor injury case files would be 290
and the number of no-injury case files would be
3,000.
Because of the limitations Heinrich himself im-
poses, that all incidents are to be of the same type
and occurring to the same person, it is implausible
that his database could contain the information
necessary for analysis and the conclusions he drew
on his ratios. Particularly disconcerting is the need
for the database to contain information on more
than 4,500 no-injury cases (300 ÷ 330 × 5,000). Un-
less a special study was initiated, creating files on
no-injury incidents would be a rarity.
Given this, one must ask, did a database exist
upon which Heinrich established his ratios, then
stated the premises that the most valuable clues for
accident causes are found in the minor injury cat-
egory? This author thinks not.
Statistical Indicators: Serious Injury Trending
Data on the trending of serious injuries and
workers’ compensation claims contradict the
Table 2
Injury Reduction
Categories
Note. Data from “State of the Line,” by National
Council on Compensation Insurance, 2005, Boca
Raton, FL: Author.
Data on the
trending of serious
injuries and
workers’ compen-
sation claims con-
tradict the premise
that focusing on
incident frequency
reduction will equiv-
alently achieve
severity reduction.
http://www.asse.org
www.asse.org OCTOBER 2011 ProfessionalSafety 59
premise that focusing on incident frequency reduc-
tion will equivalently achieve severity reduction.
The following data have been extracted from pub-
lications of the National Council on Compensation
Insurance (NCCI, 2005; 2006; 2009).
•In 2006, NCCI produced a 12-minute video,
The Remarkable Story of Declining Frequency—
Down 30% in the Past Decade. It shows that work-
ers’ compensation claim frequency was down
considerably in the decade cited. The video tells a
remarkable but not well-known story.
•A July 2009 NCCI bulletin, “Workers’ Compen-
sation Claim Frequency Continues Its Decline in
2008.” The reduction was 4.0%. A May 2010 NCCI
report says that the cumulative reduction in claims
frequency from 1991 through 2008 is 54.7%.
•A 2005 NCCI paper, “Workers’ Compensation
Claim Frequency Down Again,” states, “There has
been a larger decline in the frequency of smaller
lost-time claims than in the frequency of larger
lost-time claims.” Also, consider that NCCI (2005)
reports reductions in selected categories of claim
values for the years 1999 and 2003, expressed in
2003 hard dollars (Table 2).
While the frequency of workers’ compensation
cases is down, the greatest reductions are for less
serious injuries. The reduction in cases valued from
$10,000 to $50,000 is about one-third of that for
cases valued at less than $2,000. For cases valued
above $50,000, the reduction is about one-fifth of
that for the less costly and less serious injuries. The
data clearly show that a comparable reduction in
injury severity does not follow a reduction in injury
frequency.
A DNV (2004) bulletin is another resource of
particular note. It states that managing operations
to reduce frequency will not equivalently reduce
severity.
What about the pyramid?
Much has been said over the years about
the classical loss control pyramid, which in-
dicates the ratio between no loss incidents,
minor incidents and major incidents, and it
has often been argued that if you look after
the small potential incidents, the major loss
incidents will improve also.
The major accident reality however is
somewhat different. What we find is that if
you manage the small incidents effectively,
the small incident rate improves, but the
major accident rate stays the same, or even
slightly increases.
Contradictions: Unsafe Acts & Injuries
Heinrich’s texts contain contradictions about
when a major injury would occur and the relation-
ship between unsafe acts and a major injury. In all
editions, reference is made to 330 careless acts or
several hundred unsafe acts occurring before a ma-
jor injury occurs, as in the following examples from
the first and third editions.
•“Keep in mind that a careless act occurs ap-
proximately 300 times before [italics added] a seri-
ous injury results and that there is, therefore, an
excellent opportunity to detect and correct unsafe
practices before injury occurs” (H-20).
•“Keep in mind that an unsafe act occurs several
hundred times before [italics added] a serious injury
results” (H-21).
Before is a key word here. While an unsafe act
may be performed several times before a particu-
lar accident occurs, that is not the case in a large
majority of incidents which result in serious injury
or fatality. In his fourth edition, Heinrich gave this
view of the relationship of unsafe acts or exposures
to mechanical hazards.
If it were practicable to carry on appropriate
research, still another base therefore could be
established showing that from 500 to 1,000
or more unsafe acts or exposures to mechan-
ical hazards existed in the average case be-
fore even one of the 300 narrow escapes from
injury (events-accidents) occurred. (H-22)
There is a real problem here. All of those unsafe
acts or exposures to mechanical hazards take place
before even one accident occurs. That is illogical.
Summation on the 300-29-1 Ratios
Use of the 300-29-1 ratios is troubling. Since the
ratios are not soundly based, one must ask whether
the ratios have any substance. Does their use as a
base for a safety management system result in a
concentration of resources on the frequent and
lesser significant while ignoring opportunities to
reduce the more serious injuries?
One of Heinrich’s premises is that “the predomi-
nant causes of no-injury accidents are, in average
cases, identical with the predominant causes of
major injuries, and incidentally of minor injuries as
well.” This is wrong. It is a myth that must be dis-
lodged from the practice of safety.
Applying this premise leads to misdirection in
resource application and ineffectiveness, particu-
larly with respect to preventing serious injuries. In
this author’s experience, many incidents resulting
in serious injury are singular and unique events,
with multifaceted and complex causal factors, and
descriptions of similar incidents are rare in the his-
torical body of incident data. Furthermore, all haz-
ards do not have equal potential for harm. Some
risks are more significant than others. That requires
priority setting.
Misinterpretation of Terms
Not only have many safety practitioners used
the 300-29-1 ratios in statistical presentations, but
many also have misconstrued what Heinrich in-
tended with the terms major injury, minor injury
and no-injury accidents. Some practitioners who
cite these ratios in their presentations assume that
a “major injury” is a serious injury or a fatality. In
each edition, Heinrich gave nearly identical defini-
tions of the accident categories to which the 300-
29-1 ratios apply. This is how the definition reads
in the fourth edition.
In the accident group (330 cases), a major in-
Use of the 300-29-1
ratios is troubling.
Applying this
premise leads to
misdirection in re-
source application
and ineffectiveness,
particularly with re-
spect to preventing
serious injuries.
http://www.asse.org
60 ProfessionalSafety OCTOBER 2011 www.asse.org
jury is any case that is reported to insurance
carriers or to the state compensation com-
missioner. A minor injury is a scratch, bruise
or laceration such as is commonly termed a
first-aid case. A no-injury accident is an un-
planned event involving the movement of a
person or an object, ray or substance (e.g.,
slip, fall, flying object, inhalation) having
the probability of causing personal injury or
property damage. The great majority of re-
ported or major injuries are not fatalities or
fractures or dismemberments; they are not
all lost-time cases, and even those that are
do not necessarily involve payment of com-
pensation. (H- 20)
These definitions compel the conclusion that any
injury requiring more than first-aid treatment is a
major injury. When these definitions were devel-
oped in the late 1920s, few companies were self-in-
sured for workers’ compensation. On-site medical
facilities were rare. Insurance companies typically
paid for medical-only claims and for minor and
major injuries. According to Heinrich’s definitions,
almost all such claims would be considered major
injuries. Then, is it not so that every OSHA record-
able injury is a major injury in this context?
Heinrich’s 300-29-1 ratios have been misused
and misrepresented many times as well. For exam-
ple, a safety director recently said that in the pre-
vious year his company sustained one fatality and
30 OSHA days-away-from-work incidents, and,
therefore, Heinrich’s progression was validated.
Not so. All of the injuries and the fatality would be
in the major or lost-time injury category.
In another instance, a speaker referred to Hein-
rich’s 300-29-1 ratios and said that the 300 were
unsafe acts, the 29 were serious injuries and the
1 was a fatality. These are but two examples of the
many misuses of these ratios.
Heinrich’s Premises Versus Current Safety Knowledge
Heinrich emphasized improving an individual
worker’s performance, rather than improving the
work system established by management. That is
not compatible with current knowledge. Unfortu-
nately, some safety practitioners continue to base
their counsel on Heinrich’s premises, which nar-
rows the scope of their activities as they attempt
principally to improve worker performance. In do-
ing so, they ignore the knowledge that has evolved
in the professional practice of safety. A few exam-
ples follow:
•Hazards are the generic base of, and the justi-
fication for the existence of, the practice of safety.
•Risk is an estimate of the probability of a haz-
ard-related incident or exposure occurring and the
severity of harm or damage that could result.
•The entirety of purpose of those responsible for
safety, regardless of their titles, is to manage their
endeavors with respect to hazards so that the risks
deriving from those hazards are acceptable.
•All risks to which the practice of safety applies
derive from hazards. There are no exceptions.
•Hazards and risks are most effectively and eco-
nomically avoided, eliminated or controlled in the
design and redesign processes.
•The professional practice of safety requires con-
sideration of the two distinct aspects of risk:
1) avoiding, eliminating or reducing the prob-
ability of a hazard-related incident or exposure oc-
curring;
2) reducing the severity of harm or damage if an
incident or exposure occurs.
•Management creates the safety culture, wheth-
er positive or negative.
•An organization’s culture, translated into a
system of expected behavior, determines man-
agement’s commitment or lack of commitment to
safety and the level of safety achieved.
•Principal evidence of an organization’s culture
with respect to occupational risk management is
demonstrated through the design decisions that
determine the facilities, hardware, equipment,
tooling, materials, processes, configuration and
layout, work environment and work methods.
•Major improvements in safety will be achieved
only if a culture change takes place, only if major
changes occur in the system of expected behavior.
•While human errors may occur at the worker
level, preconditions for the commission of such er-
rors may derive from decisions made with respect
to the workplace and work methods at the man-
agement, design, engineering or technical expert
levels of an organization.
•Greater progress can be obtained with respect
to safety by focusing on system improvement to
achieve acceptable risk levels, rather than through
modifying worker behavior.
•A large proportion of problems in an opera-
tion are systemic, deriving from the workplace and
work methods created by management, and can
be resolved only by management. Responsibility
for only a relatively small remainder lies with the
worker.
•While employees should be trained and em-
powered up to their capabilities and encouraged to
make contributions with respect to hazard identifi-
cation and analysis, and risk elimination or control,
they should not be expected to do what they can-
not do.
•Accidents usually result from multiple and in-
teracting causal factors that may have organiza-
tional, cultural, technical or operational systems
origins.
•If accident investigations do not relate to actual
causal factors, corrective actions taken will be mis-
directed and ineffective.
•Causal factors for low-probability/high-conse-
quence events are rarely represented in the analyti-
cal data on incidents that occur frequently, and the
uniqueness of serious injury potential must be ad-
equately addressed. However, accidents that occur
frequently may be predictors of severity potential if
a high energy source was present (e.g., operation
of powered mobile equipment, electrical contacts).
As this list demonstrates, Heinrich’s premises
are not compatible with current knowledge.
Heinrich empha-
sized improving an
individual worker’s
performance, rather
than improving the
work system estab-
lished by manage-
ment. That is not
compatible with
current knowledge.
http://www.asse.org
www.asse.org OCTOBER 2011 ProfessionalSafety 61
Conclusion
As knowledge has evolved about how accidents
occur and their causal factors, the emphasis is now
properly placed on improving the work system,
rather than on worker behavior. As one colleague
who is disturbed by safety professionals who refer-
ence Heinrich premises as fact, says, “It is border-
line unethical on their part.”
This article has reviewed the origin of certain
premises that have been accepted as truisms by
many educators and safety practitioners, and how
they evolved and changed over time; it also ex-
amined their validity. The two premises discussed
here are wrongly based and cannot be sustained
by safety practitioners. The premises themselves
and the methods used to establish them cannot
withstand a logic test. They are myths that have
become deeply embedded in the practice of safety
and safety professionals must take action to dis-
lodge them. PS
References
BP. (2010, Sept. 8). Deepwater Horizon accident in-
vestigation report. Houston, TX: Author. Retrieved Aug.
30, 2011, from www.bp.com/liveassets/bp_internet/
globalbp/globalbp_uk_english/incident_response/
STAGING/local_assets/downloads_pdfs/Deepwa
ter_Horizon_Accident_Investigation_Report.pdf.
Center for Chemical Process Safety (CCPS).
(1994). Guidelines for preventing human error in process
safety. New York: Author.
Columbia Accident Investigation Board. (2003).
Columbia accident investigation report, Vol. 1. Washing-
ton, DC: NASA. Retrieved Aug. 30, 2011, from www
.nasa.gov/columbia/home/CAIB_Vol1.html.
Deming, W.E. (1986). Out of the crisis. Cambridge,
MA: Center for Advanced Engineering Study, Massa-
chusetts Institute of Technology.
Det Norske Veritas (DNV) Consulting. (2004).
Leading indicators for major accident hazards: An invi-
tation to industry partners. Houston, TX: Author.
Ferry, T.S. (1981). Modern accident investigation and
analysis: An executive guide. New York: John Wiley &
Sons.
Heinrich, H.W. (1931, 1941, 1950, 1959). Industrial
accident prevention: A scientific approach. New York:
McGraw-Hill. (See Table 1, p. 53
for specific references.)
Heinrich, H.W., Petersen, D.
& Roos, N. Industrial accident
prevention (5th ed.). New York:
McGraw-Hill.
Krause, T.R. (2005). Leading
with safety. Hoboken, NJ: John
Wiley & Sons.
Manuele, F.A. (2002). Heinrich
revisited: Truisms or myths. Itasca,
IL: National Safety Council.
Manuele, F.A. (2003). On the
practice of safety (3rd ed.). New
York: John Wiley & Sons.
Manuele, F.A. (2008, Dec.).
Serious injuries and fatalities:
A call for a new focus on their prevention. Professional
Safety, 53(12), 32-39.
National Council on Compensation Insurance
(NCCI). (2005, May). State of the line. Boca Raton, FL:
Author. Retrieved Aug. 30, 2011, from www.ncci.com/
media/pdf/SOL_2005.pdf.
NCCI. (2006, June). Workers’ compensation claim
frequency down again in 2005 [Research brief]. Boca
Raton, FL: Author. Retrieved Aug. 30, 2011, from www
.ncci.com/documents/research-brief-august06.pdf.
NCCI. (2006, Nov.). The remarkable story of declining
frequency—down 30% in the past decade [Video]. Boca
Raton, FL: Author. Retrieved Aug. 30, 2011, from www
.ncci.com/nccimain/IndustryInformation/NCCIVid
eos/ArchivedArticles/Pages/video_declining _fre
quency_11-06.aspx.
NCCI. (2009, July).
Workers’ compensa-
tion claim frequency
continues its decline
in 2008 [Research
brief]. Boca Raton,
FL: Author. Retrieved
Aug. 30, 2011, from
www.ncci.com/
Documents/Work
ersCompensation
ClaimFrequency
2008.pdf.
NCCI. (2010, May). State of the line. Boca Raton, FL:
Author. Retrieved Aug. 30, 2011, from www.ncci.com/
Documents/AIS-2010-SOL-Presentation.pdf.
National Safety Council (NSC). (1980). Accident
prevention manual for industrial operations: Administra-
tion and programs (8th ed.). Itasca, IL: Author.
Reason, J. (1997). Managing the risks of organizational
accidents. London: Ashgate Publishing Co.
Stefansson, V. (1928). The standardization of error.
London: K. Paul, Trench, Trubner & Co. Ltd.
Tarrants, W.E. (1980). The measurement of safety
performance. New York: Garland Publishing Co.
U.S. Department of Energy. (1994). Guide to use of
the management oversight and risk tree (SSDC-103).
Washington DC: Author.
Walton, M. (1986). The Deming management method.
New York: The Putnam Publishing Group.
Recommendations
Safety professionals should ensure that the Heinrich misconcep-
tions discussed in this article are discarded by the profession.
To
achieve this, each safety professional should:
•Stop using or promoting the premises that unsafe acts are the
primary causes of accidents and that focusing on reducing
accident
frequency will equivalently reduce injury severity.
•Actively dispel these premises in presentations, writings and
discussions.
•Politely but firmly refute allegations by others who continue to
promote the validity of these premises.
•Apply current methods that look beyond Heinrich’s myths to
determine true causal factors of accidents.
Acknowledgment
Parts of this article are updated material from
three of the author’s works: Heinrich Revis-
ited: Truisms or Myths; chapter seven in On
the Practice of Safety (3rd ed.); and the article,
“Serious Injuries and Fatalities: A Call for a
New Focus on Their Prevention,” from the
December 2008 issue of Professional Safety.
http://www.bp.com/liveassets/bp_internet/globalbp/globalbp_uk
_english/incident_response/STAGING/local_assets/downloads_
pdfs/Deepwater_Horizon_Accident_Investigation_Report.pdf
http://www.bp.com/liveassets/bp_internet/globalbp/globalbp_uk
_english/incident_response/STAGING/local_assets/downloads_
pdfs/Deepwater_Horizon_Accident_Investigation_Report.pdf
http://www.bp.com/liveassets/bp_internet/globalbp/globalbp_uk
_english/incident_response/STAGING/local_assets/downloads_
pdfs/Deepwater_Horizon_Accident_Investigation_Report.pdf
http://www.bp.com/liveassets/bp_internet/globalbp/globalbp_uk
_english/incident_response/STAGING/local_assets/downloads_
pdfs/Deepwater_Horizon_Accident_Investigation_Report.pdf
http://www.nasa.gov/columbia/home/CAIB_Vol1.html
http://www.nasa.gov/columbia/home/CAIB_Vol1.html
http://www.ncci.com/media/pdf/SOL_2005.pd
http://www.ncci.com/media/pdf/SOL_2005.pd
http://www.ncci.com/media/pdf/SOL_2005.pdf
http://www.ncci.com/documents/research-brief-august06.pdf
http://www.ncci.com/documents/research-brief-august06.pdf
https://www.ncci.com/nccimain/IndustryInformation/
NCCIVideos/ArchivedArticles/Pages/video_declining_frequenc
y_11-06.aspx
https://www.ncci.com/nccimain/IndustryInformation/
NCCIVideos/ArchivedArticles/Pages/video_declining_frequenc
y_11-06.aspx
https://www.ncci.com/nccimain/IndustryInformation/
NCCIVideos/ArchivedArticles/Pages/video_declining_frequenc
y_11-06.aspx
https://www.ncci.com/nccimain/IndustryInformation/
NCCIVideos/ArchivedArticles/Pages/video_declining_frequenc
y_11-06.aspx
http://www.ncci.com/Documents/WorkersCompensationClaimFr
equency2008.pdf
http://www.ncci.com/Documents/WorkersCompensationClaimFr
equency2008.pdf
http://www.ncci.com/Documents/WorkersCompensationClaimFr
equency2008.pdf
http://www.ncci.com/Documents/WorkersCompensationClaimFr
equency2008.pdf
http://www.ncci.com/Documents/WorkersCompensationClaimFr
equency2008.pdf
http://www.ncci.com/Documents/AIS-2010-SOL-
Presentation.pdf
http://www.ncci.com/Documents/AIS-2010-SOL-
Presentation.pdf
Copyright of Professional Safety is the property of American
Society of Safety Engineers and its content may
not be copied or emailed to multiple sites or posted to a listserv
without the copyright holder's express written
permission. However, users may print, download, or email
articles for individual use.
3Safety paradoxes
Correspondence and reprint
requests to:
James Reason
Dept. Psychology
Univ. Manchester
Oxford Road
Manchester M13 9PL
England, U.K.
Tel.: +44 161 275 2551
Fax: +44 161 275 2622
E-mail: [email protected]
Original paper
Injury Control & Safety Promotion
1566-0974/00/US$ 15.00
Injury Control & Safety Promotion – 2000,
Vol. 7, No. 1, pp. 3-14
© Swets & Zeitlinger 2000
Accepted 15 November 1999
Safety paradoxes and safety culture
James Reason
Department of Psychology, University of Manchester, U.K.
Abstract This paper deals with four safety paradoxes: (1) Safety
is
defined and measured more by its absence than its presence. (2)
De-
fences, barriers and safeguards not only protect a system, they
can also
cause its catastrophic breakdown. (3) Many organisations seek
to limit
the variability of human action, primarily to minimise error, but
it is
this same variability – in the form of timely adjustments to
unexpected
events – that maintains safety in a dynamic and changing world.
(4) An
unquestioning belief in the attainability of absolute safety can
seriously
impede the achievement of realisable safety goals, while a
preoccupa-
tion with failure can lead to high reliability. Drawing
extensively upon
the study of high reliability organisations (HROs), the paper
argues that
a collective understanding of these paradoxes is essential for
those
organisations seeking to achieve an optimal safety culture. It
concludes
with a consideration of some practical implications.
Key words Safety promotion; culture; defences; errors;
adaptabil-
ity; beliefs; psychological factors; human behaviour
Introduction A paradox is ‘a statement contrary to received
opin-
ion; seemingly absurd though perhaps well-founded’ (Concise
Oxford
Dictionary). This paper contends that the pursuit of safety
abounds
with paradox, and that this is especially true of efforts to
achieve a
safer organisational culture. In safety, as in other highly
interactive
spheres, things are not always what they seem. Not only can
they be
contrary to surface appearances, they can also run counter to
some of
our most cherished beliefs. The better we understand these
paradoxes,
the more likely we are to create and sustain a truly safe culture.
A safe culture is an informed culture, one that knows
continually
where the ‘edge’ is without necessarily having to fall over it.
The
‘edge’ lies between relative safety and unacceptable danger. In
many
industries, proximity to the ‘edge’ is the zone of greatest peril
and also
of greatest profit.1 Navigating this area requires considerable
skill on
the part of system managers and operators. Since such
individuals come
and go, however, only a safe culture can provide any degree of
lasting
protection.
Simply identifying the existence of a paradox is not enough.
Unlike
the ‘pure’ sciences, in which theories are assessed by how much
em-
9920.p65 1/27/00, 11:11 AM3
J. Reason4
pirical activity they provoke, the insights of safety scientists
and safety
practitioners are ultimately judged by the extent to which their
practical
application leads to safer systems. Each of the paradoxes
considered
below has important practical implications for the achievement
of a
safe culture. Indeed, it will be argued that a shared
understanding of
these paradoxes is a prerequisite for acquiring an optimal safety
cul-
ture.
Most of the apparent contradictions discussed in this paper have
been
revealed not so much by the investigation of adverse events – a
topic
that comprises the greater part of safety research – as from the
close
observation of high reliability organisations (HROs). Safety has
both a
negative and a positive face. The former is revealed by
accidents with
bad outcomes. Fatalities, injuries and environmental damage are
con-
spicuous and readily quantifiable occurrences. Avoiding them as
far as
possible is the objective of the safety sciences. It is hardly
surprising,
therefore, that this darker face has occupied so much of our
attention
and shaped so many of our beliefs about safety. The positive
face, on
the other hand, is far more secretive. It relates to a system’s
intrinsic
resistance to its operational hazards. Just as medicine knows
more about
pathology than health, so also do the safety sciences understand
far
more about how bad events happen than about how human
actions and
organisational processes also lead to their avoidance, detection
and
containment. It is this imbalance that has largely created the
paradoxes.
The remainder of the paper is in six parts. The next section
previews
the four safety paradoxes to be considered here. The ensuing
four sec-
tions each consider one of these safety paradoxes in more detail.
The
concluding section summarises the practical implications of
these par-
adoxes for achieving and preserving a safer culture.
Previewing the safety paradoxes
• Safety is defined and measured more by its absence than by its
presence.
• Measures designed to enhance a system’s safety – defences,
barriers
and safeguards – can also bring about its destruction.
• Many, if not most, engineering-based organisations believe
that safe-
ty is best achieved through a predetermined consistency of their
pro-
cesses and behaviours, but it is the uniquely human ability to
vary
and adapt actions to suit local conditions that preserves system
safe-
ty in a dynamic and uncertain world.
• An unquestioning belief in the attainability of absolute safety
(zero
accidents or target zero) can seriously impede the achievement
of
realisable safety goals.
A further paradox embodies elements from all of the above. If
an or-
ganisation is convinced that it has achieved a safe culture, it
almost
certainly has not. Safety culture, like a state of grace, is a
product of
continual striving. There are no final victories in the struggle
for safety.
The first paradox: how safety is defined and assessed
The Concise Oxford Dictionary defines safety as ‘freedom from
danger
and risks’. But this tells us more about what comprises
‘unsafety’ than
9920.p65 1/27/00, 11:11 AM4
5Safety paradoxes
about the substantive properties of safety itself. Such a
definition is
clearly unsatisfactory. Even in the short term, as during a
working day
or on a particular journey, we can never escape danger – though
we
may not experience its adverse consequences in that instance. In
the
longer term, of course, most of the risks and hazards that beset
human
activities are universal constants. Gravity, terrain, weather, fire
and the
potential for uncontrolled releases of mass, energy and noxious
sub-
stances are ever-present dangers. So, in the strict sense of the
defini-
tion, we can never be safe. A more appropriate definition of
safety
would be ‘the ability of individuals or organisations to deal
with risks
and hazards so as to avoid damage or losses and yet still
achieve their
goals’.
Even more problematic, however, is that safety is measured by
its
occasional absences. An organisation’s safety is commonly
assessed by
the number and severity of negative outcomes (normalised for
expo-
sure) that it experiences over a given period. But this is a
flawed metric
for the reasons set out below.
First, the relationship between intrinsic ‘safety health’ and
negative
outcomes is, at best, a tenuous one. Chance plays a large part in
caus-
ing bad events – particularly so in the case of complex, well-
defended
technologies.2 As long as hazards, defensive weaknesses and
human
fallibility continue to co-exist, unhappy chance can combine
them in
various ways to bring about a bad event. That is the essence of
the term
‘accident’. Even the most resistant organisations can suffer a
bad acci-
dent. By the same token, even the most vulnerable systems can
evade
disaster, at least for a time. Chance does not take sides. It
afflicts the
deserving and preserves the unworthy.
Second, a general pattern in organisational responses to a safety
management programme is that negative outcome data decline
rapidly
at first and then gradually bottom out to some asymptotic value.
In
commercial aviation, for example, a highly safety conscious
industry,
the fatal accident rate has remained relatively unchanged for the
past
25 years.3 Comparable patterns are found in many other
domains. During
the period of rapid decline, it seems reasonable to suppose that
the
marked diminution in accident rates actually does reflect some
im-
provement in a system’s intrinsic ‘safety health’. But once the
plateau
has been reached, periodic variations in accident rates contain
more
noise than valid safety signals. At this stage of an
organisation’s safety
development, negative outcome data are a poor indication of its
ability
to withstand adverse events in the future. This is especially true
of
well-defended systems such as commercial aviation and nuclear
power
generation that are, to a large extent, victims of their own
success. By
reducing accident rates to a very low level they have largely run
out of
‘navigational aids’ by which to steer towards some safer state.
The diminution in accident rates that is apparent in most
domains is
a product not only of local safety management efforts, but also
of a
growing public intolerance for third-party risks, environmental
damage
and work-related injuries. This, in turn, has led to increasingly
compre-
hensive safety legislation in most industrialised nations. Even
in the
least responsible organisations, merely keeping one step ahead
of the
regulator requires the implementation of basic safety measures
that are
9920.p65 1/27/00, 11:11 AM5
J. Reason6
often sufficient to bring about dramatic early reductions in
accident
rates. The important issue, however, is what happens once the
plateau
has been reached. It is at this point that an organisation’s safety
culture
takes on a profound significance. Getting from bad to average is
rela-
tively easy; getting from average to excellent is very hard. And
it is for
the latter purpose that an understanding of the paradoxes is
crucial.
In summary: while high accident rates may reasonably be taken
as
indicative of a bad safety state, low asymptotic rates do not
necessarily
signal a good one. This asymmetry in the meaning of negative
outcome
data lies at the heart of many of the subsequent paradoxes to be
dis-
cussed later. It also has far-reaching cultural implications.
There are at
least two ways to interpret very low or nil accident rates in a
given
accounting period. A very common one is to believe that the
organisa-
tion actually has achieved a safe state: that is, it takes no news
as good
news and sends out congratulatory messages to its workforce.
High-
reliability organisations, on the other hand, become worried,
accepting
that no news really is no news, and so adopt an attitude of
increased
vigilance and heightened defensiveness.4,5
The second paradox: dangerous defences A theme that
recurs repeatedly in accident reports is that measures designed
to en-
hance a system’s safety can also bring about its destruction.
Since this
paradox has been discussed at length elsewhere,6,7 we will
focus on its
cultural implications. Let us start with some examples of
defensive
failures that cover a range of domains.
• The Chernobyl disaster had its local origins in an attempt to
test an
electrical safety device designed to overcome the interruption of
power to the emergency core cooling system that would ensue
im-
mediately after the loss of off-site electricity and before the on-
site
auxiliary generators were fully operative.8
• The advanced automation present in many modern
technologies was
designed, in part, to eliminate opportunities for human error.
Expe-
rience in several domains, however, has shown that automation
can
create mode confusions and decision errors that can be more
danger-
ous than the slips and lapses it was intended to avoid.9,10
• Emergency procedures are there to guide people to safety in
the
event of a dangerous occurrence. In a number of instances,
however,
strict compliance with safety procedures has killed people. On
Piper
Alpha, the North Sea gas and oil platform that exploded in
1988,
most of the 165 rig workers that died complied strictly with the
safety drills and assembled in the accommodation area.
Tragically,
this was directly in line with a subsequent explosion.11 The few
fire-
fighters that survived the Mann Gulf forest fire disaster in 1949
dropped their heavy tools and ran, while those who died obeyed
the
organisational instruction to keep their fire-fighting tools with
them
at all times.12
• Personal protective equipment can save many lives, but it can
also
pose a dangerous threat to certain groups of people. Swedish
traffic
accident studies have revealed that both elderly female drivers
and
infants in backward-facing seats have been killed by rapidly
inflating
airbags following a collision.13
9920.p65 1/27/00, 11:11 AM6
7Safety paradoxes
• Finally, perhaps the best example of the defence paradox is
that
maintenance activities – intended to repair and forestall
technical
failures – are the largest single source of human factors
problems in
the nuclear power industry.14,15 In commercial aviation,
quality laps-
es in maintenance are the second most significant cause of
passenger
deaths.16
There is no single reason why defences are so often
instrumental in
bringing about bad events. Errors in maintenance, for example,
owe
their frequency partly to the hands- on, high-opportunity nature
of the
task, and partly to the fact that certain aspects of maintenance,
partic-
ularly installation and reassembly, are intrinsically error-
provoking
regardless of who is doing the job.6 But some of the origins of
the
defensive paradox have strong cultural overtones. We can
summarise
these cultural issues under three headings: the trade-off
problem, the
control problem and the opacity problem.
the trade-off problem An important manifestation of an organ-
isation’s cultural complexion is the characteristic way it
resolves con-
flicts. Virtually all of the organisations of concern here are in
the busi-
ness of producing something: manufactured goods, energy,
services,
the extraction of raw materials, transportation and the like. All
such
activities involve the need to protect against operational
hazards. A
universal conflict, therefore, is that between production and
protection.
Both make demands upon limited resources. Both are essential.
But
their claims are rarely perceived as equal. It is production rather
than
protection that pays the bills, and those who run these
organisations
tend to possess productive rather than protective skills.
Moreover, the
information relating to the pursuit of productive goals is
continuous,
credible and compelling, while the information relating to
protection is
discontinuous, often unreliable, and only intermittently
compelling (i.e.,
after a bad event). It is these factors that lie at the root of the
trade-off
problem. This problem can best be expressed as that of trading
protec-
tive gains for productive advantage. It has also been termed risk
ho-
meostasis17 or risk compensation – the latter term is preferable
since it
avoids some of Wilde’s more controversial assumptions.18
The trade-off problem has been discussed at length
elsewhere.18-20
Just one example will be sufficient to convey its essence. The
Davy
lamp, invented in 1815, was designed to isolate the light source,
a
naked flame, from the combustible gases present in mines. But
the
mine owners were quick to see that it also allowed miners to
work on
seams previously regarded as too dangerous. The incidence of
mine
explosions increased dramatically, reaching a peak in the
1860s.20
Improvements in protection afforded by technological
developments
are often put in place during the aftermath of a disaster. Soon,
however,
this increased protection is seen as offering commercial
advantage,
leaving the organisation with the same or even less protection
than it
had previously.
the control problem Another challenge facing all organisations
is
how to restrict the enormous variability of human behaviour to
that
9920.p65 1/27/00, 11:11 AM7
J. Reason8
which is both productive and safe. Organisational managers
have a
variety of means at their disposal:21,22 administrative controls
(prescrip-
tive rules and procedures), individual controls (selection,
training and
motivators), group controls (supervision, norms and targets) and
tech-
nical controls (automation, engineered safety features, physical
barri-
ers). In most productive systems, all of these controls are used
to some
degree; but the balance between them is very much a reflection
of the
organisational culture. What concerns us here, however, is the
often
disproportionate reliance placed upon prescriptive procedures.
Standard operating procedures are necessary. This is not in
dispute.
Since people change faster than jobs, it is essential that an
organisa-
tion’s collective wisdom is recorded and passed on. But
procedures are
not without problems, as indicated by some of the examples
listed
above. They are essentially feed-forward control devices –
prepared at
one time and place to be applied at some future time and place –
and
they suffer, along with all such control systems, the problem of
dealing
with local variations. Rule-based controls can encounter at least
three
kinds of situation: those in which they are correct and
appropriate,
those in which they are inapplicable due to local conditions, and
those
in which they are absent entirely. A good example of the latter
is the
predicament facing Captain Al Haynes and his crew in United
232
when he lost all three hydraulic systems on his DC10 due to the
explo-
sion of his tail-mounted, number two engine.23 The probability
of los-
ing all three hydraulic systems was calculated at one in a
billion, and
there were no procedures to cover this unlikely emergency. Far
more
common, however, are situations in which the procedures are
unwork-
able, incomprehensible or simply wrong. A survey carried out in
the
US nuclear industry, for example, identified poor procedures as
a factor
in some 60% of all human performance problems.15
There is a widespread belief among the managers of highly
procedur-
alised organisations that suitable training, along with rigid
compliance,
should eliminate the vast majority of human unsafe acts. When
such
errors and violations do occur, they are often seen as moral
issues
warranting sanctions. But, for the most part, punishing people
does not
eliminate the systemic causes of their unsafe acts. Indeed, by
isolating
individual actions from their local context, it can impede their
discov-
ery.
the opacity problem In the weeks following some foreign tech-
nological disaster, we often hear our country’s spokespeople
claiming
that it couldn’t happen here because our barriers and safeguards
are so
much more sophisticated and extensive. This assertion captures
an
important consequence of the opacity problem: the failure to
realise
that defences, particularly defences-in-depth, can create and
conceal
dangers as well as protect against them. When this ignorance
leads to
a collective belief in the security of high-technology systems,
the prob-
lem takes on cultural significance.
Defences-in-depth are created by diversity and redundancy.
Barriers
and safeguards take many forms. ‘Hard’ defences include
automated
safety features, physical containment, alarms and the like.
‘Soft’ de-
fences include rules and procedures, training, drills, briefings,
permit-
9920.p65 1/27/00, 11:11 AM8
9Safety paradoxes
to-work systems and many other measures that rely heavily on
people
and paper. This assortment of safety-enhancing measures is
widely
distributed throughout the organisation. This makes such
extensively
defended systems especially vulnerable to the effects of an
adverse
safety culture. Only culture can reach equally into all parts of
the sys-
tem and exert some consistent effect, for good or ill.24
While such diversity has undoubtedly enhanced the security of
high-
technology systems, the associated redundancy has proved to be
a mixed
blessing. By increasing complexity, it also makes the system
more
opaque to those who manage and control it.7,25,26 The opacity
problem
takes a variety of forms.
• Operator and maintainer failures may go unnoticed because
they are
caught and concealed by multiple backups.27
• Such concealment allows undiscovered errors and latent
conditions
(resident pathogens) to accumulate insidiously over time, thus
in-
creasing the possibility of inevitable weaknesses in the
defensive
layers lining up to permit the passage of an accident
trajectory.6,28
• By adding complexity to the system, redundant defences also
in-
crease the likelihood of unforeseeable common-mode failures.
While
the assumption of independence may be appropriate for purely
tech-
nical failures, errors committed by managers, operators and
main-
tainers are uniquely capable of creating problems that can affect
a
number of defensive layers simultaneously. At Chernobyl, for
exam-
ple, the operators successively disabled a number of supposedly
in-
dependent, engineered safety features in pursuit of their testing
pro-
gramme.
Dangerous concealment combined with the obvious
technological so-
phistication of redundant defences can readily induce a false
sense of
security in system managers, maintainers and operators. In
short, they
forget to be afraid – or, as in the case of the Chernobyl
operators, they
never learn to be afraid. Such complacency lies on the opposite
pole
from a safe culture.
The third paradox: consistency versus variability Holl-
nagel20 conducted a survey of the human factors literature to
identify
the degree to which human error has been implicated in accident
cau-
sation over the past few decades. In the 1960s, when the
problem first
began to attract serious attention, the estimated contribution of
human
error was around 20%. By the 1990s, this figure had increased
fourfold
to around 80%. One of the possible reasons for this apparent
growth in
human fallibility is that accident investigators are now far more
con-
scious that contributing errors are not confined to the ‘sharp
end’ but
are present at all levels of a system, and even beyond. Another
is that
the error causal category has, by default, moved more and more
into
the investigatory spotlight due to great advances in the
reliability of
mechanical and electronic components over the past forty years.
Whatever the reason, the reduction – or even elimination – of
human
error has now become one of the primary objectives of system
manag-
ers. Errors and violations are viewed, reasonably enough, as
deviations
from some desired or appropriate behaviour. Having mainly an
engi-
9920.p65 1/27/00, 11:11 AM9
J. Reason10
neering background, such managers attribute human
unreliability to
unwanted variability. And, as with technical unreliability, they
see the
solution as one of ensuring greater consistency of human action.
They
do this, as we have seen, through procedures and by buying
more
automation. What they often fail to appreciate, however, is that
human
variability in the form of moment-to-moment adaptations and
adjust-
ments to changing events is also what preserves system safety
in an
uncertain and dynamic world. And therein lies the paradox. By
striving
to constrain human variability, they are also undermining one
the sys-
tem’s most important safeguards.
The problem has been encapsulated by Weick’s insightful
observa-
tion5 that ‘reliability is a dynamic non-event.’ It is dynamic
because
processes remain under control due to compensations by human
com-
ponents. It is a non-event because safe outcomes claim little or
no
attention. The paradox is rooted in the fact that accidents are
salient,
while non-events, by definition, are not. Almost all of our
methodolog-
ical tools are geared to investigating adverse events. Very few
of them
are suited to creating an understanding of why timely
adjustments are
necessary to achieve successful outcomes in an uncertain and
dynamic
world
Recently, Weick et al.4 challenged the received wisdom that an
or-
ganisation’s reliability depends upon the consistency,
repeatability and
invariance of its routines and activities. Unvarying
performance, they
argue, cannot cope with the unexpected. To account for the
success of
high reliability organisations (HROs) in dealing with
unanticipated
events, they distinguish two aspects of organisational
functioning: cog-
nition and activity. The cognitive element relates to being alert
to the
possibility of unpleasant surprises and having the collective
mindset
necessary to detect, understand and recover them before they
bring
about bad consequences. Traditional ‘efficient’ organisations
strive for
stable activity patterns yet possess variable cognitions – these
differing
cognitions are most obvious before and after a bad event. In
HROs, on
the other hand, ‘there is variation in activity, but there is
stability in the
cognitive processes that make sense of this activity’.4 This
cognitive
stability depends critically upon an informed culture – or what
Weick
and his colleagues have called ‘collective mindfulness’.
Collective mindfulness allows an organisation to cope with the
unan-
ticipated in an optimal manner. ‘Optimal’ does not necessarily
mean
‘on every occasion’, but the evidence suggests that the presence
of such
enduring cognitive processes is a critical component of
organisational
resilience. Since catastrophic failures are rare events,
collectively mind-
ful organisations work hard to extract the most value from what
little
data they have. They actively set out to create a reporting
culture by
commending, even rewarding, people for reporting their errors
and near
misses. They work on the assumption that what might seem to
be an
isolated failure is likely to come from the confluence of many
‘up-
stream’ causal chains. Instead of localising failures, they
generalise
them. Instead of applying local repairs, they strive for system
reforms.
They do not take the past as a guide to the future. Aware that
system
failures can take a wide variety of yet-to-be-encountered forms,
they
are continually on the lookout for ‘sneak paths’ or novel ways
in which
9920.p65 1/27/00, 11:11 AM10
11Safety paradoxes
active failures and latent conditions can combine to defeat or
by-pass
the system defences. In short, HROs are preoccupied with the
possibil-
ity of failure – which brings us to the last paradox to be
considered
here.
The fourth paradox: target zero Some years ago, US Vice-
President Al Gore declared his intention of eradicating transport
acci-
dents. Comparable sentiments are echoed by the top managers
of by-
the-book companies, those having what Westrum29 has called
‘calculative’ cultures. They announce a corporate goal of ‘zero
acci-
dents’ and then set their workforce the task of achieving
steadily di-
minishing accident targets year by year – what I have earlier
termed the
‘negative production’ model of safety management.
It is easy to understand and to sympathise with such goal-
setting. A
truly committed management could hardly appear to settle for
anything
less. But ‘target zero’ also conveys a potentially dangerous
misrepre-
sentation of the nature of the struggle for safety: namely, that
the ‘safe-
ty war’ could end in a decisive victory of the kind achieved by a
Waterloo or an Appomattox. An unquestioning belief in victory
can
lead to defeat in the ‘safety war’. The key to relative success,
on the
other hand, seems to be an abiding concern with failure
HROs see the ‘safety war’ for what it really is: an endless
guerrilla
conflict. They do not seek a decisive victory, merely a workable
sur-
vival that will allow them to achieve their productive goals for
as long
as possible. They know that the hazards will not go away, and
accept
that entropy defeats all systems in the end. HROs accept
setbacks and
nasty surprises as inevitable. They expect to make errors and
train their
workforce to detect and recover them. They constantly rehearse
for the
imagined scenarios of failure and then go on to brainstorm
novel ones.
In short, they anticipate the worst and equip themselves to cope
with
it.
A common response to these defining features of HROs is that
they
seem excessively bleak. ‘Doom-laden’ is a term often applied to
them.
Viewed from a personal perspective, this is an understandable
reaction.
It is very hard for any single individual to remain ever mindful
of the
possibility of failure, especially when such occurrences have
personal
significance only on rare occasions. No organisation is just in
the busi-
ness of being safe. The continuing press of productive demands
is far
more likely to engage the forefront of people’s minds than the
possi-
bility of some unlikely combination of protective failures. This
is ex-
actly why safety culture is so important. Culture transcends the
psy-
chology of any single person. Individuals can easily forget to be
afraid.
A safe culture, however, can compensate for this by providing
the
reminders and ways of working that go to create and sustain
intelligent
wariness. The individual burden of chronic unease is also made
more
supportable by knowing that the collective concern is not so
much with
the occasional – and inevitable – unreliability of its human
parts, as
with the continuing resilience of the system as a whole.
The practical implications By what means can we set about
transforming an average safety culture into an excellent one?
The an-
9920.p65 1/27/00, 11:11 AM11
J. Reason12
swer, I believe, lies in recognising that a safe culture is the
product of
a number of inter-dependent sub-cultures, each of which – to
some
degree – can be socially engineered. An informed culture can
only be
built on the foundations of a reporting culture. And this, in turn,
de-
pends upon establishing a just culture. In this concluding
section, we
will look at how to build these two sub-cultures. The other
elements of
a safe culture – a flexible culture and a learning culture – hinge
largely
upon the establishment of the previous two. They have been
discussed
at length elsewhere5,6 and will not be considered further here.
In the absence of frequent bad outcomes, knowledge of where
the
‘edge’ lies can only come from persuading those at the human-
system
interface to report their ‘free lessons’. These are the mostly
inconse-
quential errors, incidents and near misses that could have
caused injury
or damage. But people do not readily confess their blunders,
particular-
ly if they believe such reports could lead to disciplinary action.
Estab-
lishing trust, therefore, is the first step in engineering a
reporting cul-
ture – and this can be very big step. Other essential
characteristics are
that the organisation should possess the necessary skills and
resources
to collect, analyse and disseminate safety-related information
and, cru-
cially, it should also have a management that is willing to act
upon and
learn from these data.
A number of effective reporting systems have been established,
par-
ticularly in aviation. Two behavioural scientists involved in the
cre-
ation of two very successful systems, the Aviation Safety
Reporting
System developed by NASA and the British Airways Safety
Informa-
tion System, have recently collaborated to produce a blueprint
for en-
gineering a reporting culture.30 The main features are
summarised be-
low.
• A qualified indemnity against sanctions – though not blanket
immu-
nity.
• A reliance on confidentiality and de-identification rather than
com-
plete anonymity.
• The organisational separation of those who collect and analyse
the
data from those responsible for administering sanctions.
• Rapid, useful and intelligible feedback – after the threat of
punish-
ment, nothing deters reporters more than a lack of any response.
• Reports should be easy to make. Free text accounts appear to
be
more acceptable to reporters than forced-choice questionnaires.
The first three of these measures relate to the issue of
punishment. In
the past, many organisations relied heavily upon the threat of
sanctions
to shape reliable human behaviour. More recently, the pendulum
has
swung towards the establishment of ‘no blame’ cultures. But
like the
excessively punitive culture it supplanted, this approach is
neither de-
sirable nor workable. A small proportion of unsafe acts are
indeed
reckless and warrant severe sanctions. What is needed is a just
culture,
one in which everyone knows where the line must be drawn
between
acceptable and unacceptable actions. When this is done, the
evidence
suggests that only around 10% of unsafe acts fall into the
unacceptable
category.6,31 This means that around 90% of unsafe acts are
largely
blameless and could be reported without fear of punishment.
9920.p65 1/27/00, 11:11 AM12
13Safety paradoxes
So how should this line be drawn? Many organisations place the
boundary between errors and procedural violations, arguing that
only
the latter are deliberate actions. But there are two problems with
this:
some errors arise from unacceptable behaviours, while some
violations
are enforced by organisational rather than by individual
shortcomings,
and so should not be judged as unacceptable. Marx31 has
proposed a
better distinction. The key determinant of blameworthiness, he
argues,
is not so much the act itself – error or violation – as the nature
of the
behaviour in which it was embedded. Did this behaviour involve
un-
warranted risk-taking? If so, then the act would be blameworthy
re-
gardless of whether it was an error or a violation. Often, of
course, the
two acts are combined. For instance, a person may violate
procedures
by taking on a double shift and make a dangerous mistake in the
final
hour. Such an individual would merit punishment because he or
she
took an unjustifiable risk in working a continuous 18 hours,
thus in-
creasing the likelihood of an error.32
These are fine judgements and there is insufficient space to
pursue
them further here. The important point, however, is that such
determi-
nations – ideally involving both management and peers – lie at
the
heart of a just culture. Without a shared agreement as to where
such a
line should be drawn, there can never be an adequate reporting
culture.
Without a reporting culture, there could not be an informed
culture. It
is the knowledge so provided that gives an optimal safety
culture its
defining characteristics: a continuing respect for its operational
haz-
ards, the will to combat hazards in a variety of ways and a
commitment
to achieving organisational resilience. And these, I have argued,
re-
quire a ‘collective mindfulness’ of the paradoxes of safety.
References
1 Hudson PTW. Psychology and safety.
Leiden: University of Leiden, 1997.
2 Reason J. Achieving a safe culture:
theory and practice. Work & Stress
1998;12:293-306.
3 Howard RW. Breaking through the 106
barrier. Proc Int Fed Airworthiness
Conf, Auckland, NZ, 20-23 October
1991.
4 Weick KE, Sutcliffe KM, Obstfeld D.
Organizing for high reliability:
processes of collective mindfulness.
In: Staw B, Sutton R, editors.
Research in Organizational Behavior
1999;21:23-81.
5 Weick KE. Organizational culture as a
source of high reliability. Calif
Management Rev 1987;29:112-27.
6 Reason J. Managing the risks of
organizational accidents. Aldershot:
Ashgate, 1997.
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx
  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx

More Related Content

Similar to   CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx

Sat Essay Template 2016. Online assignment writing service.
Sat Essay Template 2016. Online assignment writing service.Sat Essay Template 2016. Online assignment writing service.
Sat Essay Template 2016. Online assignment writing service.Stephanie Wilson
 
To Kill A Mockingbird Courage Essay
To Kill A Mockingbird Courage EssayTo Kill A Mockingbird Courage Essay
To Kill A Mockingbird Courage EssayLiana Anderson
 
Impressive Mla Format Heading For Essay ~ That
Impressive Mla Format Heading For Essay ~ ThatImpressive Mla Format Heading For Essay ~ That
Impressive Mla Format Heading For Essay ~ ThatAmanda Sanchez
 
Leapfrog Leapreader System Pen 10 Books Mega Pack
Leapfrog Leapreader System Pen 10 Books Mega PackLeapfrog Leapreader System Pen 10 Books Mega Pack
Leapfrog Leapreader System Pen 10 Books Mega PackJessica Rinehart
 
Reflective Essay Writing Examples
Reflective Essay Writing ExamplesReflective Essay Writing Examples
Reflective Essay Writing ExamplesHeather Dionne
 
(PDF) How To Write An Abstract Of Research Pape
(PDF) How To Write An Abstract Of Research Pape(PDF) How To Write An Abstract Of Research Pape
(PDF) How To Write An Abstract Of Research PapeMartha Malone
 
Apply Texas Topic C Sample Essays
Apply Texas Topic C Sample EssaysApply Texas Topic C Sample Essays
Apply Texas Topic C Sample EssaysMolly Wood
 
Scroll Writing Paper By Fantastic In Fir
Scroll Writing Paper By Fantastic In FirScroll Writing Paper By Fantastic In Fir
Scroll Writing Paper By Fantastic In FirIoulia King
 
RUNNINGHEAD Uber Technologies IncorporationUber Technologies .docx
RUNNINGHEAD Uber Technologies IncorporationUber Technologies .docxRUNNINGHEAD Uber Technologies IncorporationUber Technologies .docx
RUNNINGHEAD Uber Technologies IncorporationUber Technologies .docxanhlodge
 
004 Write An Academic Ess. Online assignment writing service.
004 Write An Academic Ess. Online assignment writing service.004 Write An Academic Ess. Online assignment writing service.
004 Write An Academic Ess. Online assignment writing service.Jenny Mancini
 
Compare Judaism Christianity And Islam Essay
Compare Judaism Christianity And Islam EssayCompare Judaism Christianity And Islam Essay
Compare Judaism Christianity And Islam EssayAlicia Galindo
 
Essay About Teacher Favorite Songs List
Essay About Teacher Favorite Songs ListEssay About Teacher Favorite Songs List
Essay About Teacher Favorite Songs ListJeff Nelson
 
Letter Formatting Insert Date Insert Nam.docx
Letter Formatting    Insert Date   Insert Nam.docxLetter Formatting    Insert Date   Insert Nam.docx
Letter Formatting Insert Date Insert Nam.docxcroysierkathey
 
Tok Essay Example Sample, Bookwormlab
Tok Essay Example Sample, BookwormlabTok Essay Example Sample, Bookwormlab
Tok Essay Example Sample, BookwormlabJim Webb
 
Florida Black History Month Essay Contest 2014 Winners
Florida Black History Month Essay Contest 2014 WinnersFlorida Black History Month Essay Contest 2014 Winners
Florida Black History Month Essay Contest 2014 WinnersDamaris Tur
 
Action Research Format - Philippin News Collections
Action Research Format - Philippin News CollectionsAction Research Format - Philippin News Collections
Action Research Format - Philippin News CollectionsVicki Cristol
 
Pin On Artist Inspiration
Pin On Artist InspirationPin On Artist Inspiration
Pin On Artist InspirationMary Gregory
 
Easy Essay On Bhrashtachar In Hindi. Online assignment writing service.
Easy Essay On Bhrashtachar In Hindi. Online assignment writing service.Easy Essay On Bhrashtachar In Hindi. Online assignment writing service.
Easy Essay On Bhrashtachar In Hindi. Online assignment writing service.Sarah Meza
 
Essay With Footnotes. Purpose of footnotes in essays
Essay With Footnotes. Purpose of footnotes in essaysEssay With Footnotes. Purpose of footnotes in essays
Essay With Footnotes. Purpose of footnotes in essaysKeisha Paulino
 

Similar to   CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx (20)

Sat Essay Template 2016. Online assignment writing service.
Sat Essay Template 2016. Online assignment writing service.Sat Essay Template 2016. Online assignment writing service.
Sat Essay Template 2016. Online assignment writing service.
 
To Kill A Mockingbird Courage Essay
To Kill A Mockingbird Courage EssayTo Kill A Mockingbird Courage Essay
To Kill A Mockingbird Courage Essay
 
Impressive Mla Format Heading For Essay ~ That
Impressive Mla Format Heading For Essay ~ ThatImpressive Mla Format Heading For Essay ~ That
Impressive Mla Format Heading For Essay ~ That
 
Leapfrog Leapreader System Pen 10 Books Mega Pack
Leapfrog Leapreader System Pen 10 Books Mega PackLeapfrog Leapreader System Pen 10 Books Mega Pack
Leapfrog Leapreader System Pen 10 Books Mega Pack
 
Reflective Essay Writing Examples
Reflective Essay Writing ExamplesReflective Essay Writing Examples
Reflective Essay Writing Examples
 
(PDF) How To Write An Abstract Of Research Pape
(PDF) How To Write An Abstract Of Research Pape(PDF) How To Write An Abstract Of Research Pape
(PDF) How To Write An Abstract Of Research Pape
 
Apply Texas Topic C Sample Essays
Apply Texas Topic C Sample EssaysApply Texas Topic C Sample Essays
Apply Texas Topic C Sample Essays
 
Scroll Writing Paper By Fantastic In Fir
Scroll Writing Paper By Fantastic In FirScroll Writing Paper By Fantastic In Fir
Scroll Writing Paper By Fantastic In Fir
 
RUNNINGHEAD Uber Technologies IncorporationUber Technologies .docx
RUNNINGHEAD Uber Technologies IncorporationUber Technologies .docxRUNNINGHEAD Uber Technologies IncorporationUber Technologies .docx
RUNNINGHEAD Uber Technologies IncorporationUber Technologies .docx
 
004 Write An Academic Ess. Online assignment writing service.
004 Write An Academic Ess. Online assignment writing service.004 Write An Academic Ess. Online assignment writing service.
004 Write An Academic Ess. Online assignment writing service.
 
Compare Judaism Christianity And Islam Essay
Compare Judaism Christianity And Islam EssayCompare Judaism Christianity And Islam Essay
Compare Judaism Christianity And Islam Essay
 
Essay About Teacher Favorite Songs List
Essay About Teacher Favorite Songs ListEssay About Teacher Favorite Songs List
Essay About Teacher Favorite Songs List
 
Letter Formatting Insert Date Insert Nam.docx
Letter Formatting    Insert Date   Insert Nam.docxLetter Formatting    Insert Date   Insert Nam.docx
Letter Formatting Insert Date Insert Nam.docx
 
Tok Essay Example Sample, Bookwormlab
Tok Essay Example Sample, BookwormlabTok Essay Example Sample, Bookwormlab
Tok Essay Example Sample, Bookwormlab
 
Florida Black History Month Essay Contest 2014 Winners
Florida Black History Month Essay Contest 2014 WinnersFlorida Black History Month Essay Contest 2014 Winners
Florida Black History Month Essay Contest 2014 Winners
 
Action Research Format - Philippin News Collections
Action Research Format - Philippin News CollectionsAction Research Format - Philippin News Collections
Action Research Format - Philippin News Collections
 
Pin On Artist Inspiration
Pin On Artist InspirationPin On Artist Inspiration
Pin On Artist Inspiration
 
Easy Essay On Bhrashtachar In Hindi. Online assignment writing service.
Easy Essay On Bhrashtachar In Hindi. Online assignment writing service.Easy Essay On Bhrashtachar In Hindi. Online assignment writing service.
Easy Essay On Bhrashtachar In Hindi. Online assignment writing service.
 
Essay With Footnotes. Purpose of footnotes in essays
Essay With Footnotes. Purpose of footnotes in essaysEssay With Footnotes. Purpose of footnotes in essays
Essay With Footnotes. Purpose of footnotes in essays
 
Workplace Diversity Report
Workplace Diversity ReportWorkplace Diversity Report
Workplace Diversity Report
 

More from odiliagilby

Per the text, computers are playing an increasingly important role i.docx
Per the text, computers are playing an increasingly important role i.docxPer the text, computers are playing an increasingly important role i.docx
Per the text, computers are playing an increasingly important role i.docxodiliagilby
 
Pennsylvania was the leader in sentencing and correctional reform .docx
Pennsylvania was the leader in sentencing and correctional reform .docxPennsylvania was the leader in sentencing and correctional reform .docx
Pennsylvania was the leader in sentencing and correctional reform .docxodiliagilby
 
Penetration testing is a simulated cyberattack against a computer or.docx
Penetration testing is a simulated cyberattack against a computer or.docxPenetration testing is a simulated cyberattack against a computer or.docx
Penetration testing is a simulated cyberattack against a computer or.docxodiliagilby
 
Perform an analysis of the social demographic, technological, econ.docx
Perform an analysis of the social  demographic, technological, econ.docxPerform an analysis of the social  demographic, technological, econ.docx
Perform an analysis of the social demographic, technological, econ.docxodiliagilby
 
Perform research and discuss whether text messaging is cheaper or mo.docx
Perform research and discuss whether text messaging is cheaper or mo.docxPerform research and discuss whether text messaging is cheaper or mo.docx
Perform research and discuss whether text messaging is cheaper or mo.docxodiliagilby
 
People in developed nations are fond of warning people in developing.docx
People in developed nations are fond of warning people in developing.docxPeople in developed nations are fond of warning people in developing.docx
People in developed nations are fond of warning people in developing.docxodiliagilby
 
Pease read and incorporate the following articles from the EBSCO h.docx
Pease read and incorporate the following articles from the EBSCO h.docxPease read and incorporate the following articles from the EBSCO h.docx
Pease read and incorporate the following articles from the EBSCO h.docxodiliagilby
 
Peer Review Journal Paper Overview of assignment due 17 April 2014 I.docx
Peer Review Journal Paper Overview of assignment due 17 April 2014 I.docxPeer Review Journal Paper Overview of assignment due 17 April 2014 I.docx
Peer Review Journal Paper Overview of assignment due 17 April 2014 I.docxodiliagilby
 
Perception is all EXCEPT [removed] [removed]Structuring and orga.docx
Perception is all EXCEPT [removed] [removed]Structuring and orga.docxPerception is all EXCEPT [removed] [removed]Structuring and orga.docx
Perception is all EXCEPT [removed] [removed]Structuring and orga.docxodiliagilby
 
Performance Based Factors and Measures for Quality AssessmentWri.docx
Performance Based Factors and Measures for Quality AssessmentWri.docxPerformance Based Factors and Measures for Quality AssessmentWri.docx
Performance Based Factors and Measures for Quality AssessmentWri.docxodiliagilby
 
People. I need some help with this assignment that needs to be done .docx
People. I need some help with this assignment that needs to be done .docxPeople. I need some help with this assignment that needs to be done .docx
People. I need some help with this assignment that needs to be done .docxodiliagilby
 
Perceptions and Causes of Psychopathology PaperPrepare a 1,0.docx
Perceptions and Causes of Psychopathology PaperPrepare a 1,0.docxPerceptions and Causes of Psychopathology PaperPrepare a 1,0.docx
Perceptions and Causes of Psychopathology PaperPrepare a 1,0.docxodiliagilby
 
People are attracted to occupations that complement their personalit.docx
People are attracted to occupations that complement their personalit.docxPeople are attracted to occupations that complement their personalit.docx
People are attracted to occupations that complement their personalit.docxodiliagilby
 
Perception of Pleasure and Pain Presentation3 slides- An explanati.docx
Perception of Pleasure and Pain Presentation3 slides- An explanati.docxPerception of Pleasure and Pain Presentation3 slides- An explanati.docx
Perception of Pleasure and Pain Presentation3 slides- An explanati.docxodiliagilby
 
Pennsylvania v. MarkMark Davis has been charged with Driving W.docx
Pennsylvania v. MarkMark Davis has been charged with Driving W.docxPennsylvania v. MarkMark Davis has been charged with Driving W.docx
Pennsylvania v. MarkMark Davis has been charged with Driving W.docxodiliagilby
 
PBAD201-1501A-02 Public AdministrationTask NamePhase 3 Individu.docx
PBAD201-1501A-02 Public AdministrationTask NamePhase 3 Individu.docxPBAD201-1501A-02 Public AdministrationTask NamePhase 3 Individu.docx
PBAD201-1501A-02 Public AdministrationTask NamePhase 3 Individu.docxodiliagilby
 
Part1 Q1. Classify each of the following as-      (i)qual.docx
Part1 Q1. Classify each of the following as-      (i)qual.docxPart1 Q1. Classify each of the following as-      (i)qual.docx
Part1 Q1. Classify each of the following as-      (i)qual.docxodiliagilby
 
Paul’s Letter to the EphesiansThe First Letter of PeterThe Fir.docx
Paul’s Letter to the EphesiansThe First Letter of PeterThe Fir.docxPaul’s Letter to the EphesiansThe First Letter of PeterThe Fir.docx
Paul’s Letter to the EphesiansThe First Letter of PeterThe Fir.docxodiliagilby
 
Past and FuturePlease respond to the followingImagine back .docx
Past and FuturePlease respond to the followingImagine back .docxPast and FuturePlease respond to the followingImagine back .docx
Past and FuturePlease respond to the followingImagine back .docxodiliagilby
 
Partisan considerations have increasingly influenced the selection.docx
Partisan considerations have increasingly influenced the selection.docxPartisan considerations have increasingly influenced the selection.docx
Partisan considerations have increasingly influenced the selection.docxodiliagilby
 

More from odiliagilby (20)

Per the text, computers are playing an increasingly important role i.docx
Per the text, computers are playing an increasingly important role i.docxPer the text, computers are playing an increasingly important role i.docx
Per the text, computers are playing an increasingly important role i.docx
 
Pennsylvania was the leader in sentencing and correctional reform .docx
Pennsylvania was the leader in sentencing and correctional reform .docxPennsylvania was the leader in sentencing and correctional reform .docx
Pennsylvania was the leader in sentencing and correctional reform .docx
 
Penetration testing is a simulated cyberattack against a computer or.docx
Penetration testing is a simulated cyberattack against a computer or.docxPenetration testing is a simulated cyberattack against a computer or.docx
Penetration testing is a simulated cyberattack against a computer or.docx
 
Perform an analysis of the social demographic, technological, econ.docx
Perform an analysis of the social  demographic, technological, econ.docxPerform an analysis of the social  demographic, technological, econ.docx
Perform an analysis of the social demographic, technological, econ.docx
 
Perform research and discuss whether text messaging is cheaper or mo.docx
Perform research and discuss whether text messaging is cheaper or mo.docxPerform research and discuss whether text messaging is cheaper or mo.docx
Perform research and discuss whether text messaging is cheaper or mo.docx
 
People in developed nations are fond of warning people in developing.docx
People in developed nations are fond of warning people in developing.docxPeople in developed nations are fond of warning people in developing.docx
People in developed nations are fond of warning people in developing.docx
 
Pease read and incorporate the following articles from the EBSCO h.docx
Pease read and incorporate the following articles from the EBSCO h.docxPease read and incorporate the following articles from the EBSCO h.docx
Pease read and incorporate the following articles from the EBSCO h.docx
 
Peer Review Journal Paper Overview of assignment due 17 April 2014 I.docx
Peer Review Journal Paper Overview of assignment due 17 April 2014 I.docxPeer Review Journal Paper Overview of assignment due 17 April 2014 I.docx
Peer Review Journal Paper Overview of assignment due 17 April 2014 I.docx
 
Perception is all EXCEPT [removed] [removed]Structuring and orga.docx
Perception is all EXCEPT [removed] [removed]Structuring and orga.docxPerception is all EXCEPT [removed] [removed]Structuring and orga.docx
Perception is all EXCEPT [removed] [removed]Structuring and orga.docx
 
Performance Based Factors and Measures for Quality AssessmentWri.docx
Performance Based Factors and Measures for Quality AssessmentWri.docxPerformance Based Factors and Measures for Quality AssessmentWri.docx
Performance Based Factors and Measures for Quality AssessmentWri.docx
 
People. I need some help with this assignment that needs to be done .docx
People. I need some help with this assignment that needs to be done .docxPeople. I need some help with this assignment that needs to be done .docx
People. I need some help with this assignment that needs to be done .docx
 
Perceptions and Causes of Psychopathology PaperPrepare a 1,0.docx
Perceptions and Causes of Psychopathology PaperPrepare a 1,0.docxPerceptions and Causes of Psychopathology PaperPrepare a 1,0.docx
Perceptions and Causes of Psychopathology PaperPrepare a 1,0.docx
 
People are attracted to occupations that complement their personalit.docx
People are attracted to occupations that complement their personalit.docxPeople are attracted to occupations that complement their personalit.docx
People are attracted to occupations that complement their personalit.docx
 
Perception of Pleasure and Pain Presentation3 slides- An explanati.docx
Perception of Pleasure and Pain Presentation3 slides- An explanati.docxPerception of Pleasure and Pain Presentation3 slides- An explanati.docx
Perception of Pleasure and Pain Presentation3 slides- An explanati.docx
 
Pennsylvania v. MarkMark Davis has been charged with Driving W.docx
Pennsylvania v. MarkMark Davis has been charged with Driving W.docxPennsylvania v. MarkMark Davis has been charged with Driving W.docx
Pennsylvania v. MarkMark Davis has been charged with Driving W.docx
 
PBAD201-1501A-02 Public AdministrationTask NamePhase 3 Individu.docx
PBAD201-1501A-02 Public AdministrationTask NamePhase 3 Individu.docxPBAD201-1501A-02 Public AdministrationTask NamePhase 3 Individu.docx
PBAD201-1501A-02 Public AdministrationTask NamePhase 3 Individu.docx
 
Part1 Q1. Classify each of the following as-      (i)qual.docx
Part1 Q1. Classify each of the following as-      (i)qual.docxPart1 Q1. Classify each of the following as-      (i)qual.docx
Part1 Q1. Classify each of the following as-      (i)qual.docx
 
Paul’s Letter to the EphesiansThe First Letter of PeterThe Fir.docx
Paul’s Letter to the EphesiansThe First Letter of PeterThe Fir.docxPaul’s Letter to the EphesiansThe First Letter of PeterThe Fir.docx
Paul’s Letter to the EphesiansThe First Letter of PeterThe Fir.docx
 
Past and FuturePlease respond to the followingImagine back .docx
Past and FuturePlease respond to the followingImagine back .docxPast and FuturePlease respond to the followingImagine back .docx
Past and FuturePlease respond to the followingImagine back .docx
 
Partisan considerations have increasingly influenced the selection.docx
Partisan considerations have increasingly influenced the selection.docxPartisan considerations have increasingly influenced the selection.docx
Partisan considerations have increasingly influenced the selection.docx
 

Recently uploaded

Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxJisc
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Pooja Bhuva
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and ModificationsMJDuyan
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxmarlenawright1
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxPooja Bhuva
 
Economic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food AdditivesEconomic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food AdditivesSHIVANANDaRV
 
OSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsOSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsSandeep D Chaudhary
 
How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17Celine George
 
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxCOMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxannathomasp01
 
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdfUGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdfNirmal Dwivedi
 
Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111GangaMaiya1
 
What is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptxWhat is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptxCeline George
 
How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17Celine George
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...Nguyen Thanh Tu Collection
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use CasesTechSoup
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024Elizabeth Walsh
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxheathfieldcps1
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxEsquimalt MFRC
 
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...Amil baba
 

Recently uploaded (20)

Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptx
 
Our Environment Class 10 Science Notes pdf
Our Environment Class 10 Science Notes pdfOur Environment Class 10 Science Notes pdf
Our Environment Class 10 Science Notes pdf
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
 
Economic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food AdditivesEconomic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food Additives
 
OSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsOSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & Systems
 
How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17
 
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxCOMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
 
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdfUGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
 
Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111
 
What is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptxWhat is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptx
 
How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use Cases
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
 

  CASE 3-1 YOU CAN’T GET THERE FROM HERE UBER SLOW ON DIVERSITY .docx

  • 1. CASE 3-1 YOU CAN’T GET THERE FROM HERE: UBER SLOW ON DIVERSITY Established in 2009, Uber provides an alternative to taxicab service in 460 cities and nearly 60 countries worldwide. The trick? Their mobile application for smartphones allows riders to arrange for transportation with drivers who operate their personal vehicles. A dual rating system (drivers and customers rate each other) serves as a quality control device keeping Uber standards high. (1) As an international technology firm, Uber has been challenged, along with other tech giants like Google and Twitter, to demonstrate that they are attuned to the specific needs of their employees, more specifically people of color and women. In Uber’s own words: At Uber, we want to create a workplace that is inclusive and reflects the diversity of the cities we serve: where everyone can be their authentic self, and where that authenticity is celebrated as a strength. By creating an environment where people from every background can thrive, we’ll make Uber a better company—not just for our employees but for our customers, too. (2) Yet actions speak louder than words. Uber employees describe the firm’s work environment amid some managers as Machiavellian and merciless. Many blame Travis Kalanick, Uber’s founder and former chief executive, for establishing such a negative culture. Uber’s fast growth approach to the market has rewarded employees and managers who have aggressively pushed for greater revenues and fatter profits at the seeming cost of human dignity.
  • 2. For example, Uber has had its share of troubles addressing issues of sexual misconduct and workforce diversity. These issues came to light when a former employee, Susan Fowler, reported in her personal blog that she was being sexually harassed by her manager and that human resources had been informed of these infractions. (3) Susan Fowler said in her blog: On my first official day rotating on the team, my new manager sent me a string of messages over company chat. He was in an open relationship, he said, and his girlfriend was having an easy time finding new partners, but he wasn’t. He was trying to stay out of trouble at work, he said, but he couldn’t help getting in trouble, because he was looking for women to have sex with. It was clear that he was trying to get me to have sex with him, and it was so clearly out of line that I immediately took screenshots of these chat messages and reported him to HR. (4) Uber’s first reaction was to call Ms. Fowler’s accusations as “abhorrent and against everything Uber stands for and believes in.”(5) Ms. Fowler purported that her manager was not punished because he “was a high performer”; yet other female employees reported similar incidents with the same manager, leading Ms. Fowler to believe that HR was covering up for her manager. Uber was in trouble as more and more scandals emerged and they quickly took the following actions: (a) apologized for some of their managers’ actions, (b) had a board member and several female executives provide testimonials on the firm’s positive work environment, and (c) began to probe workplace policies and procedures. Arianna Huffington, a board member, repeatedly labeled new employees as “brilliant jerks.” (6) Huffington said that this investigation would be different when Eric H. Holder Jr., the former United States Attorney General (as well as some others), were hired to conduct their investigation.
  • 3. Uber released its first diversity report on March 28, 2017, one month after these allegations. This report indicated that women and nonwhite employees are underrepresented at the firm, not overly dissimilar from other technology-based firms. Some of the most egregious statistics include: (a) racial configuration- 6% Hispanic, 9% black, 50% white, and (b) 85% of all technology jobs are held by men, with a mere 36% of the total workforce comprised of women. (7) Liane Hornsey, Uber’s chief human resource officer, acknowledged, “We need to do better and have much more work to do.” (8) Here are Uber’s next steps: We’re dedicating $3 million over the next three years to support organizations working to bring more women and underrepresented people into tech. This year, our recruiting team is also embarking on a college tour to recruit talented students at colleges across the country, including a number of Historically Black Colleges and Universities (HBCUs) and Hispanic Serving Institutions (HSIs). Our employee resource groups play a huge role in all our recruiting events that are focused on hiring women and people of color at Uber. In recruiting, we’ve updated our job descriptions to remove potentially exclusionary language, and we are running interview training to make our hiring processes more inclusive for women in tech. We’re also rolling out training to educate and empower employees, covering topics like “why diversity and inclusion matters,” “how to be an ally,” and “building inclusive teams.” Training is not a panacea but educating employees on the right behaviors is an important step in the right direction. This is just the beginning of our efforts. Whether you’re a veteran returning from service or a person with a disability and regardless of your religious beliefs, your sexual orientation, your gender identity, or the country you call home, at Uber, we
  • 4. want to create an environment where you can be yourself. By deepening our commitment to diversity, we will strengthen our business and better serve our customers in over 450 cities in more than 70 countries. (9) Only time will tell if this fast growth firm can manage its aggressive culture and diversity as it continues to expand into new marketplaces and those with differing cultures. Questions 1. Susan Fowler’s complaint of being the target of sexual harassment by her manager would be categorized as falling under which employment law? 2. Which type(s) of harassment was Ms. Fowler exposed to? 3. What actions, if any, has Uber taken to limit their liability relative to sexual harassment charges? 4. Uber’s diversity report indicates that 36 percent of Uber’s workforce is made up of women (15% in technical jobs); 50% of Uber’s employees in the United States are white, while 9% are black and 6% are Hispanic. Are they in violation of any EEOC and Affirmative Action laws? 5. Why does diversity matter in general and more specifically to Uber? 6. What benefits and challenges does Uber derive from a more diverse workforce? References (1) Anderson, A. (n.d.). Uber International C.V. Hoovers. Retrieved April 4, 2017, from http://0- subscriber.hoovers.com.liucat.lib.liu.edu/H/company360/fulldes
  • 5. cription.html?companyId=163401000000000 p.109 (2) Uber. (n.d.). How do we want Uber to look and feel? Retrieved April 4, 2017, from https://www.uber.com/diversity/ (3) Fowler, S. (2017, February 19). Reflecting on one very, very strange year at Uber. Retrieved April 12, 2017, from https://www.susanjfowler.com/blog/2017/2/19/reflecting-on- one-very-strange-year-at-uber (4) Ibid. (5) Patnaik, S. (2017, February 21). Uber hires ex-US Attorney General Holder to probe sexual harassment. Reuters. Retrieved April 4, 2017, from http://www.reuters.com/article/us-uber- tech-sexual-harassment-idUSKBN160041 (6) Isaac, M. (2017, March 28). Uber releases diversity report and repudiates its “hard-charging attitude.” The New York Times. Retrieved April 4, 2017, from http://www.cnbc.com/2017/03/28/uber-releases-diversity-report- and-repudiates-its-hard-charging-attitude.html (7) Ibid. (8) Uber slow on diversity. (2017, March 29). AM New York, p. A2. (9) Uber. (n.d.). How do we want Uber to look and feel? Retrieved April 4, 2017, from https://www.uber.com/diversity/ Case written by Herbert Sherman, Long Island University Unit II Case Study Read “Case 3-1, You Can’t Get There From Here: Uber Slow
  • 6. On Diversity” on page 108 of your textbook. After you have read the case study, write an analysis of the case study. Write an introduction to give context to your paper by explaining what the paper will cover. Then, divide the body of your paper using the seven headers below. Address the points within that section, as indicated under the header. Employment Law Identify what employment law Susan Fowler’s sexual harassment claim would be characterized as. Be sure to develop your answer to include your rationale. Type of Harassment Identify the type(s) of harassment to which Ms. Fowler was exposed. Be sure to develop your answer to include your rationale. Uber’s Actions Identify actions Uber has taken to limit their liability relative to sexual harassment charges. Be sure to develop your answer to include your rationale. EEOC and Affirmative Action After reviewing Uber’s diversity report, does it appear Uber is in violation of any EEOC and affirmative action laws? Be sure to develop your answer to include your rationale. Diversity Matters Explain why diversity matters in general and more specifically to Uber. Be sure to develop your answer to include your rationale. Benefits/Challenges of a Diverse Workforce Identify and explain the benefits and challenges Uber derives from a more diverse workforce. Be sure to develop your answer to include your rationale. Legal Provisions of Uber Case Write a summary that identifies legal provisions or considerations covered within this case study as it relates to a human resource management (HRM) perspective. Conclude with an analysis with your thoughts on how ethics and HRM professional standards are framed by legal provisions
  • 7. within a specific organization or industry (e.g., business, health care). Your assignment should be two pages in length, not counting the title or reference pages. Adhere to APA style when constructing this assignment, including in-text citations and references for all sources that are used. Please note that no abstract is needed. 52 ProfessionalSafety OCTOBER 2011 www.asse.org Reviewing Heinrich Dislodging Two Myths From the Practice of Safety By Fred A. Manuele I n The Standardization of Error, Stefansson (1928) makes the case that people are willing to accept as fact what is written or spoken with- out adequate supporting evidence. When studies show that a supposed fact is not true, dislodging it is difficult because that belief has become deeply embedded in the minds of people and, thereby, standardized. Stefansson pleads for a mind-set that accepts as knowledge only that which
  • 8. can be proven and which cannot be logically cont r a - d i c t e d . H e s t a t e s t h a t his theme applies to all fields of endeavor except for mathematics. Safety is a professional specialty in which myths have become standardized and deeply embedded. This article ex- amines two myths that should be dislodged from the practice of safety: 1) Unsafe acts of workers are the principle causes of occupational accidents. 2) Reducing accident fre- quency will equivalently re- duce severe injuries. These myths arise from the work of H.W. Heinrich (1931; 1941; 1950; 1959). They can be found in the four editions of Indus- trial Accident Prevention: A Scientific Approach. Although some safety practitioners may not rec- ognize Heinrich’s name, his misleading prem- ises are perpetuated as they are frequently cited in speeches and papers. Analytical evidence indicates that these prem- ises are not soundly based, supportable or valid, and, therefore, must be dislodged. Although this
  • 9. article questions the validity of the work of an au- thor whose writings have been the foundation of safety-related teaching and practice for many de- cades, it is appropriate to recognize the positive ef- fects of his work as well. This article was written as a result of encourage- ment from several colleagues who encountered situations in which these premises were cited as fact, with the resulting recommended preventive actions being inappropriate and ineffective. Safety professionals must do more to inform about and refute these myths so that they may be dislodged. Recognition: Heinrich’s Achievements Heinrich was a pioneer in the field of accident prevention and must be given his due. Publica- tion of his book’s four editions spanned nearly 30 years. From the 1930s to today, Heinrich likely has had more influence than any other individual on the work of occupational safety practitioners. In retrospect, knowing the good done by him in promoting greater attention to occupational safety and health should be balanced with an awareness of the misdirection that has resulted from applying some of his premises. Heinrich’s Sources Unavailable Attempts were made to locate Heinrich’s research, without success. Dan Petersen, who with Nestor Roos, authored a fifth edition of Industrial Accident Prevention, was asked whether they had located Heinrich’s research. Petersen said that they had to
  • 10. IN BRIEF •This article identifies two myths derived from the work of H.W. Heinrich that should be dislodged from the prac- tice of safety: 1) unsafe acts of workers are the principal causes of occupational accidents; and 2) reducing accident requency will equivalently reduce severe injuries. •As knowledge has evolved about how accidents occur and their causal factors, the emphasis is now correctly placed on improving the work system, rather than on worker behavior. Hein- rich’s premises are not compatible with current thinking. •A call is issued to safety profession- als to stop using and promoting these premises; to dispel these premises in presentations, writings and discussions; and to apply current methods that look beyond Heinrich’s myths to determine true causal factors of incidents. Fred A. Manuele, P.E., CSP, is president of Hazards Limited, which he formed after retiring from Marsh & McLennan where he was a managing director and manager of M&M Protection Consultants. His books include Advanced Safety Management: Focusing on Z10 and Serious Injury Prevention, On the Practice of Safety, Innovations in Safety Management: Addressing Career Knowledge Needs, and Heinrich Revisited: Truisms or Myths. A professional member
  • 11. of ASSE’s North- eastern Illinois Chapter and an ASSE Fellow, Manuele is a former board member of ASSE, NSC and BCSP. Professional Development Peer-Reviewed http://www.asse.org www.asse.org OCTOBER 2011 ProfessionalSafety 53 rely entirely on the previous editions of Heinrich’s books as resources. Thus, the only data that can be reviewed are contained in Heinrich’s books. His information-gather- ing methods, survey documents that may have been used, the qual- ity of the information gathered and the analytical systems used cannot be examined. Two items of note for this article: Citations from Heinrich’s texts are numbered H-1, H-2, etc., and correspond to the chart in Table 1, which indicates the page numbers and editions in which each ci- tation appears. All other citations appear as in-text references in the journal’s standard style.
  • 12. Furthermore, in today’s social climate, some of Heinrich’s terminology would be considered sex- ist. He uses phrases such as man failure, the foreman and he is responsible. Consider the time in which he wrote. The fourth edition was published in 1959. Psychology & Safety Applied psychology dominates Heinrich’s work with respect to selecting causal factors and is given great importance in safety-related problem resolu- tion. Consider the following: 1) Heinrich expresses the belief that “psy- chology in accident prevention is a fundamen- tal of great importance” (H-1). 2) His premise is that “psychology lies at the root of sequence of accident causes” (H-2). 3) In the fourth edition, Heinrich states that he envisions “the more general acceptance by man- agement of the idea that an industrial psycholo- gist be included as a member of the plant staff as a physician is already so included” (H-3). 4) The focus of applied psychology on the em- ployee, as in the following quotation: Indeed, safety psychology is as fairly appli- cable to the employer as to the employee. The initiative and the chief burden of ac- tivity in accident prevention rest upon the employer; however the practical field of
  • 13. effort for prevention through psychology is confined to the employee, but through management and supervision. (H-4) Note that the focus of applied psychology is on the worker as are other Heinrichean premises. Since application of practical psychology is confined to the worker, who reports to a supervisor, the psy- chology applier is the supervisor. With due respect to managers, supervisors and safety practitioners, it is doubtful that many could knowledgeably apply psychology “as a fundamental of great importance” in their accident prevention efforts. Table 1 Pages Cited by Edition http://www.asse.org 54 ProfessionalSafety OCTOBER 2011 www.asse.org Heinrich’s Causation Theory: The 88-10-2 Ratio Heinrich professes that among the direct and proximate causes of industrial accidents: •88% are unsafe acts of persons; •10% are unsafe mechanical or physical condi- tions; •2% are unpreventable (H-5). According to Heinrich, man failure is the problem and psychology is an important element in correct- ing it. In his discussion of the relation of psychology
  • 14. to accident prevention, Heinrich advocates identi- fying the first proximate and most easily prevented cause in the selection of remedies. He says: Selection of remedies is based on practical cause-analysis that stops at the selection of the first proximate and most easily prevented cause (such procedure is advocated in this book) and considers psychology when re- sults are not produced by simpler analysis. (H-6) Note that the first proximate and most easily prevented cause is to be selected (88% of the time a human error). That concept permeates Hein- rich’s work. It does not encompass what has been learned subsequently about the complexity of ac- cident causation or that other causal factors may be more significant than the first proximate cause. For example, the Columbia Accident Investiga- tion Board (NASA, 2003) notes the need to con- sider the complexity of incident causation: Many accident investigations do not go far enough. They identify the technical cause of the accident, and then connect it to a vari- ant of “operator error.” But this is seldom the entire issue. When the determinations of the causal chain are limited to the technical flaw and individual failure, typically the actions taken to prevent a similar event in the future are also limited: fix the technical problem and replace or retrain the individual responsible. Putting these corrections in place leads to
  • 15. another mistake: The belief that the problem is solved. Too often, accident investigations blame a failure only on the last step in a com- plex process, when a more comprehensive understanding of that process could reveal that earlier steps might be equally or even more culpable. A recent example of the complexity of accident causation appears in this excerpt from the report prepared by BP personnel following the April 20, 2010, Deepwater Horizon explosion in the Gulf of Mexico (BP, 2010): The team did not identify any single action or inaction that caused this incident. Rather, a complex and interlinked series of mechanical failures, human judgments, engineering de- sign, operational implementation and team interfaces came together to allow the initia- tion and escalation of the accident. Consider another real-world situation in which a fatality resulted from multiple causal factors: An operation produces an odorless, color- less highly toxic gas in an enclosed area. The two-level gas detection and alarm system has deteriorated over many years of use, and the system often leaks gas. An internal auditor recommends it be replaced with a three-level system, the accepted practice in the industry for that type of gas. The auditor also recommends that maintenance give the existing system high priority.
  • 16. Management puts high profits above safety and tolerates excessive risk taking. That defines culture problems. Management decides not to replace the system, and fur- thermore begins a cost-cutting initiative that reduces maintenance staff by one-third. The gas detection and alarm system continue to deteriorate, and maintenance staff cannot keep up with the frequent calls for repair and adjustment. A procedure is installed that requires employees to test for gas before entering the enclosed area. But, supervisors condone employees entering the area without making the required test. Both detection and alarm systems fail. Gas accumulates. An employee enters the area without testing for gas. The result is a toxic gas fatality. Causal factor determination would com- mence with the deficiencies in the organiza- tion’s culture whereby: resources were not provided to replace a defective detection and alarm system in a critical area; staffing deci- sions resulted in inadequate maintenance; and excessive risk taking was condoned. The employee’s violation of the established procedure was a contributing factor, but not principle among several factors. Heinrich’s theory that an unsafe act is the sole cause of an accident is not supported in the cited examples. Also, note that Heinrich’s focus on man failure is singular in the following citation: “In the occurrence of accidental injury, it is apparent that
  • 17. man failure is the heart of the problem; equally ap- parent is the conclusion that methods of control must be directed toward man failure” (H-7). [Note: Heinrich does not define man failure. In making the case to support directing efforts toward con- trolling man failure, he cites personal factors such as unsafe acts, using unsafe tools and willful disre- gard of instruction.] A directly opposite view is expressed by Deming (1986). Deming is known for his work in quality principles, which this author finds comparable to the principles required to achieve superior results in safety. The supposition is prevalent throughout the world that there would be no problems in production or service if only our production workers would do their jobs in the way that we taught. Pleasant dreams. The workers are handicapped by the system, and the system belongs to the management. (p. 134) Analytical evidence indicates that several of Heinrich’s premis- es, first introduced in 1931, are not soundly based, supportable or valid, and, therefore, must be dislodged. http://www.asse.org
  • 18. www.asse.org OCTOBER 2011 ProfessionalSafety 55 Of all Heinrich’s concepts, his thoughts on ac- cident causation, expressed as the 88-10-2 ratios, have had a significant effect on the practice of safety, and have resulted in the most misdirection. Why is this so? Because when based on the premise that man failure causes the most accidents, preven- tive efforts are directed at the worker rather than toward the operating system in which the work is performed. Many safety practitioners operate on the belief that the 88-10-2 ratios are soundly based and, as a result, focus their efforts on reducing so-called man failure rather than attempting to improve the system. This belief also perpetuates because it is the path of least resistance for an organization. It is easier for supervisors and managers to be satisfied with taking superficial preventive action, such as retraining a worker, reinstructing the work group or reposting the standard operating procedure, than it is to try to correct system problems. Certainly, operator errors may be causal factors for accidents. However, consider Ferry’s (1981) comments on this subject: We cannot argue with the thought that when an operator commits an unsafe act, leading to a mishap, there is an element of human or operator error. We are, however, decades past the place where we once stopped in our search for causes. Whenever an act is considered unsafe we
  • 19. must ask why. Why was the unsafe act com- mitted? When this question is answered in depth it will lead us on a trail seldom of the operator’s own conscious choosing. (p. 56) If, during an accident investigation, a professional search is made for causal factors beyond an unsafe act, such as through the five-why method, one will likely find that the causal factors built into work sys- tems may be of greater importance than an employ- ee’s unsafe act. Fortunately, a body of literature has emerged that recognizes the significance of causal factors which originate from decisions made above the worker level. Several are cited here. Human Errors Above the Worker Level Much has been written about human error. Par- ticular attention is given to the Guidelines for Pre- venting Human Error in Process Safety (CCPS, 1994). Although process safety appears in the title, the first two chapters provide an easily read primer on hu- man error reduction. The content of those chapters was largely influenced by personnel with plant- or corporate-level safety management experience. Safety practitioners should view the following highlights as generic and broadly applicable. They advise on where human errors occur, who commits them and at what level, the effect of organizational culture and where attention is needed to reduce the occurrence of human errors. These highlights apply to organizations of all types and sizes. •It is readily acknowledged that human errors at the operational level are a primary contributor to
  • 20. the failure of systems. It is often not recognized, however, that these errors frequently arise from failures at the management, design or technical ex- pert levels of the company (p. xiii). •A systems perspective is taken that views error as a natural consequence of a mismatch between human capabilities and demands, and an inappro- priate organizational culture. From this perspec- tive, the factors that directly influence error are ultimately controllable by management (p. 3). •Almost all major accident investigations in re- cent years have shown that human error was a significant causal factor at the level of design, op- erations, maintenance or the management process (p. 5). •One central principle presented in this book is the need to consider the organizational factors that create the preconditions for errors, as well as the immediate causes (p. 5). •Attitudes toward blame will determine whether an organization develops a blame culture, which attributes error to causes such as lack of motivation or deliberate unsafe behavior (p. 5). •Factors such as the degree of participation that is encouraged in an organization, and the quality of the communication between different levels of management and the workforce, will have a major effect on the safety culture (p. 5). Since “failures at the management, design or
  • 21. technical expert levels of the company” affect the design of the workplace and the work methods— that is, the operating system—it is logical to suggest that safety professionals should focus on system improvement to attain acceptable risk levels rather than principally on affecting worker behavior. Reason’s (1997) book, Managing the Risks of Organizational Accidents, is a must-read for safety professionals who want an education in human er- ror reduction. It has had five additional printings since 1997. Reason writes about how the effects of decisions accumulate over time and become the causal factors for incidents resulting in serious in- juries or major damage when all the circumstances necessary for the occurrence of a major event fit together. This book stresses the need to focus on decision making above the worker level to prevent major accidents. Reason states: Latent conditions, such as poor design, gaps in supervision, undetected manufacturing defects or maintenance failures, unworkable procedures, clumsy automation, shortfalls in training, less than adequate tools and equip- ment, may be present for many years before they combine with local circumstances and active failures to penetrate the system’s lay- ers of defenses. They arise from strategic and other top- level decisions made by governments, regulators, manufacturers, designers and or- ganizational managers. The impact of these decisions spreads throughout the organiza- tion, shaping a distinctive corporate culture
  • 22. and creating error-producing factors within the individual workplaces. (p. 10) http://www.asse.org 56 ProfessionalSafety OCTOBER 2011 www.asse.org The traditional occupational safety ap- proach alone, directed largely at the unsafe acts of persons, has limited value with re- spect to the “insidious accumulation of la- tent conditions [that he notes are] typically present when organizational accidents occur. (pp. 224, 239) If the decisions made by management and oth- ers have a negative effect on an organization’s culture and create error-producing factors in the workplace, focusing on reducing human errors at the worker level—the unsafe acts—will not ad- dress the problems. Deming achieved world renown in quality assur- ance. The principle embodied in what is referred to as Deming’s 85-15 rule also applies to safety. The rule supports the premise that prevention efforts should be focused on the system rather than on the worker. This author draws a comparable conclu- sion as a result of reviewing more than 1,700 inci- dent investigation reports. This is the rule, as cited by Walton (1986): “The rule holds that 85% of the problems in any operation are within the system and are the responsibly of management, while only 15% lie with the worker” (p. 242).
  • 23. In 2010, ASSE sponsored the symposium, Re- think Safety: A New View of Human Error and Workplace Safety. Several speakers proposed that the first course of action to prevent human errors is to examine the design of the work system and work methods. Those statements support Dem- ing’s 85-15 rule. Consider this statement by a hu- man error specialist [from this author’s notes]: When errors occur, they expose weakness- es in the defenses designed into systems, processes, procedures and the culture. It is management’s responsibility to anticipate errors and to have systems and work meth- ods designed so as to reduce error potential and to minimize sever- ity of injury potential when errors occur. Since most problems in an operation are systemic, safety efforts should be directed to- ward improving the system. Unfortunately, the use of the terms unsafe acts and unsafe conditions focuses attention on a worker or a condition, and diverts attention from the root-causal factors built into an operation. Allied to Deming’s view is the work of Chapanis, who was prominent in the field of ergonomics and human fac-
  • 24. tors engineering. Represen- tative of Chapanis’s writings is “The Error-Provocative Situation,” a chapter in The Measurement of Safety Perfor- mance (Tarrants, 1980). Cha- panis’s message is that if the design of the work is error-provocative, one can be certain that errors will occur in the form of accident causal factors. It is illogical to conclude in an incident investigation that the principal causal factor is the worker’s un- safe act if the design of the workplace or the work methods is error-inviting. In such cases, the error- producing aspects of the work (e.g., design, layout, equipment, operations, the system) should be con- sidered primary. U.S. Department of Energy (1994) describes the management oversight and risk tree (MORT) as a “comprehensive analytical procedure that provides a disciplined method for determining the systemic causes and contributing factors of accidents.” The following reference to “performance errors” is of particular interest. It should be pointed out that the kinds of questions raised by MORT are directed at systemic and procedural problems. The ex- perience, to date, shows there are a few “un- safe acts” in the sense of blameful work level employee failures. Assignment of “unsafe act” responsibility to a work-level employee should not be made unless or until the pre- ventive steps of 1) hazard analysis; 2) man- agement or supervisory direction; and
  • 25. 3) procedures safety review have been shown to be adequate. (p. 19) Each of these more recent publications refutes the premise that unsafe acts are the primary causes of occupational accidents. Heinrich’s Data Gathering & Analytical Method Heinrich recognized that other studies on acci- dent causation identified both unsafe acts and un- safe conditions as causal factors with almost equal frequency. Those studies produced results different from the 88-10-2 ratios. For example, the Accident Figure 1 Foundation of a Major Injury Note. Adapted from Industrial Accident Prevention: A Scientific Approach (1st ed.) (p. 91), (2nd ed.) (p. 27), (3rd ed.) (p. 24), (4th ed.) (p. 27), by H.W. Heinrich, 1931, 1941, 1950, 1959, New York: McGraw-Hill. Heinrich’s 300-29-1 ratios have been depicted as a tri- angle or a pyramid. http://www.asse.org www.asse.org OCTOBER 2011 ProfessionalSafety 57
  • 26. Prevention Manual for Industrial Operations: Ad- ministration and Programs, 8th edition (NSC, 1980) contains these statements about studies of accident causation: Two historical studies are usually cited to pinpoint the contributing factor(s) to an ac- cident. Both emphasize that most accidents have multiple causes. •A study of 91,773 cases reported in Penn- sylvania in 1953 showed 92% of all nonfatal injuries and 94% of all fatal injuries were due to hazardous mechanical or physical condi- tions. In turn, unsafe acts reported in work injury accidents accounted for 93% of the nonfatal injuries and 97% of the fatalities. •In almost 80,000 work injuries re- ported in that same state in 1960, unsafe condition(s) was identified as a contributing factor in 98.4% of the nonfatal manufactur- ing cases, and unsafe act(s) was identified as a contributing factor in 98.2% of the nonfatal cases. (p. 241) Although aware that others studying accident causation had recognized the multifactorial nature of causes, Heinrich continued to justify selecting a single causal factor in his analytical process. Hein- rich’s data-gathering methods force the accident cause determination into a singular and narrow categorization. The following paragraph is found in the second through fourth editions. It follows an explanation of the study resulting in the formula- tion of the 88-10-2 ratios. “In this research, major
  • 27. responsibility for each accident was assigned either to the unsafe act of a person or to an unsafe me- chanical condition, but in no case were both per- sonal and mechanical causes charged” (H-8). Heinrich’s study resulting in the 88-10-2 ratios was made in the late 1920s. Both the relation- ship of a study made then to the work world as it now exists and the methods used in producing it are questionable and unknown. As to the study methods, consider the following paragraph, which appears in the first edition; minor revisions were made in later editions. Twelve thousand cases were taken at random from closed-claim-file insurance records. They covered a wide spread of territory and a great variety of industrial classifications. Sixty-three thousand other cases were taken from the records of plant owners. (H-9) The source of the data was insurance claims files and records of plant owners, which cannot provide reliable accident causal data because they rarely include causal factors. Nor are accident investiga- tion reports completed by supervisors adequate re- sources for causal data. When this author provided counsel to clients in the early stages of developing computer-based incident analysis systems, insur- ance claims reports and supervisors’ investigation reports were examined as possible sources for causal data. It was rare for insurance claims reports to include provisions to enter causal data. This author has examined more than 1,700 in- cident investigation reports completed by super-
  • 28. visors and investigation teams. In approximately 80% of those reports, causal factor information was inadequate. These reports are not a sound base from which to analyze and conclude with respect to the reality of causal factors. Summation on the 88-10-2 Ratios Heinrich’s data collection and analytical meth- ods in developing the 88-10-2 ratios are unsup- portable. Heinrich’s premise, that unsafe acts are the primary causes of occupational accidents, can- not be sustained. The myth represented by those ratios must be dislodged and actively refuted by safety professionals. An interesting message of support with respect to avoiding use of the 88-10-2 ratios comes from Krause (2005), a major player in worker-focused behavior-based safety: Many in the safety community believe a high percentage of incidents, perhaps 80% to 90%, result from behavioral causes, while the remainder relate to equipment and facilities. We made this statement in our first book in 1990. However, we now recognize that this dichotomy of causes, while ingrained in our culture generally and in large parts of the safety community, is not useful, and in fact can be harmful. (p. 10) The Foundation of a Major Injury: The 300-29-1 Ratios Heinrich’s conclusion with respect to the ratios of incidents that result in no injuries, minor injuries
  • 29. and a major lost-time case was the base on which educators taught and many safety practitioners came to believe that reducing accident frequency will achieve equivalent reduction in injury sever- ity. The following statement appears in all four edi- tions of his text: “The natural conclusion follows, moreover, that in the largest injury group—the minor injuries—lies the most valuable clues to ac- cident causes” (H-10). The following discussion and statistics establish that the ratios upon which the foregoing citation is based are questionable and that reducing incident frequency does not necessarily achieve an equiva- lent reduction in injury severity. Heinrich’s 300-29-1 ratios have been depicted as a triangle or a pyramid (Figure 1). In his first edition, Heinrich writes: Analysis proves that for every mishap re- sulting in an injury there are many other ac- cidents in industry which cause no injuries whatever. From data now available concern- ing the frequency of potential-injury acci- dents, it is estimated that in a unit group of 330 accidents, 300 result in no injuries, 29 in minor injuries, and 1 in a major or lost-time case. (H-11) In the second edition, “similar” was added to the citation: “Analysis proves that for every mishap, there are many other similar accidents in industry . . .” (H-12). Heinrich’s study resulting in the 88-10-2 ratios was
  • 30. made in the late 1920s. Both the relationship of a study made then to the work world as it now exists and the methods used in producing it are questionable and unknown. http://www.asse.org 58 ProfessionalSafety OCTOBER 2011 www.asse.org Within a chart displaying the 300-29-1 ratios in the first edition, Heinrich writes, “The total of 330 accidents all have the same cause.” Note that cause is singular (H-13). This statement, that all 330 in- cidents have the same cause, challenges credulity. Also, note that the sentence quoted in this para- graph appears only in the first edition. It does not appear in later editions (H-14). For background data, Heinrich says in the first, second and third editions: The determination of this no-injury accident frequency followed a most interesting and ab- sorbing study [italics added]. The difficulties can be readily imagined. There were few ex- isting data on minor injuries—to say nothing of no-injury accidents. (H-15) In the fourth edition, published 28 years after the
  • 31. first edition, the source of the data is more specifi- cally stated: The determination of this no-injury accident frequency followed a study of over 5,000 cases [italics added]. The difficulties can be readily imagined. There were few existing data on minor injuries—to say nothing of no-injury accidents. (H-16) The credibility of such a revision after 28 years must be questioned. In Heinrich’s second and third editions, major changes were made in his presen- tation on the ratios, without explanation. 1) The statement in the first edition that the 330 accidents all have the same cause was eliminated. 2) In the second edition, changes were made indicating that the unit group of 330 accidents are “similar” and “of the same kind” (H-17). 3) In the third edition, another significant addi- tion is made. The 330 accidents now are “of the same kind and involving the same person” (H-18). The following appears in the third and fourth editions, encompassing the changes noted. Analysis proves that for every mishap result- ing in an injury there are many other similar accidents that cause no injuries whatever. From data now available concerning the fre- quency of potential-injury accidents, it is es- timated that in a unit group of 330 accidents
  • 32. of the same kind and involving the same person [italics added], 300 result in no injuries, 29 in minor injuries and 1 in a major or lost-time injury. (H-19) These changes are not explained. If the original data were valid, how does one explain the sub- stantial revisions in the conclusions eventually drawn from an analysis of it? In the second, third and fourth editions, Heinrich gives no indication of other data collection activities or of other analy- ses. How does one support using the ratios without having explanations of the differing interpretations Heinrich gives in each edition? The changes made in the 300-29-1 ratios in the second and third editions, and carried over into the fourth edition, present other serious conceptual problems. To which types of accidents does “in a unit group of 330 accidents of the same kind and occurring to the same person” apply? Certainly, it does not apply to some commonly cited incident types, such as falling to a lower level or struck by objects. For example, a construction worker rides the hoist to the 10th floor and within minutes backs into an unguarded floor opening, falling to his death. Heinrich’s ratios would give this person fa- vorable odds of 300 to 330 (10 out of 11) of suffer- ing no injury at all. That is not credible. Consider the feasibility of finding data in the 5,000-plus cases studied to support the ratios, keeping in mind that incidents are to be of the same type and occurring to the same person.
  • 33. •If the number of major or lost-time cases is 1, the number of minor injury case files would be 29 and the number of no-injury case files would be 300. •If the number of major or lost-time cases is 5, the number of minor injury case files would be 145 and the number of no-injury case files would be 1,500. •If the number of major or lost-time cases is 10, the number of minor injury case files would be 290 and the number of no-injury case files would be 3,000. Because of the limitations Heinrich himself im- poses, that all incidents are to be of the same type and occurring to the same person, it is implausible that his database could contain the information necessary for analysis and the conclusions he drew on his ratios. Particularly disconcerting is the need for the database to contain information on more than 4,500 no-injury cases (300 ÷ 330 × 5,000). Un- less a special study was initiated, creating files on no-injury incidents would be a rarity. Given this, one must ask, did a database exist upon which Heinrich established his ratios, then stated the premises that the most valuable clues for accident causes are found in the minor injury cat- egory? This author thinks not. Statistical Indicators: Serious Injury Trending Data on the trending of serious injuries and workers’ compensation claims contradict the
  • 34. Table 2 Injury Reduction Categories Note. Data from “State of the Line,” by National Council on Compensation Insurance, 2005, Boca Raton, FL: Author. Data on the trending of serious injuries and workers’ compen- sation claims con- tradict the premise that focusing on incident frequency reduction will equiv- alently achieve severity reduction. http://www.asse.org www.asse.org OCTOBER 2011 ProfessionalSafety 59 premise that focusing on incident frequency reduc- tion will equivalently achieve severity reduction. The following data have been extracted from pub- lications of the National Council on Compensation Insurance (NCCI, 2005; 2006; 2009).
  • 35. •In 2006, NCCI produced a 12-minute video, The Remarkable Story of Declining Frequency— Down 30% in the Past Decade. It shows that work- ers’ compensation claim frequency was down considerably in the decade cited. The video tells a remarkable but not well-known story. •A July 2009 NCCI bulletin, “Workers’ Compen- sation Claim Frequency Continues Its Decline in 2008.” The reduction was 4.0%. A May 2010 NCCI report says that the cumulative reduction in claims frequency from 1991 through 2008 is 54.7%. •A 2005 NCCI paper, “Workers’ Compensation Claim Frequency Down Again,” states, “There has been a larger decline in the frequency of smaller lost-time claims than in the frequency of larger lost-time claims.” Also, consider that NCCI (2005) reports reductions in selected categories of claim values for the years 1999 and 2003, expressed in 2003 hard dollars (Table 2). While the frequency of workers’ compensation cases is down, the greatest reductions are for less serious injuries. The reduction in cases valued from $10,000 to $50,000 is about one-third of that for cases valued at less than $2,000. For cases valued above $50,000, the reduction is about one-fifth of that for the less costly and less serious injuries. The data clearly show that a comparable reduction in injury severity does not follow a reduction in injury frequency. A DNV (2004) bulletin is another resource of particular note. It states that managing operations
  • 36. to reduce frequency will not equivalently reduce severity. What about the pyramid? Much has been said over the years about the classical loss control pyramid, which in- dicates the ratio between no loss incidents, minor incidents and major incidents, and it has often been argued that if you look after the small potential incidents, the major loss incidents will improve also. The major accident reality however is somewhat different. What we find is that if you manage the small incidents effectively, the small incident rate improves, but the major accident rate stays the same, or even slightly increases. Contradictions: Unsafe Acts & Injuries Heinrich’s texts contain contradictions about when a major injury would occur and the relation- ship between unsafe acts and a major injury. In all editions, reference is made to 330 careless acts or several hundred unsafe acts occurring before a ma- jor injury occurs, as in the following examples from the first and third editions. •“Keep in mind that a careless act occurs ap- proximately 300 times before [italics added] a seri- ous injury results and that there is, therefore, an excellent opportunity to detect and correct unsafe practices before injury occurs” (H-20).
  • 37. •“Keep in mind that an unsafe act occurs several hundred times before [italics added] a serious injury results” (H-21). Before is a key word here. While an unsafe act may be performed several times before a particu- lar accident occurs, that is not the case in a large majority of incidents which result in serious injury or fatality. In his fourth edition, Heinrich gave this view of the relationship of unsafe acts or exposures to mechanical hazards. If it were practicable to carry on appropriate research, still another base therefore could be established showing that from 500 to 1,000 or more unsafe acts or exposures to mechan- ical hazards existed in the average case be- fore even one of the 300 narrow escapes from injury (events-accidents) occurred. (H-22) There is a real problem here. All of those unsafe acts or exposures to mechanical hazards take place before even one accident occurs. That is illogical. Summation on the 300-29-1 Ratios Use of the 300-29-1 ratios is troubling. Since the ratios are not soundly based, one must ask whether the ratios have any substance. Does their use as a base for a safety management system result in a concentration of resources on the frequent and lesser significant while ignoring opportunities to reduce the more serious injuries? One of Heinrich’s premises is that “the predomi-
  • 38. nant causes of no-injury accidents are, in average cases, identical with the predominant causes of major injuries, and incidentally of minor injuries as well.” This is wrong. It is a myth that must be dis- lodged from the practice of safety. Applying this premise leads to misdirection in resource application and ineffectiveness, particu- larly with respect to preventing serious injuries. In this author’s experience, many incidents resulting in serious injury are singular and unique events, with multifaceted and complex causal factors, and descriptions of similar incidents are rare in the his- torical body of incident data. Furthermore, all haz- ards do not have equal potential for harm. Some risks are more significant than others. That requires priority setting. Misinterpretation of Terms Not only have many safety practitioners used the 300-29-1 ratios in statistical presentations, but many also have misconstrued what Heinrich in- tended with the terms major injury, minor injury and no-injury accidents. Some practitioners who cite these ratios in their presentations assume that a “major injury” is a serious injury or a fatality. In each edition, Heinrich gave nearly identical defini- tions of the accident categories to which the 300- 29-1 ratios apply. This is how the definition reads in the fourth edition. In the accident group (330 cases), a major in- Use of the 300-29-1
  • 39. ratios is troubling. Applying this premise leads to misdirection in re- source application and ineffectiveness, particularly with re- spect to preventing serious injuries. http://www.asse.org 60 ProfessionalSafety OCTOBER 2011 www.asse.org jury is any case that is reported to insurance carriers or to the state compensation com- missioner. A minor injury is a scratch, bruise or laceration such as is commonly termed a first-aid case. A no-injury accident is an un- planned event involving the movement of a person or an object, ray or substance (e.g., slip, fall, flying object, inhalation) having the probability of causing personal injury or property damage. The great majority of re- ported or major injuries are not fatalities or fractures or dismemberments; they are not all lost-time cases, and even those that are do not necessarily involve payment of com- pensation. (H- 20) These definitions compel the conclusion that any injury requiring more than first-aid treatment is a major injury. When these definitions were devel- oped in the late 1920s, few companies were self-in- sured for workers’ compensation. On-site medical
  • 40. facilities were rare. Insurance companies typically paid for medical-only claims and for minor and major injuries. According to Heinrich’s definitions, almost all such claims would be considered major injuries. Then, is it not so that every OSHA record- able injury is a major injury in this context? Heinrich’s 300-29-1 ratios have been misused and misrepresented many times as well. For exam- ple, a safety director recently said that in the pre- vious year his company sustained one fatality and 30 OSHA days-away-from-work incidents, and, therefore, Heinrich’s progression was validated. Not so. All of the injuries and the fatality would be in the major or lost-time injury category. In another instance, a speaker referred to Hein- rich’s 300-29-1 ratios and said that the 300 were unsafe acts, the 29 were serious injuries and the 1 was a fatality. These are but two examples of the many misuses of these ratios. Heinrich’s Premises Versus Current Safety Knowledge Heinrich emphasized improving an individual worker’s performance, rather than improving the work system established by management. That is not compatible with current knowledge. Unfortu- nately, some safety practitioners continue to base their counsel on Heinrich’s premises, which nar- rows the scope of their activities as they attempt principally to improve worker performance. In do- ing so, they ignore the knowledge that has evolved in the professional practice of safety. A few exam- ples follow:
  • 41. •Hazards are the generic base of, and the justi- fication for the existence of, the practice of safety. •Risk is an estimate of the probability of a haz- ard-related incident or exposure occurring and the severity of harm or damage that could result. •The entirety of purpose of those responsible for safety, regardless of their titles, is to manage their endeavors with respect to hazards so that the risks deriving from those hazards are acceptable. •All risks to which the practice of safety applies derive from hazards. There are no exceptions. •Hazards and risks are most effectively and eco- nomically avoided, eliminated or controlled in the design and redesign processes. •The professional practice of safety requires con- sideration of the two distinct aspects of risk: 1) avoiding, eliminating or reducing the prob- ability of a hazard-related incident or exposure oc- curring; 2) reducing the severity of harm or damage if an incident or exposure occurs. •Management creates the safety culture, wheth- er positive or negative. •An organization’s culture, translated into a system of expected behavior, determines man- agement’s commitment or lack of commitment to safety and the level of safety achieved.
  • 42. •Principal evidence of an organization’s culture with respect to occupational risk management is demonstrated through the design decisions that determine the facilities, hardware, equipment, tooling, materials, processes, configuration and layout, work environment and work methods. •Major improvements in safety will be achieved only if a culture change takes place, only if major changes occur in the system of expected behavior. •While human errors may occur at the worker level, preconditions for the commission of such er- rors may derive from decisions made with respect to the workplace and work methods at the man- agement, design, engineering or technical expert levels of an organization. •Greater progress can be obtained with respect to safety by focusing on system improvement to achieve acceptable risk levels, rather than through modifying worker behavior. •A large proportion of problems in an opera- tion are systemic, deriving from the workplace and work methods created by management, and can be resolved only by management. Responsibility for only a relatively small remainder lies with the worker. •While employees should be trained and em- powered up to their capabilities and encouraged to make contributions with respect to hazard identifi- cation and analysis, and risk elimination or control, they should not be expected to do what they can-
  • 43. not do. •Accidents usually result from multiple and in- teracting causal factors that may have organiza- tional, cultural, technical or operational systems origins. •If accident investigations do not relate to actual causal factors, corrective actions taken will be mis- directed and ineffective. •Causal factors for low-probability/high-conse- quence events are rarely represented in the analyti- cal data on incidents that occur frequently, and the uniqueness of serious injury potential must be ad- equately addressed. However, accidents that occur frequently may be predictors of severity potential if a high energy source was present (e.g., operation of powered mobile equipment, electrical contacts). As this list demonstrates, Heinrich’s premises are not compatible with current knowledge. Heinrich empha- sized improving an individual worker’s performance, rather than improving the work system estab- lished by manage- ment. That is not compatible with
  • 44. current knowledge. http://www.asse.org www.asse.org OCTOBER 2011 ProfessionalSafety 61 Conclusion As knowledge has evolved about how accidents occur and their causal factors, the emphasis is now properly placed on improving the work system, rather than on worker behavior. As one colleague who is disturbed by safety professionals who refer- ence Heinrich premises as fact, says, “It is border- line unethical on their part.” This article has reviewed the origin of certain premises that have been accepted as truisms by many educators and safety practitioners, and how they evolved and changed over time; it also ex- amined their validity. The two premises discussed here are wrongly based and cannot be sustained by safety practitioners. The premises themselves and the methods used to establish them cannot withstand a logic test. They are myths that have become deeply embedded in the practice of safety and safety professionals must take action to dis- lodge them. PS References BP. (2010, Sept. 8). Deepwater Horizon accident in- vestigation report. Houston, TX: Author. Retrieved Aug. 30, 2011, from www.bp.com/liveassets/bp_internet/ globalbp/globalbp_uk_english/incident_response/
  • 45. STAGING/local_assets/downloads_pdfs/Deepwa ter_Horizon_Accident_Investigation_Report.pdf. Center for Chemical Process Safety (CCPS). (1994). Guidelines for preventing human error in process safety. New York: Author. Columbia Accident Investigation Board. (2003). Columbia accident investigation report, Vol. 1. Washing- ton, DC: NASA. Retrieved Aug. 30, 2011, from www .nasa.gov/columbia/home/CAIB_Vol1.html. Deming, W.E. (1986). Out of the crisis. Cambridge, MA: Center for Advanced Engineering Study, Massa- chusetts Institute of Technology. Det Norske Veritas (DNV) Consulting. (2004). Leading indicators for major accident hazards: An invi- tation to industry partners. Houston, TX: Author. Ferry, T.S. (1981). Modern accident investigation and analysis: An executive guide. New York: John Wiley & Sons. Heinrich, H.W. (1931, 1941, 1950, 1959). Industrial accident prevention: A scientific approach. New York: McGraw-Hill. (See Table 1, p. 53 for specific references.) Heinrich, H.W., Petersen, D. & Roos, N. Industrial accident prevention (5th ed.). New York: McGraw-Hill. Krause, T.R. (2005). Leading
  • 46. with safety. Hoboken, NJ: John Wiley & Sons. Manuele, F.A. (2002). Heinrich revisited: Truisms or myths. Itasca, IL: National Safety Council. Manuele, F.A. (2003). On the practice of safety (3rd ed.). New York: John Wiley & Sons. Manuele, F.A. (2008, Dec.). Serious injuries and fatalities: A call for a new focus on their prevention. Professional Safety, 53(12), 32-39. National Council on Compensation Insurance (NCCI). (2005, May). State of the line. Boca Raton, FL: Author. Retrieved Aug. 30, 2011, from www.ncci.com/ media/pdf/SOL_2005.pdf. NCCI. (2006, June). Workers’ compensation claim frequency down again in 2005 [Research brief]. Boca Raton, FL: Author. Retrieved Aug. 30, 2011, from www .ncci.com/documents/research-brief-august06.pdf. NCCI. (2006, Nov.). The remarkable story of declining frequency—down 30% in the past decade [Video]. Boca Raton, FL: Author. Retrieved Aug. 30, 2011, from www .ncci.com/nccimain/IndustryInformation/NCCIVid eos/ArchivedArticles/Pages/video_declining _fre quency_11-06.aspx. NCCI. (2009, July). Workers’ compensa-
  • 47. tion claim frequency continues its decline in 2008 [Research brief]. Boca Raton, FL: Author. Retrieved Aug. 30, 2011, from www.ncci.com/ Documents/Work ersCompensation ClaimFrequency 2008.pdf. NCCI. (2010, May). State of the line. Boca Raton, FL: Author. Retrieved Aug. 30, 2011, from www.ncci.com/ Documents/AIS-2010-SOL-Presentation.pdf. National Safety Council (NSC). (1980). Accident prevention manual for industrial operations: Administra- tion and programs (8th ed.). Itasca, IL: Author. Reason, J. (1997). Managing the risks of organizational accidents. London: Ashgate Publishing Co. Stefansson, V. (1928). The standardization of error. London: K. Paul, Trench, Trubner & Co. Ltd. Tarrants, W.E. (1980). The measurement of safety performance. New York: Garland Publishing Co. U.S. Department of Energy. (1994). Guide to use of the management oversight and risk tree (SSDC-103). Washington DC: Author. Walton, M. (1986). The Deming management method. New York: The Putnam Publishing Group.
  • 48. Recommendations Safety professionals should ensure that the Heinrich misconcep- tions discussed in this article are discarded by the profession. To achieve this, each safety professional should: •Stop using or promoting the premises that unsafe acts are the primary causes of accidents and that focusing on reducing accident frequency will equivalently reduce injury severity. •Actively dispel these premises in presentations, writings and discussions. •Politely but firmly refute allegations by others who continue to promote the validity of these premises. •Apply current methods that look beyond Heinrich’s myths to determine true causal factors of accidents. Acknowledgment Parts of this article are updated material from three of the author’s works: Heinrich Revis- ited: Truisms or Myths; chapter seven in On the Practice of Safety (3rd ed.); and the article, “Serious Injuries and Fatalities: A Call for a New Focus on Their Prevention,” from the December 2008 issue of Professional Safety. http://www.bp.com/liveassets/bp_internet/globalbp/globalbp_uk _english/incident_response/STAGING/local_assets/downloads_ pdfs/Deepwater_Horizon_Accident_Investigation_Report.pdf http://www.bp.com/liveassets/bp_internet/globalbp/globalbp_uk _english/incident_response/STAGING/local_assets/downloads_ pdfs/Deepwater_Horizon_Accident_Investigation_Report.pdf
  • 49. http://www.bp.com/liveassets/bp_internet/globalbp/globalbp_uk _english/incident_response/STAGING/local_assets/downloads_ pdfs/Deepwater_Horizon_Accident_Investigation_Report.pdf http://www.bp.com/liveassets/bp_internet/globalbp/globalbp_uk _english/incident_response/STAGING/local_assets/downloads_ pdfs/Deepwater_Horizon_Accident_Investigation_Report.pdf http://www.nasa.gov/columbia/home/CAIB_Vol1.html http://www.nasa.gov/columbia/home/CAIB_Vol1.html http://www.ncci.com/media/pdf/SOL_2005.pd http://www.ncci.com/media/pdf/SOL_2005.pd http://www.ncci.com/media/pdf/SOL_2005.pdf http://www.ncci.com/documents/research-brief-august06.pdf http://www.ncci.com/documents/research-brief-august06.pdf https://www.ncci.com/nccimain/IndustryInformation/ NCCIVideos/ArchivedArticles/Pages/video_declining_frequenc y_11-06.aspx https://www.ncci.com/nccimain/IndustryInformation/ NCCIVideos/ArchivedArticles/Pages/video_declining_frequenc y_11-06.aspx https://www.ncci.com/nccimain/IndustryInformation/ NCCIVideos/ArchivedArticles/Pages/video_declining_frequenc y_11-06.aspx https://www.ncci.com/nccimain/IndustryInformation/ NCCIVideos/ArchivedArticles/Pages/video_declining_frequenc y_11-06.aspx http://www.ncci.com/Documents/WorkersCompensationClaimFr equency2008.pdf http://www.ncci.com/Documents/WorkersCompensationClaimFr equency2008.pdf http://www.ncci.com/Documents/WorkersCompensationClaimFr equency2008.pdf http://www.ncci.com/Documents/WorkersCompensationClaimFr equency2008.pdf http://www.ncci.com/Documents/WorkersCompensationClaimFr equency2008.pdf http://www.ncci.com/Documents/AIS-2010-SOL-
  • 50. Presentation.pdf http://www.ncci.com/Documents/AIS-2010-SOL- Presentation.pdf Copyright of Professional Safety is the property of American Society of Safety Engineers and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. 3Safety paradoxes Correspondence and reprint requests to: James Reason Dept. Psychology Univ. Manchester Oxford Road Manchester M13 9PL England, U.K. Tel.: +44 161 275 2551 Fax: +44 161 275 2622 E-mail: [email protected] Original paper Injury Control & Safety Promotion 1566-0974/00/US$ 15.00
  • 51. Injury Control & Safety Promotion – 2000, Vol. 7, No. 1, pp. 3-14 © Swets & Zeitlinger 2000 Accepted 15 November 1999 Safety paradoxes and safety culture James Reason Department of Psychology, University of Manchester, U.K. Abstract This paper deals with four safety paradoxes: (1) Safety is defined and measured more by its absence than its presence. (2) De- fences, barriers and safeguards not only protect a system, they can also cause its catastrophic breakdown. (3) Many organisations seek to limit the variability of human action, primarily to minimise error, but it is this same variability – in the form of timely adjustments to unexpected events – that maintains safety in a dynamic and changing world. (4) An unquestioning belief in the attainability of absolute safety can seriously impede the achievement of realisable safety goals, while a preoccupa- tion with failure can lead to high reliability. Drawing extensively upon the study of high reliability organisations (HROs), the paper argues that a collective understanding of these paradoxes is essential for those
  • 52. organisations seeking to achieve an optimal safety culture. It concludes with a consideration of some practical implications. Key words Safety promotion; culture; defences; errors; adaptabil- ity; beliefs; psychological factors; human behaviour Introduction A paradox is ‘a statement contrary to received opin- ion; seemingly absurd though perhaps well-founded’ (Concise Oxford Dictionary). This paper contends that the pursuit of safety abounds with paradox, and that this is especially true of efforts to achieve a safer organisational culture. In safety, as in other highly interactive spheres, things are not always what they seem. Not only can they be contrary to surface appearances, they can also run counter to some of our most cherished beliefs. The better we understand these paradoxes, the more likely we are to create and sustain a truly safe culture. A safe culture is an informed culture, one that knows continually where the ‘edge’ is without necessarily having to fall over it. The ‘edge’ lies between relative safety and unacceptable danger. In many industries, proximity to the ‘edge’ is the zone of greatest peril and also of greatest profit.1 Navigating this area requires considerable skill on
  • 53. the part of system managers and operators. Since such individuals come and go, however, only a safe culture can provide any degree of lasting protection. Simply identifying the existence of a paradox is not enough. Unlike the ‘pure’ sciences, in which theories are assessed by how much em- 9920.p65 1/27/00, 11:11 AM3 J. Reason4 pirical activity they provoke, the insights of safety scientists and safety practitioners are ultimately judged by the extent to which their practical application leads to safer systems. Each of the paradoxes considered below has important practical implications for the achievement of a safe culture. Indeed, it will be argued that a shared understanding of these paradoxes is a prerequisite for acquiring an optimal safety cul- ture. Most of the apparent contradictions discussed in this paper have been revealed not so much by the investigation of adverse events – a topic that comprises the greater part of safety research – as from the
  • 54. close observation of high reliability organisations (HROs). Safety has both a negative and a positive face. The former is revealed by accidents with bad outcomes. Fatalities, injuries and environmental damage are con- spicuous and readily quantifiable occurrences. Avoiding them as far as possible is the objective of the safety sciences. It is hardly surprising, therefore, that this darker face has occupied so much of our attention and shaped so many of our beliefs about safety. The positive face, on the other hand, is far more secretive. It relates to a system’s intrinsic resistance to its operational hazards. Just as medicine knows more about pathology than health, so also do the safety sciences understand far more about how bad events happen than about how human actions and organisational processes also lead to their avoidance, detection and containment. It is this imbalance that has largely created the paradoxes. The remainder of the paper is in six parts. The next section previews the four safety paradoxes to be considered here. The ensuing four sec- tions each consider one of these safety paradoxes in more detail. The concluding section summarises the practical implications of these par-
  • 55. adoxes for achieving and preserving a safer culture. Previewing the safety paradoxes • Safety is defined and measured more by its absence than by its presence. • Measures designed to enhance a system’s safety – defences, barriers and safeguards – can also bring about its destruction. • Many, if not most, engineering-based organisations believe that safe- ty is best achieved through a predetermined consistency of their pro- cesses and behaviours, but it is the uniquely human ability to vary and adapt actions to suit local conditions that preserves system safe- ty in a dynamic and uncertain world. • An unquestioning belief in the attainability of absolute safety (zero accidents or target zero) can seriously impede the achievement of realisable safety goals. A further paradox embodies elements from all of the above. If an or- ganisation is convinced that it has achieved a safe culture, it almost certainly has not. Safety culture, like a state of grace, is a product of continual striving. There are no final victories in the struggle for safety.
  • 56. The first paradox: how safety is defined and assessed The Concise Oxford Dictionary defines safety as ‘freedom from danger and risks’. But this tells us more about what comprises ‘unsafety’ than 9920.p65 1/27/00, 11:11 AM4 5Safety paradoxes about the substantive properties of safety itself. Such a definition is clearly unsatisfactory. Even in the short term, as during a working day or on a particular journey, we can never escape danger – though we may not experience its adverse consequences in that instance. In the longer term, of course, most of the risks and hazards that beset human activities are universal constants. Gravity, terrain, weather, fire and the potential for uncontrolled releases of mass, energy and noxious sub- stances are ever-present dangers. So, in the strict sense of the defini- tion, we can never be safe. A more appropriate definition of safety would be ‘the ability of individuals or organisations to deal with risks and hazards so as to avoid damage or losses and yet still achieve their goals’.
  • 57. Even more problematic, however, is that safety is measured by its occasional absences. An organisation’s safety is commonly assessed by the number and severity of negative outcomes (normalised for expo- sure) that it experiences over a given period. But this is a flawed metric for the reasons set out below. First, the relationship between intrinsic ‘safety health’ and negative outcomes is, at best, a tenuous one. Chance plays a large part in caus- ing bad events – particularly so in the case of complex, well- defended technologies.2 As long as hazards, defensive weaknesses and human fallibility continue to co-exist, unhappy chance can combine them in various ways to bring about a bad event. That is the essence of the term ‘accident’. Even the most resistant organisations can suffer a bad acci- dent. By the same token, even the most vulnerable systems can evade disaster, at least for a time. Chance does not take sides. It afflicts the deserving and preserves the unworthy. Second, a general pattern in organisational responses to a safety management programme is that negative outcome data decline rapidly at first and then gradually bottom out to some asymptotic value. In commercial aviation, for example, a highly safety conscious
  • 58. industry, the fatal accident rate has remained relatively unchanged for the past 25 years.3 Comparable patterns are found in many other domains. During the period of rapid decline, it seems reasonable to suppose that the marked diminution in accident rates actually does reflect some im- provement in a system’s intrinsic ‘safety health’. But once the plateau has been reached, periodic variations in accident rates contain more noise than valid safety signals. At this stage of an organisation’s safety development, negative outcome data are a poor indication of its ability to withstand adverse events in the future. This is especially true of well-defended systems such as commercial aviation and nuclear power generation that are, to a large extent, victims of their own success. By reducing accident rates to a very low level they have largely run out of ‘navigational aids’ by which to steer towards some safer state. The diminution in accident rates that is apparent in most domains is a product not only of local safety management efforts, but also of a growing public intolerance for third-party risks, environmental damage and work-related injuries. This, in turn, has led to increasingly compre- hensive safety legislation in most industrialised nations. Even
  • 59. in the least responsible organisations, merely keeping one step ahead of the regulator requires the implementation of basic safety measures that are 9920.p65 1/27/00, 11:11 AM5 J. Reason6 often sufficient to bring about dramatic early reductions in accident rates. The important issue, however, is what happens once the plateau has been reached. It is at this point that an organisation’s safety culture takes on a profound significance. Getting from bad to average is rela- tively easy; getting from average to excellent is very hard. And it is for the latter purpose that an understanding of the paradoxes is crucial. In summary: while high accident rates may reasonably be taken as indicative of a bad safety state, low asymptotic rates do not necessarily signal a good one. This asymmetry in the meaning of negative outcome data lies at the heart of many of the subsequent paradoxes to be dis- cussed later. It also has far-reaching cultural implications. There are at least two ways to interpret very low or nil accident rates in a
  • 60. given accounting period. A very common one is to believe that the organisa- tion actually has achieved a safe state: that is, it takes no news as good news and sends out congratulatory messages to its workforce. High- reliability organisations, on the other hand, become worried, accepting that no news really is no news, and so adopt an attitude of increased vigilance and heightened defensiveness.4,5 The second paradox: dangerous defences A theme that recurs repeatedly in accident reports is that measures designed to en- hance a system’s safety can also bring about its destruction. Since this paradox has been discussed at length elsewhere,6,7 we will focus on its cultural implications. Let us start with some examples of defensive failures that cover a range of domains. • The Chernobyl disaster had its local origins in an attempt to test an electrical safety device designed to overcome the interruption of power to the emergency core cooling system that would ensue im- mediately after the loss of off-site electricity and before the on- site auxiliary generators were fully operative.8 • The advanced automation present in many modern technologies was designed, in part, to eliminate opportunities for human error.
  • 61. Expe- rience in several domains, however, has shown that automation can create mode confusions and decision errors that can be more danger- ous than the slips and lapses it was intended to avoid.9,10 • Emergency procedures are there to guide people to safety in the event of a dangerous occurrence. In a number of instances, however, strict compliance with safety procedures has killed people. On Piper Alpha, the North Sea gas and oil platform that exploded in 1988, most of the 165 rig workers that died complied strictly with the safety drills and assembled in the accommodation area. Tragically, this was directly in line with a subsequent explosion.11 The few fire- fighters that survived the Mann Gulf forest fire disaster in 1949 dropped their heavy tools and ran, while those who died obeyed the organisational instruction to keep their fire-fighting tools with them at all times.12 • Personal protective equipment can save many lives, but it can also pose a dangerous threat to certain groups of people. Swedish traffic accident studies have revealed that both elderly female drivers and infants in backward-facing seats have been killed by rapidly inflating airbags following a collision.13
  • 62. 9920.p65 1/27/00, 11:11 AM6 7Safety paradoxes • Finally, perhaps the best example of the defence paradox is that maintenance activities – intended to repair and forestall technical failures – are the largest single source of human factors problems in the nuclear power industry.14,15 In commercial aviation, quality laps- es in maintenance are the second most significant cause of passenger deaths.16 There is no single reason why defences are so often instrumental in bringing about bad events. Errors in maintenance, for example, owe their frequency partly to the hands- on, high-opportunity nature of the task, and partly to the fact that certain aspects of maintenance, partic- ularly installation and reassembly, are intrinsically error- provoking regardless of who is doing the job.6 But some of the origins of the defensive paradox have strong cultural overtones. We can summarise these cultural issues under three headings: the trade-off problem, the control problem and the opacity problem.
  • 63. the trade-off problem An important manifestation of an organ- isation’s cultural complexion is the characteristic way it resolves con- flicts. Virtually all of the organisations of concern here are in the busi- ness of producing something: manufactured goods, energy, services, the extraction of raw materials, transportation and the like. All such activities involve the need to protect against operational hazards. A universal conflict, therefore, is that between production and protection. Both make demands upon limited resources. Both are essential. But their claims are rarely perceived as equal. It is production rather than protection that pays the bills, and those who run these organisations tend to possess productive rather than protective skills. Moreover, the information relating to the pursuit of productive goals is continuous, credible and compelling, while the information relating to protection is discontinuous, often unreliable, and only intermittently compelling (i.e., after a bad event). It is these factors that lie at the root of the trade-off problem. This problem can best be expressed as that of trading protec- tive gains for productive advantage. It has also been termed risk ho- meostasis17 or risk compensation – the latter term is preferable since it
  • 64. avoids some of Wilde’s more controversial assumptions.18 The trade-off problem has been discussed at length elsewhere.18-20 Just one example will be sufficient to convey its essence. The Davy lamp, invented in 1815, was designed to isolate the light source, a naked flame, from the combustible gases present in mines. But the mine owners were quick to see that it also allowed miners to work on seams previously regarded as too dangerous. The incidence of mine explosions increased dramatically, reaching a peak in the 1860s.20 Improvements in protection afforded by technological developments are often put in place during the aftermath of a disaster. Soon, however, this increased protection is seen as offering commercial advantage, leaving the organisation with the same or even less protection than it had previously. the control problem Another challenge facing all organisations is how to restrict the enormous variability of human behaviour to that 9920.p65 1/27/00, 11:11 AM7
  • 65. J. Reason8 which is both productive and safe. Organisational managers have a variety of means at their disposal:21,22 administrative controls (prescrip- tive rules and procedures), individual controls (selection, training and motivators), group controls (supervision, norms and targets) and tech- nical controls (automation, engineered safety features, physical barri- ers). In most productive systems, all of these controls are used to some degree; but the balance between them is very much a reflection of the organisational culture. What concerns us here, however, is the often disproportionate reliance placed upon prescriptive procedures. Standard operating procedures are necessary. This is not in dispute. Since people change faster than jobs, it is essential that an organisa- tion’s collective wisdom is recorded and passed on. But procedures are not without problems, as indicated by some of the examples listed above. They are essentially feed-forward control devices – prepared at one time and place to be applied at some future time and place – and they suffer, along with all such control systems, the problem of dealing with local variations. Rule-based controls can encounter at least
  • 66. three kinds of situation: those in which they are correct and appropriate, those in which they are inapplicable due to local conditions, and those in which they are absent entirely. A good example of the latter is the predicament facing Captain Al Haynes and his crew in United 232 when he lost all three hydraulic systems on his DC10 due to the explo- sion of his tail-mounted, number two engine.23 The probability of los- ing all three hydraulic systems was calculated at one in a billion, and there were no procedures to cover this unlikely emergency. Far more common, however, are situations in which the procedures are unwork- able, incomprehensible or simply wrong. A survey carried out in the US nuclear industry, for example, identified poor procedures as a factor in some 60% of all human performance problems.15 There is a widespread belief among the managers of highly procedur- alised organisations that suitable training, along with rigid compliance, should eliminate the vast majority of human unsafe acts. When such errors and violations do occur, they are often seen as moral issues warranting sanctions. But, for the most part, punishing people does not eliminate the systemic causes of their unsafe acts. Indeed, by
  • 67. isolating individual actions from their local context, it can impede their discov- ery. the opacity problem In the weeks following some foreign tech- nological disaster, we often hear our country’s spokespeople claiming that it couldn’t happen here because our barriers and safeguards are so much more sophisticated and extensive. This assertion captures an important consequence of the opacity problem: the failure to realise that defences, particularly defences-in-depth, can create and conceal dangers as well as protect against them. When this ignorance leads to a collective belief in the security of high-technology systems, the prob- lem takes on cultural significance. Defences-in-depth are created by diversity and redundancy. Barriers and safeguards take many forms. ‘Hard’ defences include automated safety features, physical containment, alarms and the like. ‘Soft’ de- fences include rules and procedures, training, drills, briefings, permit- 9920.p65 1/27/00, 11:11 AM8 9Safety paradoxes
  • 68. to-work systems and many other measures that rely heavily on people and paper. This assortment of safety-enhancing measures is widely distributed throughout the organisation. This makes such extensively defended systems especially vulnerable to the effects of an adverse safety culture. Only culture can reach equally into all parts of the sys- tem and exert some consistent effect, for good or ill.24 While such diversity has undoubtedly enhanced the security of high- technology systems, the associated redundancy has proved to be a mixed blessing. By increasing complexity, it also makes the system more opaque to those who manage and control it.7,25,26 The opacity problem takes a variety of forms. • Operator and maintainer failures may go unnoticed because they are caught and concealed by multiple backups.27 • Such concealment allows undiscovered errors and latent conditions (resident pathogens) to accumulate insidiously over time, thus in- creasing the possibility of inevitable weaknesses in the defensive layers lining up to permit the passage of an accident trajectory.6,28
  • 69. • By adding complexity to the system, redundant defences also in- crease the likelihood of unforeseeable common-mode failures. While the assumption of independence may be appropriate for purely tech- nical failures, errors committed by managers, operators and main- tainers are uniquely capable of creating problems that can affect a number of defensive layers simultaneously. At Chernobyl, for exam- ple, the operators successively disabled a number of supposedly in- dependent, engineered safety features in pursuit of their testing pro- gramme. Dangerous concealment combined with the obvious technological so- phistication of redundant defences can readily induce a false sense of security in system managers, maintainers and operators. In short, they forget to be afraid – or, as in the case of the Chernobyl operators, they never learn to be afraid. Such complacency lies on the opposite pole from a safe culture. The third paradox: consistency versus variability Holl- nagel20 conducted a survey of the human factors literature to identify the degree to which human error has been implicated in accident cau- sation over the past few decades. In the 1960s, when the
  • 70. problem first began to attract serious attention, the estimated contribution of human error was around 20%. By the 1990s, this figure had increased fourfold to around 80%. One of the possible reasons for this apparent growth in human fallibility is that accident investigators are now far more con- scious that contributing errors are not confined to the ‘sharp end’ but are present at all levels of a system, and even beyond. Another is that the error causal category has, by default, moved more and more into the investigatory spotlight due to great advances in the reliability of mechanical and electronic components over the past forty years. Whatever the reason, the reduction – or even elimination – of human error has now become one of the primary objectives of system manag- ers. Errors and violations are viewed, reasonably enough, as deviations from some desired or appropriate behaviour. Having mainly an engi- 9920.p65 1/27/00, 11:11 AM9 J. Reason10 neering background, such managers attribute human unreliability to
  • 71. unwanted variability. And, as with technical unreliability, they see the solution as one of ensuring greater consistency of human action. They do this, as we have seen, through procedures and by buying more automation. What they often fail to appreciate, however, is that human variability in the form of moment-to-moment adaptations and adjust- ments to changing events is also what preserves system safety in an uncertain and dynamic world. And therein lies the paradox. By striving to constrain human variability, they are also undermining one the sys- tem’s most important safeguards. The problem has been encapsulated by Weick’s insightful observa- tion5 that ‘reliability is a dynamic non-event.’ It is dynamic because processes remain under control due to compensations by human com- ponents. It is a non-event because safe outcomes claim little or no attention. The paradox is rooted in the fact that accidents are salient, while non-events, by definition, are not. Almost all of our methodolog- ical tools are geared to investigating adverse events. Very few of them are suited to creating an understanding of why timely adjustments are necessary to achieve successful outcomes in an uncertain and dynamic
  • 72. world Recently, Weick et al.4 challenged the received wisdom that an or- ganisation’s reliability depends upon the consistency, repeatability and invariance of its routines and activities. Unvarying performance, they argue, cannot cope with the unexpected. To account for the success of high reliability organisations (HROs) in dealing with unanticipated events, they distinguish two aspects of organisational functioning: cog- nition and activity. The cognitive element relates to being alert to the possibility of unpleasant surprises and having the collective mindset necessary to detect, understand and recover them before they bring about bad consequences. Traditional ‘efficient’ organisations strive for stable activity patterns yet possess variable cognitions – these differing cognitions are most obvious before and after a bad event. In HROs, on the other hand, ‘there is variation in activity, but there is stability in the cognitive processes that make sense of this activity’.4 This cognitive stability depends critically upon an informed culture – or what Weick and his colleagues have called ‘collective mindfulness’. Collective mindfulness allows an organisation to cope with the unan-
  • 73. ticipated in an optimal manner. ‘Optimal’ does not necessarily mean ‘on every occasion’, but the evidence suggests that the presence of such enduring cognitive processes is a critical component of organisational resilience. Since catastrophic failures are rare events, collectively mind- ful organisations work hard to extract the most value from what little data they have. They actively set out to create a reporting culture by commending, even rewarding, people for reporting their errors and near misses. They work on the assumption that what might seem to be an isolated failure is likely to come from the confluence of many ‘up- stream’ causal chains. Instead of localising failures, they generalise them. Instead of applying local repairs, they strive for system reforms. They do not take the past as a guide to the future. Aware that system failures can take a wide variety of yet-to-be-encountered forms, they are continually on the lookout for ‘sneak paths’ or novel ways in which 9920.p65 1/27/00, 11:11 AM10 11Safety paradoxes active failures and latent conditions can combine to defeat or
  • 74. by-pass the system defences. In short, HROs are preoccupied with the possibil- ity of failure – which brings us to the last paradox to be considered here. The fourth paradox: target zero Some years ago, US Vice- President Al Gore declared his intention of eradicating transport acci- dents. Comparable sentiments are echoed by the top managers of by- the-book companies, those having what Westrum29 has called ‘calculative’ cultures. They announce a corporate goal of ‘zero acci- dents’ and then set their workforce the task of achieving steadily di- minishing accident targets year by year – what I have earlier termed the ‘negative production’ model of safety management. It is easy to understand and to sympathise with such goal- setting. A truly committed management could hardly appear to settle for anything less. But ‘target zero’ also conveys a potentially dangerous misrepre- sentation of the nature of the struggle for safety: namely, that the ‘safe- ty war’ could end in a decisive victory of the kind achieved by a Waterloo or an Appomattox. An unquestioning belief in victory can lead to defeat in the ‘safety war’. The key to relative success, on the other hand, seems to be an abiding concern with failure
  • 75. HROs see the ‘safety war’ for what it really is: an endless guerrilla conflict. They do not seek a decisive victory, merely a workable sur- vival that will allow them to achieve their productive goals for as long as possible. They know that the hazards will not go away, and accept that entropy defeats all systems in the end. HROs accept setbacks and nasty surprises as inevitable. They expect to make errors and train their workforce to detect and recover them. They constantly rehearse for the imagined scenarios of failure and then go on to brainstorm novel ones. In short, they anticipate the worst and equip themselves to cope with it. A common response to these defining features of HROs is that they seem excessively bleak. ‘Doom-laden’ is a term often applied to them. Viewed from a personal perspective, this is an understandable reaction. It is very hard for any single individual to remain ever mindful of the possibility of failure, especially when such occurrences have personal significance only on rare occasions. No organisation is just in the busi- ness of being safe. The continuing press of productive demands is far more likely to engage the forefront of people’s minds than the possi-
  • 76. bility of some unlikely combination of protective failures. This is ex- actly why safety culture is so important. Culture transcends the psy- chology of any single person. Individuals can easily forget to be afraid. A safe culture, however, can compensate for this by providing the reminders and ways of working that go to create and sustain intelligent wariness. The individual burden of chronic unease is also made more supportable by knowing that the collective concern is not so much with the occasional – and inevitable – unreliability of its human parts, as with the continuing resilience of the system as a whole. The practical implications By what means can we set about transforming an average safety culture into an excellent one? The an- 9920.p65 1/27/00, 11:11 AM11 J. Reason12 swer, I believe, lies in recognising that a safe culture is the product of a number of inter-dependent sub-cultures, each of which – to some degree – can be socially engineered. An informed culture can only be built on the foundations of a reporting culture. And this, in turn, de-
  • 77. pends upon establishing a just culture. In this concluding section, we will look at how to build these two sub-cultures. The other elements of a safe culture – a flexible culture and a learning culture – hinge largely upon the establishment of the previous two. They have been discussed at length elsewhere5,6 and will not be considered further here. In the absence of frequent bad outcomes, knowledge of where the ‘edge’ lies can only come from persuading those at the human- system interface to report their ‘free lessons’. These are the mostly inconse- quential errors, incidents and near misses that could have caused injury or damage. But people do not readily confess their blunders, particular- ly if they believe such reports could lead to disciplinary action. Estab- lishing trust, therefore, is the first step in engineering a reporting cul- ture – and this can be very big step. Other essential characteristics are that the organisation should possess the necessary skills and resources to collect, analyse and disseminate safety-related information and, cru- cially, it should also have a management that is willing to act upon and learn from these data. A number of effective reporting systems have been established, par-
  • 78. ticularly in aviation. Two behavioural scientists involved in the cre- ation of two very successful systems, the Aviation Safety Reporting System developed by NASA and the British Airways Safety Informa- tion System, have recently collaborated to produce a blueprint for en- gineering a reporting culture.30 The main features are summarised be- low. • A qualified indemnity against sanctions – though not blanket immu- nity. • A reliance on confidentiality and de-identification rather than com- plete anonymity. • The organisational separation of those who collect and analyse the data from those responsible for administering sanctions. • Rapid, useful and intelligible feedback – after the threat of punish- ment, nothing deters reporters more than a lack of any response. • Reports should be easy to make. Free text accounts appear to be more acceptable to reporters than forced-choice questionnaires. The first three of these measures relate to the issue of punishment. In the past, many organisations relied heavily upon the threat of sanctions
  • 79. to shape reliable human behaviour. More recently, the pendulum has swung towards the establishment of ‘no blame’ cultures. But like the excessively punitive culture it supplanted, this approach is neither de- sirable nor workable. A small proportion of unsafe acts are indeed reckless and warrant severe sanctions. What is needed is a just culture, one in which everyone knows where the line must be drawn between acceptable and unacceptable actions. When this is done, the evidence suggests that only around 10% of unsafe acts fall into the unacceptable category.6,31 This means that around 90% of unsafe acts are largely blameless and could be reported without fear of punishment. 9920.p65 1/27/00, 11:11 AM12 13Safety paradoxes So how should this line be drawn? Many organisations place the boundary between errors and procedural violations, arguing that only the latter are deliberate actions. But there are two problems with this: some errors arise from unacceptable behaviours, while some violations are enforced by organisational rather than by individual shortcomings, and so should not be judged as unacceptable. Marx31 has
  • 80. proposed a better distinction. The key determinant of blameworthiness, he argues, is not so much the act itself – error or violation – as the nature of the behaviour in which it was embedded. Did this behaviour involve un- warranted risk-taking? If so, then the act would be blameworthy re- gardless of whether it was an error or a violation. Often, of course, the two acts are combined. For instance, a person may violate procedures by taking on a double shift and make a dangerous mistake in the final hour. Such an individual would merit punishment because he or she took an unjustifiable risk in working a continuous 18 hours, thus in- creasing the likelihood of an error.32 These are fine judgements and there is insufficient space to pursue them further here. The important point, however, is that such determi- nations – ideally involving both management and peers – lie at the heart of a just culture. Without a shared agreement as to where such a line should be drawn, there can never be an adequate reporting culture. Without a reporting culture, there could not be an informed culture. It is the knowledge so provided that gives an optimal safety culture its defining characteristics: a continuing respect for its operational
  • 81. haz- ards, the will to combat hazards in a variety of ways and a commitment to achieving organisational resilience. And these, I have argued, re- quire a ‘collective mindfulness’ of the paradoxes of safety. References 1 Hudson PTW. Psychology and safety. Leiden: University of Leiden, 1997. 2 Reason J. Achieving a safe culture: theory and practice. Work & Stress 1998;12:293-306. 3 Howard RW. Breaking through the 106 barrier. Proc Int Fed Airworthiness Conf, Auckland, NZ, 20-23 October 1991. 4 Weick KE, Sutcliffe KM, Obstfeld D. Organizing for high reliability: processes of collective mindfulness. In: Staw B, Sutton R, editors. Research in Organizational Behavior 1999;21:23-81. 5 Weick KE. Organizational culture as a source of high reliability. Calif Management Rev 1987;29:112-27. 6 Reason J. Managing the risks of organizational accidents. Aldershot: Ashgate, 1997.