Abortion Pills Fahaheel ௚+918133066128đŹ@ Safe and Effective Mifepristion and ...
Â
Ben Hutchinson
1. Fantasy planning: the gap between
systems of safety and safety of systems
Ben Hutchinson
National Zero Harm Manager
PhD Student â Safety Science Innovation Lab,
Griffith University
The Day Book (Chicago, Ill.) Date: April 16, 1912
http://clickamericana.com/eras/1910s/scenes-of-horror-and-heroism-on-
sinking-titanic-1912
âI can not imagine any condition which
could cause this ship to founder. I can
not conceive of any vital disaster
happening to this vessel. Modern
shipbuilding has gone beyond that.â
Captain E.J. Smith, Master of SS Titanic, referring to the ship
Adriatic.
2. An Oil Magnate, Nuclear Engineer and Postal Worker walk into a barâŚ
1 â Paul Searing https://www.flickr.com/photos/13995873@N05/
2 â The Atlantic https://www.theatlantic.com/photo/2014/03/the-
exxon-valdez-oil-spill-25-years-ago-today/100703/
3 â Jim Brickett / Flickr https://www.flickr.com/photos/jimbrickett/
1
2
3
3. The first [SMS] I designed was quite
naturally perfect, it was a work of art,
flawless, sublime.
A triumph equaled only by its monumental
failure.
The inevitability of its doom is as apparent
to me now as a consequence of the
imperfection inherent in every human
being, thus I redesigned it based on your
history to more accurately reflect the
varying grotesqueries of your nature
Silver, J. (Producer), Wachowski, L. (Director), & Wachowski, L. (Director). (2003). Matrix Reloaded [Motion Picture]. USA:
Village Roadshow Pictures.
4. The Fables of Safety
Background: https://au.pinterest.com/pin/215258057162287016/
âFor a successful technology, reality must take
precedence over public relations,
for nature cannot be fooledâ
â Feynman 1986. Report of the Presidential Commission on the Space Shuttle
Challenger accident.
5. âKnowingâ about the system was grounded in elaborate
algorithms ⌠purported to show risk was declining, and yet this
analysis was only tenuously linked to the actual level of danger.
This is not to suggest that ⌠companies or individuals are
deliberately fabricating this type of plan ⌠but rather that, in the
face of significant uncertainty, earnest attempts to plan can lose
touch with reality. â Hayes, 2015, âLessons for Effective Integrity Management
from the San Bruno Pipeline Ruptureâ, pg. 205
San Bruno pipeline explosion
http://articles.latimes.com/2010/oct/07/local/la-me-brittle-pipeline-20101007
6. How much safety in safety management?
⢠âThe SMS ⌠was repetitive, circular, ⌠[and] much of
its language was impenetrable ⌠[The SMS took] on a
life of its own, divorced from operations in the field âŚ
in some respects, maintenance of the [SMS] diverted
attention from the ⌠practical functioning of the
plantâ Dawson and Brooks, 1999, Report of the Longford Royal Commission, 200
Longford
â˘âNimrod Safety Case[s] were fatally undermined by an
assumption by all the organisations and individuals involved
that the Nimrod was âsafe anywayâ, because the Nimrod
fleet had successfully flown for 30 years, and they were
merely documenting something which they already knew.
The Nimrod Safety Case became essentially a paperwork
and âtick-boxâ exerciseâ Haddon-Cave, 2009, The Nimrod Review Report. Loss of XV230,
190
Nimrod
â˘ââPrior to Challenger, the can-do culture was a result not
just of years of apparently successful launches, but of
the cultural belief that the Shuttle Programâs many
structures, rigorous procedures, and detailed system of
rules were responsible for those successes.â Columbia Accident
Investigation Board, 2003, 199
Columbia
7. How Does an Illusion of Safety Come About?
â[M]any of us ask for certainty from our bankers, our doctors, and our political leaders. What
they deliver in response is the illusion of certainty. Many of us smile at old-fashioned fortune-
tellers. But when the soothsayers work with computer algorithms rather than tarot cards, we
take their predictions seriously and are prepared to pay for themâ -- Gigerenzer, 2014, Risk Savvy, pg.26
Illusion
of
safety
Successful
operations
Cultural norms
+ biases
System design
Poor
understanding
of work
High
uncertainty
and/or
innovation
Some reasons
âŚnot exhaustive
8. âWe found that many
witnesses ⌠had been
oblivious of what lay before
their eyes. It did not enter their
consciousness âŚ
They were like moles being
asked about the habits of birds
Report of the Tribunal Appointed to Inquire into the Disaster
at Aberfan on October 21st, 1966
Myth busting: We need to care more about safety
11. Case Study: FIFO Facility Maintenance
ď´ Extensive fatigue risk assessment. Covered
work hours, sleep opportunity, rotation
direction and more
ď´ Referred to task rotation and increased
supervision for higher risk activities and
during circadian lows
ď´ But ⌠quiet on what exactly was the high
risk activitiesâand how to control and
verify
ď´ Said most of the right thingsâbut not
focussed on the things that mattered most
ď´ A fantasy risk assessment? Decoy
phenomena?
ď´ What do you know of your critical risks and critical controls?
ď´ How do you know they are working? Have you checked?
ď´ What feeds your confidence in effectiveness of systems?
Questions for you:
12. Myth busting #2: Oh, human error you say?
âA âhuman errorâ problem
is an organizational
problem.
A âhuman errorâ problem
is at least as complex as
the organization that
helped create itâ
- Dekker, S. (2014). Field Guide to Understanding
Human Error, pg.192.
Safety
Gap
Work-as-Imagined
(Managers, procedures etc.)
Work-as-Occurs
(Reality, shop-floor)
ânormal work [work-as-occurs] may contain
daily frustrations and workarounds, as well as
workers having to ââfinish the designââ with
various improvisationsâ
â Dekker, S. (2014). The bureaucratization of safety. Safety Science, 70, 348-357.
13. Fantasy
A Firm Grip on Reality (âŚwhatever that is)
Value expertise
and relationships
Understand
limits of
knowledge
+ âmeta-
monitoringâ
Reveal and
test
assumptions
Mind the gap
(WAI vs WAO)
Engage
possibilistic
thinking
Reality
Some scattered ideas (Not linear):
âBeneath a public image of rule-following
behavior [âŚ] experts are opening with far
greater levels of ambiguity, needing to
make uncertain judgments in less than
clearly structured situationsâ Wynne (1988,
Unruly Technology, 153)
â[When] technology is involved, the illusion of certainty is
amplified.â Gigerenzer, 2014 Risk Savvy, 28
â[We have a] tendency to look at what confirms our
knowledge, not our ignoranceâ Taleb, 2007 The black swan, 30
14. * Salmon et al. (2012). Systems-based accident analysis methods: A comparison of Accimap, HFACS, and STAMP. Safety Science. 50, 1158â1170.
Is the [Swiss] Cheese Old and Mouldy?
Bad
decisions: #1
#2
#3
#4
How did I miss
the signs!
It was all so
obvious!
15. Probabilistic vs Possibilistic thinking. Or:
"things that have never happened before
happen all the time." (Sagan, 1993, The Limits of Safety,12)
http://www.dailymail.co.uk/news/article-1388629/Japan-tsunami-destroyed-wall-designed-protect-
Fukushima-nuclear-plant.html
â[S]eeing is not necessarily believing;
sometimes, we must believe before
we can see.â Perrow, 1999, Normal Accidents, 9
16. âWe put too much emphasis on reducing
errors and not enough on building expertiseâ
- Gary Klein. (2009). Streetlights and shadows: searching for the keys to adaptive decision
making, pg 13.
US Airways Flight 1549
âMiracle on the Hudsonâ
17. Background: The Day Book (Chicago, Ill.) Date: April 16, 1912
http://clickamericana.com/eras/1910s/scenes-of-horror-and-heroism-on-sinking-titanic-1912
Questions?
Available through ResearchGate
and LinkedIn