Swiss Cheese Model
Individual and Organizational Accidents
Individual Accidents
• A specific person or group is often
both the agent and the victim of
the accident.
• The consequences to the people
concerned may be great, but
their spread is limited
• Nature (though not necessarily
the frequency) of individual
accidents has remained relatively
unchanged over the years,
Organizational Accidents
• Have multiple causes involving many people
operating at different levels of their
respective companies.
• Can have devastating effects on uninvolved
populations, assets and the environment.
• They are a product of recent times or, more
specifically, a product of technological
innovations which have radically altered the
relationship between systems and their
human elements.
Organizational Accidents
• They involve the unplanned release of destructive agencies
such as mass, energy, chemicals and the like.
• They entail the breaching of the barriers and safeguards that
separate damaging and injurious hazards from vulnerable
people or assets—collectively termed ‘losses’.
• This is in sharp contrast to individual accidents where such
defences are often either inadequate or lacking.
Organizational Accidents
Figure directs our attention to the
central question in all accident
investigation:
By what means are the defences
breached?
Three sets of factors are likely to be
implicated—human, technical and
organizational
The Nature and Variety of Defences
Defences can be categorized both according to the various functions they serve
and by the ways in which these functions are achieved.
All defences are designed to serve one or more of the following functions:
• To create understanding and awareness of the local hazards.
• To give clear guidance on how to operate safely.
• To provide alarms and warnings when danger is imminent.
• To restore the system to a safe state in an off-normal situation.
• To interpose safety barriers between the hazards and the potential losses.
• To contain and eliminate the hazards should they escape this barrier.
• To provide the means of escape and rescue should hazard containment fail.
The Nature and Variety of Defences
Hard Defences
• Automated engineered safety features,
physical barriers, alarms and
annunciators, interlocks, keys,
personal protective equipment, non-
destructive testing, designed-in
structural weaknesses (for example,
fuse pins on aircraft engine pylons) and
improved system design.
Soft Defences
• ‘Soft’ defences, as the term implies, rely heavily
upon a combination of paper and people:
legislation, regulatory surveillance, rules and
procedures, training, drills and briefings,
administrative controls (for example, permit-to-
work systems and shift handovers), licensing,
certification, supervisory oversight and—most
critically—front-line operators, particularly in
highly automated control systems.
The defensive functions are usually achieved through a mixture of
‘hard’ and ‘soft’ applications.
Ideal vs Reality
In an ideal world all the
defensive layers would be
intact, allowing no
penetration by possible
accident trajectories
In the real world, however,
each layer has weaknesses
and gaps of the kind revealed
on the right-hand side of the
figure.
The Swiss Cheese Model Of Defences
• Although shows the defensive layers and their
associated ‘holes’ as being fixed and static, in
reality they are in constant flux.
• The ‘Swiss cheese’ metaphor is best
represented by a moving picture, with each
defensive layer coming in and out of the frame
according to local conditions.
• Particular defences can be removed deliberately
during calibration, maintenance and testing, or
as the result of errors and violations.
• Similarly, the holes within each layer could be
seen as shifting around, coming and going,
shrinking and expanding in response to operator
actions and local demands.
How Are The ‘Holes’ Created?
To answer this, we need to consider
the distinction between active
failures and latent conditions.
Active Failures and Latent Conditions
Active Failure
• It is by errors and violations
committed at the ‘sharp end’ of the
system
• Pilots, air traffic controllers, police
officers, insurance brokers, financial
traders, ships’ crews, control room
operators, maintenance personnel.
• Have a direct impact on the safety of
the system.
Latent Condition
• People working in complex systems make
errors or violate procedures for reasons that
generally go beyond the scope of individual
psychology. These reasons are latent
conditions.
• Poor design, gaps in supervision, undetected
manufacturing defects or maintenance
failures, unworkable procedures, clumsy
automation, shortfalls in training, less than
adequate tools
• Have a indirect impact on the safety of the
system.
Active Failures and Latent Conditions
Active Failures
• Committed by those at the
human-system interface—the
front-line or ‘sharp-end’
personnel.
• Tend to be unique to a specific
event
Latent Conditions
• They are spawned in the upper
echelons of the organization
and within related
manufacturing, contracting,
regulatory and governmental
agencies.
• If undiscovered and
uncorrected—can contribute to
a number of different
accidents.
BP-Texas City Refinery Accident, 2005
• Industry name :- British
Petroleum oil refinery
• When:- March 23rd , 2005
• Where :- Texas, USA
• Time :- Between 12:30 pm to 1
pm
• Deaths :- 15
• Injuries:- 170
• Reason:- A hydrocarbon vapor
cloud exploded at the ISOM
isomerization process unit
The Active Failure
• Required pre-start actions not completed
• Pre-Startup Safety Review not performed
• Key malfunctioning instrumentation not repaired
• Malfunctioning pressure control valve not repaired -- supervisor signed
off on startup procedure that control valves had tested satisfactorily
• Functionality checks of alarms and instruments not completed
• Night Lead Operator did not use startup procedure or record completed
steps when startup was partially completed on night shift
• Night Lead Operator left an hour before end of shift
The Active Failure
• ISOM-experienced Day Supervisor A arrived over an hour
late - did not conduct shift turnover with night shift
personnel
• Day Board Operator closed automatic tower level control
valve – although procedure required valve to be placed in
“automatic” and set at 50 percent
• Day Supervisor left the plant due to family emergency as
unit was being heated
The Latent Failure
• Work environment encouraged procedural noncompliance
• Ineffective communications for shift change and
hazardous operations (such as unit startup)
• Malfunctioning instrumentation and alarms
• Poorly designed computerized control system
• Ineffective supervisory oversight
• Insufficient staffing
• Lack of a human fatigue-prevention policy
The Latent Failure
• Inadequate operator training for abnormal and startup
conditions
• Failure to establish effective safe operating limits Ineffective
incident investigation management system
• Ineffective lessons learned program
• No coordinated line management self-assessment process
• No flare on blow down drum
• No automatic safety shutdown system
• Key operational indicators and alarms inoperative
The Accident Trajectory
• The necessary condition for an organizational accident is the rare conjunction of a
set of holes in successive defences, allowing hazards to come into damaging
contact with people and assets.
• These ‘windows of opportunity’ are rare because of the multiplicity of defences
and the mobility of the holes.
• Active failures can create gaps in the defences in at least two ways.
Front-line personnel may deliberately disable certain defences in order to achieve
local operational objectives.
The most tragic instance of this was the decision by the control room operators to
remove successive layers of defence from the Chernobyl RBMK nuclear reactor in
order to complete their task of testing a new voltage generator.
Front-line operators may unwittingly fail in their role as one of the
system’s most important lines of defence.
A common example would be the wrong diagnosis of an off-normal
system state, leading to an inappropriate course of recovery actions.
Such ‘sharp-end’ mistakes played a significant role in the Three Mile
Island nuclear power plant accident, the Bhopal metho-cyanate
disaster, the Heysel and Sheffield football stadium crushes and
numerous other organizational accidents.
The Accident Trajectory
The holes can be created
by active and latent
failures.
The Accident Trajectory
• The rectangular block at the top
represents the main elements of an
event while the triangular shape
below represents the system
producing it.
• This has three levels: the person
(unsafe acts), the workplace (error-
provoking conditions), and the
organization.
• The black upward arrows indicate
the direction of causality and the
white downward arrows indicate the
investigative steps.
The Accident Trajectory
• Within the workplace, local factors combine with natural human
tendencies to produce errors and violations—collectively termed
‘unsafe acts’—committed by individuals and teams at the ‘sharp end’,
or the direct human-system interface. Large numbers of these unsafe
acts will be made, but only very few of them will create holes in the
defences.
• Although unsafe acts are implicated in most organizational accidents,
they are not a necessary condition. On some occasions, the defences
fail simply as the result of latent conditions—as, for example, in the
Challenger and King’s Cross Underground fire disasters.
The Accident Trajectory
• In the analysis or investigation of accidents, the direction is reversed.
• The inquiry begins with the bad outcome (what happened) and then
considers how and when the defences failed.
• For each breached or bypassed defence, it is necessary to establish what
active failures and latent conditions were involved. And for each
individual unsafe act that is identified, we must consider what local
conditions could have shaped or provoked it.
• For each of these local conditions, we then go on to ask what upstream
organizational factors could have contributed to it.
The Accident Trajectory
Swiss Cheese Model Of Defences
• Accidents are usually the end point of a series of events in which the
situation becomes increasingly unsafe.
• Organisations erect multiple barriers to prevent accidents and maximise
safety, but none are perfect.
• By looking beyond the immediate cause, back from the time the accident
occurred and outwards, to the wider context, accident investigators can
often identify weaknesses at the organisational level from which useful
lessons can be learned. Reason’s illustrates how events can unfold in the
form of an accident trajectory known as the “Swiss Cheese” model
Stages in the Development of the
‘Swiss cheese’ Model (SCM)
• The mid-to-late-1980s version
• Late 1980s Model
• Early 1990s version
• Mid-1990s Variant
• The Current Version, 1997
The Mid-To-Late-1980s Version
• The starting point for the model44 was the essential, benign components
of any productive system:
• Decision-makers (plant and corporate management),
• Line management (operations, maintenance, training, and the like),
• Preconditions (reliable equipment and a skilled and motivated
workforce),
• Productive activities (the effective integration of human and mechanical
elements)
• Defences (safeguards against foreseeable hazards.
These productive ‘planes’ eventually became the cheese slice of
SCM.
The Mid-To-Late-1980s Version
First version of the Swiss
cheese model (though it
hadn’t yet taken on its
Emmenthale appearance.
Various human contributions
to the breakdown of complex
systems are mapped on to
the basic elements of
production)
The Mid-To-Late-1980s Version
• The various human and organisational contributions to the breakdown of a complex
system are mapped onto these basis productive elements.
• At that time they comprised two kinds of failure: latent failures (resident
‘pathogens’ within the system) and active failures (unsafe acts).
• The basic premise of the model was that organisational accidents have their primary
origins in the fallible decisions made by designers, builders and top-level
management.
• These are then transmitted via the intervening productive elements – line
management deficiencies, the psychological precursors of unsafe acts, the unsafe
acts themselves – to the point where these upstream influences combine with local
triggers and defensive weaknesses to breach the barriers and safeguards.
Late 1980s Model
Part of the earliest version
of the Swiss cheese model.
The diagram shows a
trajectory of accident
opportunity penetrating
several defensive layers, and
begins to have Emmenthale-
ish features
Early 1990s version
• A later variant assumed that a variety of
organisational factors could seed latent
pathogens into the system.
• These included management decisions,
core organisational processes – designing,
building, maintaining, scheduling,
budgeting, and the like – along with the
corporate safety culture.
• The significant thing about culture is that
it can affect all parts of the system for
good or ill.
• There were two ways in which the
consequences of these upstream factors
could impact adversely upon the defences.
Early 1990s version
• There was an active failure pathway in which error- and violation-
producing conditions in the workplace could, at the individual or team
level, create unsafe acts.
• A very large number of unsafe acts are likely to be committed, but
only very few of them will find chinks in the systems defences.
• There was also a latent failure pathway that transmitted pathogens to
the defences directly. Unsafe acts at the sharp end are not essential –
though common – for creating defensive gaps and weaknesses, as is
evident from the King’s Cross Underground fire, for example.
Mid-1990s Variant
• The principal innovations were,
-The identification of two distinct failure
pathways, the human error pathway and the
defensive failure pathway.
-The restriction of the term latent failure to
weaknesses or absences in the defences, barriers and
safeguards.
• This scheme also required a clear separation of
defensive functions from organisational processes.
• Unlike previous representations of the model, the
causal sequence runs from top to bottom (rather
than from left to right).
• The accident sequence is divided into four levels:
culture, climate, situation and the event itself
Mid-1990s Variant
• Both the human error and the defensive failure pathways have
their origins (within the system, at least) in the organisational
processes.
• These include goal-setting, policy-making, organising, forecasting,
planning, scheduling, managing, financing, allocating resources,
communicating, designing and specifying.
• All of these processes are likely to reflect cultural influences, and
each them can contribute to a breakdown in human performance
or a defensive failure. Whereas culture emanates from the
‘strategic apex’ of the organisation, climate relates to specific
workplaces and to their
The Current Version, 1997
• The current model (1997) involves a
succession of defensive layers
separating potential losses from the
local hazards.
• Each ‘slice’ – like Emmenthale – has
holes in it; but unlike cheese the gaps
are in continuous motion, moving
from place to place, opening and
shutting. only when a series of holes
‘line up’ can an accident trajectory
pass through the defences to cause
harm to people, assets and the
environment.
The Current Version, 1997
• The holes arise from unsafe acts (usually short-lived windows of opportunity) and latent
conditions.
• The latter occur because the designers, builders, managers and operators cannot foresee
all possible accident scenarios.
• They are much more long-lasting than the gaps due to active failures and are present
before an adverse event occurs.
• There were two important changes. First, the defensive layers were not specified. They
included a variety of barriers and safeguards – physical protection, engineered safety
features, administrative controls (regulations, rules and procedures), personal protective
equipment and the frontline operators themselves: pilots, drivers, watch keepers and
the like. They often constituted the last line of defence.
• The second change was the use of the term ‘latent conditions’. Conditions are not
causes, as such, but they are necessary for the causal agents to have their effects.
‘Swiss Cheese Model’- BP-Texas City Refinery
Accident, 2005
• An explosion and fire occurred
at the refinery’s isomerization
unit
• The explosion happened at
13:20 (Houston time) on March
23, 2005
• 15 people died and many more
were injured
• Note: The isomerization unit
boosts the octane of gasoline
blend stocks.
Simplified Block Diagram of Raffinate Splitter
Raffinate Splitter and Blowdown Drum Stack
What happened ?
• Temporary trailers placed 150 feet from the Isomerization unit. They
were being used by personnel preparing for a turnaround at another part
of the refinery
• Shut down part of the Isomerization unit to refresh the catalyst in the
feed unit
• On the night shift, the raffinate splitter was being restarted after the
shutdown. The raffinate splitter is part of the Isomerization unit that
distils chemicals for the Isomerization process
• Splitter was over-filled and over-heated
• When liquid subsequently filled the overhead line the relief valves
opened
• This caused excessive liquid and vapour to flow to blowdown drum and
vent at top of the stack
• An explosion occurred which killed 15 people and injured many others
Key Issues
• Operator Inattention
• Following Procedures
• Supervisor Absence
• Communication
• Shift handover
• Trailers Too Close to Hazards
• Some Instrumentation Did Not Work
• Abnormal Start-ups
• Investigation of Previous Incidents
• Blowdown Drum Vented Hydrocarbons to Atmosphere
• Opportunities to Replace Blowdown Drum
Reminder of the ‘Swiss Cheese Model’
Strategic Concepts
Strategic Concepts
• In order to reduce the potential for future major incidents and losses,
three layers of protection are to be considered:
− Plant – engineering hardware, control systems, and layouts to
eliminate, control and mitigate potential hazards to people, and improve
productivity
− Processes – management systems to identify, control and mitigate risks,
and drive continuous operational improvement
−People – capability of our people in terms of leadership skills, relevant
knowledge and experience, and the organizational culture they create.
• In layers of protection, ‘hard barriers’ are more reliable than ‘soft
barriers’, but all rely on people.

Topic 3 swiss cheese model

  • 1.
  • 2.
    Individual and OrganizationalAccidents Individual Accidents • A specific person or group is often both the agent and the victim of the accident. • The consequences to the people concerned may be great, but their spread is limited • Nature (though not necessarily the frequency) of individual accidents has remained relatively unchanged over the years, Organizational Accidents • Have multiple causes involving many people operating at different levels of their respective companies. • Can have devastating effects on uninvolved populations, assets and the environment. • They are a product of recent times or, more specifically, a product of technological innovations which have radically altered the relationship between systems and their human elements.
  • 3.
    Organizational Accidents • Theyinvolve the unplanned release of destructive agencies such as mass, energy, chemicals and the like. • They entail the breaching of the barriers and safeguards that separate damaging and injurious hazards from vulnerable people or assets—collectively termed ‘losses’. • This is in sharp contrast to individual accidents where such defences are often either inadequate or lacking.
  • 4.
    Organizational Accidents Figure directsour attention to the central question in all accident investigation: By what means are the defences breached? Three sets of factors are likely to be implicated—human, technical and organizational
  • 5.
    The Nature andVariety of Defences Defences can be categorized both according to the various functions they serve and by the ways in which these functions are achieved. All defences are designed to serve one or more of the following functions: • To create understanding and awareness of the local hazards. • To give clear guidance on how to operate safely. • To provide alarms and warnings when danger is imminent. • To restore the system to a safe state in an off-normal situation. • To interpose safety barriers between the hazards and the potential losses. • To contain and eliminate the hazards should they escape this barrier. • To provide the means of escape and rescue should hazard containment fail.
  • 6.
    The Nature andVariety of Defences Hard Defences • Automated engineered safety features, physical barriers, alarms and annunciators, interlocks, keys, personal protective equipment, non- destructive testing, designed-in structural weaknesses (for example, fuse pins on aircraft engine pylons) and improved system design. Soft Defences • ‘Soft’ defences, as the term implies, rely heavily upon a combination of paper and people: legislation, regulatory surveillance, rules and procedures, training, drills and briefings, administrative controls (for example, permit-to- work systems and shift handovers), licensing, certification, supervisory oversight and—most critically—front-line operators, particularly in highly automated control systems. The defensive functions are usually achieved through a mixture of ‘hard’ and ‘soft’ applications.
  • 7.
    Ideal vs Reality Inan ideal world all the defensive layers would be intact, allowing no penetration by possible accident trajectories In the real world, however, each layer has weaknesses and gaps of the kind revealed on the right-hand side of the figure.
  • 8.
    The Swiss CheeseModel Of Defences • Although shows the defensive layers and their associated ‘holes’ as being fixed and static, in reality they are in constant flux. • The ‘Swiss cheese’ metaphor is best represented by a moving picture, with each defensive layer coming in and out of the frame according to local conditions. • Particular defences can be removed deliberately during calibration, maintenance and testing, or as the result of errors and violations. • Similarly, the holes within each layer could be seen as shifting around, coming and going, shrinking and expanding in response to operator actions and local demands.
  • 9.
    How Are The‘Holes’ Created? To answer this, we need to consider the distinction between active failures and latent conditions.
  • 10.
    Active Failures andLatent Conditions Active Failure • It is by errors and violations committed at the ‘sharp end’ of the system • Pilots, air traffic controllers, police officers, insurance brokers, financial traders, ships’ crews, control room operators, maintenance personnel. • Have a direct impact on the safety of the system. Latent Condition • People working in complex systems make errors or violate procedures for reasons that generally go beyond the scope of individual psychology. These reasons are latent conditions. • Poor design, gaps in supervision, undetected manufacturing defects or maintenance failures, unworkable procedures, clumsy automation, shortfalls in training, less than adequate tools • Have a indirect impact on the safety of the system.
  • 11.
    Active Failures andLatent Conditions Active Failures • Committed by those at the human-system interface—the front-line or ‘sharp-end’ personnel. • Tend to be unique to a specific event Latent Conditions • They are spawned in the upper echelons of the organization and within related manufacturing, contracting, regulatory and governmental agencies. • If undiscovered and uncorrected—can contribute to a number of different accidents.
  • 12.
    BP-Texas City RefineryAccident, 2005 • Industry name :- British Petroleum oil refinery • When:- March 23rd , 2005 • Where :- Texas, USA • Time :- Between 12:30 pm to 1 pm • Deaths :- 15 • Injuries:- 170 • Reason:- A hydrocarbon vapor cloud exploded at the ISOM isomerization process unit
  • 13.
    The Active Failure •Required pre-start actions not completed • Pre-Startup Safety Review not performed • Key malfunctioning instrumentation not repaired • Malfunctioning pressure control valve not repaired -- supervisor signed off on startup procedure that control valves had tested satisfactorily • Functionality checks of alarms and instruments not completed • Night Lead Operator did not use startup procedure or record completed steps when startup was partially completed on night shift • Night Lead Operator left an hour before end of shift
  • 14.
    The Active Failure •ISOM-experienced Day Supervisor A arrived over an hour late - did not conduct shift turnover with night shift personnel • Day Board Operator closed automatic tower level control valve – although procedure required valve to be placed in “automatic” and set at 50 percent • Day Supervisor left the plant due to family emergency as unit was being heated
  • 15.
    The Latent Failure •Work environment encouraged procedural noncompliance • Ineffective communications for shift change and hazardous operations (such as unit startup) • Malfunctioning instrumentation and alarms • Poorly designed computerized control system • Ineffective supervisory oversight • Insufficient staffing • Lack of a human fatigue-prevention policy
  • 16.
    The Latent Failure •Inadequate operator training for abnormal and startup conditions • Failure to establish effective safe operating limits Ineffective incident investigation management system • Ineffective lessons learned program • No coordinated line management self-assessment process • No flare on blow down drum • No automatic safety shutdown system • Key operational indicators and alarms inoperative
  • 17.
    The Accident Trajectory •The necessary condition for an organizational accident is the rare conjunction of a set of holes in successive defences, allowing hazards to come into damaging contact with people and assets. • These ‘windows of opportunity’ are rare because of the multiplicity of defences and the mobility of the holes. • Active failures can create gaps in the defences in at least two ways. Front-line personnel may deliberately disable certain defences in order to achieve local operational objectives. The most tragic instance of this was the decision by the control room operators to remove successive layers of defence from the Chernobyl RBMK nuclear reactor in order to complete their task of testing a new voltage generator.
  • 18.
    Front-line operators mayunwittingly fail in their role as one of the system’s most important lines of defence. A common example would be the wrong diagnosis of an off-normal system state, leading to an inappropriate course of recovery actions. Such ‘sharp-end’ mistakes played a significant role in the Three Mile Island nuclear power plant accident, the Bhopal metho-cyanate disaster, the Heysel and Sheffield football stadium crushes and numerous other organizational accidents. The Accident Trajectory
  • 19.
    The holes canbe created by active and latent failures. The Accident Trajectory
  • 20.
    • The rectangularblock at the top represents the main elements of an event while the triangular shape below represents the system producing it. • This has three levels: the person (unsafe acts), the workplace (error- provoking conditions), and the organization. • The black upward arrows indicate the direction of causality and the white downward arrows indicate the investigative steps. The Accident Trajectory
  • 21.
    • Within theworkplace, local factors combine with natural human tendencies to produce errors and violations—collectively termed ‘unsafe acts’—committed by individuals and teams at the ‘sharp end’, or the direct human-system interface. Large numbers of these unsafe acts will be made, but only very few of them will create holes in the defences. • Although unsafe acts are implicated in most organizational accidents, they are not a necessary condition. On some occasions, the defences fail simply as the result of latent conditions—as, for example, in the Challenger and King’s Cross Underground fire disasters. The Accident Trajectory
  • 22.
    • In theanalysis or investigation of accidents, the direction is reversed. • The inquiry begins with the bad outcome (what happened) and then considers how and when the defences failed. • For each breached or bypassed defence, it is necessary to establish what active failures and latent conditions were involved. And for each individual unsafe act that is identified, we must consider what local conditions could have shaped or provoked it. • For each of these local conditions, we then go on to ask what upstream organizational factors could have contributed to it. The Accident Trajectory
  • 23.
    Swiss Cheese ModelOf Defences • Accidents are usually the end point of a series of events in which the situation becomes increasingly unsafe. • Organisations erect multiple barriers to prevent accidents and maximise safety, but none are perfect. • By looking beyond the immediate cause, back from the time the accident occurred and outwards, to the wider context, accident investigators can often identify weaknesses at the organisational level from which useful lessons can be learned. Reason’s illustrates how events can unfold in the form of an accident trajectory known as the “Swiss Cheese” model
  • 24.
    Stages in theDevelopment of the ‘Swiss cheese’ Model (SCM) • The mid-to-late-1980s version • Late 1980s Model • Early 1990s version • Mid-1990s Variant • The Current Version, 1997
  • 25.
    The Mid-To-Late-1980s Version •The starting point for the model44 was the essential, benign components of any productive system: • Decision-makers (plant and corporate management), • Line management (operations, maintenance, training, and the like), • Preconditions (reliable equipment and a skilled and motivated workforce), • Productive activities (the effective integration of human and mechanical elements) • Defences (safeguards against foreseeable hazards. These productive ‘planes’ eventually became the cheese slice of SCM.
  • 26.
    The Mid-To-Late-1980s Version Firstversion of the Swiss cheese model (though it hadn’t yet taken on its Emmenthale appearance. Various human contributions to the breakdown of complex systems are mapped on to the basic elements of production)
  • 27.
    The Mid-To-Late-1980s Version •The various human and organisational contributions to the breakdown of a complex system are mapped onto these basis productive elements. • At that time they comprised two kinds of failure: latent failures (resident ‘pathogens’ within the system) and active failures (unsafe acts). • The basic premise of the model was that organisational accidents have their primary origins in the fallible decisions made by designers, builders and top-level management. • These are then transmitted via the intervening productive elements – line management deficiencies, the psychological precursors of unsafe acts, the unsafe acts themselves – to the point where these upstream influences combine with local triggers and defensive weaknesses to breach the barriers and safeguards.
  • 28.
    Late 1980s Model Partof the earliest version of the Swiss cheese model. The diagram shows a trajectory of accident opportunity penetrating several defensive layers, and begins to have Emmenthale- ish features
  • 29.
    Early 1990s version •A later variant assumed that a variety of organisational factors could seed latent pathogens into the system. • These included management decisions, core organisational processes – designing, building, maintaining, scheduling, budgeting, and the like – along with the corporate safety culture. • The significant thing about culture is that it can affect all parts of the system for good or ill. • There were two ways in which the consequences of these upstream factors could impact adversely upon the defences.
  • 30.
    Early 1990s version •There was an active failure pathway in which error- and violation- producing conditions in the workplace could, at the individual or team level, create unsafe acts. • A very large number of unsafe acts are likely to be committed, but only very few of them will find chinks in the systems defences. • There was also a latent failure pathway that transmitted pathogens to the defences directly. Unsafe acts at the sharp end are not essential – though common – for creating defensive gaps and weaknesses, as is evident from the King’s Cross Underground fire, for example.
  • 31.
    Mid-1990s Variant • Theprincipal innovations were, -The identification of two distinct failure pathways, the human error pathway and the defensive failure pathway. -The restriction of the term latent failure to weaknesses or absences in the defences, barriers and safeguards. • This scheme also required a clear separation of defensive functions from organisational processes. • Unlike previous representations of the model, the causal sequence runs from top to bottom (rather than from left to right). • The accident sequence is divided into four levels: culture, climate, situation and the event itself
  • 32.
    Mid-1990s Variant • Boththe human error and the defensive failure pathways have their origins (within the system, at least) in the organisational processes. • These include goal-setting, policy-making, organising, forecasting, planning, scheduling, managing, financing, allocating resources, communicating, designing and specifying. • All of these processes are likely to reflect cultural influences, and each them can contribute to a breakdown in human performance or a defensive failure. Whereas culture emanates from the ‘strategic apex’ of the organisation, climate relates to specific workplaces and to their
  • 33.
    The Current Version,1997 • The current model (1997) involves a succession of defensive layers separating potential losses from the local hazards. • Each ‘slice’ – like Emmenthale – has holes in it; but unlike cheese the gaps are in continuous motion, moving from place to place, opening and shutting. only when a series of holes ‘line up’ can an accident trajectory pass through the defences to cause harm to people, assets and the environment.
  • 34.
    The Current Version,1997 • The holes arise from unsafe acts (usually short-lived windows of opportunity) and latent conditions. • The latter occur because the designers, builders, managers and operators cannot foresee all possible accident scenarios. • They are much more long-lasting than the gaps due to active failures and are present before an adverse event occurs. • There were two important changes. First, the defensive layers were not specified. They included a variety of barriers and safeguards – physical protection, engineered safety features, administrative controls (regulations, rules and procedures), personal protective equipment and the frontline operators themselves: pilots, drivers, watch keepers and the like. They often constituted the last line of defence. • The second change was the use of the term ‘latent conditions’. Conditions are not causes, as such, but they are necessary for the causal agents to have their effects.
  • 35.
    ‘Swiss Cheese Model’-BP-Texas City Refinery Accident, 2005 • An explosion and fire occurred at the refinery’s isomerization unit • The explosion happened at 13:20 (Houston time) on March 23, 2005 • 15 people died and many more were injured • Note: The isomerization unit boosts the octane of gasoline blend stocks.
  • 36.
    Simplified Block Diagramof Raffinate Splitter
  • 37.
    Raffinate Splitter andBlowdown Drum Stack
  • 38.
    What happened ? •Temporary trailers placed 150 feet from the Isomerization unit. They were being used by personnel preparing for a turnaround at another part of the refinery • Shut down part of the Isomerization unit to refresh the catalyst in the feed unit • On the night shift, the raffinate splitter was being restarted after the shutdown. The raffinate splitter is part of the Isomerization unit that distils chemicals for the Isomerization process • Splitter was over-filled and over-heated • When liquid subsequently filled the overhead line the relief valves opened • This caused excessive liquid and vapour to flow to blowdown drum and vent at top of the stack • An explosion occurred which killed 15 people and injured many others
  • 39.
    Key Issues • OperatorInattention • Following Procedures • Supervisor Absence • Communication • Shift handover • Trailers Too Close to Hazards • Some Instrumentation Did Not Work • Abnormal Start-ups • Investigation of Previous Incidents • Blowdown Drum Vented Hydrocarbons to Atmosphere • Opportunities to Replace Blowdown Drum
  • 40.
    Reminder of the‘Swiss Cheese Model’
  • 42.
  • 43.
    Strategic Concepts • Inorder to reduce the potential for future major incidents and losses, three layers of protection are to be considered: − Plant – engineering hardware, control systems, and layouts to eliminate, control and mitigate potential hazards to people, and improve productivity − Processes – management systems to identify, control and mitigate risks, and drive continuous operational improvement −People – capability of our people in terms of leadership skills, relevant knowledge and experience, and the organizational culture they create. • In layers of protection, ‘hard barriers’ are more reliable than ‘soft barriers’, but all rely on people.