SlideShare a Scribd company logo
Dr. Adnan Maqsood
SYSE-812 Human Factors Engg
➢Safety in Human Factors Engineering
2
Topics of the Course SYSE-812
Handout Contents
1 Introduction to HFE
2 Human Centric System Analysis & Design
3 Investigation Techniques in HFE
4 Affective Design in HFE
5 Cognitive & Mental Workload Analysis
6 Physical Workload Assessment
7 Safety in HFE
8 Job Satisfaction
9 Social Implications in HFE
10 Future of HFE
3
What is an Accident?
◼ Something without apparent cause, unexpected,
unintentional act, mishap, chance occurrence, act of
God
◼ Chen (1972). “An error with sad consequences”.
Implies human error
◼ Arbous and Kerrick (1981). “Unplanned event in a
chain of planned and/or controlled events.” Implies
sequential development
◼ Schutzinger (1954). “Resulting from the integration of
a constellation of forces.” Implies mechanical or other
forces
4
What is an accident?
◼ Haddon (1964). “Occurrence of an unexpected
physical or chemical damage to living or non-living
structures.” Implies unexpected event
◼ Suchman (1961). “It is doubtful that any single
definition will cover all types of events or interests.”
Generally to qualify as in accident, there should be:
❑ 1. Low degree of expectedness
❑ 2. Low degree of avoidability
❑ 3. Low degree of intention
❑ 4. Quick occurrence
5
Myths, Misconceptions & Problems in Safety
Analysis
◼ 1. Semantic confusion: A drops something on B,
then B has an accident
◼ 2. Accidents happen to other people – they are
accident prone – I am not. This means that safety
propaganda or safety programs at work has little effect
6
Accident Statistics from UK
Could be any country
Place Deaths Serious Injuries Slight Injuries
Home 7,561 120,000 1,500,000 (est)
Road 6,810 88,563 253,835
Rail 216 920 11.570
Aircraft 147 ? ?
Water transporter 158 ? ?
Factory 628 ? 11,805 (3+days away)
Farm 136 ? 8,945 (3+days away)
7
What do we learn from this?
◼ Home vs. Roads and Road vs. Work
◼ Compare to Heinrich’s theory. Stated for aircraft: Out
of 330 mishaps are produced, 29 minor injuries and 1
major injury
◼ Difficult to get data with high reliability. Number of
deaths are usually a bit more correct – but it depends
on the country/culture
◼ What about trends? Technology brings its own
problems. In 1870, 8% of the accidents in UK were
traffic accidents – today 40%. Powered hand tools,
nuclear power plants, etc.
◼ Society matures with time. In general, the trend is
downwards, C. F. Smeed’s Law
8
Smeed’s Law (1972) – Revalidated several times
9
Smeed’s Law (1972) – Revalidated several times
◼ Increasing experience with greater motorization
◼ The more vehicles, the less miles per vehicle
◼ Improvements in legislation, roads, and vehicles
◼ In developing countries, the drivers get more
experienced over time
◼ Social protest regarding high death rates
◼ The dynamics of these factors are unknown. But it is
clear that large scale actions such as left-right
switching in Sweden improved traffic safety the first
year
10
Conceptual Models of the Accident Process
◼ 1. Chain of Multiple Events
◼ Each accident is the result of a series of events. No single
cause exists – many factors influence the accident.
◼ The probability p of an accident is a function of several
different variables p=f(x1,x2,x3,….xn).
◼ This model is also used in epidemiological models, see below.
◼ 2. Epidemiological Model
◼ Originated from the study of disease. (Water supplies and
cholera in London).
◼ The host (accident victim) is described in terms of age, sex,
economic status, intelligence, behavior, etc. The agent (injury
deliverer) is described in terms of type, potential hazard,
method of use, etc. The environment is described in terms of
the effects on the host and agent: e.g. temperature, noise,
social climate.
◼ Useful for classifying accidents, but is not so helpful for
analyzing cause and effect
11
Epidemiology Origin
◼ Investigation of Cholera Epidemics in London in 1855
Water
Company
Number
of
Houses
Deaths
from
Cholera
Deaths
per
10,000
Houses
Southwark
and
Vauxhall
40,046 1,263 315
Lambeth 26,107 98 37
Rest of
London
256,423 1,422 59
12
Conceptual Models of the Accident Process
◼ 3. Energy Exchange model
❑ Injuries produced by energy exchange: e.g. mechanical,
chemical, thermal, electrical, etc.
❑ For example: A blow from a moving object crushed a
passengers leg in a car.
❑ This concept is a bit naïve, since all physical events
involve energy exchange.
❑ It is difficult to understand about causation.
❑ But the classification can be useful to suggest barriers
against accidents
13
Conceptual Models of the Accident Process
◼ 4. Behavioral Models
❑ A. Risk-taking models: Whenever a decision is made,
it is affected by the degree of risk. Risk taking is affected
by the amount of uncertainty and the amount of danger.
The assumption is that those taking higher risks have
more accidents. But sometimes people are not aware of
risks at all.
❑ B. Accident Proneness: Proposes that some persons
are more liable, due to their personality, to have more
accidents. There has been a tremendous amount of
research. The notion of accident proneness has proven
not to be useful
❑ C. Concept of Overloading: Due to information, etc.
Matching environmental requirements with operator
capabilities
14
Conceptual Models of the Accident Process
◼ 5. Systems Safety
❑ Safety is a systems problem and the person must be
understood in the context of the total system
Equipment
Factors
Task Factors
Environmental
Factors
Failure of Part
of System
Operator
Response to
the Failure
Accident or
Accident
Avoided
Predisposing
Factors
Precipitating
Factors
Example of a systems approach to accident analysis.
There are predisposing factors such as worn tires, wet
road, and glare. These can lead to precipitating factors
and eventually an accident
15
Conceptual Models of the Accident Process
◼ 6. Combined Models. Surry classified the process by
analyzing a series of questions
Predisposing Characteristics
Situational
Characteristics
Accident Conditions
Susceptible host
Hazardous environment
Injury-producing agent
Risk taking
Appraisal of hazard
Margin of error
Unexpected
Unavoidable
Unintentional
16
Ramsey’s Model
◼ Old lady sees water
puddle when crossing
the road
◼ She recognizes the
slipping hazard
◼ She decides to avoid
the puddle
◼ But she does not step
to side quickly enough
◼ She slips and falls!
17
Human Error Classification Scheme. Rouse (1983)
◼ 1. Observation of System State: Incorrect reading of
appropriate state variables; Erroneous interpretation of
correct readings; Failure to observe sufficient number of
variables; Observation of inappropriate state variables;
◼ 2. Choice of Hypothesis: Hypothesis does not
functionally relate to variables observed. Hypothesis
could not cause the values of the state variables
observed; Formulate better hypotheses
◼ 3. Testing of Hypothesis: Hypothesis not tested.
Stopped before reaching a conclusion; Reached wrong
conclusion; Considered but discarded correct
conclusion;
18
Human Error Classification Scheme. Rouse (1983)
◼ 4. Choice of goal: Goal not chosen. Insufficient
specification of goal. Choice of counter-productive
◼ 5. Choice of procedure: Procedure not chosen.
Choice would not achieve goal. Choice would achieve
incorrect goal; Choice unnecessary for achieving goal
◼ 6. Execution of procedure: Unrelated inappropriate
step executed; Required step omitted; Unnecessary
repetition of required step; Unnecessary step added;
Steps executed in wrong order; Step executed too
early or too late; Control in wrong position or range;
Stopped before procedure complete
19
Human Error
◼ Human error is the primary cause of 60-90 percent of
major accidents. Doctors and nurses make in average
1.7 errors per patient
◼ Thirty percent errors in command selection in word
processing (Card et al., 1980)
◼ But many of these errors are the results of bad system
design and bad organization rather than irresponsible
actions
◼ 1. Many reasons of errors:
❑ Poor discriminability
❑ Memory lapses
❑ Communication breakdown
❑ Biases in decision making
❑ Selection of compatible, but incorrect response
20
Human Error
◼ 2. Speed-Accuracy Trade Off
❑ It is impossible to work very fast and accurate at same
time
❑ Fast and sloppy OR Slow and accurate
◼ 3. Signal Detection Theory
❑ Assumes two kinds of human errors: false alarms and
misses
❑ The study of human errors has become a science by
itself
21
Categories of Human Errors by James Reason
◼ Mistakes & Slips
◼ Mistakes
❑ Failure to formulate the right intention, due to
shortcomings in: Perception, memory and cognition
❑ James Reason used Rasmussen’s distinction between:
◼ Knowledge-based mistakes
◼ Rule-based mistakes
22
Categories of Human Errors
◼ Knowledge-Based Mistakes
◼ These are due to failure to understand the situation. The
operator may not be able to consider alternative decisions,
since she is overwhelmed by the complexity of evidence
and cannot interpret it correctly
◼ Rule-Based Mistakes
◼ Example of rules: It is correct to turn the wheels in the
direction you want to go – unless you are skidding on ice.
Formulated as IF-THEN rules. There may be exceptions or
qualifications that are overlooked – the THEN part may be
wrong. The choice of rule is guided by frequency and
reinforcement – Rules that have been successful are
chosen again
◼ Rule-based mistakes tend to be done with much
confidence “Strong but Wrong”
◼ But there is less confidence in knowledge-based situations,
maybe because this involves a more conscious effort
23
Categories of Human Errors
◼ Slips
❑ The right intention is carried out – but incorrectly. A
common class of slips are “capture errors”. These may
happen when
◼ a. The intended action is almost the same as routine action
◼ b. The action sequence is relatively automatic
◼ e.g. Pouring orange juice in the coffee cup while reading
the morning paper during breakfast
◼ These routine situations are not attended, and the
errors are produced because the stimulus and the
response are similar
◼ In flying, controls for flaps and landing gears have
both similar feels, appearance, direction, and location,
and are both relevant for take-off and landing
24
Categories of Human Errors
◼ Lapses
❑ Failure to carry out an action – due to forgetfulness
❑ Sometimes an interruption may cause a sequence to be
stopped (What was I saying?)
◼ Mode Errors
❑ An action that is appropriate in one mode of operation is
not appropriate for another
❑ Example: Raising landing wheels, although aircraft is still
on the runway – but the pilot thought it was airborne
❑ Mode errors are of great concern in flying and HCI,
where the same key may have different meanings
❑ Mode errors are a joint consequence of relatively
automated performance and improperly conceived
systems design.
25
Remedial Actions of Errors
Potential Error Error Type Action
Loco not returned to service bay for 24 hr. Check Violation Organization /
Management
Setting off with parking brake on Slip Design
Driving locos despite earth tester warning on Violation Design / Training
Drivers leaning out of cab when traveling Violation Design / Training
Inadequate use of warning horns Violation Design
Misreading of displays Slip Design
Guards leaning out of cabs when travelling Violation Design / Training
Insufficient warning of objects/people on track Slip Design
Instability to effectively use fire extinguishers Mistake Design
Incorrect control operations Mistake Training / Design
26
How to deal with Mistakes?
◼ What can we do about Knowledge-based mistakes,
Rule-based mistakes, Slips, and lapses?
❑ Knowledge-based: Train the operator
❑ Rule-based: Training and redesign
❑ Slips: Redesign the task / environment
❑ Lapses: Redesign the task
27
Conclusion
◼ There are several ways to remedy causes of human
error
◼ In industry, it is common to implement work
procedures and training of operators
◼ In supervisory control, we try to redesign the
workplace and tools – and train the operator
This approach has been adopted by many
organizations – e.g. military. It is now common in
nuclear power plants and other complex
environments. Lately it has also been adopted by
industry
28
Reason’s Cheese Model of Accidents
◼ James Reason’s Swiss Cheese Model of
Organizational Accidents
29
Errors in Organizational Context
◼ Reason thinks that human errors represent only a
small part of the deficiencies in an organization
◼ Accidents are visible, and therefore analyzed. Less
visible organizational errors are often performed in
management decision making
❑ Example: Industrial managers have limited resources –
often not enough to allocate to both productivity and
safety
❑ Managers get positive reinforcement from production,
but safety is considered a “show stopper”, and is usually
characterized by absence of evidence
30
Consequences of Reason’s Model of Human Error
◼ Training
❑ Lack of knowledge can lead to mistakes. Training is
therefore helpful. But operators must also train at
correcting errors – this is naturalistic. Error-free training
is not
◼ Memory aids and rules
❑ For example, use memory aids for procedures (e.g.
checklists)
❑ Rules must be logical. The “band aid” approach to
human error make the situation worse
31
Consequences of Reason’s Model of Human Error
◼ Error-Tolerant Systems
❑ There is one positive aspect of errors – the opportunity
for the operator to correct them. This gives the operator
a sense of control. Driving a car involves continuous
error correction (of lateral and longitudinal position).
❑ Often there are many strategies and the operator must
be allowed to act in an opportunistic fashion. The
operator must be allowed to respond differently
according to the conditions of the moment. Operators
must be given a chance to explore the functionality of
the system. Is there an undo button?
❑ In an error-tolerant system, one can recover by undoing
an action – there is a back-up option
32
Starr (1969) risk taking model
◼ The horizontal line represents the natural death rate
due to old age
33
Human Errors are Commonplace
◼ But many of the errors people commit in operating
systems are the result of bad system design or bad
organizational structure rather than irresponsible
action (Norman 1988; Reason, 1990, 1997; Woods &
Cook, 1999)
◼ Although human error may be statistically defined as a
contributing cause to an accident, usually human error
is only one in a complex chain of breakdowns – many
of them are of mechanical or organizational nature
◼ They affected the system and weaken its defenses
(Perrow, 1984; Reason, 1997)
34
Stop Blaming the Operator
◼ By minimizing human error, we can improve both
safety and industrial production. This is a matter of
design and training
◼ The notion that the operator should be punished or
personally made responsible is unwarranted – (unless
there is a clear violation of regulations).
◼ Accident proneness is not a viable concept (Shaw and
Sichel). Therefore the blame for accidents and poor
quality falls on poor design, poor procedures, poor
training and in the end poor management!
35
Fault Tree Analysis
◼ Has been used extensively in space-craft design,
analysis of nuclear power plant, safety etc.
❑ 1. The fault tree starts with a specific failure (the top of the
tree). Choice of failure is important. If it is too general, it
cannot be analyzed, it is too specific, the analysis will not
produce enough information
❑ 2. The purpose is to find all credible ways in which the
undesirable event can occur. (Very expensive analysis)
❑ 3. It is a graphical model of various parallel and sequential
faults that will result in the occurrence of the undesired fault
(at the top of the tree)
❑ 4. Primary events are caused by inherent characteristics of
component, such as failure of light bulb due to worn
filament. Secondary events are caused by external
sources–such as excessive voltage, which burns out
filament
36
Construction of a fault tree
◼ A. By analysis (top-down)
❑ 1. Select one head event that is to be prevented
❑ 2. Determine all primary and secondary events that may
cause the head event
❑ 3. Determine relationships between causal events and
the head event in terms of AND and OR Boolean
operators
❑ 4. Determine the value and need for further analysis
according to steps 2 and 3
❑ 5. Continue to reiterate steps 2-4 until all events are
basic, or until it is not desirable to go further.
❑ 6. Diagram the events using the symbols below
❑ 7. Perform qualitative and quantitative analyses
37
Fault-Tree Analysis
Top event – Cannot
be developed further
Basic Event
Event to be further
developed
Normal Event is
normal, but can
become a fault
Inconsequential
Event Or Insufficient
data to develop
AND Gate. Several
input events must
occur to cause
output event
OR Gate. At least
one input event must
occur to cause
output event

More Related Content

Similar to H7-SafetyHFE.pdf

Workplace accidents and_human_error_by_isti
Workplace accidents and_human_error_by_istiWorkplace accidents and_human_error_by_isti
Workplace accidents and_human_error_by_isti
SYED HAIDER ABBAS
 
Lecture 4 part ii
Lecture 4 part iiLecture 4 part ii
Lecture 4 part ii
Yusof Omar
 
Stephen cresswell risk are we missing a trick - 25th june
Stephen cresswell   risk are we missing a trick - 25th juneStephen cresswell   risk are we missing a trick - 25th june
Stephen cresswell risk are we missing a trick - 25th june
Association for Project Management
 
Hsse safety iceberg theory gp
Hsse safety iceberg theory gpHsse safety iceberg theory gp
Hsse safety iceberg theory gp
Noor Ezlina
 
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docxA Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
ransayo
 
accident prevention and theories of accident
accident prevention and theories of accidentaccident prevention and theories of accident
accident prevention and theories of accident
satheeshsep24
 
Demystifying the concepts of hazard avoidance in a dynamic work environment -...
Demystifying the concepts of hazard avoidance in a dynamic work environment -...Demystifying the concepts of hazard avoidance in a dynamic work environment -...
Demystifying the concepts of hazard avoidance in a dynamic work environment -...
Shola Yemi-Jonathan
 
MEDTECH 2013 Closing Plenary, Andy Shaudt, Director of Usability Services, Na...
MEDTECH 2013 Closing Plenary, Andy Shaudt, Director of Usability Services, Na...MEDTECH 2013 Closing Plenary, Andy Shaudt, Director of Usability Services, Na...
MEDTECH 2013 Closing Plenary, Andy Shaudt, Director of Usability Services, Na...
MedTechAssociation
 
Fattore umano in chirurgia
Fattore umano in chirurgiaFattore umano in chirurgia
Fattore umano in chirurgia
Gentile Warschauer Emilio
 
Workplace Safety Policy And Procedure
Workplace Safety Policy And ProcedureWorkplace Safety Policy And Procedure
Workplace Safety Policy And Procedure
Karen Oliver
 
ethics
ethicsethics
ethics
Raja Manyam
 
healthcare and safety in environmental engineering
healthcare and safety in environmental engineeringhealthcare and safety in environmental engineering
healthcare and safety in environmental engineering
arslanMaqbool4
 
Reducing risk
Reducing riskReducing risk
Reducing risk
SKS
 
HUMAN ERROR
HUMAN ERRORHUMAN ERROR
HUMAN ERROR
Ülger Ahmet
 
OSH AccidentREVISED for OSH Parctitioners guide
OSH AccidentREVISED for OSH Parctitioners guideOSH AccidentREVISED for OSH Parctitioners guide
OSH AccidentREVISED for OSH Parctitioners guide
ssuser8748142
 
Jardine egaf10
Jardine egaf10Jardine egaf10
Jardine egaf10
Rodrigo Pascual
 
Michael Quinlan
Michael Quinlan   Michael Quinlan
2007 North Wales OHS - Human factors overview
2007 North Wales OHS - Human factors overview2007 North Wales OHS - Human factors overview
2007 North Wales OHS - Human factors overview
Andy Brazier
 
2012.02.18 Reducing Human Error in Healthcare - Getting Doctors to Swallow th...
2012.02.18 Reducing Human Error in Healthcare - Getting Doctors to Swallow th...2012.02.18 Reducing Human Error in Healthcare - Getting Doctors to Swallow th...
2012.02.18 Reducing Human Error in Healthcare - Getting Doctors to Swallow th...
NUI Galway
 
D Part 6 Short Course By J Mc Cann
D  Part 6 Short Course By J Mc CannD  Part 6 Short Course By J Mc Cann
D Part 6 Short Course By J Mc Cann
guest2dac56
 

Similar to H7-SafetyHFE.pdf (20)

Workplace accidents and_human_error_by_isti
Workplace accidents and_human_error_by_istiWorkplace accidents and_human_error_by_isti
Workplace accidents and_human_error_by_isti
 
Lecture 4 part ii
Lecture 4 part iiLecture 4 part ii
Lecture 4 part ii
 
Stephen cresswell risk are we missing a trick - 25th june
Stephen cresswell   risk are we missing a trick - 25th juneStephen cresswell   risk are we missing a trick - 25th june
Stephen cresswell risk are we missing a trick - 25th june
 
Hsse safety iceberg theory gp
Hsse safety iceberg theory gpHsse safety iceberg theory gp
Hsse safety iceberg theory gp
 
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docxA Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
 
accident prevention and theories of accident
accident prevention and theories of accidentaccident prevention and theories of accident
accident prevention and theories of accident
 
Demystifying the concepts of hazard avoidance in a dynamic work environment -...
Demystifying the concepts of hazard avoidance in a dynamic work environment -...Demystifying the concepts of hazard avoidance in a dynamic work environment -...
Demystifying the concepts of hazard avoidance in a dynamic work environment -...
 
MEDTECH 2013 Closing Plenary, Andy Shaudt, Director of Usability Services, Na...
MEDTECH 2013 Closing Plenary, Andy Shaudt, Director of Usability Services, Na...MEDTECH 2013 Closing Plenary, Andy Shaudt, Director of Usability Services, Na...
MEDTECH 2013 Closing Plenary, Andy Shaudt, Director of Usability Services, Na...
 
Fattore umano in chirurgia
Fattore umano in chirurgiaFattore umano in chirurgia
Fattore umano in chirurgia
 
Workplace Safety Policy And Procedure
Workplace Safety Policy And ProcedureWorkplace Safety Policy And Procedure
Workplace Safety Policy And Procedure
 
ethics
ethicsethics
ethics
 
healthcare and safety in environmental engineering
healthcare and safety in environmental engineeringhealthcare and safety in environmental engineering
healthcare and safety in environmental engineering
 
Reducing risk
Reducing riskReducing risk
Reducing risk
 
HUMAN ERROR
HUMAN ERRORHUMAN ERROR
HUMAN ERROR
 
OSH AccidentREVISED for OSH Parctitioners guide
OSH AccidentREVISED for OSH Parctitioners guideOSH AccidentREVISED for OSH Parctitioners guide
OSH AccidentREVISED for OSH Parctitioners guide
 
Jardine egaf10
Jardine egaf10Jardine egaf10
Jardine egaf10
 
Michael Quinlan
Michael Quinlan   Michael Quinlan
Michael Quinlan
 
2007 North Wales OHS - Human factors overview
2007 North Wales OHS - Human factors overview2007 North Wales OHS - Human factors overview
2007 North Wales OHS - Human factors overview
 
2012.02.18 Reducing Human Error in Healthcare - Getting Doctors to Swallow th...
2012.02.18 Reducing Human Error in Healthcare - Getting Doctors to Swallow th...2012.02.18 Reducing Human Error in Healthcare - Getting Doctors to Swallow th...
2012.02.18 Reducing Human Error in Healthcare - Getting Doctors to Swallow th...
 
D Part 6 Short Course By J Mc Cann
D  Part 6 Short Course By J Mc CannD  Part 6 Short Course By J Mc Cann
D Part 6 Short Course By J Mc Cann
 

Recently uploaded

basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
NidhalKahouli2
 
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELDEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
gerogepatton
 
Casting-Defect-inSlab continuous casting.pdf
Casting-Defect-inSlab continuous casting.pdfCasting-Defect-inSlab continuous casting.pdf
Casting-Defect-inSlab continuous casting.pdf
zubairahmad848137
 
Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...
Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...
Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...
University of Maribor
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
MDSABBIROJJAMANPAYEL
 
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMSA SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
IJNSA Journal
 
International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...
gerogepatton
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
Madan Karki
 
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
IJECEIAES
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Christina Lin
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
insn4465
 
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Sinan KOZAK
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
VICTOR MAESTRE RAMIREZ
 
A review on techniques and modelling methodologies used for checking electrom...
A review on techniques and modelling methodologies used for checking electrom...A review on techniques and modelling methodologies used for checking electrom...
A review on techniques and modelling methodologies used for checking electrom...
nooriasukmaningtyas
 
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdfIron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
RadiNasr
 
Embedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoringEmbedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoring
IJECEIAES
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
SUTEJAS
 
New techniques for characterising damage in rock slopes.pdf
New techniques for characterising damage in rock slopes.pdfNew techniques for characterising damage in rock slopes.pdf
New techniques for characterising damage in rock slopes.pdf
wisnuprabawa3
 
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
Yasser Mahgoub
 
ISPM 15 Heat Treated Wood Stamps and why your shipping must have one
ISPM 15 Heat Treated Wood Stamps and why your shipping must have oneISPM 15 Heat Treated Wood Stamps and why your shipping must have one
ISPM 15 Heat Treated Wood Stamps and why your shipping must have one
Las Vegas Warehouse
 

Recently uploaded (20)

basic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdfbasic-wireline-operations-course-mahmoud-f-radwan.pdf
basic-wireline-operations-course-mahmoud-f-radwan.pdf
 
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELDEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODEL
 
Casting-Defect-inSlab continuous casting.pdf
Casting-Defect-inSlab continuous casting.pdfCasting-Defect-inSlab continuous casting.pdf
Casting-Defect-inSlab continuous casting.pdf
 
Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...
Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...
Presentation of IEEE Slovenia CIS (Computational Intelligence Society) Chapte...
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
 
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMSA SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
 
International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...International Conference on NLP, Artificial Intelligence, Machine Learning an...
International Conference on NLP, Artificial Intelligence, Machine Learning an...
 
Manufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptxManufacturing Process of molasses based distillery ppt.pptx
Manufacturing Process of molasses based distillery ppt.pptx
 
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
 
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
 
A review on techniques and modelling methodologies used for checking electrom...
A review on techniques and modelling methodologies used for checking electrom...A review on techniques and modelling methodologies used for checking electrom...
A review on techniques and modelling methodologies used for checking electrom...
 
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdfIron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
 
Embedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoringEmbedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoring
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
 
New techniques for characterising damage in rock slopes.pdf
New techniques for characterising damage in rock slopes.pdfNew techniques for characterising damage in rock slopes.pdf
New techniques for characterising damage in rock slopes.pdf
 
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
2008 BUILDING CONSTRUCTION Illustrated - Ching Chapter 02 The Building.pdf
 
ISPM 15 Heat Treated Wood Stamps and why your shipping must have one
ISPM 15 Heat Treated Wood Stamps and why your shipping must have oneISPM 15 Heat Treated Wood Stamps and why your shipping must have one
ISPM 15 Heat Treated Wood Stamps and why your shipping must have one
 

H7-SafetyHFE.pdf

  • 1. Dr. Adnan Maqsood SYSE-812 Human Factors Engg ➢Safety in Human Factors Engineering
  • 2. 2 Topics of the Course SYSE-812 Handout Contents 1 Introduction to HFE 2 Human Centric System Analysis & Design 3 Investigation Techniques in HFE 4 Affective Design in HFE 5 Cognitive & Mental Workload Analysis 6 Physical Workload Assessment 7 Safety in HFE 8 Job Satisfaction 9 Social Implications in HFE 10 Future of HFE
  • 3. 3 What is an Accident? ◼ Something without apparent cause, unexpected, unintentional act, mishap, chance occurrence, act of God ◼ Chen (1972). “An error with sad consequences”. Implies human error ◼ Arbous and Kerrick (1981). “Unplanned event in a chain of planned and/or controlled events.” Implies sequential development ◼ Schutzinger (1954). “Resulting from the integration of a constellation of forces.” Implies mechanical or other forces
  • 4. 4 What is an accident? ◼ Haddon (1964). “Occurrence of an unexpected physical or chemical damage to living or non-living structures.” Implies unexpected event ◼ Suchman (1961). “It is doubtful that any single definition will cover all types of events or interests.” Generally to qualify as in accident, there should be: ❑ 1. Low degree of expectedness ❑ 2. Low degree of avoidability ❑ 3. Low degree of intention ❑ 4. Quick occurrence
  • 5. 5 Myths, Misconceptions & Problems in Safety Analysis ◼ 1. Semantic confusion: A drops something on B, then B has an accident ◼ 2. Accidents happen to other people – they are accident prone – I am not. This means that safety propaganda or safety programs at work has little effect
  • 6. 6 Accident Statistics from UK Could be any country Place Deaths Serious Injuries Slight Injuries Home 7,561 120,000 1,500,000 (est) Road 6,810 88,563 253,835 Rail 216 920 11.570 Aircraft 147 ? ? Water transporter 158 ? ? Factory 628 ? 11,805 (3+days away) Farm 136 ? 8,945 (3+days away)
  • 7. 7 What do we learn from this? ◼ Home vs. Roads and Road vs. Work ◼ Compare to Heinrich’s theory. Stated for aircraft: Out of 330 mishaps are produced, 29 minor injuries and 1 major injury ◼ Difficult to get data with high reliability. Number of deaths are usually a bit more correct – but it depends on the country/culture ◼ What about trends? Technology brings its own problems. In 1870, 8% of the accidents in UK were traffic accidents – today 40%. Powered hand tools, nuclear power plants, etc. ◼ Society matures with time. In general, the trend is downwards, C. F. Smeed’s Law
  • 8. 8 Smeed’s Law (1972) – Revalidated several times
  • 9. 9 Smeed’s Law (1972) – Revalidated several times ◼ Increasing experience with greater motorization ◼ The more vehicles, the less miles per vehicle ◼ Improvements in legislation, roads, and vehicles ◼ In developing countries, the drivers get more experienced over time ◼ Social protest regarding high death rates ◼ The dynamics of these factors are unknown. But it is clear that large scale actions such as left-right switching in Sweden improved traffic safety the first year
  • 10. 10 Conceptual Models of the Accident Process ◼ 1. Chain of Multiple Events ◼ Each accident is the result of a series of events. No single cause exists – many factors influence the accident. ◼ The probability p of an accident is a function of several different variables p=f(x1,x2,x3,….xn). ◼ This model is also used in epidemiological models, see below. ◼ 2. Epidemiological Model ◼ Originated from the study of disease. (Water supplies and cholera in London). ◼ The host (accident victim) is described in terms of age, sex, economic status, intelligence, behavior, etc. The agent (injury deliverer) is described in terms of type, potential hazard, method of use, etc. The environment is described in terms of the effects on the host and agent: e.g. temperature, noise, social climate. ◼ Useful for classifying accidents, but is not so helpful for analyzing cause and effect
  • 11. 11 Epidemiology Origin ◼ Investigation of Cholera Epidemics in London in 1855 Water Company Number of Houses Deaths from Cholera Deaths per 10,000 Houses Southwark and Vauxhall 40,046 1,263 315 Lambeth 26,107 98 37 Rest of London 256,423 1,422 59
  • 12. 12 Conceptual Models of the Accident Process ◼ 3. Energy Exchange model ❑ Injuries produced by energy exchange: e.g. mechanical, chemical, thermal, electrical, etc. ❑ For example: A blow from a moving object crushed a passengers leg in a car. ❑ This concept is a bit naïve, since all physical events involve energy exchange. ❑ It is difficult to understand about causation. ❑ But the classification can be useful to suggest barriers against accidents
  • 13. 13 Conceptual Models of the Accident Process ◼ 4. Behavioral Models ❑ A. Risk-taking models: Whenever a decision is made, it is affected by the degree of risk. Risk taking is affected by the amount of uncertainty and the amount of danger. The assumption is that those taking higher risks have more accidents. But sometimes people are not aware of risks at all. ❑ B. Accident Proneness: Proposes that some persons are more liable, due to their personality, to have more accidents. There has been a tremendous amount of research. The notion of accident proneness has proven not to be useful ❑ C. Concept of Overloading: Due to information, etc. Matching environmental requirements with operator capabilities
  • 14. 14 Conceptual Models of the Accident Process ◼ 5. Systems Safety ❑ Safety is a systems problem and the person must be understood in the context of the total system Equipment Factors Task Factors Environmental Factors Failure of Part of System Operator Response to the Failure Accident or Accident Avoided Predisposing Factors Precipitating Factors Example of a systems approach to accident analysis. There are predisposing factors such as worn tires, wet road, and glare. These can lead to precipitating factors and eventually an accident
  • 15. 15 Conceptual Models of the Accident Process ◼ 6. Combined Models. Surry classified the process by analyzing a series of questions Predisposing Characteristics Situational Characteristics Accident Conditions Susceptible host Hazardous environment Injury-producing agent Risk taking Appraisal of hazard Margin of error Unexpected Unavoidable Unintentional
  • 16. 16 Ramsey’s Model ◼ Old lady sees water puddle when crossing the road ◼ She recognizes the slipping hazard ◼ She decides to avoid the puddle ◼ But she does not step to side quickly enough ◼ She slips and falls!
  • 17. 17 Human Error Classification Scheme. Rouse (1983) ◼ 1. Observation of System State: Incorrect reading of appropriate state variables; Erroneous interpretation of correct readings; Failure to observe sufficient number of variables; Observation of inappropriate state variables; ◼ 2. Choice of Hypothesis: Hypothesis does not functionally relate to variables observed. Hypothesis could not cause the values of the state variables observed; Formulate better hypotheses ◼ 3. Testing of Hypothesis: Hypothesis not tested. Stopped before reaching a conclusion; Reached wrong conclusion; Considered but discarded correct conclusion;
  • 18. 18 Human Error Classification Scheme. Rouse (1983) ◼ 4. Choice of goal: Goal not chosen. Insufficient specification of goal. Choice of counter-productive ◼ 5. Choice of procedure: Procedure not chosen. Choice would not achieve goal. Choice would achieve incorrect goal; Choice unnecessary for achieving goal ◼ 6. Execution of procedure: Unrelated inappropriate step executed; Required step omitted; Unnecessary repetition of required step; Unnecessary step added; Steps executed in wrong order; Step executed too early or too late; Control in wrong position or range; Stopped before procedure complete
  • 19. 19 Human Error ◼ Human error is the primary cause of 60-90 percent of major accidents. Doctors and nurses make in average 1.7 errors per patient ◼ Thirty percent errors in command selection in word processing (Card et al., 1980) ◼ But many of these errors are the results of bad system design and bad organization rather than irresponsible actions ◼ 1. Many reasons of errors: ❑ Poor discriminability ❑ Memory lapses ❑ Communication breakdown ❑ Biases in decision making ❑ Selection of compatible, but incorrect response
  • 20. 20 Human Error ◼ 2. Speed-Accuracy Trade Off ❑ It is impossible to work very fast and accurate at same time ❑ Fast and sloppy OR Slow and accurate ◼ 3. Signal Detection Theory ❑ Assumes two kinds of human errors: false alarms and misses ❑ The study of human errors has become a science by itself
  • 21. 21 Categories of Human Errors by James Reason ◼ Mistakes & Slips ◼ Mistakes ❑ Failure to formulate the right intention, due to shortcomings in: Perception, memory and cognition ❑ James Reason used Rasmussen’s distinction between: ◼ Knowledge-based mistakes ◼ Rule-based mistakes
  • 22. 22 Categories of Human Errors ◼ Knowledge-Based Mistakes ◼ These are due to failure to understand the situation. The operator may not be able to consider alternative decisions, since she is overwhelmed by the complexity of evidence and cannot interpret it correctly ◼ Rule-Based Mistakes ◼ Example of rules: It is correct to turn the wheels in the direction you want to go – unless you are skidding on ice. Formulated as IF-THEN rules. There may be exceptions or qualifications that are overlooked – the THEN part may be wrong. The choice of rule is guided by frequency and reinforcement – Rules that have been successful are chosen again ◼ Rule-based mistakes tend to be done with much confidence “Strong but Wrong” ◼ But there is less confidence in knowledge-based situations, maybe because this involves a more conscious effort
  • 23. 23 Categories of Human Errors ◼ Slips ❑ The right intention is carried out – but incorrectly. A common class of slips are “capture errors”. These may happen when ◼ a. The intended action is almost the same as routine action ◼ b. The action sequence is relatively automatic ◼ e.g. Pouring orange juice in the coffee cup while reading the morning paper during breakfast ◼ These routine situations are not attended, and the errors are produced because the stimulus and the response are similar ◼ In flying, controls for flaps and landing gears have both similar feels, appearance, direction, and location, and are both relevant for take-off and landing
  • 24. 24 Categories of Human Errors ◼ Lapses ❑ Failure to carry out an action – due to forgetfulness ❑ Sometimes an interruption may cause a sequence to be stopped (What was I saying?) ◼ Mode Errors ❑ An action that is appropriate in one mode of operation is not appropriate for another ❑ Example: Raising landing wheels, although aircraft is still on the runway – but the pilot thought it was airborne ❑ Mode errors are of great concern in flying and HCI, where the same key may have different meanings ❑ Mode errors are a joint consequence of relatively automated performance and improperly conceived systems design.
  • 25. 25 Remedial Actions of Errors Potential Error Error Type Action Loco not returned to service bay for 24 hr. Check Violation Organization / Management Setting off with parking brake on Slip Design Driving locos despite earth tester warning on Violation Design / Training Drivers leaning out of cab when traveling Violation Design / Training Inadequate use of warning horns Violation Design Misreading of displays Slip Design Guards leaning out of cabs when travelling Violation Design / Training Insufficient warning of objects/people on track Slip Design Instability to effectively use fire extinguishers Mistake Design Incorrect control operations Mistake Training / Design
  • 26. 26 How to deal with Mistakes? ◼ What can we do about Knowledge-based mistakes, Rule-based mistakes, Slips, and lapses? ❑ Knowledge-based: Train the operator ❑ Rule-based: Training and redesign ❑ Slips: Redesign the task / environment ❑ Lapses: Redesign the task
  • 27. 27 Conclusion ◼ There are several ways to remedy causes of human error ◼ In industry, it is common to implement work procedures and training of operators ◼ In supervisory control, we try to redesign the workplace and tools – and train the operator This approach has been adopted by many organizations – e.g. military. It is now common in nuclear power plants and other complex environments. Lately it has also been adopted by industry
  • 28. 28 Reason’s Cheese Model of Accidents ◼ James Reason’s Swiss Cheese Model of Organizational Accidents
  • 29. 29 Errors in Organizational Context ◼ Reason thinks that human errors represent only a small part of the deficiencies in an organization ◼ Accidents are visible, and therefore analyzed. Less visible organizational errors are often performed in management decision making ❑ Example: Industrial managers have limited resources – often not enough to allocate to both productivity and safety ❑ Managers get positive reinforcement from production, but safety is considered a “show stopper”, and is usually characterized by absence of evidence
  • 30. 30 Consequences of Reason’s Model of Human Error ◼ Training ❑ Lack of knowledge can lead to mistakes. Training is therefore helpful. But operators must also train at correcting errors – this is naturalistic. Error-free training is not ◼ Memory aids and rules ❑ For example, use memory aids for procedures (e.g. checklists) ❑ Rules must be logical. The “band aid” approach to human error make the situation worse
  • 31. 31 Consequences of Reason’s Model of Human Error ◼ Error-Tolerant Systems ❑ There is one positive aspect of errors – the opportunity for the operator to correct them. This gives the operator a sense of control. Driving a car involves continuous error correction (of lateral and longitudinal position). ❑ Often there are many strategies and the operator must be allowed to act in an opportunistic fashion. The operator must be allowed to respond differently according to the conditions of the moment. Operators must be given a chance to explore the functionality of the system. Is there an undo button? ❑ In an error-tolerant system, one can recover by undoing an action – there is a back-up option
  • 32. 32 Starr (1969) risk taking model ◼ The horizontal line represents the natural death rate due to old age
  • 33. 33 Human Errors are Commonplace ◼ But many of the errors people commit in operating systems are the result of bad system design or bad organizational structure rather than irresponsible action (Norman 1988; Reason, 1990, 1997; Woods & Cook, 1999) ◼ Although human error may be statistically defined as a contributing cause to an accident, usually human error is only one in a complex chain of breakdowns – many of them are of mechanical or organizational nature ◼ They affected the system and weaken its defenses (Perrow, 1984; Reason, 1997)
  • 34. 34 Stop Blaming the Operator ◼ By minimizing human error, we can improve both safety and industrial production. This is a matter of design and training ◼ The notion that the operator should be punished or personally made responsible is unwarranted – (unless there is a clear violation of regulations). ◼ Accident proneness is not a viable concept (Shaw and Sichel). Therefore the blame for accidents and poor quality falls on poor design, poor procedures, poor training and in the end poor management!
  • 35. 35 Fault Tree Analysis ◼ Has been used extensively in space-craft design, analysis of nuclear power plant, safety etc. ❑ 1. The fault tree starts with a specific failure (the top of the tree). Choice of failure is important. If it is too general, it cannot be analyzed, it is too specific, the analysis will not produce enough information ❑ 2. The purpose is to find all credible ways in which the undesirable event can occur. (Very expensive analysis) ❑ 3. It is a graphical model of various parallel and sequential faults that will result in the occurrence of the undesired fault (at the top of the tree) ❑ 4. Primary events are caused by inherent characteristics of component, such as failure of light bulb due to worn filament. Secondary events are caused by external sources–such as excessive voltage, which burns out filament
  • 36. 36 Construction of a fault tree ◼ A. By analysis (top-down) ❑ 1. Select one head event that is to be prevented ❑ 2. Determine all primary and secondary events that may cause the head event ❑ 3. Determine relationships between causal events and the head event in terms of AND and OR Boolean operators ❑ 4. Determine the value and need for further analysis according to steps 2 and 3 ❑ 5. Continue to reiterate steps 2-4 until all events are basic, or until it is not desirable to go further. ❑ 6. Diagram the events using the symbols below ❑ 7. Perform qualitative and quantitative analyses
  • 37. 37 Fault-Tree Analysis Top event – Cannot be developed further Basic Event Event to be further developed Normal Event is normal, but can become a fault Inconsequential Event Or Insufficient data to develop AND Gate. Several input events must occur to cause output event OR Gate. At least one input event must occur to cause output event