SlideShare a Scribd company logo
1 of 41
DR JOHN ROOKSBY
IN THIS LECTURE …
This lecture focuses on human error in systems operation
Human error is implicated in many accidents and disasters.
• But this does not mean human error causes these
  accidents
• The media are often too quick to blame individuals … and
  computer scientists too quick to blame the user
We will look at issues in studying human error, and will
outline three approaches to modeling human error.
We will then discuss design-for-error.
In the next lecture I will suggest it is more important to look
at human reliability than human error.
HUMAN ERROR
Given that all complex systems involve people in their
production, maintenance and operation it should not be
surprising that human issues are often constituent in failures
and accidents.


According to one report, human error accounts for…
50-70% OF
 AVIATION
 DISASTERS




http://en.wikipedia.org/wiki/1994_Fairchild_Air_Force_Base_B-52_crash
70% OF SHIPPING
    ACCIDENTS




ge: Roberto Vongher
://en.wikipedia.org/wiki/Costa_Concordia_disaster
60-85% OF
 SHUTTLE
 INCIDENTS AT
 NASA




http://en.wikipedia.org/wiki/Space_Shuttle_Challenger_Disaster
44,000 – 98,000 DEATHS A YEAR
IN THE USA IN HEALTHCARE
(MAINLY THROUGH MEDICATION
ERRORS)




http://en.wikipedia.org/wiki/Anaesthetic_machine
CONTRIBUTION OR CAUSE?
Human error often features in accidents and disasters. But
this does not mean they necessarily cause accidents.
• Accidents can have multiple causes, so why single out
  human error?
• Other factors can underlie human errors, such as poor
  design, lack of training, or overwork.
• There is not always a correct way to work. Some actions
  are just errors in hindsight
It is important to understand and reduce errors that happen
at “the sharp end” of system operation
• But this may be to treat symptoms from deeper troubles.
THE “SHARP END” OF FAILURE
  Blunt End

              Regulations


              Organisations


                 Groups
  Sharp End




                 Users


               Technology
http://gizmodo.com/5844628/a-passenger-airplane-nearly-flew-upside-down-because-of-a-dumb-pilot
STUDYING HUMAN ERROR
Human activity and human error is not a simple topic. It
encounters areas that have been debated in psychology and
the humanities for decades, if not centuries.
Errors are predominantly studied by
   • Laboratory simulations
   • Field observation
   • Archive data

It is difficult to study human error. What constitutes an error
can be controversial
   • What is the „correct‟ action is in any given situation?
MODELING HUMAN ERROR
There have been several attempts to build taxonomies of
human error. There is still no definitive model, and many
argue there never can be.
In this lecture I will briefly cover three
• THERP (Technique for Human Error Rate Prediction)
• GEMS (Generic Error Modeling System)
• CREAM (Cognitive Reliability and Error Analysis Method)
THERP
THERP states human actions can be:
• Correct: Actions are done as specified
• Errors of omission: An action is omitted
• Errors of commission: An action is inadequate. out of
  sequence, mistimed, or not of quality (too much/too
  little/wrong way)
• Extraneous actions: An action is not expected at that time
THERP enables you to build probabilities of errors occurring
in a given context.
The key problem with THERP is that it assumes a correct
specification. The only things humans do is follow the
specification, and if not it is an error. Reality is not like this.
GEMS
The GEMS model has also had a wide impact. It draws on
cognitive psychology.
GEMS is founded upon an “Activity Space” model, which
represents human performance on three levels:
  • Skills based activity
       • Activities where we act more or less automatically
  • Rule based activity
       • Activities where we apply rules
  • Knowledge based activity
       • Activities where we fall back on our (sometimes patchy)
         knowledge
GEMS
• Errors in skills based activity:
      • Execution errors
          • Slips: an error in executing an action correctly
          • Lapses: steps are missed when executing an
            activity

• Errors in rule based activity
      • Planning errors
          • Rule-based mistake: a rule is misapplied

• Errors in knowledge based activity
      • Also planning errors
          • Knowledge-based mistakes: knowledge is
            wrong or misapplied
GEMS: CHARACTERISTICS OF
ERROR TYPES
                  Skills Based       Rule Based          Knowledge
                  Errors             Errors              Based Errors

Main Error type    Slips & lapses      RB mistakes           KB mistakes
Activity type      Routine actions         Problems solving activities
Attention         Often elsewhere     Directed at problem related issues
Control mode      Mainly automatic              More conscious
Predictability         Largely                      variable
                     predictable

Frequency             Common                      Uncommon
Opportunity           Very high                     Very low
Detection           Usually easy     Difficult and often through intervention
GEMS
GEMS is best thought of as a way of characterising rather
than defining error. There are a number of problems with it:
• It assumes human activities are goal or plan-driven (this is
  controversial in psychology and the humanities)
• The plan or goal is assumed to be correct but how do you
  judge this?
• It can be very difficult to definitively categorise any human
  action in terms of error
• Many different versions of the model exist
CREAM
CREAM (Cognitive Reliability and Error Analysis Method
seeks to avoid the idea that humans simply introduce errors
into „perfectly specified systems‟, and enables you to model
correct as well as incorrect actions
CREAM sees action taking place across three levels
• Person Related
• Organisation Related
• Technology Related
CREAM
CREAM details the genotypes (possible causes) of error as
• Person related
   • Observation / Interpretation / Planning
• Organisation related
   • Communication / Training / Ambient Conditions / Working
     Conditions
• Technology related
   • Equipment Failure / Procedures / Temporary Interface
      Problems / Permanent Interface Problems
CREAM does not offer a specific error taxonomy, but offers
generic „phenotypes‟ with which to build one.
VIOLATIONS
We have assumed so far, that errors are unintentional
aberrations of an intentional and correct activity.
• But sometimes rules and procedures can be deliberately
  violated
Sabotage is one reason, but many violations can be well
intentioned. These well-intentioned violations fall into three
major categories:
  • Routine violations: Taking the path of least effort.
  • Optimizing violations: Doing something (too) quickly or
    cheaply.
  • Necessary violations: These are provoked by
    organisational or contextual failings.
DESIGN FOR ERROR
If we design systems appropriately we can minimise error.
• This is called Design for Error
When designing any system we need to be aware how
human errors can and will be made.
• This is not to say that human error is a problem that can
  be solved through design – but to say that good design
  can play a role in minimising error.
Design for Error is similar to, but not the same as Design-
for-Failure and Design-for-Recovery.
A helpful introduction to the concepts of Design for Error
can be found in Don Norman‟s book “The Design of
Everyday Things”. He Recommends…
DESIGN FOR ERROR
Put the required knowledge into the world.
  •   Don‟t require people to remember everything they need
      to know in order to operate a system
  •   This knowledge must be available in an appropriate
      form (manuals are often left on shelves unread)
  •   But understand that different people require different
      forms of guidance. An expert in some procedure will
      not want to be forced to follow the same sequence of
      steps that an amateur may need to go through.
DESIGN FOR ERROR
Design “forcing functions” physical or logical constraints
  • Interlocks: These force certain sequences of events, for
    example opening a microwave door turns it off. To set off
    a fire extinguisher you must use remove the pin.
  • Lockins: These stop you for carrying out a certain action
    in a particular context. For example most computers now
    cannot be shut down when there is unsaved work.
  • Lockouts: These stop you for doing something. For
    example the stairs to the basement of a tall building are
    usually differently designed or have a gate to stop people
    continuing to the basement when evacuating.
DESIGN FOR ERROR
Narrow the gulf of execution and evaluation
  • Make things visible to the user and to others, make the
    results of each action apparent
  • Enable people to correct their own errors if they see
    them. Support double checking. Be aware that
    correcting other person‟s error can create social
    difficulties (especially if that person is a superior)
  • Provide support for evaluation in ways that are
    situationally appropriate. For example people may stop
    reading common error message, so if an uncommon
    error occurs consider not making it look like a run-of-the
    –mill event.
DESIGN FOR ERROR
Don‟t assume that technical approaches need to be taken to
reduce error.
• Human centred approaches can be effective, particularly
  training people.
• Organisational approaches such as planning workflows
  and shifts can be effective.
• The design of the working environment can also have a
  huge impact on error proneness.
KEY POINTS
Human error is often implicated in accidents and disasters
It is often wrong to say human error is the cause of an
accident or disaster, as there will be other underlying causes.
It can be difficult and controversial to label any particular
action as an error. Just because it varies from a procedure
does not mean it was the wrong thing to do.
There are several ways to model error. Often these are
derived from cognitive psychology, and concentrate on
where a correct action was intended but an erroneous action
performed.
We can design for error. Think of the ways people can
experience errors and provide resources to reduce these.
SOURCES / READING
Books by James Reason
• (1997) Managing the risks of organisational accidents. Ashgate.
• (1990) Human error. Cambridge University Press.
• (2008) The Human Contribution. Ashgate.

Donald Norman (1988) The design of everyday things. Basic.
• See the chapter “Too Err is Human”

Sydney Dekker (2006) The Field Guide to Human Error. Ashgate.

L Kohn & M Donaldson (editors) (2000) Too Err is Human: Building a Safer
Health System. (http://www.nap.edu/openbook.php?isbn=0309068371)

Books by Erik Hollnagel
• (1998) Cognitive Reliability and Error Analysis Method (CREAM)
• (2006) Resilience Engineering
EXERCISE 1.
INFUSION DEVICE
For the next 5 minutes
  • read “infusion devices” example.
Useful Definitions:
  • An infusion device is a mechanical device that administers
    intravenous solutions containing drugs to patients.
  • Hypertensive means patient has high blood pressure
  • Cubic centimetres can be written as cc‟s or cm3
  • An Anaesthesiologist or anaesthetist is a medical doctor
    who administers the anaesthetic before, during and after
    surgery.
  • Intravenous (IV) fluid is supplied in plastic bags and
    administered using IV tubing.
INFUSION DEVICE
The “system” in this case was
  •   The digital technology,
  •   The equipment (tubing etc)
  •   The people, practices and procedures
  •   and the physical design of the surgical suite.
 The “failure” in this case was

 • The breakdown in delivery of IV medications during surgery -
   the free flow of the medication from the infusion device.
INFUSION DEVICE
Systemic Failures:
  • Multiple infusions devices, each requiring set-up and each
    requiring a slightly different set up.
  • Each of three different medications had to be programmed
    into the infusion device with the correct dose for the patient
  • Possible scheduling problems in the operating suites may
    have contributed to the anaesthesiologist having insufficient
    time to check the devices before surgery
  • A new nurse on the team means assumptions within the team
    about responsibilities and ways of working might be false.
  • The nurse found herself assembling a device she was
    unfamiliar with. Was she trained properly? Why didn’t she
    ask for help?
INFUSION DEVICE
Where was the error?
  • There is no single error here
  • As in any safety critical industry there are
    numerous faults and latent conditions that need to
    be addressed
  • Appropriate mechanisms need to be in place to
    trap errors
  • Blaming the nurse is a common but inappropriate
    reaction in this case. Hospitals often have a
    “blame culture”
            See “Making Information Technology a Team Player in Safety: The Case of
               Infusion Devices” (further reading section) for more on infusion devices
            The example is based upon Too Err is Human (see further reading section)
EXERCISE 2.
COMMON SLIPS AND LAPSES
Slips often occur is routine activities. We intend to do one
thing, but do another. There are many kinds of slip:


Capture Errors:
An activity you are doing is “captured” by another one. Often a
non-routine activity can be captured by a more routine one.
For example, sometimes when I am driving to St Andrews town
centre I pull into the work car park as if I was driving to work.
COMMON SLIPS AND LAPSES
Description Errors
Sometimes when we do a routine activity, we do it to something
that is similar to but not the same as the thing intended. (It is not
correct but “fits the description”)
For example sometimes if I leave my mobile next to my mouse, I
grab the mobile by mistake.
For example I once dried my hands on my flatmate‟s coat which
was hanging on the back of a chair where a tea-towel would
normally be
COMMON SLIPS AND LAPSES
Data driven errors
Many human actions are responses to something. These
responses can enter into a processes as an additional step or as
a mis-step
For example when I was typing a document, someone asked me
the meaning of a word. I then realised I had typed that word
instead of the word I mean to.
COMMON SLIPS AND LAPSES
Associate action errors
Sometimes our own internal associations can trigger a slip.


For example picking up the telephone and saying “come in”


For example, I once went to a job interview and instead of saying
“Hi, I‟m John”, I said “Hi, I‟m scared”. (These kinds of associative
errors are called Freudian Slips).
COMMON SLIPS AND LAPSES
Loss of Activation Errors
Sometimes we set out to do something, but along the way forget
what we set out to do.


For example, I once went to my bedroom but once I was there
wondered what it was I went to do. Once I was back downstairs I
remembered I wanted to charge my phone.
COMMON SLIPS AND LAPSES
Mode Errors
Sometimes we operate a technology correctly, except that it is in
the wrong mode.


For example, when turning my car around, I reversed it but forgot
to put it in a forward gear before setting off forwards.
For example, I typed the body of a text message into the „to‟ area
on my phone.



                        Source: Donald Norman (1988) The design of everyday things. Basic.

More Related Content

What's hot

Introduction to Understanding Human errors in Pharmaceutical Industries
 Introduction to Understanding Human errors in Pharmaceutical Industries Introduction to Understanding Human errors in Pharmaceutical Industries
Introduction to Understanding Human errors in Pharmaceutical IndustriesKarishmaRK
 
FAA HUMAN FACTOR IN AVIATION MAINTENANCE HF MRO
FAA HUMAN FACTOR IN AVIATION MAINTENANCE HF MROFAA HUMAN FACTOR IN AVIATION MAINTENANCE HF MRO
FAA HUMAN FACTOR IN AVIATION MAINTENANCE HF MROAmnat Sk
 
Human Error Prevention
Human Error PreventionHuman Error Prevention
Human Error PreventionAJAY SAVITA
 
Human factors in Aviation
Human factors in AviationHuman factors in Aviation
Human factors in AviationJames Lowrence
 
Human Factors Training in Aviation
Human Factors Training in AviationHuman Factors Training in Aviation
Human Factors Training in Aviationaviation-training
 
Human Factors as Driver for Safety Management, Engineering, and Risk Governance
Human Factors as Driver for Safety Management, Engineering, and Risk GovernanceHuman Factors as Driver for Safety Management, Engineering, and Risk Governance
Human Factors as Driver for Safety Management, Engineering, and Risk GovernanceThe Windsdor Consulting Group, Inc.
 
Helicopter Aviation: Human Factors
Helicopter Aviation: Human FactorsHelicopter Aviation: Human Factors
Helicopter Aviation: Human FactorsIHSTFAA
 
FMEA Introduction.ppt
FMEA Introduction.pptFMEA Introduction.ppt
FMEA Introduction.pptbowerj
 
Human factor basic
Human factor  basicHuman factor  basic
Human factor basicS P Singh
 
Root Cause Analysis
Root Cause AnalysisRoot Cause Analysis
Root Cause Analysismtalhausmani
 

What's hot (20)

Human Factors in Aviation
Human Factors in AviationHuman Factors in Aviation
Human Factors in Aviation
 
Human errors
Human errorsHuman errors
Human errors
 
Human Factors Presentation
Human Factors PresentationHuman Factors Presentation
Human Factors Presentation
 
Human factors
Human factorsHuman factors
Human factors
 
Introduction to Understanding Human errors in Pharmaceutical Industries
 Introduction to Understanding Human errors in Pharmaceutical Industries Introduction to Understanding Human errors in Pharmaceutical Industries
Introduction to Understanding Human errors in Pharmaceutical Industries
 
FAA HUMAN FACTOR IN AVIATION MAINTENANCE HF MRO
FAA HUMAN FACTOR IN AVIATION MAINTENANCE HF MROFAA HUMAN FACTOR IN AVIATION MAINTENANCE HF MRO
FAA HUMAN FACTOR IN AVIATION MAINTENANCE HF MRO
 
Human factors training
Human factors trainingHuman factors training
Human factors training
 
Human Error Prevention
Human Error PreventionHuman Error Prevention
Human Error Prevention
 
Human factors in Aviation
Human factors in AviationHuman factors in Aviation
Human factors in Aviation
 
Failure Mode & Effects Analysis (FMEA)
Failure Mode & Effects Analysis (FMEA)Failure Mode & Effects Analysis (FMEA)
Failure Mode & Effects Analysis (FMEA)
 
Human error
Human errorHuman error
Human error
 
Human Factors Training in Aviation
Human Factors Training in AviationHuman Factors Training in Aviation
Human Factors Training in Aviation
 
Human Factors as Driver for Safety Management, Engineering, and Risk Governance
Human Factors as Driver for Safety Management, Engineering, and Risk GovernanceHuman Factors as Driver for Safety Management, Engineering, and Risk Governance
Human Factors as Driver for Safety Management, Engineering, and Risk Governance
 
Airworthiness: Maintenance Error Dirty Dozen
Airworthiness: Maintenance Error Dirty DozenAirworthiness: Maintenance Error Dirty Dozen
Airworthiness: Maintenance Error Dirty Dozen
 
Helicopter Aviation: Human Factors
Helicopter Aviation: Human FactorsHelicopter Aviation: Human Factors
Helicopter Aviation: Human Factors
 
FMEA Introduction.ppt
FMEA Introduction.pptFMEA Introduction.ppt
FMEA Introduction.ppt
 
FMEA
FMEAFMEA
FMEA
 
Human factor basic
Human factor  basicHuman factor  basic
Human factor basic
 
Safety, Accidents, and Human Error
Safety, Accidents, and Human ErrorSafety, Accidents, and Human Error
Safety, Accidents, and Human Error
 
Root Cause Analysis
Root Cause AnalysisRoot Cause Analysis
Root Cause Analysis
 

Similar to CS5032 Lecture 5: Human Error 1

2017 VMUG UserCon Presentation (IT Culture & DevOps)
2017 VMUG UserCon Presentation (IT Culture & DevOps)2017 VMUG UserCon Presentation (IT Culture & DevOps)
2017 VMUG UserCon Presentation (IT Culture & DevOps)Jon Hildebrand
 
VMUG UserCon Presentation for 2018
VMUG UserCon Presentation for 2018VMUG UserCon Presentation for 2018
VMUG UserCon Presentation for 2018Jon Hildebrand
 
CS5032 Lecture 10: Learning from failure 2
CS5032 Lecture 10: Learning from failure 2CS5032 Lecture 10: Learning from failure 2
CS5032 Lecture 10: Learning from failure 2John Rooksby
 
Applying SRE techniques to micro service design
Applying SRE techniques to micro service designApplying SRE techniques to micro service design
Applying SRE techniques to micro service designTheo Schlossnagle
 
Interface Design for Elearning - Tips and Tricks
Interface Design for Elearning - Tips and TricksInterface Design for Elearning - Tips and Tricks
Interface Design for Elearning - Tips and TricksJulie Dirksen
 
An Introduction to Usability
An Introduction to UsabilityAn Introduction to Usability
An Introduction to Usabilitydirk.swart
 
CS5032 Lecture 6: Human Error 2
CS5032 Lecture 6: Human Error 2CS5032 Lecture 6: Human Error 2
CS5032 Lecture 6: Human Error 2John Rooksby
 
Fundamentals of testing - Testing & Implementations
Fundamentals of testing - Testing & ImplementationsFundamentals of testing - Testing & Implementations
Fundamentals of testing - Testing & Implementationsyogi syafrialdi
 
Debugging microservices in production
Debugging microservices in productionDebugging microservices in production
Debugging microservices in productionbcantrill
 
Human factors - what role should they play in Responsible Care
Human factors - what role should they play in Responsible CareHuman factors - what role should they play in Responsible Care
Human factors - what role should they play in Responsible CareAdvisian
 
Human factors in software reliability engineering - Research Paper
Human factors in software reliability engineering - Research PaperHuman factors in software reliability engineering - Research Paper
Human factors in software reliability engineering - Research PaperMuhammad Ahmad Zia
 

Similar to CS5032 Lecture 5: Human Error 1 (20)

human-error.ppt
human-error.ppthuman-error.ppt
human-error.ppt
 
Human Error as a Learning Tool
Human Error as a Learning ToolHuman Error as a Learning Tool
Human Error as a Learning Tool
 
2017 VMUG UserCon Presentation (IT Culture & DevOps)
2017 VMUG UserCon Presentation (IT Culture & DevOps)2017 VMUG UserCon Presentation (IT Culture & DevOps)
2017 VMUG UserCon Presentation (IT Culture & DevOps)
 
VMUG UserCon Presentation for 2018
VMUG UserCon Presentation for 2018VMUG UserCon Presentation for 2018
VMUG UserCon Presentation for 2018
 
CS5032 Lecture 10: Learning from failure 2
CS5032 Lecture 10: Learning from failure 2CS5032 Lecture 10: Learning from failure 2
CS5032 Lecture 10: Learning from failure 2
 
Root cause analysis
Root cause analysisRoot cause analysis
Root cause analysis
 
Applying SRE techniques to micro service design
Applying SRE techniques to micro service designApplying SRE techniques to micro service design
Applying SRE techniques to micro service design
 
Interface Design for Elearning - Tips and Tricks
Interface Design for Elearning - Tips and TricksInterface Design for Elearning - Tips and Tricks
Interface Design for Elearning - Tips and Tricks
 
An Introduction to Usability
An Introduction to UsabilityAn Introduction to Usability
An Introduction to Usability
 
CS5032 Lecture 6: Human Error 2
CS5032 Lecture 6: Human Error 2CS5032 Lecture 6: Human Error 2
CS5032 Lecture 6: Human Error 2
 
DBA Best Practices.ppt
DBA Best Practices.pptDBA Best Practices.ppt
DBA Best Practices.ppt
 
Debugging (Part 2)
Debugging (Part 2)Debugging (Part 2)
Debugging (Part 2)
 
Fundamentals of testing - Testing & Implementations
Fundamentals of testing - Testing & ImplementationsFundamentals of testing - Testing & Implementations
Fundamentals of testing - Testing & Implementations
 
Starting with c
Starting with cStarting with c
Starting with c
 
QA is Broken, Fix it!
QA is Broken, Fix it!QA is Broken, Fix it!
QA is Broken, Fix it!
 
Debugging microservices in production
Debugging microservices in productionDebugging microservices in production
Debugging microservices in production
 
Human Factor Safety Decomposed
Human Factor Safety DecomposedHuman Factor Safety Decomposed
Human Factor Safety Decomposed
 
Human factors - what role should they play in Responsible Care
Human factors - what role should they play in Responsible CareHuman factors - what role should they play in Responsible Care
Human factors - what role should they play in Responsible Care
 
Human factors in software reliability engineering - Research Paper
Human factors in software reliability engineering - Research PaperHuman factors in software reliability engineering - Research Paper
Human factors in software reliability engineering - Research Paper
 
Process Troubleshooting
Process TroubleshootingProcess Troubleshooting
Process Troubleshooting
 

More from John Rooksby

Designing apps lecture
Designing apps lectureDesigning apps lecture
Designing apps lectureJohn Rooksby
 
Implementing Ethics for a Mobile App Deployment
Implementing Ethics for a Mobile App DeploymentImplementing Ethics for a Mobile App Deployment
Implementing Ethics for a Mobile App DeploymentJohn Rooksby
 
Self tracking and digital health
Self tracking and digital healthSelf tracking and digital health
Self tracking and digital healthJohn Rooksby
 
Digital Health From an HCI Perspective - Geraldine Fitzpatrick
Digital Health From an HCI Perspective - Geraldine FitzpatrickDigital Health From an HCI Perspective - Geraldine Fitzpatrick
Digital Health From an HCI Perspective - Geraldine FitzpatrickJohn Rooksby
 
How to evaluate and improve the quality of mHealth behaviour change tools
How to evaluate and improve the quality of mHealth behaviour change toolsHow to evaluate and improve the quality of mHealth behaviour change tools
How to evaluate and improve the quality of mHealth behaviour change toolsJohn Rooksby
 
Guest lecture: Designing mobile apps
Guest lecture: Designing mobile appsGuest lecture: Designing mobile apps
Guest lecture: Designing mobile appsJohn Rooksby
 
Talk at UCL: Mobile Devices in Everyday Use
Talk at UCL: Mobile Devices in Everyday UseTalk at UCL: Mobile Devices in Everyday Use
Talk at UCL: Mobile Devices in Everyday UseJohn Rooksby
 
Intimacy and Mobile Devices
Intimacy and Mobile DevicesIntimacy and Mobile Devices
Intimacy and Mobile DevicesJohn Rooksby
 
CS5032 Lecture 2: Failure
CS5032 Lecture 2: FailureCS5032 Lecture 2: Failure
CS5032 Lecture 2: FailureJohn Rooksby
 
CS5032 Lecture 20: Dependable infrastructure 2
CS5032 Lecture 20: Dependable infrastructure 2CS5032 Lecture 20: Dependable infrastructure 2
CS5032 Lecture 20: Dependable infrastructure 2John Rooksby
 
CS5032 Lecture 19: Dependable infrastructure
CS5032 Lecture 19: Dependable infrastructureCS5032 Lecture 19: Dependable infrastructure
CS5032 Lecture 19: Dependable infrastructureJohn Rooksby
 
CS5032 Lecture 14: Organisations and failure 2
CS5032 Lecture 14: Organisations and failure 2CS5032 Lecture 14: Organisations and failure 2
CS5032 Lecture 14: Organisations and failure 2John Rooksby
 
CS5032 Lecture 13: organisations and failure
CS5032 Lecture 13: organisations and failureCS5032 Lecture 13: organisations and failure
CS5032 Lecture 13: organisations and failureJohn Rooksby
 
CS5032 Lecture 9: Learning from failure 1
CS5032 Lecture 9: Learning from failure 1CS5032 Lecture 9: Learning from failure 1
CS5032 Lecture 9: Learning from failure 1John Rooksby
 
Testing Sociotechnical Systems: Passport Issuing
Testing Sociotechnical Systems: Passport IssuingTesting Sociotechnical Systems: Passport Issuing
Testing Sociotechnical Systems: Passport IssuingJohn Rooksby
 
Testing Sociotechnical Systems: Heathrow Terminal 5
Testing Sociotechnical Systems: Heathrow Terminal 5Testing Sociotechnical Systems: Heathrow Terminal 5
Testing Sociotechnical Systems: Heathrow Terminal 5John Rooksby
 

More from John Rooksby (18)

Designing apps lecture
Designing apps lectureDesigning apps lecture
Designing apps lecture
 
Implementing Ethics for a Mobile App Deployment
Implementing Ethics for a Mobile App DeploymentImplementing Ethics for a Mobile App Deployment
Implementing Ethics for a Mobile App Deployment
 
Self tracking and digital health
Self tracking and digital healthSelf tracking and digital health
Self tracking and digital health
 
Digital Health From an HCI Perspective - Geraldine Fitzpatrick
Digital Health From an HCI Perspective - Geraldine FitzpatrickDigital Health From an HCI Perspective - Geraldine Fitzpatrick
Digital Health From an HCI Perspective - Geraldine Fitzpatrick
 
How to evaluate and improve the quality of mHealth behaviour change tools
How to evaluate and improve the quality of mHealth behaviour change toolsHow to evaluate and improve the quality of mHealth behaviour change tools
How to evaluate and improve the quality of mHealth behaviour change tools
 
Guest lecture: Designing mobile apps
Guest lecture: Designing mobile appsGuest lecture: Designing mobile apps
Guest lecture: Designing mobile apps
 
Talk at UCL: Mobile Devices in Everyday Use
Talk at UCL: Mobile Devices in Everyday UseTalk at UCL: Mobile Devices in Everyday Use
Talk at UCL: Mobile Devices in Everyday Use
 
Fitts' Law
Fitts' LawFitts' Law
Fitts' Law
 
Intimacy and Mobile Devices
Intimacy and Mobile DevicesIntimacy and Mobile Devices
Intimacy and Mobile Devices
 
Making data
Making dataMaking data
Making data
 
CS5032 Lecture 2: Failure
CS5032 Lecture 2: FailureCS5032 Lecture 2: Failure
CS5032 Lecture 2: Failure
 
CS5032 Lecture 20: Dependable infrastructure 2
CS5032 Lecture 20: Dependable infrastructure 2CS5032 Lecture 20: Dependable infrastructure 2
CS5032 Lecture 20: Dependable infrastructure 2
 
CS5032 Lecture 19: Dependable infrastructure
CS5032 Lecture 19: Dependable infrastructureCS5032 Lecture 19: Dependable infrastructure
CS5032 Lecture 19: Dependable infrastructure
 
CS5032 Lecture 14: Organisations and failure 2
CS5032 Lecture 14: Organisations and failure 2CS5032 Lecture 14: Organisations and failure 2
CS5032 Lecture 14: Organisations and failure 2
 
CS5032 Lecture 13: organisations and failure
CS5032 Lecture 13: organisations and failureCS5032 Lecture 13: organisations and failure
CS5032 Lecture 13: organisations and failure
 
CS5032 Lecture 9: Learning from failure 1
CS5032 Lecture 9: Learning from failure 1CS5032 Lecture 9: Learning from failure 1
CS5032 Lecture 9: Learning from failure 1
 
Testing Sociotechnical Systems: Passport Issuing
Testing Sociotechnical Systems: Passport IssuingTesting Sociotechnical Systems: Passport Issuing
Testing Sociotechnical Systems: Passport Issuing
 
Testing Sociotechnical Systems: Heathrow Terminal 5
Testing Sociotechnical Systems: Heathrow Terminal 5Testing Sociotechnical Systems: Heathrow Terminal 5
Testing Sociotechnical Systems: Heathrow Terminal 5
 

Recently uploaded

Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusZilliz
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
 
Vector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxVector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxRemote DBA Services
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businesspanagenda
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...apidays
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodJuan lago vázquez
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdfSandro Moreira
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistandanishmna97
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024The Digital Insurer
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamUiPathCommunity
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...apidays
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxRustici Software
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesrafiqahmad00786416
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Jeffrey Haguewood
 
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelDeepika Singh
 

Recently uploaded (20)

Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Vector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxVector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptx
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering Developers
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
 

CS5032 Lecture 5: Human Error 1

  • 2. IN THIS LECTURE … This lecture focuses on human error in systems operation Human error is implicated in many accidents and disasters. • But this does not mean human error causes these accidents • The media are often too quick to blame individuals … and computer scientists too quick to blame the user We will look at issues in studying human error, and will outline three approaches to modeling human error. We will then discuss design-for-error. In the next lecture I will suggest it is more important to look at human reliability than human error.
  • 3.
  • 4. HUMAN ERROR Given that all complex systems involve people in their production, maintenance and operation it should not be surprising that human issues are often constituent in failures and accidents. According to one report, human error accounts for…
  • 5. 50-70% OF AVIATION DISASTERS http://en.wikipedia.org/wiki/1994_Fairchild_Air_Force_Base_B-52_crash
  • 6. 70% OF SHIPPING ACCIDENTS ge: Roberto Vongher ://en.wikipedia.org/wiki/Costa_Concordia_disaster
  • 7. 60-85% OF SHUTTLE INCIDENTS AT NASA http://en.wikipedia.org/wiki/Space_Shuttle_Challenger_Disaster
  • 8. 44,000 – 98,000 DEATHS A YEAR IN THE USA IN HEALTHCARE (MAINLY THROUGH MEDICATION ERRORS) http://en.wikipedia.org/wiki/Anaesthetic_machine
  • 9. CONTRIBUTION OR CAUSE? Human error often features in accidents and disasters. But this does not mean they necessarily cause accidents. • Accidents can have multiple causes, so why single out human error? • Other factors can underlie human errors, such as poor design, lack of training, or overwork. • There is not always a correct way to work. Some actions are just errors in hindsight It is important to understand and reduce errors that happen at “the sharp end” of system operation • But this may be to treat symptoms from deeper troubles.
  • 10. THE “SHARP END” OF FAILURE Blunt End Regulations Organisations Groups Sharp End Users Technology
  • 11.
  • 13. STUDYING HUMAN ERROR Human activity and human error is not a simple topic. It encounters areas that have been debated in psychology and the humanities for decades, if not centuries. Errors are predominantly studied by • Laboratory simulations • Field observation • Archive data It is difficult to study human error. What constitutes an error can be controversial • What is the „correct‟ action is in any given situation?
  • 14. MODELING HUMAN ERROR There have been several attempts to build taxonomies of human error. There is still no definitive model, and many argue there never can be. In this lecture I will briefly cover three • THERP (Technique for Human Error Rate Prediction) • GEMS (Generic Error Modeling System) • CREAM (Cognitive Reliability and Error Analysis Method)
  • 15. THERP THERP states human actions can be: • Correct: Actions are done as specified • Errors of omission: An action is omitted • Errors of commission: An action is inadequate. out of sequence, mistimed, or not of quality (too much/too little/wrong way) • Extraneous actions: An action is not expected at that time THERP enables you to build probabilities of errors occurring in a given context. The key problem with THERP is that it assumes a correct specification. The only things humans do is follow the specification, and if not it is an error. Reality is not like this.
  • 16. GEMS The GEMS model has also had a wide impact. It draws on cognitive psychology. GEMS is founded upon an “Activity Space” model, which represents human performance on three levels: • Skills based activity • Activities where we act more or less automatically • Rule based activity • Activities where we apply rules • Knowledge based activity • Activities where we fall back on our (sometimes patchy) knowledge
  • 17. GEMS • Errors in skills based activity: • Execution errors • Slips: an error in executing an action correctly • Lapses: steps are missed when executing an activity • Errors in rule based activity • Planning errors • Rule-based mistake: a rule is misapplied • Errors in knowledge based activity • Also planning errors • Knowledge-based mistakes: knowledge is wrong or misapplied
  • 18. GEMS: CHARACTERISTICS OF ERROR TYPES Skills Based Rule Based Knowledge Errors Errors Based Errors Main Error type Slips & lapses RB mistakes KB mistakes Activity type Routine actions Problems solving activities Attention Often elsewhere Directed at problem related issues Control mode Mainly automatic More conscious Predictability Largely variable predictable Frequency Common Uncommon Opportunity Very high Very low Detection Usually easy Difficult and often through intervention
  • 19. GEMS GEMS is best thought of as a way of characterising rather than defining error. There are a number of problems with it: • It assumes human activities are goal or plan-driven (this is controversial in psychology and the humanities) • The plan or goal is assumed to be correct but how do you judge this? • It can be very difficult to definitively categorise any human action in terms of error • Many different versions of the model exist
  • 20. CREAM CREAM (Cognitive Reliability and Error Analysis Method seeks to avoid the idea that humans simply introduce errors into „perfectly specified systems‟, and enables you to model correct as well as incorrect actions CREAM sees action taking place across three levels • Person Related • Organisation Related • Technology Related
  • 21. CREAM CREAM details the genotypes (possible causes) of error as • Person related • Observation / Interpretation / Planning • Organisation related • Communication / Training / Ambient Conditions / Working Conditions • Technology related • Equipment Failure / Procedures / Temporary Interface Problems / Permanent Interface Problems CREAM does not offer a specific error taxonomy, but offers generic „phenotypes‟ with which to build one.
  • 22. VIOLATIONS We have assumed so far, that errors are unintentional aberrations of an intentional and correct activity. • But sometimes rules and procedures can be deliberately violated Sabotage is one reason, but many violations can be well intentioned. These well-intentioned violations fall into three major categories: • Routine violations: Taking the path of least effort. • Optimizing violations: Doing something (too) quickly or cheaply. • Necessary violations: These are provoked by organisational or contextual failings.
  • 23. DESIGN FOR ERROR If we design systems appropriately we can minimise error. • This is called Design for Error When designing any system we need to be aware how human errors can and will be made. • This is not to say that human error is a problem that can be solved through design – but to say that good design can play a role in minimising error. Design for Error is similar to, but not the same as Design- for-Failure and Design-for-Recovery. A helpful introduction to the concepts of Design for Error can be found in Don Norman‟s book “The Design of Everyday Things”. He Recommends…
  • 24. DESIGN FOR ERROR Put the required knowledge into the world. • Don‟t require people to remember everything they need to know in order to operate a system • This knowledge must be available in an appropriate form (manuals are often left on shelves unread) • But understand that different people require different forms of guidance. An expert in some procedure will not want to be forced to follow the same sequence of steps that an amateur may need to go through.
  • 25. DESIGN FOR ERROR Design “forcing functions” physical or logical constraints • Interlocks: These force certain sequences of events, for example opening a microwave door turns it off. To set off a fire extinguisher you must use remove the pin. • Lockins: These stop you for carrying out a certain action in a particular context. For example most computers now cannot be shut down when there is unsaved work. • Lockouts: These stop you for doing something. For example the stairs to the basement of a tall building are usually differently designed or have a gate to stop people continuing to the basement when evacuating.
  • 26. DESIGN FOR ERROR Narrow the gulf of execution and evaluation • Make things visible to the user and to others, make the results of each action apparent • Enable people to correct their own errors if they see them. Support double checking. Be aware that correcting other person‟s error can create social difficulties (especially if that person is a superior) • Provide support for evaluation in ways that are situationally appropriate. For example people may stop reading common error message, so if an uncommon error occurs consider not making it look like a run-of-the –mill event.
  • 27. DESIGN FOR ERROR Don‟t assume that technical approaches need to be taken to reduce error. • Human centred approaches can be effective, particularly training people. • Organisational approaches such as planning workflows and shifts can be effective. • The design of the working environment can also have a huge impact on error proneness.
  • 28. KEY POINTS Human error is often implicated in accidents and disasters It is often wrong to say human error is the cause of an accident or disaster, as there will be other underlying causes. It can be difficult and controversial to label any particular action as an error. Just because it varies from a procedure does not mean it was the wrong thing to do. There are several ways to model error. Often these are derived from cognitive psychology, and concentrate on where a correct action was intended but an erroneous action performed. We can design for error. Think of the ways people can experience errors and provide resources to reduce these.
  • 29. SOURCES / READING Books by James Reason • (1997) Managing the risks of organisational accidents. Ashgate. • (1990) Human error. Cambridge University Press. • (2008) The Human Contribution. Ashgate. Donald Norman (1988) The design of everyday things. Basic. • See the chapter “Too Err is Human” Sydney Dekker (2006) The Field Guide to Human Error. Ashgate. L Kohn & M Donaldson (editors) (2000) Too Err is Human: Building a Safer Health System. (http://www.nap.edu/openbook.php?isbn=0309068371) Books by Erik Hollnagel • (1998) Cognitive Reliability and Error Analysis Method (CREAM) • (2006) Resilience Engineering
  • 31. INFUSION DEVICE For the next 5 minutes • read “infusion devices” example. Useful Definitions: • An infusion device is a mechanical device that administers intravenous solutions containing drugs to patients. • Hypertensive means patient has high blood pressure • Cubic centimetres can be written as cc‟s or cm3 • An Anaesthesiologist or anaesthetist is a medical doctor who administers the anaesthetic before, during and after surgery. • Intravenous (IV) fluid is supplied in plastic bags and administered using IV tubing.
  • 32. INFUSION DEVICE The “system” in this case was • The digital technology, • The equipment (tubing etc) • The people, practices and procedures • and the physical design of the surgical suite. The “failure” in this case was • The breakdown in delivery of IV medications during surgery - the free flow of the medication from the infusion device.
  • 33. INFUSION DEVICE Systemic Failures: • Multiple infusions devices, each requiring set-up and each requiring a slightly different set up. • Each of three different medications had to be programmed into the infusion device with the correct dose for the patient • Possible scheduling problems in the operating suites may have contributed to the anaesthesiologist having insufficient time to check the devices before surgery • A new nurse on the team means assumptions within the team about responsibilities and ways of working might be false. • The nurse found herself assembling a device she was unfamiliar with. Was she trained properly? Why didn’t she ask for help?
  • 34. INFUSION DEVICE Where was the error? • There is no single error here • As in any safety critical industry there are numerous faults and latent conditions that need to be addressed • Appropriate mechanisms need to be in place to trap errors • Blaming the nurse is a common but inappropriate reaction in this case. Hospitals often have a “blame culture” See “Making Information Technology a Team Player in Safety: The Case of Infusion Devices” (further reading section) for more on infusion devices The example is based upon Too Err is Human (see further reading section)
  • 36. COMMON SLIPS AND LAPSES Slips often occur is routine activities. We intend to do one thing, but do another. There are many kinds of slip: Capture Errors: An activity you are doing is “captured” by another one. Often a non-routine activity can be captured by a more routine one. For example, sometimes when I am driving to St Andrews town centre I pull into the work car park as if I was driving to work.
  • 37. COMMON SLIPS AND LAPSES Description Errors Sometimes when we do a routine activity, we do it to something that is similar to but not the same as the thing intended. (It is not correct but “fits the description”) For example sometimes if I leave my mobile next to my mouse, I grab the mobile by mistake. For example I once dried my hands on my flatmate‟s coat which was hanging on the back of a chair where a tea-towel would normally be
  • 38. COMMON SLIPS AND LAPSES Data driven errors Many human actions are responses to something. These responses can enter into a processes as an additional step or as a mis-step For example when I was typing a document, someone asked me the meaning of a word. I then realised I had typed that word instead of the word I mean to.
  • 39. COMMON SLIPS AND LAPSES Associate action errors Sometimes our own internal associations can trigger a slip. For example picking up the telephone and saying “come in” For example, I once went to a job interview and instead of saying “Hi, I‟m John”, I said “Hi, I‟m scared”. (These kinds of associative errors are called Freudian Slips).
  • 40. COMMON SLIPS AND LAPSES Loss of Activation Errors Sometimes we set out to do something, but along the way forget what we set out to do. For example, I once went to my bedroom but once I was there wondered what it was I went to do. Once I was back downstairs I remembered I wanted to charge my phone.
  • 41. COMMON SLIPS AND LAPSES Mode Errors Sometimes we operate a technology correctly, except that it is in the wrong mode. For example, when turning my car around, I reversed it but forgot to put it in a forward gear before setting off forwards. For example, I typed the body of a text message into the „to‟ area on my phone. Source: Donald Norman (1988) The design of everyday things. Basic.

Editor's Notes

  1. All Nippon Airways. Flight “flips” on 6/9/2011