Your SlideShare is downloading. ×
CS5032 Lecture 5: Human Error 1
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

CS5032 Lecture 5: Human Error 1

1,580
views

Published on


0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,580
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
126
Comments
0
Likes
3
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • All Nippon Airways. Flight “flips” on 6/9/2011
  • Transcript

    • 1. DR JOHN ROOKSBY
    • 2. IN THIS LECTURE …This lecture focuses on human error in systems operationHuman error is implicated in many accidents and disasters.• But this does not mean human error causes these accidents• The media are often too quick to blame individuals … and computer scientists too quick to blame the userWe will look at issues in studying human error, and willoutline three approaches to modeling human error.We will then discuss design-for-error.In the next lecture I will suggest it is more important to lookat human reliability than human error.
    • 3. HUMAN ERRORGiven that all complex systems involve people in theirproduction, maintenance and operation it should not besurprising that human issues are often constituent in failuresand accidents.According to one report, human error accounts for…
    • 4. 50-70% OF AVIATION DISASTERShttp://en.wikipedia.org/wiki/1994_Fairchild_Air_Force_Base_B-52_crash
    • 5. 70% OF SHIPPING ACCIDENTSge: Roberto Vongher://en.wikipedia.org/wiki/Costa_Concordia_disaster
    • 6. 60-85% OF SHUTTLE INCIDENTS AT NASAhttp://en.wikipedia.org/wiki/Space_Shuttle_Challenger_Disaster
    • 7. 44,000 – 98,000 DEATHS A YEARIN THE USA IN HEALTHCARE(MAINLY THROUGH MEDICATIONERRORS)http://en.wikipedia.org/wiki/Anaesthetic_machine
    • 8. CONTRIBUTION OR CAUSE?Human error often features in accidents and disasters. Butthis does not mean they necessarily cause accidents.• Accidents can have multiple causes, so why single out human error?• Other factors can underlie human errors, such as poor design, lack of training, or overwork.• There is not always a correct way to work. Some actions are just errors in hindsightIt is important to understand and reduce errors that happenat “the sharp end” of system operation• But this may be to treat symptoms from deeper troubles.
    • 9. THE “SHARP END” OF FAILURE Blunt End Regulations Organisations Groups Sharp End Users Technology
    • 10. http://gizmodo.com/5844628/a-passenger-airplane-nearly-flew-upside-down-because-of-a-dumb-pilot
    • 11. STUDYING HUMAN ERRORHuman activity and human error is not a simple topic. Itencounters areas that have been debated in psychology andthe humanities for decades, if not centuries.Errors are predominantly studied by • Laboratory simulations • Field observation • Archive dataIt is difficult to study human error. What constitutes an errorcan be controversial • What is the „correct‟ action is in any given situation?
    • 12. MODELING HUMAN ERRORThere have been several attempts to build taxonomies ofhuman error. There is still no definitive model, and manyargue there never can be.In this lecture I will briefly cover three• THERP (Technique for Human Error Rate Prediction)• GEMS (Generic Error Modeling System)• CREAM (Cognitive Reliability and Error Analysis Method)
    • 13. THERPTHERP states human actions can be:• Correct: Actions are done as specified• Errors of omission: An action is omitted• Errors of commission: An action is inadequate. out of sequence, mistimed, or not of quality (too much/too little/wrong way)• Extraneous actions: An action is not expected at that timeTHERP enables you to build probabilities of errors occurringin a given context.The key problem with THERP is that it assumes a correctspecification. The only things humans do is follow thespecification, and if not it is an error. Reality is not like this.
    • 14. GEMSThe GEMS model has also had a wide impact. It draws oncognitive psychology.GEMS is founded upon an “Activity Space” model, whichrepresents human performance on three levels: • Skills based activity • Activities where we act more or less automatically • Rule based activity • Activities where we apply rules • Knowledge based activity • Activities where we fall back on our (sometimes patchy) knowledge
    • 15. GEMS• Errors in skills based activity: • Execution errors • Slips: an error in executing an action correctly • Lapses: steps are missed when executing an activity• Errors in rule based activity • Planning errors • Rule-based mistake: a rule is misapplied• Errors in knowledge based activity • Also planning errors • Knowledge-based mistakes: knowledge is wrong or misapplied
    • 16. GEMS: CHARACTERISTICS OFERROR TYPES Skills Based Rule Based Knowledge Errors Errors Based ErrorsMain Error type Slips & lapses RB mistakes KB mistakesActivity type Routine actions Problems solving activitiesAttention Often elsewhere Directed at problem related issuesControl mode Mainly automatic More consciousPredictability Largely variable predictableFrequency Common UncommonOpportunity Very high Very lowDetection Usually easy Difficult and often through intervention
    • 17. GEMSGEMS is best thought of as a way of characterising ratherthan defining error. There are a number of problems with it:• It assumes human activities are goal or plan-driven (this is controversial in psychology and the humanities)• The plan or goal is assumed to be correct but how do you judge this?• It can be very difficult to definitively categorise any human action in terms of error• Many different versions of the model exist
    • 18. CREAMCREAM (Cognitive Reliability and Error Analysis Methodseeks to avoid the idea that humans simply introduce errorsinto „perfectly specified systems‟, and enables you to modelcorrect as well as incorrect actionsCREAM sees action taking place across three levels• Person Related• Organisation Related• Technology Related
    • 19. CREAMCREAM details the genotypes (possible causes) of error as• Person related • Observation / Interpretation / Planning• Organisation related • Communication / Training / Ambient Conditions / Working Conditions• Technology related • Equipment Failure / Procedures / Temporary Interface Problems / Permanent Interface ProblemsCREAM does not offer a specific error taxonomy, but offersgeneric „phenotypes‟ with which to build one.
    • 20. VIOLATIONSWe have assumed so far, that errors are unintentionalaberrations of an intentional and correct activity.• But sometimes rules and procedures can be deliberately violatedSabotage is one reason, but many violations can be wellintentioned. These well-intentioned violations fall into threemajor categories: • Routine violations: Taking the path of least effort. • Optimizing violations: Doing something (too) quickly or cheaply. • Necessary violations: These are provoked by organisational or contextual failings.
    • 21. DESIGN FOR ERRORIf we design systems appropriately we can minimise error.• This is called Design for ErrorWhen designing any system we need to be aware howhuman errors can and will be made.• This is not to say that human error is a problem that can be solved through design – but to say that good design can play a role in minimising error.Design for Error is similar to, but not the same as Design-for-Failure and Design-for-Recovery.A helpful introduction to the concepts of Design for Errorcan be found in Don Norman‟s book “The Design ofEveryday Things”. He Recommends…
    • 22. DESIGN FOR ERRORPut the required knowledge into the world. • Don‟t require people to remember everything they need to know in order to operate a system • This knowledge must be available in an appropriate form (manuals are often left on shelves unread) • But understand that different people require different forms of guidance. An expert in some procedure will not want to be forced to follow the same sequence of steps that an amateur may need to go through.
    • 23. DESIGN FOR ERRORDesign “forcing functions” physical or logical constraints • Interlocks: These force certain sequences of events, for example opening a microwave door turns it off. To set off a fire extinguisher you must use remove the pin. • Lockins: These stop you for carrying out a certain action in a particular context. For example most computers now cannot be shut down when there is unsaved work. • Lockouts: These stop you for doing something. For example the stairs to the basement of a tall building are usually differently designed or have a gate to stop people continuing to the basement when evacuating.
    • 24. DESIGN FOR ERRORNarrow the gulf of execution and evaluation • Make things visible to the user and to others, make the results of each action apparent • Enable people to correct their own errors if they see them. Support double checking. Be aware that correcting other person‟s error can create social difficulties (especially if that person is a superior) • Provide support for evaluation in ways that are situationally appropriate. For example people may stop reading common error message, so if an uncommon error occurs consider not making it look like a run-of-the –mill event.
    • 25. DESIGN FOR ERRORDon‟t assume that technical approaches need to be taken toreduce error.• Human centred approaches can be effective, particularly training people.• Organisational approaches such as planning workflows and shifts can be effective.• The design of the working environment can also have a huge impact on error proneness.
    • 26. KEY POINTSHuman error is often implicated in accidents and disastersIt is often wrong to say human error is the cause of anaccident or disaster, as there will be other underlying causes.It can be difficult and controversial to label any particularaction as an error. Just because it varies from a proceduredoes not mean it was the wrong thing to do.There are several ways to model error. Often these arederived from cognitive psychology, and concentrate onwhere a correct action was intended but an erroneous actionperformed.We can design for error. Think of the ways people canexperience errors and provide resources to reduce these.
    • 27. SOURCES / READINGBooks by James Reason• (1997) Managing the risks of organisational accidents. Ashgate.• (1990) Human error. Cambridge University Press.• (2008) The Human Contribution. Ashgate.Donald Norman (1988) The design of everyday things. Basic.• See the chapter “Too Err is Human”Sydney Dekker (2006) The Field Guide to Human Error. Ashgate.L Kohn & M Donaldson (editors) (2000) Too Err is Human: Building a SaferHealth System. (http://www.nap.edu/openbook.php?isbn=0309068371)Books by Erik Hollnagel• (1998) Cognitive Reliability and Error Analysis Method (CREAM)• (2006) Resilience Engineering
    • 28. EXERCISE 1.
    • 29. INFUSION DEVICEFor the next 5 minutes • read “infusion devices” example.Useful Definitions: • An infusion device is a mechanical device that administers intravenous solutions containing drugs to patients. • Hypertensive means patient has high blood pressure • Cubic centimetres can be written as cc‟s or cm3 • An Anaesthesiologist or anaesthetist is a medical doctor who administers the anaesthetic before, during and after surgery. • Intravenous (IV) fluid is supplied in plastic bags and administered using IV tubing.
    • 30. INFUSION DEVICEThe “system” in this case was • The digital technology, • The equipment (tubing etc) • The people, practices and procedures • and the physical design of the surgical suite. The “failure” in this case was • The breakdown in delivery of IV medications during surgery - the free flow of the medication from the infusion device.
    • 31. INFUSION DEVICESystemic Failures: • Multiple infusions devices, each requiring set-up and each requiring a slightly different set up. • Each of three different medications had to be programmed into the infusion device with the correct dose for the patient • Possible scheduling problems in the operating suites may have contributed to the anaesthesiologist having insufficient time to check the devices before surgery • A new nurse on the team means assumptions within the team about responsibilities and ways of working might be false. • The nurse found herself assembling a device she was unfamiliar with. Was she trained properly? Why didn’t she ask for help?
    • 32. INFUSION DEVICEWhere was the error? • There is no single error here • As in any safety critical industry there are numerous faults and latent conditions that need to be addressed • Appropriate mechanisms need to be in place to trap errors • Blaming the nurse is a common but inappropriate reaction in this case. Hospitals often have a “blame culture” See “Making Information Technology a Team Player in Safety: The Case of Infusion Devices” (further reading section) for more on infusion devices The example is based upon Too Err is Human (see further reading section)
    • 33. EXERCISE 2.
    • 34. COMMON SLIPS AND LAPSESSlips often occur is routine activities. We intend to do onething, but do another. There are many kinds of slip:Capture Errors:An activity you are doing is “captured” by another one. Often anon-routine activity can be captured by a more routine one.For example, sometimes when I am driving to St Andrews towncentre I pull into the work car park as if I was driving to work.
    • 35. COMMON SLIPS AND LAPSESDescription ErrorsSometimes when we do a routine activity, we do it to somethingthat is similar to but not the same as the thing intended. (It is notcorrect but “fits the description”)For example sometimes if I leave my mobile next to my mouse, Igrab the mobile by mistake.For example I once dried my hands on my flatmate‟s coat whichwas hanging on the back of a chair where a tea-towel wouldnormally be
    • 36. COMMON SLIPS AND LAPSESData driven errorsMany human actions are responses to something. Theseresponses can enter into a processes as an additional step or asa mis-stepFor example when I was typing a document, someone asked methe meaning of a word. I then realised I had typed that wordinstead of the word I mean to.
    • 37. COMMON SLIPS AND LAPSESAssociate action errorsSometimes our own internal associations can trigger a slip.For example picking up the telephone and saying “come in”For example, I once went to a job interview and instead of saying“Hi, I‟m John”, I said “Hi, I‟m scared”. (These kinds of associativeerrors are called Freudian Slips).
    • 38. COMMON SLIPS AND LAPSESLoss of Activation ErrorsSometimes we set out to do something, but along the way forgetwhat we set out to do.For example, I once went to my bedroom but once I was therewondered what it was I went to do. Once I was back downstairs Iremembered I wanted to charge my phone.
    • 39. COMMON SLIPS AND LAPSESMode ErrorsSometimes we operate a technology correctly, except that it is inthe wrong mode.For example, when turning my car around, I reversed it but forgotto put it in a forward gear before setting off forwards.For example, I typed the body of a text message into the „to‟ areaon my phone. Source: Donald Norman (1988) The design of everyday things. Basic.