Human Factors – How to take the first steps? | Neil Clark – IHF Ltd

926 views

Published on

Global HSE Conference | Sept 26 - 27 | New Delhi, India

Published in: Business, Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
926
On SlideShare
0
From Embeds
0
Number of Embeds
50
Actions
Shares
0
Downloads
21
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • “Whatever you do … don’t pull that lever” was the first thing that was ever said to me when I arrived on my first fast jet training course at RAF Linton-on-ouse. Indeed it was the first thing that was said to every budding fast jet pilot when they arrived on the Squadron. The lever I am talking about is the emergency shutdown lever on the RAF’s Tucano aircraft. When pulled it did exactly what it said on the tin. It shutdown the engine in the quickest possible time. It cut the fuel and turned off the ignitors. The thing about this lever was that It sat behind the seat to the left and was perilously close to the aircraft flap lever – a lever that needed to be touched many times on every flight. Not only was the ESDL close, it was the same shape, it had the same feel and was actually a little easier to reach than the flap lever. However, It did have one distinguishing feature, it was yellow. Which when you consider where it was positioned, wasn’t a great deal of help.Now, I can already sense by the looks on your faces that there is a building anticipation for a story on how someone did indeed do what they should never do and pull the lever. Well you are right. Before I tell you the story I want to state two things very clearly, nobody was hurt and it wasn’t me! Tell the story; 500ft cct, newbie pilot, new instructor, talk about the complications and the intensity of flying a cct, height, speed, RT, flaps, wind, ball etc etc. Conclude by saying that they sorted it out and went round at 100ft with one hand on the ejection handle. Latent error – so or later will cause an accident.
  • “Whatever you do … don’t pull that lever” was the first thing that was ever said to me when I arrived on my first fast jet training course at RAF Linton-on-ouse. Indeed it was the first thing that was said to every budding fast jet pilot when they arrived on the Squadron. The lever I am talking about is the emergency shutdown lever on the RAF’s Tucano aircraft. When pulled it did exactly what it said on the tin. It shutdown the engine in the quickest possible time. It cut the fuel and turned off the ignitors. The thing about this lever was that It sat behind the seat to the left and was perilously close to the aircraft flap lever – a lever that needed to be touched many times on every flight. Not only was the ESDL close, it was the same shape, it had the same feel and was actually a little easier to reach than the flap lever. However, It did have one distinguishing feature, it was yellow. Which when you consider where it was positioned, wasn’t a great deal of help.Now, I can already sense by the looks on your faces that there is a building anticipation for a story on how someone did indeed do what they should never do and pull the lever. Well you are right. Before I tell you the story I want to state two things very clearly, nobody was hurt and it wasn’t me! Tell the story; 500ft cct, newbie pilot, new instructor, talk about the complications and the intensity of flying a cct, height, speed, RT, flaps, wind, ball etc etc. Conclude by saying that they sorted it out and went round at 100ft with one hand on the ejection handle. Latent error – so or later will cause an accident.
  • Being a simple minded pilot – I often think that it is the simplest things that can help us most. Properly describing the environment that one is entering is one of the basics. Whether it is through safety critical communication, signage or barriers, people need to understand what they are entering into and what precautions they need to take before doing so. Describe – then show picture. Next few minutes
  • We need to consider peoples states of mind, what time of day it is and how much risk (actual or perceived) on is exposed to. One of the biggest opportunities for people to make mistakes is a combination of stress and the routine. Again this is a funny example however, I often think about isolations when I see this. More specifically I think about people identifying and apply isolations to the wrong pieces of kit. What can we do to stop that from happening?
  • The complexity of how we represent information can play an enormous role in whether operators can identify when things are going wrong in a timely manner. Indeed some may say that dsigners of such systems and displays pride themselves on how complex they can make a system.
  • For me, signs are probably the most basic of ways in which we can support users within the environment in which we work. The pictures above not only display how simple signs can tell us the completely incorrect information but also how the positioning of signs can be counter intuitive. It also highlights that fact that often the really effective signs often have no words. Simple colours and shapes or the brand can tell everyone exactly what they need to know in an instant.
  • So, I hope I have demonstrated to you in a fairly light hearted manner some of the considerations we can make when speaking about human factors. I’d now like to focus on a more serious note and give you a really stark example of how human factors can have devastating effects. I’d like to tell you about AF 447May 31st 2009, 7 miles above the empty expanse of the Atlantic Ocean, Air France 447, an Airbus A-330 passenger aircraft cuts through the midnight darkness. The plane had taken off 3 hours earlier, climbing from Rio de Janeiro on a north easterly heading.Prior to the recovery of the aircraft’s black box, the data implied that the plane had fallen foul of a technical problem – the icing up of the pitot tubes that give airspeed indications – which in conjunction with severe weather led to a complex “error chain” that ended in a crash and the loss of 228 lives.We now understand that, indeed AF447 passed into clouds associated with a large system of thunderstorms, its airspeed sensors became temporarily iced over and the autopilot disengaged. In the ensuing confusion, the pilots lost control of the plane because they reacted incorrectly to the loss of instrumentation then seemed unable to comprehend the nature of the problems they had caused. Neither weather nor malfunction doomed AF447 nor a complex chain of errors, just a simple but persistent mistake on the part of the handling pilot there were many human factors at play in this accident but I would like to focus on what I believe to be the three most important. Human judgements of course, are never made in a vacuum. Pilots are part of a complex system that can either increase or reduce the probability that they will make a mistake.Here we will examine the hardware, human and system contributory factors that led to this preventable accident.Briefly explain what happened.
  • Hardware and Ergonomic IssuesThe Airbus A-330 has some hardware issues that contributed towards this accident. Despite the fact that the cockpit voice recorder captured the pilots flying towards a storm cloud saying “thank goodness we are in an airbus”Perhaps the biggest problem was the redesign of the more traditional “yoke” as you can see in the top left picture. There were two main problems with this redesign;One pilot could no longer see what the other pilot was doing, There is no increase in pressure at higher airspeeds, no vibration is felt through the stick during the stall buffet, and most importantly there is no corresponding deflection on the other pilots’ side stick when a control input is made, making it impossible for the other pilot to know without being explicitly told what control inputs are being made.The aircraft is impossible to stall when the computer is operating in normal law, but when the computer trips to alternate law, as in this event, a lot of the safe guards are removed and the pilot can stall the aircraft. This computer controlled safety can affect the pilot’s mindset and allow complacency to set in and may have contributed to the accident by making the pilot believe the stall warning was spurious.The stall warning was designed to be impossible to ignore, yet it wasn’t mentioned once during the whole incident. The word “STALL” is loudly repeated every 3 seconds and is accompanied by a noise. However, it is inhibited if the aircraft’s AoA gets too high or airspeed gets too low – this may have contributed to the flight deck confusion. Also, it has been proven that hearing is one of the first things to shut off when a person is working at capacity, and the actual warning itself may have led to further cognitive performance degradation.The left and right cockpit airspeed indicators work independently of each other. There is no way to know of a discrepancy unless the pilots are communicating. There is no visual cue of any discrepancy.
  • We have spoken a few times now about understanding the environment in which one operates is absolutely critical to successful operations. Part of the AF447 report states that due to a deteriorated cockpit environment an experience crew flew a perfectly serviceable aircraft into the sea. So what Human Factors contributed to this deteriorated environment; Startle effect – both pilots appeared at capacity (especially the handling pilot), fully tipped over the precipous of the performance-arousal curve. Neither hears the stall warning despite it sounding 58 times. CRM – the handling pilot should be concentrating on flying the aircraft and the non-handling pilot should be looking up emergency procedures and keeping a handle on the bigger picture. Exacerbated by having 2 first officers in the cockpit and no captain, nobody in charge of the situation.Circadian Rhythm – The crew were at their circadian low at 2am Paris time, also jet lagged and fatigued.The crew failed to follow the ‘loss of airspeed indications’ emergency procedure.The crew failed to hear the stall warning, and failed to recognise the aircraft had stalled.The handling pilot had the stick on full rear deflection throughout. At the point when the other pilot, who had up until this point seemed to have more understanding of the situation, took control, he immediately pulled back on the stick too. If he had lowered the nose at this point, the accident would have been averted.The crew were unable to see the bigger picture and put together the information in front of them, monitoring their airspeed, vertical speed, angle of attack and degrees nose up.
  • System FailingsWhen looking at human factors incidents, statistically 80% of the findings are system failings. People have been set up to fail.Air France had a poor learning culture. 9 other incident reports of ‘loss of airspeed indications’ on A330 /340 aircraft were made during 2008-2009, prior to this accident, including one from Madagascar that contained all of the information the crew needed to deal with just this emergency. These incident reports were not distributed to Air France’s other pilots.The Madagascar incident proved that every Airbus pilot needed immediate training on unreliable airspeed occurrences and new emergency procedures spelling out the importance of ignoring the flight director, which became a critical problem on Flight 447.
  • CRM!!! HDT!
  • So all that is well and good.
  • Is there anything there that has happened in the course of this presentation that I can draw upon on.
  • I hope I have demonstrated to you in a fairly light hearted manner some of the considerations we can make when speaking about human factors. I’d like to close on a more serious note and give you a really stark example of how human factors can have devastating effects. I’d like to tell you about AF 477This will be a very abridge version of the accident which killed 228 people on the 1st of June 2009. Which are very similar to the numbers involved in the piper alpha disaster. Before I tell you about the accident it is important to note that the crew had over 20,000 flying hours combined. A very experienced crew. The aircraft crashed after temporary inconsistencies between the airspeed measurements which caused the autopilot to disconnect, after which the crew reacted incorrectly. The result was that a perfectly serviceable aircraft belly flopped at high speed into the mid Atlantic with all systems functioning as designed and the pilots in a state of utter confusion and disarray. It was at night and because of the environment they were in they were completely disorientated. In summary, I would like to close by reaffirming the importance of human factors and also state that it is often the simplest things that make the biggest difference. Tom Gilchrist is going to follow on from me and tell you about the work that we have being doing at Step Change that we hope will give you the tools to improve various facets of your organsiation, business or department.I would like to leave you with this. People make mistakes, no amount of training or education can or will elminate that so the more we can do to prepare the environemnt that we abd our colleagues work in to eliminate the opportunity for human error. The less mistakes we will make and the less incidents and accidents we should have. to err is to be human.In summary, I would like to close by reaffirming the importance of human factors and also state that it is often the simplest things that make the biggest difference. Tom Gilchrist is going to follow on from me and tell you about the work that we have being doing at Step Change that we hope will give you the tools to improve various facets of your organisiation, business or department.I would like to leave you with this. People make mistakes, no amount of training or education can or will eliminate that so the more we can do to prepare the environment that we and our colleagues work in to eliminate the opportunity for human error, the less mistakes we will make and the less incidents and accidents we should have.
  • Human Factors – How to take the first steps? | Neil Clark – IHF Ltd

    1. 1. Technical Session # 1A Topic : Human Factors - How to take the first steps? Topic: By: Neil Clark – IHF Ltd Human Factors – How to take the first steps?
    2. 2. Technical Session # 1A Topic : Human Factors - How to take the first steps? Why is Human Factors so Important to me? “Whatever you do, don’t pull that lever!”
    3. 3. Technical Session # 1A Topic : Human Factors - How to take the first steps? Neil Who? Mechanical Engineering and Maths Professional Sportsman Military Pilot IHF
    4. 4. Technical Session # 1A Topic : Human Factors - How to take the first steps? Why is Human Factors so Important to me? “Whatever you do, don’t pull that lever!”
    5. 5. Technical Session # 1A Topic : Human Factors - How to take the first steps? Human Factors Human Factors Workload Design Ergonomics Situational Awareness StressFatigue Procedures Training Culture
    6. 6. Technical Session # 1A Topic : Human Factors - How to take the first steps? So What Can We Take From This? … The environment virtually guarantees them! Sometime Mistakes are Inevitable …
    7. 7. Technical Session # 1A Topic : Human Factors - How to take the first steps? So What is Human Factors? “The way in which man interacts with the environment” “Human Factors does not just focus on Health and Safety” “There is so much complexity in the world today that the integration of human factors allows individuals to focus on the task at hand rather than worrying about adapting to the environment they are working in.” “Saving money and saving lives”
    8. 8. Technical Session # 1A Topic : Human Factors - How to take the first steps? Understanding Your Environment
    9. 9. Technical Session # 1A Topic : Human Factors - How to take the first steps? Understanding Your People
    10. 10. Technical Session # 1A Topic : Human Factors - How to take the first steps? Complexity
    11. 11. Technical Session # 1A Topic : Human Factors - How to take the first steps? Clarity of Information
    12. 12. Technical Session # 1A Topic : Human Factors - How to take the first steps? Human Factors Gone Wrong - Air France – Human Factors 1. Human Factors in design 2. The effect of the environment 3. The role of training and competence
    13. 13. Technical Session # 1A Topic : Human Factors - How to take the first steps? Air France – Flight 447 – Human Factors in Design
    14. 14. Technical Session # 1A Topic : Human Factors - How to take the first steps? Air France – Flight 447- The Effect of the Environment
    15. 15. Technical Session # 1A Topic : Human Factors - How to take the first steps? Air France – Flight 447 – Training and Competence and Organisational Culture
    16. 16. Technical Session # 1A Topic : Human Factors - How to take the first steps? Human Factors Gone Right
    17. 17. Technical Session # 1A Topic : Human Factors - How to take the first steps? How are we bringing Human Factors to Aberdeen Awareness – First Steps Assessment – Next Steps Action
    18. 18. Technical Session # 1A Topic : Human Factors - How to take the first steps? Categorisation
    19. 19. Technical Session # 1A Topic : Human Factors - How to take the first steps? Independent Assessments
    20. 20. Technical Session # 1A Topic : Human Factors - How to take the first steps? Direction and Urgency
    21. 21. Technical Session # 1A Topic : Human Factors - How to take the first steps? Summary How are you going to take Human Factors into your business?

    ×