Automation

445 views
392 views

Published on

Paper I prepared for one of my classes at ERAU WWOnline

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
445
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Automation

  1. 1. Running Head: IMPACT OF AUTOMATION ON FLIGHT CREWS’ GROUP PROCESS Impact of Aircraft Automation on Flight Crews’ Group Process Mersie A. Melke Embry-Riddle Aeronautical University Daytona Beach, Florida Department of Distance Learning Instructor: Susan Bailey-Schmidt Impact of Automation on Group Process 1
  2. 2. January 29, 2015Abstract The intent of automation in aircraft is to decrease the number of human errors and the consequent incidents or accidents. Coherent with this, automation aims to relieve the pilots flying the aircraft, from removable workload. Therefore, assertion that the automated system is an additional crewmember of the already existing crew of human pilots seems fair. This paper examines how automation affects the group process of a human flight crew. In order to analyze a typical flight crew environment, the paper defines the contents of a flight crew’s group process for an uneventful flight phase. Based on this assumption, this paper analyzes both positive and negative implications of automation on pilots’ group process. Impact of Automation on Group Process 2
  3. 3. Flight Crew Group Processes The duty of a flight crew assigned to transport passengers over a route is to provide a safe passage for the travelers. At the same time, the crew should achieve an efficient performance of this duty to bring profit to the airline. As a result, flight crews capable of carrying out a safe and efficient flight have a group process common to all of them. One of the components of this group process is communication. Communication in a flight crew is the process by which crewmembers get a model of their work environment. It is via briefing, a form of communication, that the flight crew of a certain flight leg gets knowledge of its duties and responsibilities. Communication in the cockpit is a tool necessary to create a shared mental model of the flight instruments, flying environment and individual inputs from the pilots. Inquiry, assertion and self-critique are all part of the communication process of a healthy flight crew (Helmreich, Foushee, 1993). Coherent with communication, decision tasks are another component of flight crew group processes (Helmreich, Foushee, 1993). This component covers activities of flight crew that address the fate of flight maneuvers accommodating the idea of external inputs from the aircraft, air traffic controller and fellow crewmembers. It may also span its application in areas of crew conflict resolution to harness the optimal output from individual crewmembers. A third component of pilots’ process is team formation and management tasks (Helmreich, Foushee, 1993). From the moment, a pilot accepts a flight assignment to the time the pilot accomplishes that job, the crewmember is part of a team. This team has a shared purpose of getting passengers safely and efficiently to their destination. However, since individuals comprise this team, variable individual thoughts are also part of this team. Therefore, in order to come up with a team thought process, team management is necessary from the moment of formation to the time of disbandment. Impact of Automation on Group Process 3
  4. 4. A fourth group process is the situational awareness and workload management of the flight crew (Helmreich, Foushee, 1993). This component is interrelated with the above- mentioned group process components. In addition, it has it own unique identifying features. One feature is the vigilance involved to ascertain situational awareness. In addition, this vigilance calls for preparation and planning for the flight route. Ensuring optimal distribution of workload amongst the crewmembers, prioritizing tasks and avoiding distraction are also identifying features of this component of pilots’ group process. The above processes are the cognitive and interpersonal functions of pilots’ group processes. Another part of these processes is the man- machine interface task. This part reflects the technical proficiency of the crew. Aircraft control tasks such as aircraft power control, flight control and navigation fall within this category. In addition, procedural tasks such as checklists or manual handling, air traffic controller usage, systems operation and expertise and skill of abnormal operations is part of this category (Helmreich, Foushee, 1993). Therefore, the integration of the above-mentioned group processes maintains a purposeful flight crew. Now that components of pilots’ group process have been identified, it is possible to comprehend the effect of automation on these identified components. Consequently, the following paragraphs examine the effects of automation on the group processes within a cockpit. In addition, they assess effects on individual pilots as the individual changes makeup the whole group process. Effects of automation on group processes It is a fact that automation of aircraft has decreased the workload of flight crew. Coherent with this, the way of operating an aircraft changed due to the advent of automation. The shift from active control to monitoring changed not only the type of cognitive activity demanded, but also the associated consequences (Mosier, 2002). Although the impact of Impact of Automation on Group Process 4
  5. 5. automation in terms of numbers of decision errors is positive, reduction of some types of errors accompanied the introduction of new varieties of automation-related errors (Mosier, Skitka, Dunbar, McDonnell, 2001). By replacing the human operator with automated equipment in an attempt to eliminate human error, another potential form of human error emerges (Mouloua, Smither, Vincenzi, Smith, 2002). The perfection and reliability of the automated equipment depends on the hardware and software capabilities designed and built by humans. Even though a system becomes less vulnerable to operator error through introduction of automation, it is more vulnerable to designer error (Mouloua, Smither, Vincenzi, Smith, 2002). In particular, the availability of automation and automated decision aids facilitates an existing human tendency to make the least amount of cognitive effort. Decision makers have displayed a tendency to use these aids as a replacement for vigilant information seeking and processing, a phenomenon referred to as automation bias (Mosier, Skitka, Dunbar, McDonnell, 2001). Consequently, automation bias entails omission errors or commission errors. Omission errors are failures to take necessary action or to respond to system irregularities when not prompted to do so by an automated device. Commission errors occur when operators inappropriately follow an automated directive or recommendation without verifying it against other available information or despite contradictions from other sources of information. (Mosier, Skitka, Dunbar, McDonnell, 2001) Airline cockpit crews are a hierarchical team and the decision making process they go through is influenced by detail constructs of a team or group process (Mosier, Skitka, Dunbar, McDonnell, 2001). These constructs are team informity, staff validity and hierarchical sensitivity. The first construct is the degree to which the team, as a whole, is aware of all of the relevant cues or information. Staff validity is the degree to which each member of the team can produce valid judgments on the decision object. The third construct, Impact of Automation on Group Process 5
  6. 6. hierarchical sensitivity, is the degree to which the team leader effectively weights team member judgments in arriving at the team’s decision (Mosier, Skitka, Dunbar, McDonnell, 2001). The level of autonomy and authority incorporated into modern automated systems, however, may also give them the character of independent agents, or team members, whose expertise may encourage team leaders to weight their input above that of other members, affecting hierarchical sensitivity (Mosier, Skitka, Dunbar, McDonnell, 2001). Another automation related error is that team members may use automated input as a shortcut in decision making, whether or not its staff validity is high, given the characteristics of a particular situation. Time constraints, especially in critical events, may exacerbate the tendency to rely on a few information cues (Mosier, Skitka, Dunbar, McDonnell, 2001). Effect of automation on individual pilots In modern, high tech aircraft, the flying task is much more cognitively than physically demanding. The cockpit is now an electronic, deterministic environment, in which primary task of pilot is to supervise and monitor systems and information displays to ensure consistency, or coherence, of the “world” and to restore it when disruption occur (Mosier, 2002). The automated cockpit brings cues that were in the outside environment in to the cockpit and displays them as reliable and accurate information rather than probabilistic cues. This changes the goal of pilot cognition from correspondence, or empirical accuracy in using probabilistic cues for diagnosis, judgment, and prediction to coherence, or rationality and consistency in diagnostic and judgment processes (Mosier, 2002). Amalberti and Sarter in their paper dated 2000, p.4, had stated the change due to automation on cockpit crew as follow; “The development and introduction of modern automation technology has led to new cognitive demands. The result is new knowledge requirements (e.g. understanding Impact of Automation on Group Process 6
  7. 7. the functional structure of the system), new communication tasks (e.g. knowing how to instruct the automation to carry out a particular task), new data management tasks (e.g. knowing when to look for, and where to find, relevant information in the system’s data architecture), and new attentional demands (e.g. tracking the status and behavior of the automation as well as the controlled process).” The two cognitive processes, coherence and cognition, differ in the way of execution and resource usage they involve. Coherence and correspondence are an either/or process which cannot be done simultaneously by an individual (Mosier, 2002). Correspondence involves relating work environment cues with previous experience or instrument readings or both to come up with a decision. The goal of correspondence in cognition is empirical objective accuracy in human judgment (Mosier, 2002). However, coherence involves a thorough diagnosis of a situation before decision-making. Coherence competence refers to an individual’s ability to maintain logical consistency in judgments and decisions (Mosier, 2002). A pilot for example exercises coherence competency when scanning the information displayed inside the cockpit to ensure that system parameters, flight modes, and navigational displays are consistent with each other and with what should be present in a given situation (Mosier, 2002). Conversely, correspondence judgments without reference to the real world are impossible. In addition, evaluation of these judgments is according to how well they represent, predict, or explain objective reality (Mosier, 2002). Humans are “cognitive misers” and tend to take the path of least cognitive effort while making decisions, which makes usage of heuristic judgment (Mosier, 2002). In aviation, the use of heuristics such as availability of cues, the judgment of the likelihood of events dependent on the ease of pilot memory recall, the tendency to use event congruency to judge the likelihood that the current event will have the same outcome, has been identified as Impact of Automation on Group Process 7
  8. 8. a factor in incidents and accidents (Mosier, 2002). Automated environments induce errors in coherence judgment, not observed in non-automated environment. This susceptibility persists even when two people are sharing responsibility for monitoring, judgment, and decision- making (Mosier, 2002). The degree of autonomy and authority of modern automated systems creates a gap in the operator’s mental model of the automation and its interactions. Low observability interfaces and situations with none routine elements and conjunctions make it difficult to operate, track and anticipate the activities of human pilots’ automated partners, producing automation surprises (Sarter, Woods, 1999). Observability refers to the ability of available feedback to actively support operators in monitoring and staying ahead of system activities and transitions. Observability is more than data availability: it depends on the cognitive work needed to extract meaning from availability (Sarter, Woods, 1999). Operating mode of an Automated Flight System (AFS) distinguishes it. An operating mode is a way the AFS controls the path of the aircraft. The AFS takes inputs from the environment (e.g. speed, altitude, geographical position, and attitude) and, based on this, sends outputs to the aircraft’s control surfaces and other systems in order to reach or maintain some value on the inputs (Zuschlag, 2002). An operating mode differs from interface mode, which is a set of parameters displayed and parameter inputs allowed under certain condition Examples are a page collection on the Central Display Unit (CDU) of a Flight Management System (FMS). While some operating modes have corresponding interface modes (e.g. HOLD) there is generally not a one-to-one correspondence (Zuschlag, 2002). Furthermore, even for operating modes that have corresponding interface modes, at any given time, the operating mode and the interface mode are independent (e.g. CDU does not have to be on the Vertical Navigation (VNAV) page in order for the aircraft to be flying in the VNAV operating mode) Impact of Automation on Group Process 8
  9. 9. (Zuschlag, 2002). A real world example of mode confusion or lack of mode awareness is crash of China Airlines Flight 140, cited by Hammer in his paper dated 1999, p.549; “The crew mistakenly activated an automatic go-around without realizing it. The autopilot tried to increase power and gain altitude while the crew attempted to maintain power and reduce altitude. Because the crew controlled the elevators and automation controlled the horizontal stabilizer, the nature of conflict was not apparent. The aircraft eventually attained an unrecoverable state and then crashed. If the aircraft had been flown completely manually to a landing, or fully automatic in a go-around, no accident would have occurred.” According to Zuschlag (2002), flight automation operation has contributed to the cause of accidents and incidents for five reasons. One is the issue of mode awareness, which encompasses that area of lacking where crew do not perceive signals indicating an uncommanded or programmed change in a mode or mode parameter, or that a mode had failed to engage as intended. When crew did not anticipate an uncommanded change or failure to change in the AFS’s mode or parameter for a mode, mode prediction problem arises which is the second contributing reason in an automated cockpit. A third contributing problem could be a programming error, which occurs when the crew makes an entry error in to the AFS and the error surfaces after the aircraft deviates from the intended path. Coherent with this awkward programming could be the fourth contributing reason leading to incidents and accidents in an AFS. Crew trying to diagnose a problem with the AFS, being either that the crew cannot program the AFS as desired or that the AFS has done something undesirable is also a reason. This distracts the crew from other things necessary to fly the plane correctly. Time pressure has a negative effect on information search and diagnosis accuracy, and the presence of none congruent information heightens these negative effects (Mosier, Sethi, McCauley, Khoo, Orasanu, 2007). Diagnosis Impact of Automation on Group Process 9
  10. 10. confidence is unrelated to accuracy and has a negative relation to amount of information accessed (Mosier et al., 2007). Another study by Damos, John and Lyall (1999) on the effect of automation on cockpit crew activities asserts that different activities react variably to automation. This study done on actual revenue flight found that pilots’ of aircraft with advanced levels of cockpit automation did spend more time looking inside the aircraft during approach when the aircraft was below 10,000 ft. In contrast, the amount of time spent looking inside the cockpit was not affected by the level of cockpit automation during climb or descent to 10,000 ft. Defining activities such as; manipulating frequency on communication radios, manipulating the controls on the communication selector panel, manipulating the cabin temperature controls, manipulating the controls on the front panel and manipulating the navigation radios as housekeeping activities, Damos, John and Lyall (1999) further asserted that increase in automation did not affect these activities and communication. Conclusions What started out as a search for a remedial action of pilot error due to workload has led to today’s modern aircraft. Contemporary aircraft are capable of collecting data of the environment they fly in, assimilate the data gathered and provide it in the cockpit as decisional aids for flight crews. It is evident that removable workload has been alleviated from flight crew. Flight crew can now use their time in the skies for things other than looking for the North Star as a navigational aid. In addition, aircraft reliability has increased due to the advent of automation. However, together with the above-mentioned positive effects automation has brought about other effects. Firstly, the way flight occurs has changed and has adopted a cognitive requirement. In addition, crews are required to be vigilant not just to the environment they are flying in but also to automated cues onboard that follows and updates the status of the Impact of Automation on Group Process 10
  11. 11. environment. Flight crews, therefore have to recognize the automated partner as part of their team and need to make it part of their group process. As defined in the beginning of this paper, flight crew group processes manifest through, communication, workload management, situational awareness, team building and management, decision tasks, aircraft control tasks and procedural tasks (Helmreich, Foushee, 1993). Incorporating the automated partner to be part of a team that practices all the above will require specialized training and design philosophy that would foster the above factors of flight crew group processes in the presence of an automated aircraft. However assigning automation to alleviate workload of flight crew without due consideration of the group processes will lead to automation related errors. Therefore, what has started out, as an initiative to decrease error will end up with another form of error, which still may cause incidents and accidents. Impact of Automation on Group Process 11
  12. 12. References Amalberti, R., & Sarter, N.B. (2000). Cognitive Engineering in the Aviation Domain – Opportunities and Challenges. In: Sarter, N.B. & Amalberti, R. (Eds). Cognitive Engineering in Aviation Domain (pp. 1-9). Mahwah, NJ: Lawrence Erbaum Associates. Damos, D.L., John, R.S. & Lyall, E.A. (2005). Pilot Activities and the Level of Cockpit Automation. The International Journal of Aviation Psychology. 15:3,251 — 268. Hammer, J.M. (1999). Human Factors of Functionality and Intelligent Avionics. In Garland, D.J., Wise, J.A., Hopkin, V.D. (Eds). Handbook of Aviation Human Factors (pp. 549- 567). Mahwah, NJ: Lawrence Erbaum Associates. Helmreich, R. L. & Foushee, H.C. (1993). Why Crew Resource Management? Empirical and Theoretical Bases of Human Factors Training in Aviation. In Wiener, E.L., Kanki, B.G., Helmreich, R.L. (Eds.), Cockpit Resource Management (pp. 3-41).Academic Press, San Diego, Ca. Mosier, K.L. (2002). Automation and Cognition: Maintaining Coherence in Electronic Cockpit. In Salas, E. (Ed.) Advances in Human Performance and Cognitive Engineering Research (pp. 93-121). Elsevier Science Limited, U.K. Mosier, K.L., Sethi, N., McCauley, S., Khoo, L. & Orasanu, J.M. (2007, April). What You Don’t Know Can Hurt You: Factors Impacting Diagnosis in the Automated Cockpit. Human Factors: The Journal of the Human Factors and Ergonomics Society. 49:2, 300-310. Mosier, K. L., Skitka, L. J., Dunbar, M. & McDonnell, L. (2001). Aircrews and Automation Bias: The Advantages of Teamwork? The International Journal of Aviation Psychology,11:1,1 — 14. Impact of Automation on Group Process 12
  13. 13. Mouloua, M., Smither, J.A., Vincenzi, D.A., Smith, L. (2002). Automation and Aging: Issues and Considerations. In Salas, E. (Ed.), Advances in Human Performance and Cognitive Engineering Research (pp. 213-237). Elsevier Science Limited, U.K. Sarter, N.B., Woods, D.D. (1999, October). Team Play with a Powerful and Independent Agent: Operational Experiences and Automation Surprise on the Airbus A320. Human Performance in Extreme Environments. 4:2, 60-70. Zuschlag, M. (2002). Human Factor Issues Regarding Automated Systems. In Salas, E. (Ed.), Advances in Human Performance and Cognitive Engineering Research (pp. 157-199). Elsevier Science Limited, U.K. Impact of Automation on Group Process 13

×