Hf in uas auto

243 views
166 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
243
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Hf in uas auto

  1. 1. AutomationandAutonomyLisa Jo Elliott, Ph.D.University of South Florida
  2. 2. Automation“any sensing, detection,information-processing, decision-making, or control action that couldbe performed by humans but isactually performed by machine.”- Moray et al. (2000, p. 44)
  3. 3. System designers promise automationdelivers…• decreased workload• increased precision• better system performance.System operators find automation delivers…• system failures• automation-induced accidents• imperfect team member
  4. 4. UAS vs. Manned Flight Operation• Automation is one of the few areas.– Manned studies translate to unmanned systems• Depending on the size and use of the UAS– Smaller vehicles less automation• Role of automation is well defined (waypointnavigation).– Larger systems complex automation along withseveral team members and several roles.
  5. 5. Automation has Behavior• Interacts with the operator’s behavior– mental model of the system– trust of the system.• Automation changes an operator’s– training,– task assignments,– workload,– situation awareness,– trust– operator’s skill set
  6. 6. Automation is notan equal team member• Changes the dynamics between the operator& the system.– automation is limited in its ability– automation is deaf,– cannot freely communicate– only the abilities that the system designer hasdeemed necessary.
  7. 7. Re-Distributing Workload from Operatorsto the Automation• Often, aids appear at precisely the time whenworkload is the greatest.• Operators do not have the time to set up theautomation in addition to flying the plane.• In order for automation to relieve workloadit must be incorporated gradually and beforeoperator workload is at its maximum.
  8. 8. Skill Decrement???!
  9. 9. Trust• “automation etiquette guidelines”– Based on Gricean Maxims• Can overcome low automation reliability:– performance in the low-reliability/good-etiquettecondition was almost as good as …– that in the high reliability/poor etiquettecondition.Miller et al., 2004; Sheridan and Parasuraman ,2006, (p. 103).
  10. 10. Automation is not All or None• The amount of automation and the roles it plays are dictated by– system size– number of operators– mission.• Static automation is hard-wired.– The system designer chooses whom (the system or the operator) andhow (manual or automatic) a task will be performed.• Adaptive automation is called up by an operator event.– explicit (a request for aid)– implicit (tied to operator workload)– situational event (takeoff speed).– ability to turn itself on or off.McCarley and Wickens, 2005
  11. 11. Taxonomies• The human-centered taxonomies are useful– isolating operator issues– system performance issues– defining what the automation can and should bedoing in terms of human cognitive performanceWickens, 2008a
  12. 12. Two types of Taxonomies• Functional Centered Taxonomies• Technocentric Taxonomies
  13. 13. Functional Taxonomy:Four Classes of Automation Functions• 1) information acquisition;• (2) information analysis;• (3) decision and action selection;• (4) action implementation• Parasuraman and Colleagues (2000)Parasuraman et al., 2000, p. 288
  14. 14. Technocentric Taxonomy: NASA FLOAAT• Function-Specific Level of Autonomy and AutomationTool (FLOAAT) to facilitate its system requirementsdevelopment.• lowest level of automation states that– all data monitoring– calculation– decisions– tasks are executed by the ground station.• highest level of automation moves– all data monitoring, calculation, decisions, and taskexecution to the onboard system without any assistancefrom the human..Proud and Hart, 2005
  15. 15. Ongoing Challenges• Sensory System Integration– Inability of autonomous systems to createconsistent object representations from a variety ofsensors.• System communication with an operator– in case of system failure• Intra System issues of– Trust, situation awareness, & workload.
  16. 16. ReferencesBillings, C. E., and D. D. Woods. 1994.Concerns about adaptive automation in aviation systems. In Human Performance in Automated Systems: CurrentResearch and Trends, ed. M. Mouloua and R. Parasuraman, 24–29. Hillsdale, NJ: Erlbaum.Heintz, F., and P. Doherty. 2004. DyKnow: An approach to middleware for knowledge processing. Journal of Intelligent and Fuzzy Systems 15: 3–13.Lee, J. D., and K. A. See. 2004. Trust in automation and technology: Designing for appropriate reliance. Human Factors 46: 50–80.McCarley, J. S., and C. D. Wickens. 2005. Human factors implications of UAVs in the national airspace (Technical report AHFD-05-05/FAA-05-01).Aviation Human Factors Division, Savoy, Illinois.Miller, C. A., Goldman, R. P. and Funk, H. B. 2004. Delegation approaches to multiple unmanned vehicle control. Proceedings of the Workshop onHuman Factors of Unmanned Aerial Vehicles: Manning the Unmanned CERI, Tempe, AZ.Moray, N., T. Inagaki, and M. Itoh. 2000. Adaptive automation, trust, and self-confidence in fault management of time-critical tasks. Journal ofExperimental Psychology: Applied 6: 44–58.Pettersson, P. O., and P. Doherty. 2004. Probabilistic roadmap based planning for an autonomous unmanned helicopter. Sensors: 1–6.Parasuraman, R., and C. A. Miller. 2004. Trust and etiquette in high-criticality automated systems. Communications of the ACM 47: 51–55.Parasuraman, R., and V. A. Riley. 1997. Humans and automation: Use, misuse, disuse and abuse. Human Factors 39: 230–253.Parasuraman, R., T. B. Sheridan, and C. D. Wickens. 2000. A model for types and levels of human interaction with automation. IEEE Transactions onSystems, Man and Cybernetics—Part A: Systems and Humans 30: 286–297.Parasuraman, R., and C. D. Wickens. 2008. Humans: Still vital after all these years of automation. Human Factors 50: 511–520.Proud, R. W., and J. J. Hart (2005). FLOAAT: A Tool for Determining Levels of Autonomy and Automation, Applied to Human-Rated Space Systems. AIAAinfotech@Aerospace 2005. Arlington, VA: American Institute of Aeronautics and Astronautics.Sarter, N., D. D. Woods, and C. E. Billings. 1997. Automation surprises. In Handbook of Human Factors and Ergonomics (2nd ed.), ed. G. Salvendy, 1926–1943. New York: Wiley.Sheridan, T., and R. Parasuraman. 2006. Human-automation interaction. In Reviews of Human Factors and Ergonomics, ed. R. S. Nickerson, Vol. 1, 89–129. Santa Monica, CA: Human Factors and Ergonomics Society.Wickens, C. D. 2008a. Functional allocation and the degree of automation. Presentation at the Rocky MountainChapter of HFES. http://Function_allocation_and_the_degree_of_automation_C_Wickens_16_Oct_2008.pdf(accessed June 6, 2010).Wickens, C. D. 2008b. Situation awareness: Review of Mica Endsley’s 1995 articles on SA theory and measurement.Human Factors 50: 397–403.

×