Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Flying Lessons:
or what aviation can tell other
disciplines about user interfaces.
Blay Whitby
blaywhitby.org
2
In other words:
 Aviation has a rather clear way of
exposing false assumptions and
outdated work practices.
 Other areas...
4
5
What I'm not saying…
 Aviation always got it right.
 Aviation has all the answers.
 Other fields should simply imitate
...
On the other hand…
 Aviation has some hard-won knowledge
from which other fields might learn.

7
The Aviation Model – some key features
 Blame doesn't help fix anything.
 Stopping it happening again is the goal.
 Tec...
So:






Don‟t assume, no really, DON‟T
ASSUME!
Blaming the user is worthless – it‟s not
the user‟s fault.
Gather data...
“Pilot Error”







Now never used in aviation
Why?
It won‟t stop the same set of events
happening again.
We need to ...
“Pilot Error”
 Tiredness…
 Lack of attention…
 Lack of training…
 Lack of adequate information…
 Loss of situational ...
Data collection






Find out why people made a particular
error.
Improve displays, communication,
training.
Examine t...
Psychology of Decision
Making





The „risky shift‟
The domineering captain and compliant
co-pilot
CRM

13
14
15
16
Current research topics in
aviation human factors





Cultural variations
Situational awareness
Mode confusion

17
Cultural variation


KAL 801

18
Cultural variation



KAL 801
“Captain, the weather radar has helped
us a lot.”

19
Situational awareness


AA695

20
Situational awareness





AA695
FMS logic that dropped all intermediate fixes
from the display(s) in the event of exec...
Mode Confusion


AF 447

22
Mode Confusion








AF 447
Not one cause but a chain of events:
Tropical storm – turbulence
At 02.10.06 in cruise;...
Mode Confusion





Handover to humans, „the startle effect‟;
Alternate law; (suddenly it‟s a different
aircraft)
At 02...
Mode Confusion





In alternate law stall protection is
inoperative.
An aural warning “stall, stall” sounded.
When the...
Mode Confusion





The CRM was also not quite right.
The Captain and PNF may not have
fully realised what control inpu...
The problem of mode
confusion will be solved

27
The problem of mode
confusion will be solved
Learn from the mistakes of others: you
won’t live long enough to make all of ...
Upcoming SlideShare
Loading in …5
×

Blay Whitby: Flying lessons (UX Brighton 2013)

859 views

Published on

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

Blay Whitby: Flying lessons (UX Brighton 2013)

  1. 1. Flying Lessons: or what aviation can tell other disciplines about user interfaces. Blay Whitby blaywhitby.org
  2. 2. 2
  3. 3. In other words:  Aviation has a rather clear way of exposing false assumptions and outdated work practices.  Other areas of human activity might be wise to take notice. 3
  4. 4. 4
  5. 5. 5
  6. 6. What I'm not saying…  Aviation always got it right.  Aviation has all the answers.  Other fields should simply imitate aviation. 6
  7. 7. On the other hand…  Aviation has some hard-won knowledge from which other fields might learn. 7
  8. 8. The Aviation Model – some key features  Blame doesn't help fix anything.  Stopping it happening again is the goal.  Technical and human factors are usually combined.  Chains of events; not single causes  Management (and even national culture) is part of the problem. 8
  9. 9. So:    Don‟t assume, no really, DON‟T ASSUME! Blaming the user is worthless – it‟s not the user‟s fault. Gather data - data shows where the problems are. 9
  10. 10. “Pilot Error”     Now never used in aviation Why? It won‟t stop the same set of events happening again. We need to know why people make errors – in particular this error. 10
  11. 11. “Pilot Error”  Tiredness…  Lack of attention…  Lack of training…  Lack of adequate information…  Loss of situational awareness  CRM 11
  12. 12. Data collection    Find out why people made a particular error. Improve displays, communication, training. Examine the psychology of decision making. 12
  13. 13. Psychology of Decision Making    The „risky shift‟ The domineering captain and compliant co-pilot CRM 13
  14. 14. 14
  15. 15. 15
  16. 16. 16
  17. 17. Current research topics in aviation human factors    Cultural variations Situational awareness Mode confusion 17
  18. 18. Cultural variation  KAL 801 18
  19. 19. Cultural variation   KAL 801 “Captain, the weather radar has helped us a lot.” 19
  20. 20. Situational awareness  AA695 20
  21. 21. Situational awareness    AA695 FMS logic that dropped all intermediate fixes from the display(s) in the event of execution of a direct routing. FMS-generated navigational information that used a different naming convention from that published in navigational charts. 21
  22. 22. Mode Confusion  AF 447 22
  23. 23. Mode Confusion      AF 447 Not one cause but a chain of events: Tropical storm – turbulence At 02.10.06 in cruise; Pitot tubes blocked; ADCs lost input; 23
  24. 24. Mode Confusion    Handover to humans, „the startle effect‟; Alternate law; (suddenly it‟s a different aircraft) At 02.11.30 PF declares the a/c out of control 24
  25. 25. Mode Confusion    In alternate law stall protection is inoperative. An aural warning “stall, stall” sounded. When the airspeed drops below about 60kts the stall warning stops! 25
  26. 26. Mode Confusion    The CRM was also not quite right. The Captain and PNF may not have fully realised what control inputs that the PF was making. None of the crew realized in time that the a/c was completely stalled. 26
  27. 27. The problem of mode confusion will be solved 27
  28. 28. The problem of mode confusion will be solved Learn from the mistakes of others: you won’t live long enough to make all of them for yourself. 28

×