Wayne Giang- the 10th annual Human Factors IUW Workshop

654 views

Published on

  • Be the first to comment

  • Be the first to like this

Wayne Giang- the 10th annual Human Factors IUW Workshop

  1. 1. Use of Multi-Sensory Temporal Synchrony as a Method for Showing Data Relationships Across Modalities Wayne Giang Nov. 13th , 2010
  2. 2. http://www.photosfan.com/images/nuclear-power-plant-control-point1.jpg
  3. 3. http://hackedgadgets.com/wp-content/2/russian_nuclear_power_plant_control_room.jpg
  4. 4. http://www.photosfan.com/images/nuclear-power-plant-control-point1.jpg
  5. 5. http://www.photosfan.com/images/nuclear-power-plant-control-point1.jpg
  6. 6. http://hackedgadgets.com/wp-content/2/russian_nuclear_power_plant_control_room.jpg
  7. 7. http://www.photosfan.com/images/nuclear-power-plant-control-point1.jpg
  8. 8. Presentation Outline • Thought Experiment • Complex Systems and EID • Multimodal Interfaces • Layouts in Multimodal Interfaces • Temporal Synchrony
  9. 9. Controlling Complex Systems • Large, socio-technical, real-time, dynamic systems • Complex systems require special interfaces because (Vicente & Rasmussen, 1992): – Complex systems require complex controllers – Physical systems are governed by constraints – Good controllers must possess a model of the system
  10. 10. Ecological Interface Design • Helps with understanding underlying system constraints • Recognize and respond to abnormal events • Map system constraints and relationships onto perceptual objects • Allow operators to use skill-based behaviour
  11. 11. Perceptual Relationships • “Visual ways of displaying information that can reduce the need for memory or mental calculation” (Burns & Hajdukiewicz, 2004) • Visual comparisons of orientation, size, shape, location – Visual characteristics that are perceptually easy to evaluate – Also provides grouping and layout information (configural displays) Burns (2000)
  12. 12. Multi-modal Interfaces http://www.eyewriter.org/images/data/TEMPT-ONE/eye-tracking/ASL_EyeTracker.JPGhttp://commons.wikimedia.org/wiki/File:RobertFuddBewusstsein17Jh.png
  13. 13. Multi-modal Interfaces • Input and output beyond traditional keyboard/mouse + monitor interactions • Multi-modal presentation  audition, touch, vision • Multiple ResourceTheory (Wickens & Hollands, 2004) (Wickens & Hollands, 2004)
  14. 14. Layouts in Multimodal Interfaces • Different models (Sarter, 2006): – Redundancy – Supplementary – Sensory modalities as separate channels of information • Lack of research about how to layout information in different modalities, especially for abstract data • Hypothesis: Layouts in multimodal interfaces will depend on cross-modal relationships
  15. 15. Showing Cross-modal Relationships • Different degrees of cross-modal relationships: – Completely automatic– Multisensory Integration – Based on a judgement – Cross-modal Matching • Perceptual relationships  EID • Can we group multi-modal interface information into perceptual objects ? • Are there processing advantages for grouping display information into perceptual events?
  16. 16. Three principles of multi-sensory integration (Meredith & Stein, 1993) Spatial Rule – Integration is more likely when the individual sensory stimuli come from roughly the same location Temporal Rule – Integration is more likely when the individual sensory stimuli start from roughly the same time . Principle of Inverse Effectiveness – Integration is more likely when the individual sensory stimuli are vague or weak.
  17. 17. Display Integration in Ecological Displays (Burns, 2000) • Impact of spatial proximity (things presented close together) and temporal proximity (things presented at nearly the same time) on understanding ecological displays • Recommendation of using higher spatial proximity and temporal proximity in adaptive problem solving tasks
  18. 18. • Spatial location? • Temporal occurrence? Possible Methods for Grouping Crossmodal Data
  19. 19. Temporal Synchrony • A type of crossmodal matching – Are things happening at the same time/ same duration? • Ability to detect temporal synchrony developed at a very young age (Lewkowicz, 2000) • “Synchrony window” exists in which things are perceived to be synchronous
  20. 20. WhyTemporal Synchrony? • Not as constrained by physical location of displays. – Higher mobility for operators – Lower space footprint • Can be more dynamic • Different methods for invoking temporal synchrony: – Onset – Duration – Rate – Rhythm
  21. 21. Audition andTouch • Most commonly used non-visual modalities for information presentation • Share many “amodal” characteristics • Benefits: – Ability to be perceived even when attention is directed elsewhere – More accurate representation of temporal information (double flash illusion)
  22. 22. • Onset Possible Groupings usingTemporal Synchrony
  23. 23. • Duration Possible Groupings usingTemporal Synchrony
  24. 24. • Rate Possible Groupings usingTemporal Synchrony
  25. 25. Proposed Methodology- Research Questions • To what degree can we group or compare auditory (earcon/sonification) and tactile (tactor/tactification) information using temporal synchrony and how is this impacted by operator workload? • Are there processing advantages for grouping display information using these perceptual relationships? + Temporal Synchrony? = Perceptual Objects?
  26. 26. Take Home Message • Layouts in multi-modal interfaces may be dependent on crossmodal perceptual relationships. • EID can leverage these relationships in complex systems • Temporal synchrony is one method that may be used to group information in multimodal interfaces
  27. 27. Questions?

×