Astute symposium 2013-10-10_hmi_design_patterns_elenatsiporkova_tomstevens


Published on

Astute symposium 10/10/2013 - HMI design patterns

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

Astute symposium 2013-10-10_hmi_design_patterns_elenatsiporkova_tomstevens

  1. 1. 2013 Tsiporkova Elena & Stevens Tom
  2. 2.  7th EU Framework Programme  Generic architecture  Proactive decision support  5 demonstrators
  3. 3.  Demonstrators    Emergency dispatching Production management Patterns  Document  Share  Formalize knowledge to exploit
  4. 4.  Observation during field studies  Comparison of design challenges for demonstrators during workshop  Enrichment through information modeling workshop  Prototyping  Forces identification  Applicability analysis
  5. 5.  Multiple warnings  Meaningful sounds  Goal progression-based information detail and adaptation
  6. 6.  Mode switch  Interactive instructions  Modality combination for urgent messages
  7. 7.  Spatial point representation  Context-based information articulation
  8. 8.  Generic attributes for patterns   Interaction  Device  Information  Environment  Event  Task  Modality   User System Knowledge described in the form of  Attribute values, depending on context  Relationships  Rules
  9. 9.  The semantic modelling and reasoning environment aims at facilitates the design of multimodal interfaces in practice and enables the capacity to:  capture and model HMI design patterns, formal guidelines and expert domain knowledge in the field of multimodal interface design  reason and derive design recommendations (e.g. applicable HMI patterns) during the interface design  allow dynamic interface adaptation (e.g. context-aware output modality) at runtime
  10. 10.  The developed semantic model is a hierarchical class structure composed of a nested set of ontological models consisting of three levels of abstraction Core Design Pattern Model
  11. 11.  The core model defines a set of parameters and a multitude of relationships between them, e.g.
  12. 12.  The core model implements two types of reasoning rules: • Chain rules: deduce a relationship based on the transitive application of two other relationships, e.g. (System runsOn Device) AND (Device isUsedBy User) → (System interactsWith User) • SWRL1 rules: express a common sense knowledge or logic rules applicable to any design situation, e.g. IF (System detects Event) AND (Event hasCriticallityLevel severe) THEN (Event hasPriority high) (1) A Semantic Web Rule Language
  13. 13.  Applicability conditions for design patterns are described via SWRL rules: • Important Message Pattern IF (System detects Event) AND (Event hasPriority high) AND (User hasAttention on_environment) THEN (Interaction exhibitsPattern important_message_pattern) • Combination of Modalities Pattern IF (System detects Event) AND (Event impacts Environment) AND (Environment hasSafetyLevel risk_full) AND (System sends Information) AND (Information hasType instruction) THEN (Interaction exhibitsPattern combination_of_modalities_pattern)
  14. 14.  The purpose of this level is to    model a concrete application domain e.g. emergency dispatching derive new knowledge from the models and data in the inner abstraction layers The semantic models are instantiated with information such as  types of users involved: fire fighters, fire commanders;  activities and tasks: evacuation, search and rescue;  applications and devices: application features (supported users, detected events); device type, status (active, idle), components (audio, haptic);  working environment: inside/outside, noise/security level;  events: type (toxic smoke formation, approaching dangerous goods), priority, criticality level
  15. 15.  Produce design recommendations during the interface design process   enables easy simulation of different situations e.g. if approaching dangerous goods detected then send alarm message and display detailed map   allows modelling of a concrete application situation e.g. a fire commander instructs his firefighters to evacuate the building that is on fire derives the applicable design patterns for the concrete interaction Dynamic interface adaptation at runtime  suggests the modality to use according to context  determines the type of message and the level of detail  selects the device where the message needs to be sent  …
  16. 16. body awareness Event visual output Interaction User Application loud
  17. 17.  Validate and enrich patterns  Refine and extend ontology  Implement ontology at runtime  Collaboratively enrich ontology
  18. 18.  Ferreira Nàdia  Geldof Sabine  Hristoskova Anna  Tourwé Tom  Artemis Joint Undertaking for embedded computing  Innoviris research funding