Successfully reported this slideshow.

Model-based Design and Generation of a Gesture-based User Interface Navigation Control

0

Share

1 of 14
1 of 14

Model-based Design and Generation of a Gesture-based User Interface Navigation Control

0

Share

The gesture-based control of interfaces could enable
interaction in situations where hardware controls are
missing and support impaired people where other controls
fail. The rich spectrum of combining hand postures with
movements offers great interaction possibilities but requires
extensive user testing to figure out an optimal control with
a sufficient control performance and a low error rate.
In this paper we describe a declarative, model-based
gesture navigation design based on state charts that can be
used for the rapid generation of different prototypes to
accelerate user testing and comparison of different
interaction controls. We use the declarative modeling to
design and generate several variants of a gesture-based
interface navigation control. The models are described
using state charts and are transformed to state machines at
system runtime. They can be directly executed to form a
multimodal interaction.

The gesture-based control of interfaces could enable
interaction in situations where hardware controls are
missing and support impaired people where other controls
fail. The rich spectrum of combining hand postures with
movements offers great interaction possibilities but requires
extensive user testing to figure out an optimal control with
a sufficient control performance and a low error rate.
In this paper we describe a declarative, model-based
gesture navigation design based on state charts that can be
used for the rapid generation of different prototypes to
accelerate user testing and comparison of different
interaction controls. We use the declarative modeling to
design and generate several variants of a gesture-based
interface navigation control. The models are described
using state charts and are transformed to state machines at
system runtime. They can be directly executed to form a
multimodal interaction.

More Related Content

Related Books

Free with a 14 day trial from Scribd

See all

Model-based Design and Generation of a Gesture-based User Interface Navigation Control

  1. 1. Model-based Design and Generation of a Gesture-based User Interface Navigation Control Sebastian Feuerstack, Mauro dos Santos Anjo, and Ednaldo Pizzolato Dr.-Ing. Sebastian Feuerstack Universidade Federal de São Carlos Departamento de Computacão 4. November 2011 Sebastian Feuerstack I Sebastian@Feuerstack.org
  2. 2. Basic Question How to model and run multimodal interaction prototype variants ? 4. November 2011 2
  3. 3. Why modeling? • One (visual) language that – is declarative and precise to discuss & execute – Is located between a tool and source code – Supports different forms of interaction and flexible prototyping of multimodal interfaces 4. November 2011 3
  4. 4. What is state of the art ? 1. Task AUI CUI FUI CUI FUI 2. Widget W1 W2 Alg. A1 D1 Driver 4. November 2011 4
  5. 5. Use Case: Test and Evaluation Gesture-based Interface Navigation :IN:hand_gestures NoHands [one_hand] [no_hands] OneHand Navigation previous / next / Predecessor Command closer / farer closer /farer Ticker tick previous wait_one start_p started tick previous select confirmed next [1s]/tick previous next selected [confirm] Successor tick next start_n tick select select confirm • Rapid model-based Design and Comparison of three variants 4. November 2011 The Augmented “Drag-and-Drop 5
  6. 6. How to model? • Models – Interactors (Abstract & Concrete Media, Mode) • Static: class diagram, Behavior: state-chart – Mappings Mode-To-Media Synchronization • Custom Notation 4. November 2011 6
  7. 7. Abstract Media Model 4. November 2011 7
  8. 8. Abstract Media Model Single Choice Aggregates a set of Entities from that only one can be chosen at a time Examples: Direction (left or right), Shopping Cart Further Properties: Container -> Aggregation, Discrete, Output to the User Contains Single Choice Elements that are Inputs 4. November 2011 8
  9. 9. Abstract Behavior Model AUI:AIO:AIChoiceElement: AISingleChoiceElement Presenting drop listed initialized next||prev||parent focus dragging defocus /aio=find(act); organize aio.focus drag focused organized present Selection H organize unchosen [in(focused)] choose suspend unchoose / aios=find(parent.childs.chosen); aios.all.unchoose chosen suspended 4. November 2011 9
  10. 10. Mode Model (Example: Gesture Interactor) IR:IN:HandGesture:FlexibleTicker NoHands right_hand_appeared right_hand_disappeared Right Hand Navigation Speed Predecessor Command started [1s]/tick [t]/tick entry/start_ticker tick previous wait_one exit/stop_ticker previous start_p tick select confirmed next SpeedAdjustment previous closer closer next selected confirm normal faster fastest entry/ entry/ entry/ Successor t = 1200ms; farer t = 1000ms; farer t = 800ms; restart_ticker restart_ticker restart_ticker tick next start_n select tick 4. November 2011 10
  11. 11. Multimodal Mapping (Combining Mode with concrete Media) 4. November 2011 11
  12. 12. Conclusions & Future Work What’s the advantage? • Detailed Modeling of Multimodal Interactions including behavior • Existing tools • No gap between design- and runtime 4. November 2011 12
  13. 13. Whats next? Focus on: Visit our website – Fusion – All papers – Paradigm Design – Videos – Formalization – Open Source – Tools Software – MINT Framework – MINT Framework http://www.multi-access.de 4. November 2011 Design of Multimodal Interaction 13
  14. 14. 4. November 2011 14

Editor's Notes

  • Transformationalapproach Not targetedtomultmodalinterfaces, Models areinspectable but transformationsarecomplexAssembly Approach: Black-boxedcomponents, extensibilityproblem
  • ×