Successfully reported this slideshow.

Comparing the Multimodal Interaction Technique Design of MINT with NiMMiT

1

Share

1 of 18
1 of 18

Comparing the Multimodal Interaction Technique Design of MINT with NiMMiT

1

Share

With new sensors that can capture hand and body movements in 3D, novel interaction techniques gain importance. But development of new forms of interaction is highly iterative, depends on extensive user testing and therefore is expensive. We propose a model-based notation using statecharts and mappings to ease multimodal interaction technique design. This model-based specification can be used to communicate designs, for evaluation and to enable re-use. Our contribution continues previous research on model-based interaction technique design considers multimodal interaction and addresses problems like the state explosion, error management and consideration of output modalities mentioned by earlier research. We evaluate our notation by comparing it with NiMMiT referring to the same use case to identify similarities, strength and problems.

With new sensors that can capture hand and body movements in 3D, novel interaction techniques gain importance. But development of new forms of interaction is highly iterative, depends on extensive user testing and therefore is expensive. We propose a model-based notation using statecharts and mappings to ease multimodal interaction technique design. This model-based specification can be used to communicate designs, for evaluation and to enable re-use. Our contribution continues previous research on model-based interaction technique design considers multimodal interaction and addresses problems like the state explosion, error management and consideration of output modalities mentioned by earlier research. We evaluate our notation by comparing it with NiMMiT referring to the same use case to identify similarities, strength and problems.

More Related Content

Related Books

Free with a 14 day trial from Scribd

See all

Comparing the Multimodal Interaction Technique Design of MINT with NiMMiT

  1. 1. Comparing the Multimodal Interaction Technique Design of MINT with NiMMiT Sebastian Feuerstack and Ednaldo B. Pizzolato Dr.-Ing. Sebastian Feuerstack OFFIS – Institute for Information Technology Oldenburg, Germany 21. Dezember 2013 Sebastian Feuerstack I Sebastian@Feuerstack.org
  2. 2. Outline 1. 2. 3. 4. Introduction / Use Case Related Work NiMMiT vs. MINT Results of Comparison 21. Dezember 2013 2
  3. 3. Post-WIMP Interfaces Non-Traditional UIs Specifically designed, High degree of freedom Modes: Speech, Gestures Media: Augmented Reality 21. Dezember 2013 3
  4. 4. Basic Question How to model multimodal interaction techniques? 21. Dezember 2013 4
  5. 5. Object-In-Hand-Metaphor Utilized in 3D worlds – Navigation, object selection and object manipulation. 21. Dezember 2013 Images taken from Boeck et al. 2007 5
  6. 6. Why modeling? Design, prototype, communicate and store interaction techniques to be re-usable. • High-level visual language – Incorporated in a tool – Abstracts source code – Declarative and precise to be executed 21. Dezember 2013 6
  7. 7. What is State of the Art ? State-driven Object Graphs [Carr97] ICOs [Navarre05] Dataflow-driven: InTml [Figueroa02] Icon [Dragicevic04] NiMMiT [Boeck06] State + Data + Event + Hierarchy 21. Dezember 2013 7
  8. 8. 21. Dezember 2013 The Augmented “Drag-and-Drop 8
  9. 9. What are the Problems ? State Explosion Missing Undo Steps Incorporation of Output Modalities
  10. 10. How do we model ? CUI Behavior Mapping B Structure • Models S Resource Interactor – Interactors (Abstract & Concrete Media, Mode) • Static: class diagram, Behavior: state-chart – Mappings (Data Flow) • Custom Notation 21. Dezember 2013 10
  11. 11. UI Model 3DObject:CUI hidden hide initialized position position displaying selected deselect displayed positioned unhighlight highlight highlighted select select_ texture face select_ face display Class1::3DObject stopped rotate stop rotating drag /parent.remove(self) dropped 21. Dezember 2013 texture H drop dragging +origin_x : int +origin_y : int +x,y,z : int +face : string +texture : string +rotation : int +move() +rotate() 11
  12. 12. Resource Interactor GestureInteractor DominantHand.PointingDevice stopped stopped Non-DominantHand stopped move move grab released grab open grabbing stopped moving move moving move_away move_away move waiting click 21. Dezember 2013 clicked translating rotating rotate 12
  13. 13. Selection Technique Highlighting Mapping Highlighting x,y = PointingDevice.stopped S obj=FC.collision(x,y) obj.highlight Selection Mapping Selection PointingDevice.clicked obj=3DObject.highlighted 21. Dezember 2013 C obj.select 13
  14. 14. Object-In-Hand-Metaphor Grap and Move Mapping Object in Hand and Moving while grabbing Non-DominantHand.grabbing obj = 3DObject.selected Tw<0,3s C n_x,n_y,offset=FC.calculateOffset(x1,y1,obj) obj.move(n_x,n_y) x1,y1 = PointingDevice.moving x2,y2=Non-DominantHand FC.proximity(x1,x2,y1,y2) Withdrawal Mapping Object Withdrawal Non-DominantHand.move_away objs = 3DObject.selected 21. Dezember 2013 S obj.deselect obj.move(obj.origin_x,obj.origin_y) 14
  15. 15. Results of Comparison 1. State Explosion – Data-flow separated from state-driven behavior model 2. Recovery Options, e.g. Drag-and-Drop c ) Drag-and-Drop Non-DominantHand.grabbing S Non-DominantHand.released Item= 3DObject.selected C dst.drop(item) dst=Box.highlighted item.drag src=item.parent src.drop(item) fail 3. Multimodal Output b ) Assignment of Output 3DObject.highlighted A Sound.click a ) Redundant Output x,y = PointingDevice.stopped obj=FC.colision(x,y) 21. Dezember 2013 S R obj.highlight Sound.click 15
  16. 16. Results of Comparison 2 4. Design Process NiMMiT: Sequential and iterative, Execution of XML-based Model, Model observation MINT: Initial Deployment, Model observation, Model manipulation at runtime 5. Tools Support NiMMiT: One Tool, one Model MINT: Two Tools, two (three) different Models 21. Dezember 2013 16
  17. 17. Questions or Comments? 21. Dezember 2013 17
  18. 18. Bio Sebastian Feuerstack Master in Information Systems and Artificial Intelligence, Technical University Berlin PhD about Modal-based Design of Interactive Systems, TU-Berlin & Researcher, DAI-Labor Post-Doc about Design of Multimodal Interaction, UFSCar & German Research Foundation Project Manager & Senior Researcher of the European Research Project: Holistic Human Factors and System Design of Adaptive Cooperative HumanMachine Systems, OFFIS, Oldenburg, Germany Research Interests: Engineering Interactive Systems, Models, Methods and Tools Application: Assistance Systems Publications, Slides, Videos and Projects: http://www.multi-access.de & http://www.feuerstack.org 21. Dezember 2013 18

Editor's Notes

  • The Post-WIMP term sums up the trend to design interaction techniques specifically to a certain combination of an application and one or more interaction devices and their control modes. Non-traditional interfaces that consider modes like speech or gestures and media such as augmented and hyper-reality offer a high degree of freedom in interaction design but make the design process cumbersome since extensive user testing is usually required to figure out an efficient and accessible way of interaction.
  • is
  • Transformationalapproach Not targetedtomultmodalinterfaces, Models areinspectable but transformationsarecomplexAssembly Approach: Black-boxedcomponents, extensibilityproblem
  • ×