SlideShare a Scribd company logo
1 of 47
Download to read offline
2 December 2005
Next Generation User Interfaces
Multimodal Interaction
Prof. Beat Signer
Department of Computer Science
Vrije Universiteit Brussel
http://www.beatsigner.com
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 2October 24, 2016
Human Senses and Modalities
 Input as well as output modalities
Sensory Perception Sense Organ Modality
sight eyes visual
hearing ears auditory
smell nose olfactory
taste tongue gustatory
touch skin tactile
(balance) vestibular system vestibular
Modality “refers to the type of communication channel used to convey
or acquire information. It also covers the way an idea is expressed or
perceived, or the manner an action is performed.”
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 3October 24, 2016
Bolt's "Put-that-there" (1980)
Bolt, 1980
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 4October 24, 2016
Video: “Put-that-there”
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 5October 24, 2016
Bolt's "Put-that-there" (1980) ...
 Combination of two input modalities
 speech is used to issue the semantic part of the command
 gestures are used to provide the locations on the screen
 Complementary use of both modalities
 use of only speech or gestures does not enable the same
commands
 The "Put-that-there" interaction is still regarded as a rich
case of multimodal interaction
 however, it is based on an old mouse-oriented metaphor for
selecting objects
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 6October 24, 2016
Multimodal Interaction
 Multimodal interaction is about human-machine
interaction involving multiple modalities
 input modalities
- speech, gesture, gaze, emotions, …
 output modalities
- voice synthesis, visual cues, …
 Humans are inherently multimodal!
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 7October 24, 2016
Example: SpeeG2
User
Speech recognition
(Microsoft SAPI 5.4)
Skeletal tracking
(Microsoft Kinect)
5
4
2
3
SpeeG2 GUI
6
1
Sven De Kock
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 8October 24, 2016
Example: SpeeG2 …
 While as user speaks a
sentence, the most
probable words are shown
on the screen
 By using hand gestures
the user can correct words
(by selection) if necessary
 Improved recognition rate
with up to 21 words per
minute (WPM)
SpeeG2, WISE Lab
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 9October 24, 2016
Video: SpeeG2
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 10October 24, 2016
GUIs vs. Multimodal Interfaces
Graphical User Interfaces Multimodal Interfaces
Single event input stream that controls
the event loop
Typically continuous and simultaneous input
from multiple input streams
Sequential processing Parallel processing
Centralised architecture Often distributed architectures (e.g. multi
agent) with high computational and memory
requirements
No temporal constraints Requires timestamping and temporal
constraints for multimodal fusion
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 11October 24, 2016
Multimodal Human-Machine Interaction Loop
Dumas et al., 2009
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 12October 24, 2016
Multimodal Fusion
 Fusion in a multimodal system can happen at three
different levels
 data level
 feature level
 decision level
Dumas et al., 2009
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 13October 24, 2016
Multimodal Fusion …
 Data-level fusion
 focuses on the fusion of identical or tightly linked types of
multimodal data
- e.g. two video streams capturing the same scene from different angles
- low semantics and highly sensitive to noise (no preprocessing)
 Feature-level fusion
 fusion of features extracted from the data
- e.g. speech and lip movements
 Decision-level fusion
 focuses on interpretation based on semantic data
- e.g. Bolt’s "Put-that-there"  speech and gesture
 fusion of high-level information derived from data- and feature-
level fusion
- highly resistant to noise and failures
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 14October 24, 2016
Comparison of Fusion Levels
Data-Level Fusion Feature-Level Fusion Decision-Level Fusion
Used for raw data of the
same modality
closely coupled
modalities
loosely coupled
modalities
Level of
detail (data)
highest level of
detail
moderate level of
detail
difficult to recover data
that has been fused on
the data and feature
level
Sensitivity
to noise and
failures
highly sensitive moderately sensitive not very sensitive
Usage not frequently used
for multimodal
interfaces
used to improve the
recognition rate
most widely used level
of fusion for multimodal
interfaces
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 15October 24, 2016
Advantages of Multimodal Input
 Support and accommodate a user's perceptual
and communicative capabilities
 natural user interfaces with new ways of engaging interaction
 Integration of computational skillls of computers in the
real world by offering more natural ways of human-
machine interaction
 Enhanced robustness due to the combination of different
(partial) information sources
 Flexible personalisation by using different modalities
based on user preferences and context
 Helps visually or physically impaired users
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 16October 24, 2016
Multimodal Fission
 Forms a comprehensible answer for the user by making
use of multiple output modalities based on the device
and context
 builds an abstract message through a combination of channels
 Three main steps
 selection and structuring of content
 selection of modalities
 output coordination
 Coordination of the output on each channel is necessary
to form a coherent message
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 17October 24, 2016
Ten Myths of Multimodal Interaction
1. If you build a multimodal system, users
will interact multimodally
 users show a strong preference to interact
multimodally but this does not mean that they
always interact multimodally
 unimodal and multimodal interactions are mixed
depending on the task to be carried out
2. Speech and pointing is the dominant
multimodal integration pattern
 heritage from Bolt's "Put-that-there" where gesture and speech
are used to select an object (mouse-oriented metaphor)
 modalities can be used for more than just object selection
- written input, facial expressions, ...
Sharon Oviatt
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 18October 24, 2016
Ten Myths of Multimodal Interaction ...
3. Multimodal input involves simultaneous signals
 signals are often not overlapping
 users frequently introduce (consciously or not) a small delay
between two modal inputs
 developers should not rely on an overlap for the fusion process
4. Speech is the primary input mode in any multimodal
system that includes it
 in human-human communication speech is indeed the primary
input mode
 in human-machine communication speech is not the exclusive
carrier of important information and does also not have temporal
precedence over other modalities
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 19October 24, 2016
Ten Myths of Multimodal Interaction ...
5. Multimodal language does not differ linguistically from
unimodal language
 different modalities complement each other
- e.g. avoid error-prone voice descriptions of spatial locations by pointing
 in many respects multimodal language is substantially simplified
- might lead to more robust systems
6. Multimodal integration involves redundancy of content
between modes
 early research in multimodal interaction assumed that different
redundant modalities could improve the recognition
 complementary modalities are more important and designers
should not rely on duplicated content
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 20October 24, 2016
Ten Myths of Multimodal Interaction ...
7. Individual error-prone recognition technologies combine
multimodally to produce even greater unreliability
 assumes that using two error-prone input modes such as speech
and handwriting recognition results in greater unreliability
 however, increased robustness due to mutual disambiguation
between modalities can be observed
8. All users’ multimodal commands are integrated in a
uniform way
 different users use different integration patterns (e.g. parallel vs.
sequential)
 system that can detect and adapt to a user’s dominant integration
pattern might lead to increased recognition rates
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 21October 24, 2016
Ten Myths of Multimodal Interaction ...
9. Different input modes are capable of transmitting
comparable content
 some modalities can be compared (e.g. speech and writing)
 however, each modality has its own very distinctive gestures
- e.g. gaze vs. a pointing device such a the Wiimote
10.Enhanced efficiency is the main advantage of
multimodal systems
 often there is only a minimal increase in efficiency
 however, many other advantages
- less errors, enhanced usability, flexibility, mutual disambiguation, …
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 22October 24, 2016
Formal Models for Combining Modalities
 Formalisation of multimodal human-machine interaction
 Conceptualise the different possible relationships
between input and output modalities
 Two conceptual spaces
 CASE: use and combination or modalities on the system side
(fusion engine)
 CARE: possible combination of modalities at the user level
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 23October 24, 2016
CASE Model
 3 dimensions in the
CASE design space
 levels of abstraction
 use of modalities
 fusion
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 24October 24, 2016
CASE Model: Levels of Abstraction
 Data received from a
device can be processed
at multiple levels of
abstraction
 Speech analysis example
 signal (data) level
 phonetic (feature) level
 semantic level
 A multimodal system is
always classified under
the meaning category
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 25October 24, 2016
CASE Model: Use of Modalities
 The use of modalities
expresses the temporal
availability of multiple
modalities
 parallel use allows the user
to employ multiple modalities
simultaneously
 sequential use forces the
user to use the modalities
one after another
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 26October 24, 2016
CASE Model: Fusion
 Possible combination of
different types of data
 combined means that there
is fusion
 independant means that
there is an absence of fusion
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 27October 24, 2016
CASE Model: Classification
 Concurrent
 two distinctive tasks
executed in parallel
 e.g. draw a circle and draw
a square next to it
 Alternate
 task with temporal alternation
of modalities
 e.g. "Draw a circle there"
followed by pointing to the
location
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 28October 24, 2016
CASE Model: Classification
 Synergistic
 task using several coreferent
modalities in parallel
 e.g. "Draw a circle here" with
concurrent pointing to the
location
 synergistic multimodal
systems are the ultimate goal
 Exclusive
 one task after the other with
one modality at a time (no
coreference)
Nigay and Coutaz, 1993
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 29October 24, 2016
CARE Properties
 Four properties to characterise and assess aspects of
multimodal interaction in terms of the combination of
modalities at the user level [Coutaz et al., 1995]
 Complementarity
 Assignment
 Redundancy
 Equivalence
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 30October 24, 2016
Complementarity
 Multiple modalities are to be used within a temporal
window in order reach a given state
 No single modality on its own is sufficient to reach the
state
 Integration (fusion) can happen sequentially or in parallel
 Example
 “Please show me this list”
and <pointing at ‘list of flights’ label>
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 31October 24, 2016
Assignment
 Only one modality can be used to reach a given state
 Absence of choice
 Example
 Moving the mouse to change the position of a window
(considering that the mouse is the only modality available for that
operation)
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 32October 24, 2016
Redundancy
 If multiple modalities have the same expressive power
(equivalent) and if they are used within the same
temporal window
 Repetitive behaviour without increasing the expressive
power
 Integration (fusion) can happen sequentially or in parallel
 Example
 “Could you show me the list of flights?”
and <click on ‘list of flights’ button>
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 33October 24, 2016
Equivalence
 Necessary and sufficient to use any single of the
available modalities
 Choice between multiple modalities
 No temporal constraints
 Example
 “Could you show me the list of flights?”
or <click on ‘list of flights’ button>
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 34October 24, 2016
Example for Equivalence: EdFest 2004
 Edfest 2004 prototype con-
sisted of a digital pen and
paper interface in combination
with speech input and voice
output [Belotti et al., 2005]
 User had a choice to execute
some commands either via
pen input or through speech
input
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 35October 24, 2016
Multimodal Interaction Frameworks
 Software frameworks for the creation of multimodal
interfaces
 Squidy
 OpenInterface
 HephaisTK
 Mudra
 Three families of frameworks
 stream based
- Squidy, OpenInterface
 event based
- HephaisTK
 hybrid
- Mudra
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 36October 24, 2016
Squidy
 Integrated tool with a GUI
 Interfaces are define by
pipelining components
and filters
 Online knowledge base of
device components and
filters
 Semantic zooming
 e.g. zooming on a filter
shows a graph of the data
passing through the filter
http://www.squidy-lib.de
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 37October 24, 2016
OpenInterface Framework
 European research project
(2006-2009)
 Component-based
framework
 Online component library
with components for
numerous modalities and
devices
 Design an application by
pipelining components
http://www.openinterface.org
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 38October 24, 2016
HephaisTK
 Software agents-based
framework
 Focus on events
 Different fusion algorithms
 Description of the
dialogue via the SMUIML
language
http://www.hephaistk.org
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 39October 24, 2016
UsiXML
 User Interface eXtended
Markup Language
 XML-based declarative
User Interface Description
Language (UIDL) to des-
cribe abstract interfaces
 can later be mapped to
different physical character-
istics (including multimodal
user interfaces)
Stanciulescu et al., 2005
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 40October 24, 2016
Mudra
 Fusion across different
levels of abstraction (data,
feature and decision)
 via fact base
 Interactions described via
a declarative rule-based
language
 Rapid prototyping
 simple integration of new
input devices
 integration of external
gesture recognisers
Hoste et al., 2011
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 41October 24, 2016
References
 R.A. Bolt, “Put-that-there”: Voice and Gesture at
the Graphics Interface, In Proceedings of Siggraph 1980,
7th Annual Conference on Computer Graphics and Inter-
active Techniques, Seattle, USA, July 1980
 http://dx.doi.org/10.1145/965105.807503
 L. Nigay and J. Coutaz, A Design Space for Multimodal
Systems: Concurrent Processing and Data Fusion, In
Proceedings of CHI 1993, 3rd International Conference
on Human Factors in Computing Systems, Amsterdam,
The Netherlands, April 1993
 http://dx.doi.org/10.1145/169059.169143
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 42October 24, 2016
References …
 J. Coutaz et al., Four Easy Pieces for
Assessing the Usability of Multimodal Interaction: The
CARE Properties, In Proceedings of Interact 1995, 5th
International Conference on Human-Computer
Interaction, Lillehammer, Norway, June 1995
 http://dx.doi.org/10.1007/978-1-5041-2896-4_19
 B. Dumas, D. Lalanne and S. Oviatt, Multi-
modal Interfaces: A Survey of Principles, Models and
Frameworks, Human Machine Interaction, LNCS 5440,
2009
 http://dx.doi.org/10.1007/978-3-642-00437-7_1
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 43October 24, 2016
References …
 S. Oviatt, Ten Myths of Multimodal Interaction,
Communications of the ACM 42(11), November 1999
 http://dx.doi.org/10.1145/319382.319398
 Put-that-there demo
 https://www.youtube.com/watch?v=RyBEUyEtxQo
 R. Belotti, C. Decurtins, M.C. Norrie, B. Signer and
L. Vukelja, Experimental Platform for Mobile Information
Systems, Proceedings of MobiCom 2005, 11th Annual
International Conference on Mobile Computing and Net-
working, Cologne, Germany, August 2005
 http://dx.doi.org/10.1145/1080829.1080856
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 44October 24, 2016
References …
 L. Hoste and B. Signer, SpeeG2: A Speech-
and Gesture-based Interface for Efficient Controller-free
Text Entry, Proceedings of ICMI 2013, 15th International
Conference on Multimodal Interaction, Sydney, Australia,
December 2013
 http://beatsigner.com/publications/hoste_ICMI2013.pdf
 L. Hoste, B. Dumas and B. Signer, Mudra: A Unified
Multimodal Interaction Framework, Proceedings of
ICMI 2011, 13th International Conference on Multimodal
Interaction, Alicante, Spain, November 2011
 http://beatsigner.com/publications/hoste_ICMI2011.pdf
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 45October 24, 2016
References …
 L. Hoste, A Declarative Approach for Engineer-
ing Multimodal Interaction, PhD thesis, Vrije Universiteit
Brussel (VUB), Brussels, Belgium, June 2015
 http://beatsigner.com/theses/PhdThesisLodeHoste.pdf
 A. Stanciulescu, Q. Limbourg, J. Vanderdonckt,
B. Michotte and F. Montero, A Transformational Ap-
proach for Multimodal Web User Interfaces Based on
UsiXML, Proceedings of ICMI 2005, 7th International
Conference on Multimodal Interfaces, Trento, Italy,
October 2005
 http://dx.doi.org/10.1145/1088463.1088508
Beat Signer - Department of Computer Science - bsigner@vub.ac.be 46October 24, 2016
References …
 SpeeG2 demo
 https://www.youtube.com/watch?v=ItKySNv8l90
2 December 2005
Next Lecture
Pen-based Interaction

More Related Content

What's hot

HCI 3e - Ch 20: Ubiquitous computing and augmented realities
HCI 3e - Ch 20:  Ubiquitous computing and augmented realitiesHCI 3e - Ch 20:  Ubiquitous computing and augmented realities
HCI 3e - Ch 20: Ubiquitous computing and augmented realitiesAlan Dix
 
Blue Eye Technology Seminar Presentation
Blue Eye Technology Seminar PresentationBlue Eye Technology Seminar Presentation
Blue Eye Technology Seminar PresentationVaibhav Kumar
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyesRoshmi Sarmah
 
Cognitive architecture
Cognitive architectureCognitive architecture
Cognitive architectureHasam Panezai
 
Blue eye technology
Blue eye technologyBlue eye technology
Blue eye technologyDivya Mohan
 
Distributed Computing in IoT
Distributed Computing in IoTDistributed Computing in IoT
Distributed Computing in IoTKishan Patel
 
Human computer interaction
Human  computer interactionHuman  computer interaction
Human computer interactionAyusha Patnaik
 
HCI 3e - Ch 12: Cognitive models
HCI 3e - Ch 12:  Cognitive modelsHCI 3e - Ch 12:  Cognitive models
HCI 3e - Ch 12: Cognitive modelsAlan Dix
 
Gesture recognition adi
Gesture recognition adiGesture recognition adi
Gesture recognition adiaditya verma
 
SPEECH BASED EMOTION RECOGNITION USING VOICE
SPEECH BASED  EMOTION RECOGNITION USING VOICESPEECH BASED  EMOTION RECOGNITION USING VOICE
SPEECH BASED EMOTION RECOGNITION USING VOICEVamshidharSingh
 
HCI 3e - Ch 4: Paradigms
HCI 3e - Ch 4:  ParadigmsHCI 3e - Ch 4:  Paradigms
HCI 3e - Ch 4: ParadigmsAlan Dix
 
Challenges in Cloud Computing – VM Migration
Challenges in Cloud Computing – VM MigrationChallenges in Cloud Computing – VM Migration
Challenges in Cloud Computing – VM MigrationSarmad Makhdoom
 
Direct manipulation and virtual environments
Direct manipulation and virtual environmentsDirect manipulation and virtual environments
Direct manipulation and virtual environmentsSanjog Sigdel
 
Context-Aware Computing
Context-Aware ComputingContext-Aware Computing
Context-Aware Computinglogus2k
 
Screenless Display PPT
Screenless Display PPTScreenless Display PPT
Screenless Display PPTVikas Kumar
 
Virtual Reality-Seminar presentation
Virtual Reality-Seminar  presentationVirtual Reality-Seminar  presentation
Virtual Reality-Seminar presentationShreyansh Vijay Singh
 
Human Computer Interaction (HCI)
Human Computer Interaction (HCI)Human Computer Interaction (HCI)
Human Computer Interaction (HCI)Lahiru Danushka
 

What's hot (20)

HCI 3e - Ch 20: Ubiquitous computing and augmented realities
HCI 3e - Ch 20:  Ubiquitous computing and augmented realitiesHCI 3e - Ch 20:  Ubiquitous computing and augmented realities
HCI 3e - Ch 20: Ubiquitous computing and augmented realities
 
Blue Eye Technology Seminar Presentation
Blue Eye Technology Seminar PresentationBlue Eye Technology Seminar Presentation
Blue Eye Technology Seminar Presentation
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyes
 
Cognitive architecture
Cognitive architectureCognitive architecture
Cognitive architecture
 
Blue eye technology
Blue eye technologyBlue eye technology
Blue eye technology
 
Distributed Computing in IoT
Distributed Computing in IoTDistributed Computing in IoT
Distributed Computing in IoT
 
Human computer interaction
Human  computer interactionHuman  computer interaction
Human computer interaction
 
HCI 3e - Ch 12: Cognitive models
HCI 3e - Ch 12:  Cognitive modelsHCI 3e - Ch 12:  Cognitive models
HCI 3e - Ch 12: Cognitive models
 
Gesture recognition adi
Gesture recognition adiGesture recognition adi
Gesture recognition adi
 
SPEECH BASED EMOTION RECOGNITION USING VOICE
SPEECH BASED  EMOTION RECOGNITION USING VOICESPEECH BASED  EMOTION RECOGNITION USING VOICE
SPEECH BASED EMOTION RECOGNITION USING VOICE
 
Blue eye technology ppt
Blue eye technology pptBlue eye technology ppt
Blue eye technology ppt
 
HCI 3e - Ch 4: Paradigms
HCI 3e - Ch 4:  ParadigmsHCI 3e - Ch 4:  Paradigms
HCI 3e - Ch 4: Paradigms
 
Challenges in Cloud Computing – VM Migration
Challenges in Cloud Computing – VM MigrationChallenges in Cloud Computing – VM Migration
Challenges in Cloud Computing – VM Migration
 
Universal design HCI
Universal design HCIUniversal design HCI
Universal design HCI
 
Direct manipulation and virtual environments
Direct manipulation and virtual environmentsDirect manipulation and virtual environments
Direct manipulation and virtual environments
 
Context-Aware Computing
Context-Aware ComputingContext-Aware Computing
Context-Aware Computing
 
Eye Ring ppt
Eye Ring pptEye Ring ppt
Eye Ring ppt
 
Screenless Display PPT
Screenless Display PPTScreenless Display PPT
Screenless Display PPT
 
Virtual Reality-Seminar presentation
Virtual Reality-Seminar  presentationVirtual Reality-Seminar  presentation
Virtual Reality-Seminar presentation
 
Human Computer Interaction (HCI)
Human Computer Interaction (HCI)Human Computer Interaction (HCI)
Human Computer Interaction (HCI)
 

Viewers also liked

Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...Beat Signer
 
Interfaces Multimodais Inteligentes
Interfaces Multimodais InteligentesInterfaces Multimodais Inteligentes
Interfaces Multimodais InteligentesJoana Paulo Pardal
 
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...Beat Signer
 
Security, Privacy and Trust - Web Technologies (1019888BNR)
Security, Privacy and Trust - Web Technologies (1019888BNR)Security, Privacy and Trust - Web Technologies (1019888BNR)
Security, Privacy and Trust - Web Technologies (1019888BNR)Beat Signer
 
Interactive Tabletops and Surfaces - Lecture 07 - Next Generation User Interf...
Interactive Tabletops and Surfaces - Lecture 07 - Next Generation User Interf...Interactive Tabletops and Surfaces - Lecture 07 - Next Generation User Interf...
Interactive Tabletops and Surfaces - Lecture 07 - Next Generation User Interf...Beat Signer
 
Information Architectures - Lecture 04 - Next Generation User Interfaces (401...
Information Architectures - Lecture 04 - Next Generation User Interfaces (401...Information Architectures - Lecture 04 - Next Generation User Interfaces (401...
Information Architectures - Lecture 04 - Next Generation User Interfaces (401...Beat Signer
 
XML and Related Technologies - Web Technologies (1019888BNR)
XML and Related Technologies - Web Technologies (1019888BNR)XML and Related Technologies - Web Technologies (1019888BNR)
XML and Related Technologies - Web Technologies (1019888BNR)Beat Signer
 
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)Beat Signer
 
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...Beat Signer
 
Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...
Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...
Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...Beat Signer
 
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...Beat Signer
 
CSS3 and Responsive Web Design - Web Technologies (1019888BNR)
CSS3 and Responsive Web Design - Web Technologies (1019888BNR)CSS3 and Responsive Web Design - Web Technologies (1019888BNR)
CSS3 and Responsive Web Design - Web Technologies (1019888BNR)Beat Signer
 
Semantic Web and Web 3.0 - Web Technologies (1019888BNR)
Semantic Web and Web 3.0 - Web Technologies (1019888BNR)Semantic Web and Web 3.0 - Web Technologies (1019888BNR)
Semantic Web and Web 3.0 - Web Technologies (1019888BNR)Beat Signer
 
Web 2.0 Patterns and Technologies - Web Technologies (1019888BNR)
Web 2.0 Patterns and Technologies - Web Technologies (1019888BNR)Web 2.0 Patterns and Technologies - Web Technologies (1019888BNR)
Web 2.0 Patterns and Technologies - Web Technologies (1019888BNR)Beat Signer
 
Web Search and SEO - Web Technologies (1019888BNR)
Web Search and SEO - Web Technologies (1019888BNR)Web Search and SEO - Web Technologies (1019888BNR)
Web Search and SEO - Web Technologies (1019888BNR)Beat Signer
 
Advanced SQL - Lecture 6 - Introduction to Databases (1007156ANR)
Advanced SQL - Lecture 6 - Introduction to Databases (1007156ANR)Advanced SQL - Lecture 6 - Introduction to Databases (1007156ANR)
Advanced SQL - Lecture 6 - Introduction to Databases (1007156ANR)Beat Signer
 
Course Review - Lecture 12 - Next Generation User Interfaces (4018166FNR)
Course Review - Lecture 12 - Next Generation User Interfaces (4018166FNR)Course Review - Lecture 12 - Next Generation User Interfaces (4018166FNR)
Course Review - Lecture 12 - Next Generation User Interfaces (4018166FNR)Beat Signer
 
Structured Query Language (SQL) - Lecture 5 - Introduction to Databases (1007...
Structured Query Language (SQL) - Lecture 5 - Introduction to Databases (1007...Structured Query Language (SQL) - Lecture 5 - Introduction to Databases (1007...
Structured Query Language (SQL) - Lecture 5 - Introduction to Databases (1007...Beat Signer
 
The Top Skills That Can Get You Hired in 2017
The Top Skills That Can Get You Hired in 2017The Top Skills That Can Get You Hired in 2017
The Top Skills That Can Get You Hired in 2017LinkedIn
 

Viewers also liked (19)

Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
Requirements Analysis, Prototyping and Evaluation - Lecture 03 - Next Generat...
 
Interfaces Multimodais Inteligentes
Interfaces Multimodais InteligentesInterfaces Multimodais Inteligentes
Interfaces Multimodais Inteligentes
 
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
Gesture-based Interaction - Lecture 08 - Next Generation User Interfaces (401...
 
Security, Privacy and Trust - Web Technologies (1019888BNR)
Security, Privacy and Trust - Web Technologies (1019888BNR)Security, Privacy and Trust - Web Technologies (1019888BNR)
Security, Privacy and Trust - Web Technologies (1019888BNR)
 
Interactive Tabletops and Surfaces - Lecture 07 - Next Generation User Interf...
Interactive Tabletops and Surfaces - Lecture 07 - Next Generation User Interf...Interactive Tabletops and Surfaces - Lecture 07 - Next Generation User Interf...
Interactive Tabletops and Surfaces - Lecture 07 - Next Generation User Interf...
 
Information Architectures - Lecture 04 - Next Generation User Interfaces (401...
Information Architectures - Lecture 04 - Next Generation User Interfaces (401...Information Architectures - Lecture 04 - Next Generation User Interfaces (401...
Information Architectures - Lecture 04 - Next Generation User Interfaces (401...
 
XML and Related Technologies - Web Technologies (1019888BNR)
XML and Related Technologies - Web Technologies (1019888BNR)XML and Related Technologies - Web Technologies (1019888BNR)
XML and Related Technologies - Web Technologies (1019888BNR)
 
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
Interaction Design - Lecture 2 - Next Generation User Interfaces (4018166FNR)
 
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
Virtual and Augmented Reality - Lecture 10 - Next Generation User Interfaces ...
 
Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...
Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...
Tangible, Embedded and Embodied Interaction - Lecture 09 - Next Generation Us...
 
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
Implicit Human-Computer Interaction - Lecture 11 - Next Generation User Inter...
 
CSS3 and Responsive Web Design - Web Technologies (1019888BNR)
CSS3 and Responsive Web Design - Web Technologies (1019888BNR)CSS3 and Responsive Web Design - Web Technologies (1019888BNR)
CSS3 and Responsive Web Design - Web Technologies (1019888BNR)
 
Semantic Web and Web 3.0 - Web Technologies (1019888BNR)
Semantic Web and Web 3.0 - Web Technologies (1019888BNR)Semantic Web and Web 3.0 - Web Technologies (1019888BNR)
Semantic Web and Web 3.0 - Web Technologies (1019888BNR)
 
Web 2.0 Patterns and Technologies - Web Technologies (1019888BNR)
Web 2.0 Patterns and Technologies - Web Technologies (1019888BNR)Web 2.0 Patterns and Technologies - Web Technologies (1019888BNR)
Web 2.0 Patterns and Technologies - Web Technologies (1019888BNR)
 
Web Search and SEO - Web Technologies (1019888BNR)
Web Search and SEO - Web Technologies (1019888BNR)Web Search and SEO - Web Technologies (1019888BNR)
Web Search and SEO - Web Technologies (1019888BNR)
 
Advanced SQL - Lecture 6 - Introduction to Databases (1007156ANR)
Advanced SQL - Lecture 6 - Introduction to Databases (1007156ANR)Advanced SQL - Lecture 6 - Introduction to Databases (1007156ANR)
Advanced SQL - Lecture 6 - Introduction to Databases (1007156ANR)
 
Course Review - Lecture 12 - Next Generation User Interfaces (4018166FNR)
Course Review - Lecture 12 - Next Generation User Interfaces (4018166FNR)Course Review - Lecture 12 - Next Generation User Interfaces (4018166FNR)
Course Review - Lecture 12 - Next Generation User Interfaces (4018166FNR)
 
Structured Query Language (SQL) - Lecture 5 - Introduction to Databases (1007...
Structured Query Language (SQL) - Lecture 5 - Introduction to Databases (1007...Structured Query Language (SQL) - Lecture 5 - Introduction to Databases (1007...
Structured Query Language (SQL) - Lecture 5 - Introduction to Databases (1007...
 
The Top Skills That Can Get You Hired in 2017
The Top Skills That Can Get You Hired in 2017The Top Skills That Can Get You Hired in 2017
The Top Skills That Can Get You Hired in 2017
 

Similar to Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (4018166FNR)

Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...Beat Signer
 
Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interactionDr. Rajesh P Barnwal
 
Cross-Media Information Systems - Quo Vadis?
Cross-Media Information Systems - Quo Vadis?Cross-Media Information Systems - Quo Vadis?
Cross-Media Information Systems - Quo Vadis?Beat Signer
 
Chapter 4 universal design
Chapter 4 universal designChapter 4 universal design
Chapter 4 universal designPado Pado
 
Speech enabled interactive voice response system
Speech enabled interactive voice response systemSpeech enabled interactive voice response system
Speech enabled interactive voice response systemeSAT Journals
 
From PaperPoint to MindXpres - Towards Enhanced Presentation Tools
From PaperPoint to MindXpres - Towards Enhanced Presentation ToolsFrom PaperPoint to MindXpres - Towards Enhanced Presentation Tools
From PaperPoint to MindXpres - Towards Enhanced Presentation ToolsBeat Signer
 
Multimodal and Affective Human Computer Interaction - Abhinav Sharma
Multimodal and Affective Human Computer Interaction - Abhinav SharmaMultimodal and Affective Human Computer Interaction - Abhinav Sharma
Multimodal and Affective Human Computer Interaction - Abhinav SharmaAbhinav Sharma
 
Summer school bz_fp7research_20100708
Summer school bz_fp7research_20100708Summer school bz_fp7research_20100708
Summer school bz_fp7research_20100708Sandro D'Elia
 
Artificial Intelligence Advances | Vol.3, Iss.1 April 2021
Artificial Intelligence Advances | Vol.3, Iss.1 April 2021Artificial Intelligence Advances | Vol.3, Iss.1 April 2021
Artificial Intelligence Advances | Vol.3, Iss.1 April 2021Bilingual Publishing Group
 
5. Assessing The Accessibility Of User Interfaces For All Ages
5. Assessing The Accessibility Of User Interfaces For All Ages5. Assessing The Accessibility Of User Interfaces For All Ages
5. Assessing The Accessibility Of User Interfaces For All AgesAEGIS-ACCESSIBLE Projects
 
5. Assessing The Accessibility Of User Interfaces For All Ages
5. Assessing The Accessibility Of User Interfaces For All Ages5. Assessing The Accessibility Of User Interfaces For All Ages
5. Assessing The Accessibility Of User Interfaces For All AgesAEGIS-ACCESSIBLE Projects
 
Demonstration of visual based and audio-based hci system
Demonstration of visual based and audio-based hci systemDemonstration of visual based and audio-based hci system
Demonstration of visual based and audio-based hci systemeSAT Journals
 
The Recurated Museum: IV. Collections Management & Sustainability
The Recurated Museum: IV. Collections Management & SustainabilityThe Recurated Museum: IV. Collections Management & Sustainability
The Recurated Museum: IV. Collections Management & SustainabilityChristopher Morse
 
Relinquishing Control: Creating Space for Open Innovation
Relinquishing Control: Creating Space for Open InnovationRelinquishing Control: Creating Space for Open Innovation
Relinquishing Control: Creating Space for Open Innovationfrog
 
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014Journal For Research
 

Similar to Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (4018166FNR) (20)

Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (401816...
 
Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interaction
 
Multimodal Interaction
Multimodal InteractionMultimodal Interaction
Multimodal Interaction
 
Cross-Media Information Systems - Quo Vadis?
Cross-Media Information Systems - Quo Vadis?Cross-Media Information Systems - Quo Vadis?
Cross-Media Information Systems - Quo Vadis?
 
Chapter 4 universal design
Chapter 4 universal designChapter 4 universal design
Chapter 4 universal design
 
Speech enabled interactive voice response system
Speech enabled interactive voice response systemSpeech enabled interactive voice response system
Speech enabled interactive voice response system
 
From PaperPoint to MindXpres - Towards Enhanced Presentation Tools
From PaperPoint to MindXpres - Towards Enhanced Presentation ToolsFrom PaperPoint to MindXpres - Towards Enhanced Presentation Tools
From PaperPoint to MindXpres - Towards Enhanced Presentation Tools
 
Multimodal and Affective Human Computer Interaction - Abhinav Sharma
Multimodal and Affective Human Computer Interaction - Abhinav SharmaMultimodal and Affective Human Computer Interaction - Abhinav Sharma
Multimodal and Affective Human Computer Interaction - Abhinav Sharma
 
ICS3211 lecture 07
ICS3211 lecture 07ICS3211 lecture 07
ICS3211 lecture 07
 
Summer school bz_fp7research_20100708
Summer school bz_fp7research_20100708Summer school bz_fp7research_20100708
Summer school bz_fp7research_20100708
 
Artificial Intelligence Advances | Vol.3, Iss.1 April 2021
Artificial Intelligence Advances | Vol.3, Iss.1 April 2021Artificial Intelligence Advances | Vol.3, Iss.1 April 2021
Artificial Intelligence Advances | Vol.3, Iss.1 April 2021
 
5. Assessing The Accessibility Of User Interfaces For All Ages
5. Assessing The Accessibility Of User Interfaces For All Ages5. Assessing The Accessibility Of User Interfaces For All Ages
5. Assessing The Accessibility Of User Interfaces For All Ages
 
5. Assessing The Accessibility Of User Interfaces For All Ages
5. Assessing The Accessibility Of User Interfaces For All Ages5. Assessing The Accessibility Of User Interfaces For All Ages
5. Assessing The Accessibility Of User Interfaces For All Ages
 
2 4-10
2 4-102 4-10
2 4-10
 
Demonstration of visual based and audio-based hci system
Demonstration of visual based and audio-based hci systemDemonstration of visual based and audio-based hci system
Demonstration of visual based and audio-based hci system
 
ICS3211 Lecture 07
ICS3211 Lecture 07 ICS3211 Lecture 07
ICS3211 Lecture 07
 
Chapter 4
Chapter 4 Chapter 4
Chapter 4
 
The Recurated Museum: IV. Collections Management & Sustainability
The Recurated Museum: IV. Collections Management & SustainabilityThe Recurated Museum: IV. Collections Management & Sustainability
The Recurated Museum: IV. Collections Management & Sustainability
 
Relinquishing Control: Creating Space for Open Innovation
Relinquishing Control: Creating Space for Open InnovationRelinquishing Control: Creating Space for Open Innovation
Relinquishing Control: Creating Space for Open Innovation
 
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
 

More from Beat Signer

Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)Beat Signer
 
Indoor Positioning Using the OpenHPS Framework
Indoor Positioning Using the OpenHPS FrameworkIndoor Positioning Using the OpenHPS Framework
Indoor Positioning Using the OpenHPS FrameworkBeat Signer
 
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...Beat Signer
 
Cross-Media Technologies and Applications - Future Directions for Personal In...
Cross-Media Technologies and Applications - Future Directions for Personal In...Cross-Media Technologies and Applications - Future Directions for Personal In...
Cross-Media Technologies and Applications - Future Directions for Personal In...Beat Signer
 
Bridging the Gap: Managing and Interacting with Information Across Media Boun...
Bridging the Gap: Managing and Interacting with Information Across Media Boun...Bridging the Gap: Managing and Interacting with Information Across Media Boun...
Bridging the Gap: Managing and Interacting with Information Across Media Boun...Beat Signer
 
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming CurriculaCodeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming CurriculaBeat Signer
 
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions Beat Signer
 
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...Beat Signer
 
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)Beat Signer
 
Interaction - Lecture 10 - Information Visualisation (4019538FNR)
Interaction - Lecture 10 - Information Visualisation (4019538FNR)Interaction - Lecture 10 - Information Visualisation (4019538FNR)
Interaction - Lecture 10 - Information Visualisation (4019538FNR)Beat Signer
 
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...Beat Signer
 
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)Beat Signer
 
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...Beat Signer
 
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...Beat Signer
 
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)Beat Signer
 
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)Beat Signer
 
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)Beat Signer
 
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...Beat Signer
 
Introduction - Lecture 1 - Information Visualisation (4019538FNR)
Introduction - Lecture 1 - Information Visualisation (4019538FNR)Introduction - Lecture 1 - Information Visualisation (4019538FNR)
Introduction - Lecture 1 - Information Visualisation (4019538FNR)Beat Signer
 
Towards a Framework for Dynamic Data Physicalisation
Towards a Framework for Dynamic Data PhysicalisationTowards a Framework for Dynamic Data Physicalisation
Towards a Framework for Dynamic Data PhysicalisationBeat Signer
 

More from Beat Signer (20)

Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
Introduction - Lecture 1 - Human-Computer Interaction (1023841ANR)
 
Indoor Positioning Using the OpenHPS Framework
Indoor Positioning Using the OpenHPS FrameworkIndoor Positioning Using the OpenHPS Framework
Indoor Positioning Using the OpenHPS Framework
 
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
Personalised Learning Environments Based on Knowledge Graphs and the Zone of ...
 
Cross-Media Technologies and Applications - Future Directions for Personal In...
Cross-Media Technologies and Applications - Future Directions for Personal In...Cross-Media Technologies and Applications - Future Directions for Personal In...
Cross-Media Technologies and Applications - Future Directions for Personal In...
 
Bridging the Gap: Managing and Interacting with Information Across Media Boun...
Bridging the Gap: Managing and Interacting with Information Across Media Boun...Bridging the Gap: Managing and Interacting with Information Across Media Boun...
Bridging the Gap: Managing and Interacting with Information Across Media Boun...
 
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming CurriculaCodeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
Codeschool in a Box: A Low-Barrier Approach to Packaging Programming Curricula
 
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
The RSL Hypermedia Metamodel and Its Application in Cross-Media Solutions
 
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
Case Studies and Course Review - Lecture 12 - Information Visualisation (4019...
 
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
Dashboards - Lecture 11 - Information Visualisation (4019538FNR)
 
Interaction - Lecture 10 - Information Visualisation (4019538FNR)
Interaction - Lecture 10 - Information Visualisation (4019538FNR)Interaction - Lecture 10 - Information Visualisation (4019538FNR)
Interaction - Lecture 10 - Information Visualisation (4019538FNR)
 
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
View Manipulation and Reduction - Lecture 9 - Information Visualisation (4019...
 
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
Visualisation Techniques - Lecture 8 - Information Visualisation (4019538FNR)
 
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
Design Guidelines and Principles - Lecture 7 - Information Visualisation (401...
 
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
Data Processing and Visualisation Frameworks - Lecture 6 - Information Visual...
 
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
Data Presentation - Lecture 5 - Information Visualisation (4019538FNR)
 
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
Analysis and Validation - Lecture 4 - Information Visualisation (4019538FNR)
 
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
Data Representation - Lecture 3 - Information Visualisation (4019538FNR)
 
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
Human Perception and Colour Theory - Lecture 2 - Information Visualisation (4...
 
Introduction - Lecture 1 - Information Visualisation (4019538FNR)
Introduction - Lecture 1 - Information Visualisation (4019538FNR)Introduction - Lecture 1 - Information Visualisation (4019538FNR)
Introduction - Lecture 1 - Information Visualisation (4019538FNR)
 
Towards a Framework for Dynamic Data Physicalisation
Towards a Framework for Dynamic Data PhysicalisationTowards a Framework for Dynamic Data Physicalisation
Towards a Framework for Dynamic Data Physicalisation
 

Recently uploaded

ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxAreebaZafar22
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxVishalSingh1417
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhikauryashika82
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin ClassesCeline George
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Disha Kariya
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterMateoGardella
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfChris Hunter
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 

Recently uploaded (20)

ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch Letter
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 

Multimodal Interaction - Lecture 05 - Next Generation User Interfaces (4018166FNR)

  • 1. 2 December 2005 Next Generation User Interfaces Multimodal Interaction Prof. Beat Signer Department of Computer Science Vrije Universiteit Brussel http://www.beatsigner.com
  • 2. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 2October 24, 2016 Human Senses and Modalities  Input as well as output modalities Sensory Perception Sense Organ Modality sight eyes visual hearing ears auditory smell nose olfactory taste tongue gustatory touch skin tactile (balance) vestibular system vestibular Modality “refers to the type of communication channel used to convey or acquire information. It also covers the way an idea is expressed or perceived, or the manner an action is performed.” Nigay and Coutaz, 1993
  • 3. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 3October 24, 2016 Bolt's "Put-that-there" (1980) Bolt, 1980
  • 4. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 4October 24, 2016 Video: “Put-that-there”
  • 5. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 5October 24, 2016 Bolt's "Put-that-there" (1980) ...  Combination of two input modalities  speech is used to issue the semantic part of the command  gestures are used to provide the locations on the screen  Complementary use of both modalities  use of only speech or gestures does not enable the same commands  The "Put-that-there" interaction is still regarded as a rich case of multimodal interaction  however, it is based on an old mouse-oriented metaphor for selecting objects
  • 6. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 6October 24, 2016 Multimodal Interaction  Multimodal interaction is about human-machine interaction involving multiple modalities  input modalities - speech, gesture, gaze, emotions, …  output modalities - voice synthesis, visual cues, …  Humans are inherently multimodal!
  • 7. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 7October 24, 2016 Example: SpeeG2 User Speech recognition (Microsoft SAPI 5.4) Skeletal tracking (Microsoft Kinect) 5 4 2 3 SpeeG2 GUI 6 1 Sven De Kock
  • 8. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 8October 24, 2016 Example: SpeeG2 …  While as user speaks a sentence, the most probable words are shown on the screen  By using hand gestures the user can correct words (by selection) if necessary  Improved recognition rate with up to 21 words per minute (WPM) SpeeG2, WISE Lab
  • 9. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 9October 24, 2016 Video: SpeeG2
  • 10. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 10October 24, 2016 GUIs vs. Multimodal Interfaces Graphical User Interfaces Multimodal Interfaces Single event input stream that controls the event loop Typically continuous and simultaneous input from multiple input streams Sequential processing Parallel processing Centralised architecture Often distributed architectures (e.g. multi agent) with high computational and memory requirements No temporal constraints Requires timestamping and temporal constraints for multimodal fusion
  • 11. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 11October 24, 2016 Multimodal Human-Machine Interaction Loop Dumas et al., 2009
  • 12. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 12October 24, 2016 Multimodal Fusion  Fusion in a multimodal system can happen at three different levels  data level  feature level  decision level Dumas et al., 2009
  • 13. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 13October 24, 2016 Multimodal Fusion …  Data-level fusion  focuses on the fusion of identical or tightly linked types of multimodal data - e.g. two video streams capturing the same scene from different angles - low semantics and highly sensitive to noise (no preprocessing)  Feature-level fusion  fusion of features extracted from the data - e.g. speech and lip movements  Decision-level fusion  focuses on interpretation based on semantic data - e.g. Bolt’s "Put-that-there"  speech and gesture  fusion of high-level information derived from data- and feature- level fusion - highly resistant to noise and failures
  • 14. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 14October 24, 2016 Comparison of Fusion Levels Data-Level Fusion Feature-Level Fusion Decision-Level Fusion Used for raw data of the same modality closely coupled modalities loosely coupled modalities Level of detail (data) highest level of detail moderate level of detail difficult to recover data that has been fused on the data and feature level Sensitivity to noise and failures highly sensitive moderately sensitive not very sensitive Usage not frequently used for multimodal interfaces used to improve the recognition rate most widely used level of fusion for multimodal interfaces
  • 15. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 15October 24, 2016 Advantages of Multimodal Input  Support and accommodate a user's perceptual and communicative capabilities  natural user interfaces with new ways of engaging interaction  Integration of computational skillls of computers in the real world by offering more natural ways of human- machine interaction  Enhanced robustness due to the combination of different (partial) information sources  Flexible personalisation by using different modalities based on user preferences and context  Helps visually or physically impaired users
  • 16. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 16October 24, 2016 Multimodal Fission  Forms a comprehensible answer for the user by making use of multiple output modalities based on the device and context  builds an abstract message through a combination of channels  Three main steps  selection and structuring of content  selection of modalities  output coordination  Coordination of the output on each channel is necessary to form a coherent message
  • 17. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 17October 24, 2016 Ten Myths of Multimodal Interaction 1. If you build a multimodal system, users will interact multimodally  users show a strong preference to interact multimodally but this does not mean that they always interact multimodally  unimodal and multimodal interactions are mixed depending on the task to be carried out 2. Speech and pointing is the dominant multimodal integration pattern  heritage from Bolt's "Put-that-there" where gesture and speech are used to select an object (mouse-oriented metaphor)  modalities can be used for more than just object selection - written input, facial expressions, ... Sharon Oviatt
  • 18. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 18October 24, 2016 Ten Myths of Multimodal Interaction ... 3. Multimodal input involves simultaneous signals  signals are often not overlapping  users frequently introduce (consciously or not) a small delay between two modal inputs  developers should not rely on an overlap for the fusion process 4. Speech is the primary input mode in any multimodal system that includes it  in human-human communication speech is indeed the primary input mode  in human-machine communication speech is not the exclusive carrier of important information and does also not have temporal precedence over other modalities
  • 19. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 19October 24, 2016 Ten Myths of Multimodal Interaction ... 5. Multimodal language does not differ linguistically from unimodal language  different modalities complement each other - e.g. avoid error-prone voice descriptions of spatial locations by pointing  in many respects multimodal language is substantially simplified - might lead to more robust systems 6. Multimodal integration involves redundancy of content between modes  early research in multimodal interaction assumed that different redundant modalities could improve the recognition  complementary modalities are more important and designers should not rely on duplicated content
  • 20. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 20October 24, 2016 Ten Myths of Multimodal Interaction ... 7. Individual error-prone recognition technologies combine multimodally to produce even greater unreliability  assumes that using two error-prone input modes such as speech and handwriting recognition results in greater unreliability  however, increased robustness due to mutual disambiguation between modalities can be observed 8. All users’ multimodal commands are integrated in a uniform way  different users use different integration patterns (e.g. parallel vs. sequential)  system that can detect and adapt to a user’s dominant integration pattern might lead to increased recognition rates
  • 21. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 21October 24, 2016 Ten Myths of Multimodal Interaction ... 9. Different input modes are capable of transmitting comparable content  some modalities can be compared (e.g. speech and writing)  however, each modality has its own very distinctive gestures - e.g. gaze vs. a pointing device such a the Wiimote 10.Enhanced efficiency is the main advantage of multimodal systems  often there is only a minimal increase in efficiency  however, many other advantages - less errors, enhanced usability, flexibility, mutual disambiguation, …
  • 22. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 22October 24, 2016 Formal Models for Combining Modalities  Formalisation of multimodal human-machine interaction  Conceptualise the different possible relationships between input and output modalities  Two conceptual spaces  CASE: use and combination or modalities on the system side (fusion engine)  CARE: possible combination of modalities at the user level
  • 23. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 23October 24, 2016 CASE Model  3 dimensions in the CASE design space  levels of abstraction  use of modalities  fusion Nigay and Coutaz, 1993
  • 24. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 24October 24, 2016 CASE Model: Levels of Abstraction  Data received from a device can be processed at multiple levels of abstraction  Speech analysis example  signal (data) level  phonetic (feature) level  semantic level  A multimodal system is always classified under the meaning category Nigay and Coutaz, 1993
  • 25. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 25October 24, 2016 CASE Model: Use of Modalities  The use of modalities expresses the temporal availability of multiple modalities  parallel use allows the user to employ multiple modalities simultaneously  sequential use forces the user to use the modalities one after another Nigay and Coutaz, 1993
  • 26. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 26October 24, 2016 CASE Model: Fusion  Possible combination of different types of data  combined means that there is fusion  independant means that there is an absence of fusion Nigay and Coutaz, 1993
  • 27. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 27October 24, 2016 CASE Model: Classification  Concurrent  two distinctive tasks executed in parallel  e.g. draw a circle and draw a square next to it  Alternate  task with temporal alternation of modalities  e.g. "Draw a circle there" followed by pointing to the location Nigay and Coutaz, 1993
  • 28. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 28October 24, 2016 CASE Model: Classification  Synergistic  task using several coreferent modalities in parallel  e.g. "Draw a circle here" with concurrent pointing to the location  synergistic multimodal systems are the ultimate goal  Exclusive  one task after the other with one modality at a time (no coreference) Nigay and Coutaz, 1993
  • 29. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 29October 24, 2016 CARE Properties  Four properties to characterise and assess aspects of multimodal interaction in terms of the combination of modalities at the user level [Coutaz et al., 1995]  Complementarity  Assignment  Redundancy  Equivalence
  • 30. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 30October 24, 2016 Complementarity  Multiple modalities are to be used within a temporal window in order reach a given state  No single modality on its own is sufficient to reach the state  Integration (fusion) can happen sequentially or in parallel  Example  “Please show me this list” and <pointing at ‘list of flights’ label>
  • 31. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 31October 24, 2016 Assignment  Only one modality can be used to reach a given state  Absence of choice  Example  Moving the mouse to change the position of a window (considering that the mouse is the only modality available for that operation)
  • 32. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 32October 24, 2016 Redundancy  If multiple modalities have the same expressive power (equivalent) and if they are used within the same temporal window  Repetitive behaviour without increasing the expressive power  Integration (fusion) can happen sequentially or in parallel  Example  “Could you show me the list of flights?” and <click on ‘list of flights’ button>
  • 33. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 33October 24, 2016 Equivalence  Necessary and sufficient to use any single of the available modalities  Choice between multiple modalities  No temporal constraints  Example  “Could you show me the list of flights?” or <click on ‘list of flights’ button>
  • 34. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 34October 24, 2016 Example for Equivalence: EdFest 2004  Edfest 2004 prototype con- sisted of a digital pen and paper interface in combination with speech input and voice output [Belotti et al., 2005]  User had a choice to execute some commands either via pen input or through speech input
  • 35. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 35October 24, 2016 Multimodal Interaction Frameworks  Software frameworks for the creation of multimodal interfaces  Squidy  OpenInterface  HephaisTK  Mudra  Three families of frameworks  stream based - Squidy, OpenInterface  event based - HephaisTK  hybrid - Mudra
  • 36. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 36October 24, 2016 Squidy  Integrated tool with a GUI  Interfaces are define by pipelining components and filters  Online knowledge base of device components and filters  Semantic zooming  e.g. zooming on a filter shows a graph of the data passing through the filter http://www.squidy-lib.de
  • 37. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 37October 24, 2016 OpenInterface Framework  European research project (2006-2009)  Component-based framework  Online component library with components for numerous modalities and devices  Design an application by pipelining components http://www.openinterface.org
  • 38. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 38October 24, 2016 HephaisTK  Software agents-based framework  Focus on events  Different fusion algorithms  Description of the dialogue via the SMUIML language http://www.hephaistk.org
  • 39. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 39October 24, 2016 UsiXML  User Interface eXtended Markup Language  XML-based declarative User Interface Description Language (UIDL) to des- cribe abstract interfaces  can later be mapped to different physical character- istics (including multimodal user interfaces) Stanciulescu et al., 2005
  • 40. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 40October 24, 2016 Mudra  Fusion across different levels of abstraction (data, feature and decision)  via fact base  Interactions described via a declarative rule-based language  Rapid prototyping  simple integration of new input devices  integration of external gesture recognisers Hoste et al., 2011
  • 41. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 41October 24, 2016 References  R.A. Bolt, “Put-that-there”: Voice and Gesture at the Graphics Interface, In Proceedings of Siggraph 1980, 7th Annual Conference on Computer Graphics and Inter- active Techniques, Seattle, USA, July 1980  http://dx.doi.org/10.1145/965105.807503  L. Nigay and J. Coutaz, A Design Space for Multimodal Systems: Concurrent Processing and Data Fusion, In Proceedings of CHI 1993, 3rd International Conference on Human Factors in Computing Systems, Amsterdam, The Netherlands, April 1993  http://dx.doi.org/10.1145/169059.169143
  • 42. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 42October 24, 2016 References …  J. Coutaz et al., Four Easy Pieces for Assessing the Usability of Multimodal Interaction: The CARE Properties, In Proceedings of Interact 1995, 5th International Conference on Human-Computer Interaction, Lillehammer, Norway, June 1995  http://dx.doi.org/10.1007/978-1-5041-2896-4_19  B. Dumas, D. Lalanne and S. Oviatt, Multi- modal Interfaces: A Survey of Principles, Models and Frameworks, Human Machine Interaction, LNCS 5440, 2009  http://dx.doi.org/10.1007/978-3-642-00437-7_1
  • 43. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 43October 24, 2016 References …  S. Oviatt, Ten Myths of Multimodal Interaction, Communications of the ACM 42(11), November 1999  http://dx.doi.org/10.1145/319382.319398  Put-that-there demo  https://www.youtube.com/watch?v=RyBEUyEtxQo  R. Belotti, C. Decurtins, M.C. Norrie, B. Signer and L. Vukelja, Experimental Platform for Mobile Information Systems, Proceedings of MobiCom 2005, 11th Annual International Conference on Mobile Computing and Net- working, Cologne, Germany, August 2005  http://dx.doi.org/10.1145/1080829.1080856
  • 44. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 44October 24, 2016 References …  L. Hoste and B. Signer, SpeeG2: A Speech- and Gesture-based Interface for Efficient Controller-free Text Entry, Proceedings of ICMI 2013, 15th International Conference on Multimodal Interaction, Sydney, Australia, December 2013  http://beatsigner.com/publications/hoste_ICMI2013.pdf  L. Hoste, B. Dumas and B. Signer, Mudra: A Unified Multimodal Interaction Framework, Proceedings of ICMI 2011, 13th International Conference on Multimodal Interaction, Alicante, Spain, November 2011  http://beatsigner.com/publications/hoste_ICMI2011.pdf
  • 45. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 45October 24, 2016 References …  L. Hoste, A Declarative Approach for Engineer- ing Multimodal Interaction, PhD thesis, Vrije Universiteit Brussel (VUB), Brussels, Belgium, June 2015  http://beatsigner.com/theses/PhdThesisLodeHoste.pdf  A. Stanciulescu, Q. Limbourg, J. Vanderdonckt, B. Michotte and F. Montero, A Transformational Ap- proach for Multimodal Web User Interfaces Based on UsiXML, Proceedings of ICMI 2005, 7th International Conference on Multimodal Interfaces, Trento, Italy, October 2005  http://dx.doi.org/10.1145/1088463.1088508
  • 46. Beat Signer - Department of Computer Science - bsigner@vub.ac.be 46October 24, 2016 References …  SpeeG2 demo  https://www.youtube.com/watch?v=ItKySNv8l90
  • 47. 2 December 2005 Next Lecture Pen-based Interaction