ICIW 2010 - The Fifth International Conference on Internet and Web Applications and ServicesMay 9 - 15, 2010 - Barcelona, Spain Titol de la presentacióMultimodal Interaction in Distributed and Ubiquitous ComputingMarc Pous and Luigi CeccaronibdigitalCognom Nom,  CàrrecNom de la jornada, lloc
SUMMARYWhy multimodality?Multimodal distributed services deploymentUser tests and conclusions
In our cities, urban services practically have not changed in a century
MOTIVATIONS1: Location-based services
http://www.flickr.com/photos/nnova/4512346900/
http://picasaweb.google.com/marc.pous/Xina2007#5102695479944027890
http://picasaweb.google.com/marcpous.nyc2009/NYC2009#5415374502175450370
http://picasaweb.google.com/marcpous.nyc2009/NYC2009#5415371534810675298
http://www.flickr.com/photos/loquat73/3385335980/
MOTIVATIONS2: Interactions and multimodality
http://www.flickr.com/photos/nnova/4423756547/in/set-72157623598200316/
http://www.flickr.com/photos/nnova/4424521014/in/set-72157623598200316/
MOTIVATIONS3: Accessibility
http://www.dynamiclanguageblog.com/2009/12/technology-grows-for-hard-of-hearing.html
MOTIVATIONSWhatisan ICD?(InteractiveCommunityDisplay)
INREDIS3G = ACCESSIBILITY + UBIQUITY + INTEROPERABILITYhttp://www.inredis.es
INREDIS ICD
Management of distributed services that offer the capability of processing and synthesizing multiple modalities of interaction
DISTRIBUTED INTERACTION SERVICESVoice servicesVoice recognitionVia microphone, real-time streaming  converting the voice signal into textVoice synthesisSign language serviceSpanish sign language recognitionVia webcam, real-time streaming  converting the image into textEmotion serviceEmotion recognition (voice + image)Via webcam and microphone, real-time streamingAvatar serviceEmotional avatarSign-language speaker avatar
DISTRIBUTED SERVICESSIGN LANGUAGE SERVICES
DISTRIBUTED SERVICESVOICE SERVICES
How to consume multimodal distributed services offered by an ICD?
W3C MMI ArchitectureW3C MMI Architecturehttp://www.w3.org/TR/mmi-arch/
W3C MMI Frameworkhttp://www.w3.org/TR/mmi-framework/
Proposed multimodal interactionframework
ICD ARCHITECTUREICD Architecture
ICD DESIGN and INTERFACES(1) Interactionidentification(2) Main page- Mapor avatar- Commandbuttons- Content	- Scenariooptions(3) Contextualizedmapwithservices(4) Avatar withmap and Street View service
USER–CENTERED DESIGNUser profiles (Personas)Blind usersDeaf usersUnable to readPeople living or visiting a cityScenariosContext-aware informative scenarioEmergency scenario
IMPLEMENTATION
USABILITY TESTINGPLI (PeopleLedInnovation) methodologyPreliminar usabilitytestswith a limitednumber of usersButthe ICD implementationreceivedgoodfeedback!
CONCLUSIONS and FUTURE WORKImplementation based on the W3C MMI Architecture ideaPlatform able to integrate multimodal interactive distributed services and offer interaction in real-timeWeb-based applications and device-independent implementationImprovement of the user experience and accessibility of people with special needsWhat are we working on now?Real-time interaction: delay reduction between interactionsBrowser-based apps not being able to easily access device hardwareSynchronous interaction vs. Asynchronous technologiesImprovement of the distributed services orchestrationEnhancement of the mobile usability
MOBILE PHONES?http://www.slideshare.net/nokiaconversations/younghee-jung-and-jan-chipchase-presentation-for-nokia-design-london-studio-event-london-29th-april-2008
Thankyou!Moltesgràcies!¡Muchas gracias!Marc Pous and Luigi Ceccaronimpous@bdigital.org and lceccaroni@bdigital.orgTitol de la presentacióCognom Nom,  CàrrecNom de la jornada, lloc

Multimodal Interaction in Distributed and Ubiquitous Computing - ICIW 2010