QoMEX’09<br />A Test-bed For Quality of Multimedia Experience Evaluation of Sensory Effects<br />Christian Timmerer and Ma...
Outline<br />Introduction / Motivation<br />Sensory Effect Description Language<br />Test-Bed: Annotation Tool and Simulat...
Introduction<br />Universal Multimedia Access (UMA)<br />Anywhere, anytime, any time + technically feasible<br />Main focu...
Motivation<br />Consumption of multimedia content may stimulate also other senses<br />Vision or audition<br />Olfaction, ...
Sensory Effect Description Language (SEDL)<br />XML Schema-based language for describing sensory effects<br />Basic buildi...
Sensory Effect Description Language (cont’d)<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />...
Test-Bed: Annotation Tool and Simulator<br />Annotation Tool: SEVino<br />Simulator: SESim<br />2009/07/31<br />Christian ...
Test Environment<br />Based on amBX (Ambient Experience) system + SDK<br />Two fan devices, a wrist rumbler, two sound spe...
Preliminary Results<br />More or less constant color pattern<br />A lot of different colors which change very rapidly<br /...
Conclusions<br />Test-bed for the QoMEX evaluation of sensory effects<br />SEVino: a video annotation tool for sensory eff...
References<br />M. Waltl, C. Timmerer, and H. Hellwagner, “A Test-Bed for Quality of Multimedia Experience Evaluation of S...
2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />12<br />Demo & Video<br />
Thank you for your attention<br />... questions, comments, etc. are welcome …<br />Ass.-Prof. Dipl.-Ing. Dr. Christian Tim...
Upcoming SlideShare
Loading in …5
×

A Test-bed For Quality of Multimedia Experience Evaluation of Sensory Effects

1,876 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,876
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
26
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Take only every nth frame for automatic color calculation for performance reasonsHSV and HMMD are used since these color spaces are closer to the human perception of color than RGB
  • Video1 (A Chinese Ghost Story 1 - Taoist Monk Fight Scene, http://www.youtube.com/watch?v=TzBkL_1kCUc) has a length of 63 s, 25 fps, 624x336 pixel, and 1058 kbit/sbitrate with a more or less constant color pattern.Video2 (Alien Quadrilogy (2003) Trailer, http://www.youtube.com/watch?v=gIWLwen1Rf8) has a length of 62 s, 25 fps, 640x464 pixel, and 702 kbit/sbitrate with a lot of different colors which change very rapidly
  • A Test-bed For Quality of Multimedia Experience Evaluation of Sensory Effects

    1. 1. QoMEX’09<br />A Test-bed For Quality of Multimedia Experience Evaluation of Sensory Effects<br />Christian Timmerer and Markus Waltl<br />Klagenfurt University (UNIKLU)  Faculty of Technical Sciences (TEWI)<br />Department of Information Technology (ITEC)  Multimedia Communication (MMC)<br />http://research.timmerer.com  http://blog.timmerer.com  mailto:christian.timmerer@itec.uni-klu.ac.at<br />Co-authors: Markus Waltl, Christian Timmerer, and Hermann Hellwagner<br />Acknowledgment: This work is supported in part by the European Commission in the context of the InterMedia project. Further information is available at http://intermedia.miralab.unige.ch/.<br />Slides available at http://www.slideshare.net/christian.timmerer<br />
    2. 2. Outline<br />Introduction / Motivation<br />Sensory Effect Description Language<br />Test-Bed: Annotation Tool and Simulator<br />Test Environment and Preliminary Results<br />Conclusions<br />Demo & Video<br />2009/07/31<br />2<br />Christian Timmerer, Klagenfurt University, Austria<br />
    3. 3. Introduction<br />Universal Multimedia Access (UMA)<br />Anywhere, anytime, any time + technically feasible<br />Main focus on devices and network connectivity issues<br />Universal Multimedia Experience (UME)<br />Take the user into account<br />Multimedia Adaptation and Quality Models/Metrics<br />Single modality (i.e., audio, image, or video only) or a simple combination of two modalities (i.e., audio and video)<br />Triple user characterization model<br />Sensorial, e.g., sharpness, brightness<br />Perceptual, e.g., what/where is the content<br />Emotional, e.g., feeling, sensation<br />Ambient Intelligence<br />Add’llight effects are highly appreciated for both audio and visual content<br />Calls for a scientific framework to capture, measure, quantify, judge, and explain the user experience <br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />3<br />B. de Ruyter, E. Aarts. “Ambient intelligence: visualizing the future”, Proceedings of the Working Conference on Advanced Visual Interfaces, New York, NY, USA, 2004, pp. 203–208.<br />E. Aarts, B. de Ruyter, “New research perspectives on Ambient Intelligence”, Journal of Ambient Intelligence and Smart Environments, IOS Press, vol. 1, no. 1, 2009, pp. 5–14. <br />F. Pereira, “A triple user characterization model for video adaptation and quality of experience evaluation,” Proc. of the 7th Workshop on Multimedia Signal Processing, Shanghai, China, October 2005, pp. 1–4.<br />
    4. 4. Motivation<br />Consumption of multimedia content may stimulate also other senses<br />Vision or audition<br />Olfaction, mechanoreception, equilibrioception, thermoception, …<br />Annotation with metadata providing so-called sensory effects that steer appropriate devices capable of rendering these effects<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />4<br />… giving her/him the sensation of being part of the particular media<br />➪ worthwhile, informative user experience <br />
    5. 5. Sensory Effect Description Language (SEDL)<br />XML Schema-based language for describing sensory effects<br />Basic building blocks to describe, e.g., light, wind, fog, vibration, scent<br />MPEG-V Part 3, Sensory Information<br />Adopted MPEG-21 DIA tools for adding time information (synchronization)<br />Actual effects are not part of SEDL but defined within the Sensory Effect Vocabulary (SEV)<br />Extensibility: additional effects can be added easily w/o affecting SEDL<br />Flexibility: each application domain may define its own sensory effects<br />Description conforming to SEDL :== Sensory Effect Metadata (SEM)<br />May be associated to any kind of multimedia content (e.g., movies, music, Web sites, games)<br />Steer sensory devices like fans, vibration chairs, lamps, etc. via an appropriate mediation device<br />➪ Increase the experience of the user<br />➪ Worthwhile, informative user experience<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />5<br />
    6. 6. Sensory Effect Description Language (cont’d)<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />6<br />SEM ::=[DescriptionMetadata](Declarations|GroupOfEffects|Effect|ReferenceEffect)+<br />Declarations ::= (GroupOfEffects|Effect|Parameter)+<br />GroupOfEffects ::= timestamp EffectDefinitionEffectDefinition (EffectDefinition)*<br />Effect ::= timestamp EffectDefinition<br />EffectDefinition ::= [activate][duration][fade][alt] [priority][intensity][position]<br />[adaptability]<br />
    7. 7. Test-Bed: Annotation Tool and Simulator<br />Annotation Tool: SEVino<br />Simulator: SESim<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />7<br />
    8. 8. Test Environment<br />Based on amBX (Ambient Experience) system + SDK<br />Two fan devices, a wrist rumbler, two sound speakers, a subwoofer, two lights, and a wall washer<br />Everything controlled by SEM descriptionsexcept light effect<br />➪ automatic color calculation is deployed<br />Advantages<br />Reduction of description size<br />Speeds up authoring stage<br />Different automatic color calculation methods may lead to different user experiences<br />(1) Average color in the RGB color space <br />(2-4) Dominant color in the RGB, HSV, and HMMD<br />Requires a lot of computational resources: (2-4) &gt; (1) due to the management of color bins & amBX supports only RGB<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />8<br />
    9. 9. Preliminary Results<br />More or less constant color pattern<br />A lot of different colors which change very rapidly<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />9<br />Color calculation is performed only on every pth frame (p=5) for efficiency reasons <br />
    10. 10. Conclusions<br />Test-bed for the QoMEX evaluation of sensory effects<br />SEVino: a video annotation tool for sensory effects<br />SESim: a corresponding simulation tool<br />A real world test environment based on the amBX system and SDK<br />Major findings<br />Average color for the automatic color calculation ➪ immediate reaction to color changes, appealing effects, low computational requirements, real-time applicable<br />RGB, HSV, and HMMD dominant color ➪ smoother reaction to color changes, higher computational requirements<br />Future work<br />Optimization of the automatic color calculation (real-time)<br />Subjective tests (already started & stay tuned)<br />(Semi-)automatic extraction of sensory effect information<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />10<br />
    11. 11. References<br />M. Waltl, C. Timmerer, and H. Hellwagner, “A Test-Bed for Quality of Multimedia Experience Evaluation of Sensory Effects”, Proceedings of the First International Workshop on Quality of Multimedia Experience (QoMEX 2009), San Diego, USA, July 29-31, 2009.<br />C. Timmerer, J. Gelissen, M. Waltl, and H. Hellwagner, “Interfacing with Virtual Worlds”, accepted for publication in the Proceedings of the 2009 NEM Summit, Saint-Malo, France, September 28-30, 2009.<br />C. Timmerer, “MPEG-V: Media Context and Control”, 89th ISO/IEC JTC 1/SC 29/WG 11 (MPEG) Meeting, London, UK, June 2009. https://www-itec.uni-klu.ac.at/mmc/blog/2009/07/08/mpeg-v-media-context-and-control/<br />MPEG-V: http://www.chiariglione.org/mpeg/working_documents.htm#MPEG-V<br />MPEG-V reflector: http://lists.uni-klu.ac.at/mailman/listinfo/metaverse<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />11<br />
    12. 12. 2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />12<br />Demo & Video<br />
    13. 13. Thank you for your attention<br />... questions, comments, etc. are welcome …<br />Ass.-Prof. Dipl.-Ing. Dr. Christian Timmerer<br />Klagenfurt University, Department of Information Technology (ITEC)<br />Universitätsstrasse 65-67, A-9020 Klagenfurt, AUSTRIA<br />christian.timmerer@itec.uni-klu.ac.at<br />http://research.timmerer.com/<br />Tel: +43/463/2700 3621 Fax: +43/463/2700 3699<br />© Copyright: Christian Timmerer<br />13<br />2009/07/31<br />Christian Timmerer, Klagenfurt University, Austria<br />

    ×