This presentation was given at QoMEX 2014, the 6th International Workshop on Quality of Multimedia Experience.
Abstract:
This paper investigates the Quality of Experience (QoE) of multisensory media by analysing biosignals collected by electroencephalography (EEG) and eye gaze sensors and comparing with subjective ratings. Also investigated is the impact on QoE of various levels of synchronicity between the sensory effect and target video scene. Results confirm findings from previous research that show sensory effects added to videos increases the QoE rating. While there was no statistical difference observed for the QoE ratings for different levels of sensory effect synchronicity, an analysis of raw EEG data showed 25% more activity in the temporal lobe during asynchronous effects and 20-25% more activity in the occipital lobe during synchronous effects. The eye gaze data showed more deviation for a video with synchronous effects and the EEG showed correlating occipital lobe activity for this instance. These differences in physiological responses indicate sensory effect synchronicity may affect QoE despite subjective ratings appearing similar.
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
QoMEX2014 - Analysing the Quality of Experience of Multisensory Media from Measurements of Physiological Responses
1. 1.Analysing the Quality of Experience of Multisensory Media from Measurements of Physiological Responses PhD Candidate Jacob Donley, Christian Ritz, Muawiyath Shujau School of Electrical, Computer and Telecommunications Engineering ICT Research Institute & Global Challenges University of Wol ongong
2.INTRODUCTION 2 • Multisensory Media (Multiple Sensory Stimuli)
3.INTRODUCTION 2 • Multisensory Media (Multiple Sensory Stimuli) ← Ambient Lighting → ↓ ↑ Vibration Wind ↙Wind ↘
4.INTRODUCTION 2 • Multisensory Media (Multiple Sensory Stimuli)
5.INTRODUCTION 2 • Multisensory Media (Multiple Sensory Stimuli) • Subjective testing of multisensory media –Discrete method –Continuous method
6.INTRODUCTION 2 • Multisensory Media (Multiple Sensory Stimuli) • Subjective testing of multisensory media –Discrete method –Continuous method • Discrete –MOS has previously shown advantageous use of multisensory media
7.INTRODUCTION 2 • Multisensory Media (Multiple Sensory Stimuli) • Subjective testing of multisensory media –Discrete method –Continuous method • Discrete –MOS has previously shown advantageous use of multisensory media • Continuous –Does the MOS reflect the experience evenly?
8.INTRODUCTION 2 • Multisensory Media (Multiple Sensory Stimuli) • Subjective testing of multisensory media –Discrete method –Continuous method • Discrete –MOS has previously shown advantageous use of multisensory media • Continuous –Does the MOS reflect the experience evenly? • What af ect does synchronicity of ef ects have?
9.EMOTION FROM BIOSIGNALS • Electroencephalography (EEG) –Emotiv EEG (14 Channels) 3
10.EMOTION FROM BIOSIGNALS • Electroencephalography (EEG) –Emotiv EEG (14 Channels) –Analysis: • Emotiv suites (Proprietary algorithms) 3
11.EMOTION FROM BIOSIGNALS • Electroencephalography (EEG) –Emotiv EEG (14 Channels) –Analysis: • Emotiv suites (Proprietary algorithms) •Event-Related Potential (ERP) (e.g. P300 responses) 3
12.EMOTION FROM BIOSIGNALS • Electroencephalography (EEG) –Emotiv EEG (14 Channels) –Analysis: • Emotiv suites (Proprietary algorithms) •Event-Related Potential (ERP) (e.g. P300 responses) • Sub-band frequency powers (Alpha, Beta, etc.) 3
13.EMOTION FROM BIOSIGNALS • Electroencephalography (EEG) –Emotiv EEG (14 Channels) –Analysis: • Emotiv suites (Proprietary algorithms) •Event-Related Potential (ERP) (e.g. P300 responses) • Sub-band frequency powers (Alpha, Beta, etc.) • Standardised Low Resolution Brain Electromagnetic Tomography (sLORETA) 3
14.EMOTION FROM BIOSIGNALS • Electroencephalography (EEG) –Emotiv EEG (14 Channels) –Analysis: • Emotiv suites (Proprietary algorithms) •Event-Related Potential (ERP) (e.g. P300 responses) • Sub-band frequency powers (Alpha, Beta, etc.) • Standardised Low Resolution Brain Electromagnetic Tomography (sLORETA) • Eye Gaze Tracking –Sony PS3Eye and Dual Infrared Lights • Modified variable focus lens (No IR filter) 3
15.EMOTION FROM BIOSIGNALS • Electroencephalography (EEG) –Emotiv EEG (14 Channels) –Analysis: • Emotiv suites (Proprietary algorithms) •Event-Related Potential (ERP) (e.g. P300 responses) • Sub-band frequency powers (Alpha, Beta, etc.) • Standardised Low Resolution Brain Electromagnetic Tomography (sLORETA) • Eye Gaze Tracking –Sony PS3Eye and Dual Infrared Lights • Modified variable focus lens (No IR filter) • Large sensor designed for variable lighting 3
16.EMOTION FROM BIOSIGNALS • Electroencephalography (EEG) –Emotiv EEG (14 Channels) –Analysis: • Emotiv suites (Proprietary algorithms) •Event-Related Potential (ERP) (e.g. P300 responses) • Sub-band frequency powers (Alpha, Beta, etc.) • Standardised Low Resolution Brain Electromagnetic Tomography (sLORETA) • Eye Gaze Tracking –Sony PS3Eye and Dual Infrared Lights • Modified variable focus lens (No IR filter) • Large sensor designed for variable lighting • High frame rate (60fps) 3
17.EMOTION FROM BIOSIGNALS • Electroencephalography (EEG) –Emotiv EEG (14 Channels) –Analysis: • Emotiv suites (Proprietary algorithms) •Event-Related Potential (ERP) (e.g. P300 responses) • Sub-band frequency powers (Alpha, Beta, etc.) • Standardised Low Resolution Brain Electromagnetic Tomography (sLORETA) • Eye Gaze Tracking –Sony PS3Eye and Dual Infrared Lights • Modified variable focus lens (No IR filter) • Large sensor designed for variable lighting • High frame rate (60fps) –ITU Gaze Tracker (Open Source) • Iris, Pupil and Glint (HaarFeatures) 3
18.BIOSENSOR-BASED QOE EVALUATION SYSTEM 4
19.BIOSENSOR-BASED QOE EVALUATION SYSTEM • Supports Ful HD Audio/Video 4
20.BIOSENSOR-BASED QOE EVALUATION SYSTEM • Supports Ful HD Audio/Video • Multisensory Media via Philips amBX system –MPEG-V Capable (Standard forMultisensory Media) 4
21.BIOSENSOR-BASED QOE EVALUATION SYSTEM • Supports Ful HD Audio/Video • Multisensory Media via Philips amBX system –MPEG-V Capable (Standard forMultisensory Media) • Emotiv EEG calibration & recording –Facial Expressions via Expressiv™ Suite –Subjective Emotions via Af ectiv™ Suite –Raw EEG Potentials and Gyroscope Data 4
22.BIOSENSOR-BASED QOE EVALUATION SYSTEM • Supports Ful HD Audio/Video • Multisensory Media via Philips amBX system –MPEG-V Capable (Standard forMultisensory Media) • Emotiv EEG calibration & recording –Facial Expressions via Expressiv™ Suite –Subjective Emotions via Af ectiv™ Suite –Raw EEG Potentials and Gyroscope Data • ITU Gaze Tracker calibration & recording 4
23.BIOSENSOR-BASED QOE EVALUATION SYSTEM 5
24.SUBJECTIVE TESTS • Asynchronous => Al ef ects preceding A/V by 500ms • 500ms => Average skew perceptual y noticeable for media synchronisation [1, 2] [1] R. Steinmetz, "Human perception of jit er and media synchronization," Selected Areas in Communications, IEEE Journal on, vol. 14, pp. 61-72, 1996. [2] W. Yaodu, X. Xiang, K. Jingming, and H. Xinlu, "A speech-video synchrony quality metric using CoIA," in Packet Video Workshop (PV), 2010 18th International, 2010, pp. 173-177. 6 Ef ects None Asynchronous Synchronous Subjects 10 Videos per 15 Subject
25.SUBJECTIVE TESTS • Asynchronous => Al ef ects preceding A/V by 500ms • 500ms => Average skew perceptual y noticeable for media synchronisation [1, 2] [1] R. Steinmetz, "Human perception of jit er and media synchronization," Selected Areas in Communications, IEEE Journal on, vol. 14, pp. 61-72, 1996. [2] W. Yaodu, X. Xiang, K. Jingming, and H. Xinlu, "A speech-video synchrony quality metric using CoIA," in Packet Video Workshop (PV), 2010 18th International, 2010, pp. 173-177. 6 Ef ects None Asynchronous Synchronous Subjects 10 Videos per 15 Subject
26.SUBJECTIVE TESTS • Asynchronous => Al ef ects preceding A/V by 500ms • 500ms => Average skew perceptual y noticeable for media synchronisation [1, 2] [1] R. Steinmetz, "Human perception of jit er and media synchronization," Selected Areas in Communications, IEEE Journal on, vol. 14, pp. 61-72, 1996. [2] W. Yaodu, X. Xiang, K. Jingming, and H. Xinlu, "A speech-video synchrony quality metric using CoIA," in Packet Video Workshop (PV), 2010 18th International, 2010, pp. 173-177. 6 EEG Setup Eye Gaze Calibration Example Media Trial QoE Vote x 3 Ef ects None Asynchronous Synchronous Subjects 10 Videos per 15 Subject
27.RESULTS • QoE Votes agree with previous research 75 55 35 QoE Vote (%) QoE Votes and 95% Confidence Interval Without Ef ects With Asynchronous Ef ects With Synchronous Ef ects Shortened Video Titles 7
28.• The ef ect of synchronicity on QoE votes T-Test showing the probability that mean QoE would be observed the same 50% 40% 30% 20%10% 0% Probability RESULTS (alpha=0.05) Without Ef ects & AsyncWithout Ef ects & Sync Async& Sync Shortened Video Titles N.B. One-way ANOVA showed discernabledif erences 8
29.DISCUSSION • What if a subject’s vote is biased in favour of a single event? 9
30.DISCUSSION • What if a subject’s vote is biased in favour of a single event? • Is there any observable physiological dif erence between theasynchronous and synchronous events? 9
31.DISCUSSION • What if a subject’s vote is biased in favour of a single event? • Is there any observable physiological dif erence between theasynchronous and synchronous events? • Can we find a correlation with the votes from either EEG or Gaze data? 9
32.RESULTS • Emotiv’salgorithms are proprietary & values undefined. • Brain lobe functionality wel documented. 10
33.RESULTS • Emotiv’salgorithms are proprietary & values undefined. • Brain lobe functionality wel documented. • Using sLORETAalgorithm: Most Active Brain Lobes bit.ly/EEG_3D_ Vid 10 Frontal Temporal Occipital No Sensory Ef ects Sync Sensory Ef ects AsyncSensory Ef ects
34.RESULTS • Emotiv’salgorithms are proprietary & values undefined. • Brain lobe functionality wel documented. • Using sLORETAalgorithm: 100% 75% 50% 25% 0% No Ef ects AsyncSync Most Active Lobe (%) Type of Ef ects Frontal Parietal Limbic Occipital Temporal QoE MOS Most Active Brain Lobes Frontal Temporal Occipital bit.ly/EEG_3D_ Vid 10 No Sensory Ef ects Sync Sensory Ef ects AsyncSensory Ef ects
35.PRELIMINARY RESULTS • Possible EEG correlation with Eye Gaze deviation Frontal FrontalFrontalFrontalFrontalOccipital 500 400 300 200 100 0 No Ef ects AsyncSync Standard Deviation (Pixels) Type of Ef ects Subject 1 -X CoordSubject 1 -Y CoordSubject 2 -X CoordSubject 2 -Y Coord11
36.PRELIMINARY RESULTS • Possible EEG correlation with Eye Gaze deviation Frontal FrontalFrontalFrontalFrontalOccipital 500 400 300 200 100 0 No Ef ects AsyncSync Standard Deviation (Pixels) Type of Ef ects Subject 1 -X CoordSubject 1 -Y CoordSubject 2 -X CoordSubject 2 -Y CoordFrontal FrontalOccipital No Sensory Ef ects Sync Sensory Ef ects 11 AsyncSensory Ef ects
37.CONCLUSIONS • Multisensory ef ects enhance QoE 12
38.CONCLUSIONS • Multisensory ef ects enhance QoE • Synchronicity at this level & direction is indiscernible 12
39.CONCLUSIONS • Multisensory ef ects enhance QoE • Synchronicity at this level & direction is indiscernible • Correlating EEG lobe activity–25% increase in temporal lobe for async–20% increase in occipital lobe for sync 12
40.CONCLUSIONS • Multisensory ef ects enhance QoE • Synchronicity at this level & direction is indiscernible • Correlating EEG lobe activity–25% increase in temporal lobe for async–20% increase in occipital lobe for sync • Software package for physiological subjective testing and multisensory media playback 12
41.CONCLUSIONS • Multisensory ef ects enhance QoE • Synchronicity at this level & direction is indiscernible • Correlating EEG lobe activity–25% increase in temporal lobe for async–20% increase in occipital lobe for sync • Software package for physiological subjective testing and multisensory media playback • Large dataset of physiological responses for multisensory stimuli (Availability depending on ethics restrictions) 12
42.Questions?
34. •
•
•
0%
25%
50%
75%
100%
No Effects
Async
Sync
Most Active Lobe (%)
Type of Effects
Frontal
Parietal
Limbic
Occipital
Temporal
QoE MOS
35. •
0
100
200
300
400
500
No Effects
Async
Sync
Standard Deviation (Pixels)
Type of Effects
Subject 1 - X Coord
Subject 1 - Y Coord
Subject 2 - X Coord
Subject 2 - Y Coord
36. •
0
100
200
300
400
500
No Effects
Async
Sync
Standard Deviation (Pixels)
Type of Effects
Subject 1 - X Coord
Subject 1 - Y Coord
Subject 2 - X Coord
Subject 2 - Y Coord