• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Sound Events and Emotions: Investigating the Relation of Rhythmic Characteristics and Arousal
 

Sound Events and Emotions: Investigating the Relation of Rhythmic Characteristics and Arousal

on

  • 121 views

A variety of recent researches in Audio Emotion Recognition (AER) outlines high performance and retrieval accuracy results. However, in most works music is considered as the original sound content ...

A variety of recent researches in Audio Emotion Recognition (AER) outlines high performance and retrieval accuracy results. However, in most works music is considered as the original sound content that conveys the identified emotions. One of the music characteristics that is found to represent a fundamental means for conveying emotions are the rhythm- related acoustic cues. Although music is an important aspect of everyday life, there are numerous non-linguistic and non- musical sounds surrounding humans, generally defined as sound events (SEs). Despite this enormous impact of SEs to humans, a scarcity of investigations regarding AER from SEs is observed. There are only a few recent investigations concerned with SEs and AER, presenting a semantic connection between the former and the listener’s triggered emotion. In this work we analytically investigate the connection of rhythm-related characteristics of a wide range of common SEs with the arousal of the listener using sound events with semantic content. To this aim, several feature evaluation and classification tasks are conducted using different ranking and classification algorithms. High accuracy results are obtained, demonstrating a significant relation of SEs rhythmic characteristics to the elicited arousal.

Statistics

Views

Total Views
121
Views on SlideShare
118
Embed Views
3

Actions

Likes
0
Downloads
0
Comments
0

2 Embeds 3

http://www.ionio.gr 2
http://ionio.gr 1

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Sound Events and Emotions: Investigating the Relation of Rhythmic Characteristics and Arousal Sound Events and Emotions: Investigating the Relation of Rhythmic Characteristics and Arousal Presentation Transcript

    • Sound Events and Emotions: Investigating the Relation of Rhythmic Characteristics and Arousal K. Drossos 1 R. Kotsakis 2 G. Kalliris 2 A. Floros 1 1Digital Audio Processing & Applications Group, Audiovisual Signal Processing Laboratory, Dept. of Audiovisual Arts, Ionian University, Corfu, Greece 2Laboratory of Electronic Media, Dept. of Journalism and Mass Communication, Aristotle University of Thessaloniki, Thessaloniki, Greece
    • Agenda 1. Introduction Everyday life Sound and emotions 2. Objectives 3. Experimental Sequence Experimental Procedure’s Layout Sound corpus & emotional model Processing of Sounds Machine learning tests 4. Results Feature evaluation Classification 5. Discussion & Conclusions 6. Feature work
    • 1. Introduction Everyday life Sound and emotions 2. Objectives 3. Experimental Sequence Experimental Procedure’s Layout Sound corpus & emotional model Processing of Sounds Machine learning tests 4. Results Feature evaluation Classification 5. Discussion & Conclusions 6. Feature work
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Everyday life Everyday life Stimuli Many stimuli, e.g. Visual Aural Interaction with stimuli Reactions Emotions felt
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Everyday life Everyday life Stimuli Many stimuli, e.g. Visual Aural Interaction with stimuli Reactions Emotions felt
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Everyday life Everyday life Stimuli Many stimuli, e.g. Visual Aural Interaction with stimuli Reactions Emotions felt
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Everyday life Everyday life Stimuli Many stimuli, e.g. Visual Aural Interaction with stimuli Reactions Emotions felt
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Everyday life Everyday life Stimuli Many stimuli, e.g. Visual Aural Interaction with stimuli Reactions Emotions felt
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Everyday life Everyday life Stimuli Many stimuli, e.g. Visual Aural Structured form of sound General sound Interaction with stimuli Reactions Emotions felt
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Everyday life Everyday life Stimuli Many stimuli, e.g. Visual Aural Structured form of sound General sound Interaction with stimuli Reactions Emotions felt
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions Music Structured form of sound Primary used to mimic and extent voice characteristics Enhancing emotion(s) conveyance Relation to emotions through (but not olny) MIR and MER Music Information Retrieval Music Emotion Recognition
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions Music Structured form of sound Primary used to mimic and extent voice characteristics Enhancing emotion(s) conveyance Relation to emotions through (but not olny) MIR and MER Music Information Retrieval Music Emotion Recognition
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions Music Structured form of sound Primary used to mimic and extent voice characteristics Enhancing emotion(s) conveyance Relation to emotions through (but not olny) MIR and MER Music Information Retrieval Music Emotion Recognition
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions Music Structured form of sound Primary used to mimic and extent voice characteristics Enhancing emotion(s) conveyance Relation to emotions through (but not olny) MIR and MER Music Information Retrieval Music Emotion Recognition
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions Music Structured form of sound Primary used to mimic and extent voice characteristics Enhancing emotion(s) conveyance Relation to emotions through (but not olny) MIR and MER Music Information Retrieval Music Emotion Recognition
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions Music Structured form of sound Primary used to mimic and extent voice characteristics Enhancing emotion(s) conveyance Relation to emotions through (but not olny) MIR and MER Music Information Retrieval Music Emotion Recognition
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions (cont’d) Research results & Applications Accuracy results up to 85% In large music data bases Opposed to typical “Artist/Genre/Year” classification Categorisation according to emotion Retrieval based on emotion Preliminary applications for synthesis of music based on emotion
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions (cont’d) Research results & Applications Accuracy results up to 85% In large music data bases Opposed to typical “Artist/Genre/Year” classification Categorisation according to emotion Retrieval based on emotion Preliminary applications for synthesis of music based on emotion
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions (cont’d) Research results & Applications Accuracy results up to 85% In large music data bases Opposed to typical “Artist/Genre/Year” classification Categorisation according to emotion Retrieval based on emotion Preliminary applications for synthesis of music based on emotion
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions (cont’d) Research results & Applications Accuracy results up to 85% In large music data bases Opposed to typical “Artist/Genre/Year” classification Categorisation according to emotion Retrieval based on emotion Preliminary applications for synthesis of music based on emotion
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions (cont’d) Research results & Applications Accuracy results up to 85% In large music data bases Opposed to typical “Artist/Genre/Year” classification Categorisation according to emotion Retrieval based on emotion Preliminary applications for synthesis of music based on emotion
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions Music and emotions (cont’d) Research results & Applications Accuracy results up to 85% In large music data bases Opposed to typical “Artist/Genre/Year” classification Categorisation according to emotion Retrieval based on emotion Preliminary applications for synthesis of music based on emotion
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events Sound events Non structured/general form of sound Additional information regarding: Source attributes Environment attributes Sound producing mechanism attributes Apparent almost any time, if not always, in everyday life
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events Sound events Non structured/general form of sound Additional information regarding: Source attributes Environment attributes Sound producing mechanism attributes Apparent almost any time, if not always, in everyday life
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events Sound events Non structured/general form of sound Additional information regarding: Source attributes Environment attributes Sound producing mechanism attributes Apparent almost any time, if not always, in everyday life
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events Sound events Non structured/general form of sound Additional information regarding: Source attributes Environment attributes Sound producing mechanism attributes Apparent almost any time, if not always, in everyday life
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events Sound events Non structured/general form of sound Additional information regarding: Source attributes Environment attributes Sound producing mechanism attributes Apparent almost any time, if not always, in everyday life
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events Sound events Non structured/general form of sound Additional information regarding: Source attributes Environment attributes Sound producing mechanism attributes Apparent almost any time, if not always, in everyday life
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events (cont’d) Sound events Used in many applications: Audio interactions/interfaces Video games Soundscapes Artificial acoustic environments Trigger reactions Elicit emotions
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events (cont’d) Sound events Used in many applications: Audio interactions/interfaces Video games Soundscapes Artificial acoustic environments Trigger reactions Elicit emotions
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events (cont’d) Sound events Used in many applications: Audio interactions/interfaces Video games Soundscapes Artificial acoustic environments Trigger reactions Elicit emotions
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events (cont’d) Sound events Used in many applications: Audio interactions/interfaces Video games Soundscapes Artificial acoustic environments Trigger reactions Elicit emotions
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events (cont’d) Sound events Used in many applications: Audio interactions/interfaces Video games Soundscapes Artificial acoustic environments Trigger reactions Elicit emotions
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound and emotions From music to sound events (cont’d) Sound events Used in many applications: Audio interactions/interfaces Video games Soundscapes Artificial acoustic environments Trigger reactions Elicit emotions
    • 1. Introduction Everyday life Sound and emotions 2. Objectives 3. Experimental Sequence Experimental Procedure’s Layout Sound corpus & emotional model Processing of Sounds Machine learning tests 4. Results Feature evaluation Classification 5. Discussion & Conclusions 6. Feature work
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Objectives of current study Data Aural stimuli can elicit emotions Music is a structured form of sound Sound events are not Music’s rhythm is arousing Research question Is sound events’ rhythm arousing too?
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Objectives of current study Data Aural stimuli can elicit emotions Music is a structured form of sound Sound events are not Music’s rhythm is arousing Research question Is sound events’ rhythm arousing too?
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Objectives of current study Data Aural stimuli can elicit emotions Music is a structured form of sound Sound events are not Music’s rhythm is arousing Research question Is sound events’ rhythm arousing too?
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Objectives of current study Data Aural stimuli can elicit emotions Music is a structured form of sound Sound events are not Music’s rhythm is arousing Research question Is sound events’ rhythm arousing too?
    • 1. Introduction Everyday life Sound and emotions 2. Objectives 3. Experimental Sequence Experimental Procedure’s Layout Sound corpus & emotional model Processing of Sounds Machine learning tests 4. Results Feature evaluation Classification 5. Discussion & Conclusions 6. Feature work
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Experimental Procedure’s Layout Layout Emotional model selection Annotated sound corpus collection Processing of sounds Pre-processing Feature extraction Machine learning tests
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Experimental Procedure’s Layout Layout Emotional model selection Annotated sound corpus collection Processing of sounds Pre-processing Feature extraction Machine learning tests
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Experimental Procedure’s Layout Layout Emotional model selection Annotated sound corpus collection Processing of sounds Pre-processing Feature extraction Machine learning tests
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Experimental Procedure’s Layout Layout Emotional model selection Annotated sound corpus collection Processing of sounds Pre-processing Feature extraction Machine learning tests
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Experimental Procedure’s Layout Layout Emotional model selection Annotated sound corpus collection Processing of sounds Pre-processing Feature extraction Machine learning tests
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Experimental Procedure’s Layout Layout Emotional model selection Annotated sound corpus collection Processing of sounds Pre-processing Feature extraction Machine learning tests
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound corpus & emotional model General Characteristics Sound Corpus International Affective Digital Sounds (IADS) 167 sounds Duration: 6 secs Variable sampling frequency Emotional model Continuous model Sounds annotated using Arousal-Valence-Dominance Self Assessment Manikin (S.A.M.) used
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound corpus & emotional model General Characteristics Sound Corpus International Affective Digital Sounds (IADS) 167 sounds Duration: 6 secs Variable sampling frequency Emotional model Continuous model Sounds annotated using Arousal-Valence-Dominance Self Assessment Manikin (S.A.M.) used
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound corpus & emotional model General Characteristics Sound Corpus International Affective Digital Sounds (IADS) 167 sounds Duration: 6 secs Variable sampling frequency Emotional model Continuous model Sounds annotated using Arousal-Valence-Dominance Self Assessment Manikin (S.A.M.) used
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound corpus & emotional model General Characteristics Sound Corpus International Affective Digital Sounds (IADS) 167 sounds Duration: 6 secs Variable sampling frequency Emotional model Continuous model Sounds annotated using Arousal-Valence-Dominance Self Assessment Manikin (S.A.M.) used
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound corpus & emotional model General Characteristics Sound Corpus International Affective Digital Sounds (IADS) 167 sounds Duration: 6 secs Variable sampling frequency Emotional model Continuous model Sounds annotated using Arousal-Valence-Dominance Self Assessment Manikin (S.A.M.) used
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Sound corpus & emotional model General Characteristics Sound Corpus International Affective Digital Sounds (IADS) 167 sounds Duration: 6 secs Variable sampling frequency Emotional model Continuous model Sounds annotated using Arousal-Valence-Dominance Self Assessment Manikin (S.A.M.) used
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Pre-processing General procedure Normalisation Segmentation / Windowing Clustering in two classes Class A: 24 members Class B: 143 members Segmentation/Windowing Different window lengths Ranging from 0.8 to 2.0 seconds, with 0.2 seconds increment
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Pre-processing General procedure Normalisation Segmentation / Windowing Clustering in two classes Class A: 24 members Class B: 143 members Segmentation/Windowing Different window lengths Ranging from 0.8 to 2.0 seconds, with 0.2 seconds increment
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Pre-processing General procedure Normalisation Segmentation / Windowing Clustering in two classes Class A: 24 members Class B: 143 members Segmentation/Windowing Different window lengths Ranging from 0.8 to 2.0 seconds, with 0.2 seconds increment
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Pre-processing General procedure Normalisation Segmentation / Windowing Clustering in two classes Class A: 24 members Class B: 143 members Segmentation/Windowing Different window lengths Ranging from 0.8 to 2.0 seconds, with 0.2 seconds increment
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Pre-processing General procedure Normalisation Segmentation / Windowing Clustering in two classes Class A: 24 members Class B: 143 members Segmentation/Windowing Different window lengths Ranging from 0.8 to 2.0 seconds, with 0.2 seconds increment
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Pre-processing General procedure Normalisation Segmentation / Windowing Clustering in two classes Class A: 24 members Class B: 143 members Segmentation/Windowing Different window lengths Ranging from 0.8 to 2.0 seconds, with 0.2 seconds increment
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Feature Extraction Extracted features Only rhythm related features For each segment Statistical measures Total of 26 features’ set
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Feature Extraction Extracted features Only rhythm related features For each segment Statistical measures Total of 26 features’ set
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Feature Extraction Extracted features Only rhythm related features For each segment Statistical measures Total of 26 features’ set
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Feature Extraction Extracted features Only rhythm related features For each segment Statistical measures Total of 26 features’ set
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Processing of Sounds Feature Extraction Extracted features Only rhythm related features For each segment Statistical measures Extracted Features Statistical Measures Beat spectrum Mean Onsets Standard deviation Tempo Gradient Fluctuation Kurtosis Event density Skewness Pulse clarity
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Machine learning tests Feature Evaluation Objectives Most valuable features for arousal detection Dependencies between selected features and different window lengths Algorithms used InfoGainAttributeEval SVMAttributeEval WEKA software
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Machine learning tests Feature Evaluation Objectives Most valuable features for arousal detection Dependencies between selected features and different window lengths Algorithms used InfoGainAttributeEval SVMAttributeEval WEKA software
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Machine learning tests Feature Evaluation Objectives Most valuable features for arousal detection Dependencies between selected features and different window lengths Algorithms used InfoGainAttributeEval SVMAttributeEval WEKA software
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Machine learning tests Classification Objectives Arousal classification based on rhythm Dependencies between different window lengths and classification results Algorithms used Artificial neural networks Logistic regression K Nearest Neighbors WEKA software
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Machine learning tests Classification Objectives Arousal classification based on rhythm Dependencies between different window lengths and classification results Algorithms used Artificial neural networks Logistic regression K Nearest Neighbors WEKA software
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Machine learning tests Classification Objectives Arousal classification based on rhythm Dependencies between different window lengths and classification results Algorithms used Artificial neural networks Logistic regression K Nearest Neighbors WEKA software
    • 1. Introduction Everyday life Sound and emotions 2. Objectives 3. Experimental Sequence Experimental Procedure’s Layout Sound corpus & emotional model Processing of Sounds Machine learning tests 4. Results Feature evaluation Classification 5. Discussion & Conclusions 6. Feature work
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Feature Evaluation General results Two groups of features formed, 13 features each group An upper and a lower group Inner group rank not always the same Upper and lower groups constant for all window lengths
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Feature Evaluation General results Two groups of features formed, 13 features each group An upper and a lower group Inner group rank not always the same Upper and lower groups constant for all window lengths
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Feature Evaluation General results Two groups of features formed, 13 features each group An upper and a lower group Inner group rank not always the same Upper and lower groups constant for all window lengths
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Feature Evaluation General results Two groups of features formed, 13 features each group An upper and a lower group Inner group rank not always the same Upper and lower groups constant for all window lengths
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Feature Evaluation Upper group Beatspectrum std Event density std Onsets gradient Fluctuation kurtosis Beatspectrum gradient Pulse clarity std Fluctuation mean Fluctuation std Fluctuation skewness Onsets skewness Pulse clarity kurtosis Event density kurtosis Onsets mean
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification Classification General results Relative un-corelation of features Variations regarding different window lengths Accuracy results Highest accuracy score: 88.37% Lowest accuracy score: 71.26%
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification Classification General results Relative un-corelation of features Variations regarding different window lengths Accuracy results Highest accuracy score: 88.37% Lowest accuracy score: 71.26%
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification Classification General results Relative un-corelation of features Variations regarding different window lengths Accuracy results Highest accuracy score: 88.37% Lowest accuracy score: 71.26%
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification Classification General results Relative un-corelation of features Variations regarding different window lengths Accuracy results Highest accuracy score: 88.37% Window length: 1.0 second Algorithm utilised: Logistic regression Lowest accuracy score: 71.26%
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification Classification General results Relative un-corelation of features Variations regarding different window lengths Accuracy results Highest accuracy score: 88.37% Window length: 1.0 second Algorithm utilised: Logistic regression Lowest accuracy score: 71.26%
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification Classification General results Relative un-corelation of features Variations regarding different window lengths Accuracy results Highest accuracy score: 88.37% Window length: 1.0 second Algorithm utilised: Logistic regression Lowest accuracy score: 71.26% Window length: 1.4 seconds Algorithm utilised: Artificial Neural network
    • 1. Introduction Everyday life Sound and emotions 2. Objectives 3. Experimental Sequence Experimental Procedure’s Layout Sound corpus & emotional model Processing of Sounds Machine learning tests 4. Results Feature evaluation Classification 5. Discussion & Conclusions 6. Feature work
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Features Most informative features: Rhythm’s periodicity at auditory channel (fluctuation) Onsets Event density Beatspecturm Pulse clarity Independent from semantic content
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Features Most informative features: Rhythm’s periodicity at auditory channel (fluctuation) Onsets Event density Beatspecturm Pulse clarity Independent from semantic content
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Features Most informative features: Rhythm’s periodicity at auditory channel (fluctuation) Onsets Event density Beatspecturm Pulse clarity Independent from semantic content
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Features Most informative features: Rhythm’s periodicity at auditory channel (fluctuation) Onsets Event density Beatspecturm Pulse clarity Independent from semantic content
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Features Most informative features: Rhythm’s periodicity at auditory channel (fluctuation) Onsets Event density Beatspecturm Pulse clarity Independent from semantic content
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature evaluation Features Most informative features: Rhythm’s periodicity at auditory channel (fluctuation) Onsets Event density Beatspecturm Pulse clarity Independent from semantic content
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification results Algorithms related LR’s minimum score was: 81.44% KNN’s minimum score was: 82.05% Results related Sound events’ rhythm affects arousal Thus, sound’s rhythm affect arousal
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification results Algorithms related LR’s minimum score was: 81.44% KNN’s minimum score was: 82.05% Results related Sound events’ rhythm affects arousal Thus, sound’s rhythm affect arousal
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification results Algorithms related LR’s minimum score was: 81.44% KNN’s minimum score was: 82.05% Results related Sound events’ rhythm affects arousal Thus, sound’s rhythm affect arousal
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Classification results Algorithms related LR’s minimum score was: 81.44% KNN’s minimum score was: 82.05% Results related Sound events’ rhythm affects arousal Thus, sound’s rhythm affect arousal
    • 1. Introduction Everyday life Sound and emotions 2. Objectives 3. Experimental Sequence Experimental Procedure’s Layout Sound corpus & emotional model Processing of Sounds Machine learning tests 4. Results Feature evaluation Classification 5. Discussion & Conclusions 6. Feature work
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature work Features & Dimensions Other features related to arousal Connection of features with valence
    • Introduction Objectives Experimental Sequence Results Discussion & Conclusions Feature work Feature work Features & Dimensions Other features related to arousal Connection of features with valence
    • Thank you!