SlideShare a Scribd company logo
Auditory, visual, and auditory-visual
 perception of emotions by individuals with
cochlear implants, hearing AIDS, and normal
                   hearing


 Most, T. and Aviner, C. (2009). Journal of deaf studies
           and deaf education, 14, 449-464.
Introduction
• Spoken language communication comprises
  linguistic information (Mozziconacci, 2002)
  –   lexical items
  –   syntactic pattern
  –   vocal nonverbal information
  –   speaker’s emotional state with respect to the topic or
      the listener
• This study focuses upon nonverbal information,
  specifically upon the emotional state of the
  speaker.
Introduction
• Perception of the speaker’s emotional state
  – auditory
     • fundamental frequency, intensity, duration, rate of speech
     • variations in acc of diff emotions
        – easy: anger, sadness
        – difficult: surprise, disgust
  – visual
     • eye fear, anger, sadness
     • mouth happiness, disgust
        – easy: happy
        – difficult: fear, disgust
        – between: anger, sadness
Introduction
• Compare perception of emotions through
  different mode
  – visual mode (V) > auditory mode (A)
  – A+V > A
  – A+V ?(> or =) V
Introduction


• Emotion Perception by Individuals With Hearing
  Loss Using Hearing Aids
Introduction
• Hearing loss (HL)
  – diff perceiving the spoken signal in general
  – diff perceiving auditory nonverbal cues of emotions
• Auditory perception of emotions
  – HAs (hearing aids) < NH (normal hearing)
  – negative correlation (emotions and degree of HL)
• Visual perception of emotions
  – HAs ?(>, =, <) NH
Introduction


• Emotion Perception by Individuals With Hearing
  Loss Using Cochlear Implants
Introduction
• Increases the audibility of the speech signal
   better speech perception
  – CI> HAs (with similar degree HL)
• Variables
  – age of implantation
  – duration of CI use
  – onset of deafness
Purpose
• Evaluate emotions perception
  – CI, HAs and NH
• Hypothesis
  –   auditory emotion perception: NH> all HL
  –   auditory emotion perception: CI> HA
  –   A and the A+V modes: early-CI> late-CI
  –   NH: A+V mode> V mode
  –   HL: A+V> A or V mode
  –   the perception of specific emotions through the
      different modes by all the participants
Materials and Methods
• Participants:
             Number of             Age            Duration (CI)
    Group    participants                         HL range (HA)
    Early CI 10(4           10:10-15:7           6:9-13:1
             males)         (M= 3:11, SD= 1:3)   (M= 9:7, SD= 2.0)
     Late CI 10(4           10-17:6              1-6:9
             males)         (M= 15:1, SD= 2:7)   (M= 3.8, SD=
                                                 2:10)
      HA     10(5           10:2-15:9            73-110 dB
             males)         (M= 13:10, SD=       (M= 88, SD= 9.6)
                            1.11)
      NH     10(4           10:5-16:10
             males)         (M= 14:10, SD=
Materials and Methods
• Instrument:
  – Identification of Emotion Test (IET)
  – total of 36 video-recorded
     • anger, fear, sadness, happiness, disgust, and surprise
     • the same neutral sentence ‘‘I am going out now and I’ll be
       back later.’’
Materials and Methods
• Procedure:
  – three individual sessions
     • emotion vocabulary test
     • Ling 6-sound test
     • emotion test- three modes (block design)
  – group (4)*mode (3)*emotion (6)
Results
• Scores were adjusted according to the following
  formula by boothroyd (1998) to take gussing inti
  account:
Results
• Group results (group(4)*mode(3)) :
  – main effect:
     • group: F(3, 36)= 6.47, p< .01
     • mode: F(2, 35)= 303.88, p< .001
  – interaction: F(6, 70)= 5.26, p< .001
Results
Results
• Identification of Specific Emotions
  (group(4)*mode(3)*emotion(6))
  – main effect:
     • mode effect for emotion
     • group effect for emotion (happiness, sadness and disgust)
  – interaction:
     • between group and mode (happiness, anger and disgust)
Results
• A mode:
  – happiness, sadness and disgust: NH better than other
    groups
  – anger: NH> LCI & HA, NH= ECI
  – fear: NH> ECI & LCI, NH= HA
Results
• V mode:
  – No sig diff between group and emotion.
  – perceive order (all group):
     • happiness> anger, sadness and disgust> surprise and fear
Results
• A+V mode:
  – HA no sig diff from other group.
  – surprise: NH> LCI, ECI
  – perceive order (all group):
     • happiness> anger> disgust> fear
Results
• V, A+V modes revealed a similar hierarchy on
  emotion identification, but A mode different.
  – happiness was the easiest to perceive when presented
    in the V, A+V modes
  – sadness was easier to perceive when presented in the
    A modes than other modes
Results
• Pearson correlations (mode, group)
  – sig: V and A+V scores for the ECI (r= .82, p< .01)
    and LCI (r= .68, p< .05) and in the NH (r= .88, p< .
    001)
• Pearson correlations (mode, age, duration, degree
  of HL, age at onset of hearing loss)
  – sig: chronological age and A+V scores (r= .45, p< .
    01) and the V scores (r= .32, p< .05)
Discussion
• Hypothesis
  –   auditory emotion perception: NH> all HL (O)
  –   auditory emotion perception: CI > HA
  –   A and the A+V modes: early-CI > late-CI
  –   NH: A+V mode > V mode
  –   HL A+V > A or V mode
  –   the perception of specific emotions through the
      different modes by all the participants
Discussion
• The auditory emotion perception of individuals
  with NH would be superior to that of all
  participants with hearing loss was supported.
Discussion
• Hypothesis
  –   auditory emotion perception: NH> all HL
  –   auditory emotion perception: CI > HA (X)
  –   A and the A+V modes: early-CI > late-CI
  –   NH: A+V mode > V mode
  –   HL A+V > A or V mode
  –   the perception of specific emotions through the
      different modes by all the participants
Discussion
• The CI users did not demonstrate any advantage
  over the HA users in identifying emotions
  through either the auditory or the auditory–visual
  mode.
  – segmental features
Discussion
• Hypothesis
  –   auditory emotion perception: NH> all HL
  –   auditory emotion perception: CI > HA
  –   A and the A+V modes: early-CI > late-CI (X)
  –   NH: A+V mode > V mode
  –   HL A+V > A or V mode
  –   the perception of specific emotions through the
      different modes by all the participants
Discussion
• No significant differences emerged between the
  early and late implantees in any of the
  presentation modes, and no significant correlation
  emerged between the perception of emotions and
  age at implantation.
  – both small group
  – age (2.6)
Discussion
• Hypothesis
  –   auditory emotion perception: NH> all HL.
  –   auditory emotion perception: CI > HA
  –   A and the A+V modes: early-CI > late-CI
  –   NH: A+V mode > V mode (O)
  –   HL A+V > A or V mode
  –   the perception of specific emotions through the
      different modes by all the participants
Discussion
• NH auditory identification of emotions was
  significantly lower than visual identification or
  combined auditory–visual identification.
  – utilize the auditory information and to benefit from
    the additional auditory information received about
    emotions in the combined mode.
Discussion
• Hypothesis
  – auditory emotion perception: NH> all HL.
  – CI users would perform differently from HA users in
    auditory emotion perception
  – A and the A+V modes: early-CI > late-CI
  – NH: A+V mode > V mode
  – HL: A+V > A or V mode (X)
  – the perception of specific emotions through the
    different modes by all the participants
Discussion
• HL were unable to benefit from the additional
  auditory information provided in the combined
  mode.
  – Individual difference
Discussion
• Hypothesis
  – auditory emotion perception: NH> all HL.
  – CI users would perform differently from HA users in
    auditory emotion perception
  – A and the A+V modes: early-CI > late-CI
  – NH: A+V mode > V mode
  – HL A+V > A or V mode
  – examine the perception of specific emotions through
    the different modes by all the participants
Discussion
• Visual information was more dominant in the
  combined mode.
Discussion
• Chronological age did correlate sig with
  – visual
  – auditory–visual modes
  – older participants obtaining better perception scores
• Mental knowledge
  – representation of the typical emotions
  – grows with age
Discussion
• Future studies (emotion identity)
  – age at implantation
  – CI combine HA v.s. CI alone
  – postlingually deafened v.s. prelingually deafened

More Related Content

Viewers also liked

LONG RANGE REMOTE CONTROL USING RF DEVICE.ppt
LONG RANGE REMOTE CONTROL USING RF DEVICE.pptLONG RANGE REMOTE CONTROL USING RF DEVICE.ppt
LONG RANGE REMOTE CONTROL USING RF DEVICE.ppt
shantanu gupta
 
Siemens human resource policies management
Siemens human resource policies management Siemens human resource policies management
Siemens human resource policies management
Vishwanath Neha
 
vallesoriental
vallesorientalvallesoriental
vallesoriental
pau_muras
 
Selenium Camp 2016
Selenium Camp 2016Selenium Camp 2016
Selenium Camp 2016
Dan Cuellar
 
Active aging complete
Active aging completeActive aging complete
Active aging complete
Jodi Rudick
 
World Environment Day Powerpoint (2)
World Environment Day Powerpoint (2)World Environment Day Powerpoint (2)
World Environment Day Powerpoint (2)
Three Kings School
 
Test Data - Food for your Test Automation Framework
Test Data - Food for your Test Automation FrameworkTest Data - Food for your Test Automation Framework
Test Data - Food for your Test Automation Framework
Anand Bagmar
 
indian economy before US recession
indian economy before US recessionindian economy before US recession
indian economy before US recession
ajit verma
 
Creative and Fun Photographs by John Wilhelm
Creative and Fun Photographs by John WilhelmCreative and Fun Photographs by John Wilhelm
Creative and Fun Photographs by John Wilhelm
maditabalnco
 

Viewers also liked (9)

LONG RANGE REMOTE CONTROL USING RF DEVICE.ppt
LONG RANGE REMOTE CONTROL USING RF DEVICE.pptLONG RANGE REMOTE CONTROL USING RF DEVICE.ppt
LONG RANGE REMOTE CONTROL USING RF DEVICE.ppt
 
Siemens human resource policies management
Siemens human resource policies management Siemens human resource policies management
Siemens human resource policies management
 
vallesoriental
vallesorientalvallesoriental
vallesoriental
 
Selenium Camp 2016
Selenium Camp 2016Selenium Camp 2016
Selenium Camp 2016
 
Active aging complete
Active aging completeActive aging complete
Active aging complete
 
World Environment Day Powerpoint (2)
World Environment Day Powerpoint (2)World Environment Day Powerpoint (2)
World Environment Day Powerpoint (2)
 
Test Data - Food for your Test Automation Framework
Test Data - Food for your Test Automation FrameworkTest Data - Food for your Test Automation Framework
Test Data - Food for your Test Automation Framework
 
indian economy before US recession
indian economy before US recessionindian economy before US recession
indian economy before US recession
 
Creative and Fun Photographs by John Wilhelm
Creative and Fun Photographs by John WilhelmCreative and Fun Photographs by John Wilhelm
Creative and Fun Photographs by John Wilhelm
 

More from Yi-Cheng Tsai

Digital Transformation for Smart Projector
Digital Transformation for Smart ProjectorDigital Transformation for Smart Projector
Digital Transformation for Smart Projector
Yi-Cheng Tsai
 
The effect of perceptual load on the processing of a distracting schematic face
The effect of perceptual load on the processing of a distracting schematic faceThe effect of perceptual load on the processing of a distracting schematic face
The effect of perceptual load on the processing of a distracting schematic faceYi-Cheng Tsai
 
Examining the Effect of Perceptual Saliency on the Liking Judgment
Examining the Effect of Perceptual Saliency on the Liking JudgmentExamining the Effect of Perceptual Saliency on the Liking Judgment
Examining the Effect of Perceptual Saliency on the Liking JudgmentYi-Cheng Tsai
 
2015 Red dot Design Concept Honorable Mention
2015 Red dot Design Concept Honorable Mention2015 Red dot Design Concept Honorable Mention
2015 Red dot Design Concept Honorable Mention
Yi-Cheng Tsai
 
Executive function in everyday life implications for young cochlear implant u...
Executive function in everyday life implications for young cochlear implant u...Executive function in everyday life implications for young cochlear implant u...
Executive function in everyday life implications for young cochlear implant u...
Yi-Cheng Tsai
 
One year -old infants follow others' voice direction
One year -old infants follow others' voice directionOne year -old infants follow others' voice direction
One year -old infants follow others' voice direction
Yi-Cheng Tsai
 
Perceptual Load
Perceptual LoadPerceptual Load
Perceptual Load
Yi-Cheng Tsai
 
Ch4
Ch4Ch4
Attention Presentation
Attention PresentationAttention Presentation
Attention Presentation
Yi-Cheng Tsai
 
Pattern Recognition
Pattern RecognitionPattern Recognition
Pattern Recognition
Yi-Cheng Tsai
 
Focused Attention
Focused AttentionFocused Attention
Focused Attention
Yi-Cheng Tsai
 
Attention And Pattern Recognition(1)
Attention And Pattern Recognition(1)Attention And Pattern Recognition(1)
Attention And Pattern Recognition(1)
Yi-Cheng Tsai
 
Chapter3
Chapter3Chapter3
Chapter3
Yi-Cheng Tsai
 
The Advantage Of Being Left Handed In Interactive Sports
The Advantage Of Being Left Handed In Interactive SportsThe Advantage Of Being Left Handed In Interactive Sports
The Advantage Of Being Left Handed In Interactive Sports
Yi-Cheng Tsai
 

More from Yi-Cheng Tsai (18)

Digital Transformation for Smart Projector
Digital Transformation for Smart ProjectorDigital Transformation for Smart Projector
Digital Transformation for Smart Projector
 
The effect of perceptual load on the processing of a distracting schematic face
The effect of perceptual load on the processing of a distracting schematic faceThe effect of perceptual load on the processing of a distracting schematic face
The effect of perceptual load on the processing of a distracting schematic face
 
Examining the Effect of Perceptual Saliency on the Liking Judgment
Examining the Effect of Perceptual Saliency on the Liking JudgmentExamining the Effect of Perceptual Saliency on the Liking Judgment
Examining the Effect of Perceptual Saliency on the Liking Judgment
 
2015 Red dot Design Concept Honorable Mention
2015 Red dot Design Concept Honorable Mention2015 Red dot Design Concept Honorable Mention
2015 Red dot Design Concept Honorable Mention
 
Cloud 20131104
Cloud 20131104Cloud 20131104
Cloud 20131104
 
Executive function in everyday life implications for young cochlear implant u...
Executive function in everyday life implications for young cochlear implant u...Executive function in everyday life implications for young cochlear implant u...
Executive function in everyday life implications for young cochlear implant u...
 
One year -old infants follow others' voice direction
One year -old infants follow others' voice directionOne year -old infants follow others' voice direction
One year -old infants follow others' voice direction
 
Perceptual Load
Perceptual LoadPerceptual Load
Perceptual Load
 
Ch4
Ch4Ch4
Ch4
 
Attention Presentation
Attention PresentationAttention Presentation
Attention Presentation
 
Pattern Recognition
Pattern RecognitionPattern Recognition
Pattern Recognition
 
Focused Attention
Focused AttentionFocused Attention
Focused Attention
 
Attention And Pattern Recognition(1)
Attention And Pattern Recognition(1)Attention And Pattern Recognition(1)
Attention And Pattern Recognition(1)
 
Chapter3
Chapter3Chapter3
Chapter3
 
The Advantage Of Being Left Handed In Interactive Sports
The Advantage Of Being Left Handed In Interactive SportsThe Advantage Of Being Left Handed In Interactive Sports
The Advantage Of Being Left Handed In Interactive Sports
 
20090923
2009092320090923
20090923
 
目前課表
目前課表目前課表
目前課表
 
20090910
2009091020090910
20090910
 

Auditory, visual, and auditory visual perception of emotions by individuals with cochlear implants, hearing aids, and normal hearing.

  • 1. Auditory, visual, and auditory-visual perception of emotions by individuals with cochlear implants, hearing AIDS, and normal hearing Most, T. and Aviner, C. (2009). Journal of deaf studies and deaf education, 14, 449-464.
  • 2. Introduction • Spoken language communication comprises linguistic information (Mozziconacci, 2002) – lexical items – syntactic pattern – vocal nonverbal information – speaker’s emotional state with respect to the topic or the listener • This study focuses upon nonverbal information, specifically upon the emotional state of the speaker.
  • 3. Introduction • Perception of the speaker’s emotional state – auditory • fundamental frequency, intensity, duration, rate of speech • variations in acc of diff emotions – easy: anger, sadness – difficult: surprise, disgust – visual • eye fear, anger, sadness • mouth happiness, disgust – easy: happy – difficult: fear, disgust – between: anger, sadness
  • 4. Introduction • Compare perception of emotions through different mode – visual mode (V) > auditory mode (A) – A+V > A – A+V ?(> or =) V
  • 5. Introduction • Emotion Perception by Individuals With Hearing Loss Using Hearing Aids
  • 6. Introduction • Hearing loss (HL) – diff perceiving the spoken signal in general – diff perceiving auditory nonverbal cues of emotions • Auditory perception of emotions – HAs (hearing aids) < NH (normal hearing) – negative correlation (emotions and degree of HL) • Visual perception of emotions – HAs ?(>, =, <) NH
  • 7. Introduction • Emotion Perception by Individuals With Hearing Loss Using Cochlear Implants
  • 8. Introduction • Increases the audibility of the speech signal  better speech perception – CI> HAs (with similar degree HL) • Variables – age of implantation – duration of CI use – onset of deafness
  • 9. Purpose • Evaluate emotions perception – CI, HAs and NH • Hypothesis – auditory emotion perception: NH> all HL – auditory emotion perception: CI> HA – A and the A+V modes: early-CI> late-CI – NH: A+V mode> V mode – HL: A+V> A or V mode – the perception of specific emotions through the different modes by all the participants
  • 10. Materials and Methods • Participants: Number of Age Duration (CI) Group participants HL range (HA) Early CI 10(4 10:10-15:7 6:9-13:1 males) (M= 3:11, SD= 1:3) (M= 9:7, SD= 2.0) Late CI 10(4 10-17:6 1-6:9 males) (M= 15:1, SD= 2:7) (M= 3.8, SD= 2:10) HA 10(5 10:2-15:9 73-110 dB males) (M= 13:10, SD= (M= 88, SD= 9.6) 1.11) NH 10(4 10:5-16:10 males) (M= 14:10, SD=
  • 11. Materials and Methods • Instrument: – Identification of Emotion Test (IET) – total of 36 video-recorded • anger, fear, sadness, happiness, disgust, and surprise • the same neutral sentence ‘‘I am going out now and I’ll be back later.’’
  • 12. Materials and Methods • Procedure: – three individual sessions • emotion vocabulary test • Ling 6-sound test • emotion test- three modes (block design) – group (4)*mode (3)*emotion (6)
  • 13. Results • Scores were adjusted according to the following formula by boothroyd (1998) to take gussing inti account:
  • 14. Results • Group results (group(4)*mode(3)) : – main effect: • group: F(3, 36)= 6.47, p< .01 • mode: F(2, 35)= 303.88, p< .001 – interaction: F(6, 70)= 5.26, p< .001
  • 16. Results • Identification of Specific Emotions (group(4)*mode(3)*emotion(6)) – main effect: • mode effect for emotion • group effect for emotion (happiness, sadness and disgust) – interaction: • between group and mode (happiness, anger and disgust)
  • 17. Results • A mode: – happiness, sadness and disgust: NH better than other groups – anger: NH> LCI & HA, NH= ECI – fear: NH> ECI & LCI, NH= HA
  • 18. Results • V mode: – No sig diff between group and emotion. – perceive order (all group): • happiness> anger, sadness and disgust> surprise and fear
  • 19. Results • A+V mode: – HA no sig diff from other group. – surprise: NH> LCI, ECI – perceive order (all group): • happiness> anger> disgust> fear
  • 20. Results • V, A+V modes revealed a similar hierarchy on emotion identification, but A mode different. – happiness was the easiest to perceive when presented in the V, A+V modes – sadness was easier to perceive when presented in the A modes than other modes
  • 21. Results • Pearson correlations (mode, group) – sig: V and A+V scores for the ECI (r= .82, p< .01) and LCI (r= .68, p< .05) and in the NH (r= .88, p< . 001) • Pearson correlations (mode, age, duration, degree of HL, age at onset of hearing loss) – sig: chronological age and A+V scores (r= .45, p< . 01) and the V scores (r= .32, p< .05)
  • 22. Discussion • Hypothesis – auditory emotion perception: NH> all HL (O) – auditory emotion perception: CI > HA – A and the A+V modes: early-CI > late-CI – NH: A+V mode > V mode – HL A+V > A or V mode – the perception of specific emotions through the different modes by all the participants
  • 23. Discussion • The auditory emotion perception of individuals with NH would be superior to that of all participants with hearing loss was supported.
  • 24. Discussion • Hypothesis – auditory emotion perception: NH> all HL – auditory emotion perception: CI > HA (X) – A and the A+V modes: early-CI > late-CI – NH: A+V mode > V mode – HL A+V > A or V mode – the perception of specific emotions through the different modes by all the participants
  • 25. Discussion • The CI users did not demonstrate any advantage over the HA users in identifying emotions through either the auditory or the auditory–visual mode. – segmental features
  • 26. Discussion • Hypothesis – auditory emotion perception: NH> all HL – auditory emotion perception: CI > HA – A and the A+V modes: early-CI > late-CI (X) – NH: A+V mode > V mode – HL A+V > A or V mode – the perception of specific emotions through the different modes by all the participants
  • 27. Discussion • No significant differences emerged between the early and late implantees in any of the presentation modes, and no significant correlation emerged between the perception of emotions and age at implantation. – both small group – age (2.6)
  • 28. Discussion • Hypothesis – auditory emotion perception: NH> all HL. – auditory emotion perception: CI > HA – A and the A+V modes: early-CI > late-CI – NH: A+V mode > V mode (O) – HL A+V > A or V mode – the perception of specific emotions through the different modes by all the participants
  • 29. Discussion • NH auditory identification of emotions was significantly lower than visual identification or combined auditory–visual identification. – utilize the auditory information and to benefit from the additional auditory information received about emotions in the combined mode.
  • 30. Discussion • Hypothesis – auditory emotion perception: NH> all HL. – CI users would perform differently from HA users in auditory emotion perception – A and the A+V modes: early-CI > late-CI – NH: A+V mode > V mode – HL: A+V > A or V mode (X) – the perception of specific emotions through the different modes by all the participants
  • 31. Discussion • HL were unable to benefit from the additional auditory information provided in the combined mode. – Individual difference
  • 32. Discussion • Hypothesis – auditory emotion perception: NH> all HL. – CI users would perform differently from HA users in auditory emotion perception – A and the A+V modes: early-CI > late-CI – NH: A+V mode > V mode – HL A+V > A or V mode – examine the perception of specific emotions through the different modes by all the participants
  • 33. Discussion • Visual information was more dominant in the combined mode.
  • 34. Discussion • Chronological age did correlate sig with – visual – auditory–visual modes – older participants obtaining better perception scores • Mental knowledge – representation of the typical emotions – grows with age
  • 35. Discussion • Future studies (emotion identity) – age at implantation – CI combine HA v.s. CI alone – postlingually deafened v.s. prelingually deafened