Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The Rise of Emotion-aware Conversational Agents: Threats in Digital Emotions

111 views

Published on

Presentation of the paper at the workshop "#RCBlackMirror2018: Re-Coding Black Mirror"

Part of WWW'18, April 23, 2018, Lyon, France

Published in: Science
  • Hello! Get Your Professional Job-Winning Resume Here - Check our website! https://vk.cc/818RFv
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

The Rise of Emotion-aware Conversational Agents: Threats in Digital Emotions

  1. 1. The Rise of Emotion-aware Conversational Agents: Threats in Digital Emotions Martino Mensio Giuseppe Rizzo Maurizio Morisio #RCBlackMirror2018 @ WWW2018 24 April 2018 Lyon, FR
  2. 2. Objective - analyze theoretical advances enabling Conversational Agents - analyze emotional side-effects of user-centricity Considering similarities with Black Mirror episodes and other real-world examples 2
  3. 3. Main episode of inspiration s02e01 Be Right Back: a person “reconstructed” from digital traces - different interaction channels - emotional attachment to bots - free will, autonomy and ethics [1] 3 [1] Moor, J. (2009). Four kinds of ethical robots. Philosophy Now, 72, 12-14.
  4. 4. Background: Conversational Agents 4
  5. 5. Background: three stages 5 text voice embodied
  6. 6. Background: stage #1 Textual interaction 6 [2] Serban, I. V., Sordoni, A., Bengio, Y., Courville, A. C., & Pineau, J. (2016, February). Building End-To-End Dialogue Systems Using Generative Hierarchical Neural Network Models. In AAAI (Vol. 16, pp. 3776-3784).
  7. 7. Background: stage #2 Voice interaction 7 speech-to-text and text to speech voice imitation capabilities tonal features in input/output
  8. 8. Background: stage #3 Embodied agent 8 body movements facial expression humanoid robots: Sophia [3] Nadine [4] [3] http://www.hansonrobotics.com/robot/sophia/ [4] http://bit.ly/telegraph_knapton_nadine
  9. 9. Background: additional ingredients - topic focus [5] - persona-based generation[6] - emotions (recognise, process, simulate) [7] - channel specific features 9 [5] Chen Xing, Wei Wu, Yu Wu, Jie Liu, Yalou Huang, Ming Zhou, and Wei-Ying Ma. 2017. Topic Aware Neural Response Generation.. In AAAI, Vol. 17. 3351–3357. [6] Jiwei Li, Michel Galley, Chris Brockett, Georgios P Spithourakis, Jianfeng Gao, and Bill Dolan. 2016. A persona-based neural conversation model. The 54th Annual Meeting of the Association for Computational Linguistics 1, 994–1003. CL Lisetti. 1998. Affective computing. (1998). [7] https://cakechat.replika.ai/
  10. 10. Background: Replika example [8] https://www.youtube.com/watch?v=yQGqMVuAk04 10
  11. 11. Effects 11
  12. 12. Short-term therapeutic effect 12 Someone who: - always listens to you - always agrees - never complains - available on-demand Provides: - relief - self-consciousness
  13. 13. Addiction - “race for attention” [9] - escape from reality 13 [9] http://bit.ly/ted_harris_talk
  14. 14. Isolation 14 - side effects of tools that should enable more connections - hikikomori - virtual friends can make the situation worse
  15. 15. Personality change 15 Big Five Personality Traits [10] - openness reduction ← “filter bubble” - extraversion reduction ← isolation Could be measured specifically for Conversational Agents [10] Goldberg, L. R. (1993). The structure of phenotypic personality traits. American psychologist, 48(1), 26.
  16. 16. Societal implication inability to connect with other people 16
  17. 17. Conclusions - Conversational Agents should be tools, not substitute human relationships - need for regulation on the emotional field 17
  18. 18. Let’s keep emotions for humans https://www.slideshare.net/MartinoMensio https://twitter.com/MartinoMensio 18

×