T7 Embodied conversational agents and affective computing


Published on

14th European Agent Systems Summer Schoola

Published in: Education, Technology
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

T7 Embodied conversational agents and affective computing

  1. 1. Embodied Conversational Agents and Affective Computing Alexandre Pauchet Alexandre.Pauchet@insa-rouen.fr Nicolas Sabouret Nicolas.Sabouret@lip6.fr EASSS 2012 http://acai.lip6.fr/
  2. 2. French working group ACAIhttp://acai.lip6.fr/
  3. 3. GT ACAI 2004: GT ACA (ECA) 2012: GT ACAI  GDR I3, SMA from AFIA  Affects, Artificial Companions and Interaction Thematic meetings  Corpus  Virtual agents  Affective computing  .... From 10 to 30 people
  4. 4. GT ACAI Workshop WACA 2005, 2006, 2008, 2010  Chatbots, dialogical agents  Assistant agents  Virtual agents  Emotions WACAI 2012 in Grenoble  Affects, Artificial Companions and Interaction Web Site: http://acai.lip6.fr/  GT ACA: http://www.limsi.fr/aca/ Mailing list: acai@poleia.lip6.fr
  5. 5. Plan Embodied Conversational Agents Introduction to ECAs, talking heads, gestural agents, deploying environments, assistant agents, usefulness of embodiment, human-ECA interaction Affective Computing Introduction to affective computing, affective models, ECAs and emotions
  6. 6. Embodied Conversational Agents
  7. 7. Introduction
  8. 8. Definitions ECA : "Embodied Conversational Agent" IVA : "Intelligent Virtual Agent"Agent Perception Decision Expression (awareness) (rationality, AI) (affects)RationalProactivity MultimodalityConversational - Graphic User Interface (GUI)Multimodal interaction - Natural Language Processing (NLP)Social ...EmbodiedPersonificationSituated Microsoft™ Agents La Cantoche™ Aibot Sony™ Icons Virtual Characters Robots
  9. 9. Roles of ECAsECAs are “Interactive Virtual Characters” that are situated in mediated(often distributed) environments.They can play four main roles: Assistants to welcome users and to assist them in understanding and using the structure and the functioning of applications Tutors for students in human-learning mediated environments, or for patients in psychological/pathological monitoring systems Partners for actors in virtual environments: partner or adversary in a game, participant in creative design groups, member of a mixed-initiative community, … Companions as a friend in a long term relashionship
  10. 10. Categories of ECAsGRETA (Pélachaud LTCI) STEVE (Rickel et al USC) VTE (Gratch et al USC) Talking heads Gestural Agents Situated Agents Fixed – Realistic Fixes/floating – moving arms Complete mobile charactersExpressions – LipSynch Dialogue – Deictic – Sign language Virtual/Augmented reality Emotions Tutoring – Assistance Training – Action
  11. 11. Education Assistance APPLICATIONS Mixed Communities Welcoming WEB PAGES ©CANAL+ Virtual Reality Smart Objects Deploying environments AMBIENT
  12. 12. Scientific issuesAgent models: interaction performance/efficiency of models  Interaction models with users: Multi-modality, H/M Dialogue  Models of cognitive agents: BDI logics and planning, affective logics, ...  Task models and user models: Symbolic Representation and ReasoningHuman modelling: ecological relevance of models  Capture  Representation of physiology  Reproduction of expressions of humans  Evaluation of behaviours  Application
  13. 13. Scientific communitiesComputer Science and Natural Language Processing  Multi-Agent Systems  Human-Machine Interaction/Interfaces  Human-Machine DialogueHumanities and Society Sciences  Psychology of Language Interaction  Ergonomic psychology
  14. 14. Projects and teams European project SEMAINE (http://www.semaine-project.eu/)  LTCI (France): Greta LIMSI (France): MARC  Bielefield University (Germany): MAX USC (USA): STEVE, VTE
  15. 15. Design of animated characters (ECAs or avatars) Independent of the embodiment level User Non verbal behaviours (postures, gestures, ...): feeling?  Theoretical approach (related works)  Empirical approach (corpus analysis) Generated animations  From an intention (triggering action) Evaluation  Automatically and cyclically <configuration> <timecode>10</timecode> <body>front</body> <head>front</head> <eyes>open happy</eyes> … … … <leftArm>hello</leftArm> <rightArm>hip</rightArm> <speech>Hello!</speech> </configuration> Corpora Representation Reproduction
  16. 16. Talking heads Nikita by Kozaburo © 2002 Done with Poser 5 / No PostworkBeauty contest for Embodied Conversational agents Model: DAZ-3D Victoria-3www.missdigitalworld.com Skin Texture:Mec4D
  17. 17. Models based on visems Visem: elementary unit of the position of the muscles of the human faces (3D model) Experimental settings:  Small red balls manually placed on the face  3 to 5 cameras  Training corpus G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/
  18. 18. Transitions between expressions  Torsion model of the face and the neck  The positions of the red balls are set manually  Pattern recognition is used to monitor ball movements  Extraction of key frames automatic and manualG. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/
  19. 19. Analysis and synthesis Visems + 3D model are used to (re-)generate expressions  6 to 11 parameters Validation by comparing to real expressions G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/
  20. 20. ‘Talking head’ project (LIMSI)  Based on visems  Video corpus of 21 utterances: « Cest /phonem/ ici » (Ex: « Cest u ici ») ... u Cest i ci ...
  21. 21. Realistic talking heads Greta (Prudence, Obadiah, Poppy and Spike) LTCI – Telecom ParisTechMARCLIMSI
  22. 22. Gestural agents http://marc.limsi.fr/
  23. 23. Design of realistic gestural agents and avatars MARC (LIMSI)Annotated corpus of videos (ex: Anvil) GRETA (LTCI)Psychological knowledge (Related works)
  24. 24. 2D cartoon agents: Lea Lea Marco Jules GeniusLEA Agents (LIMSI)• 2D cartoon characters (110 GIF files)• Easily integrated into Java applications and JavaScript• Description language for high-level behaviours
  25. 25. Lea: gestural expressiveness deictic gestures … iconic gestures adapters Emblematic gestures … transparent GIF files
  26. 26. Specifying attitudes and animations in XML <?xml version=1.0 encoding=utf-8?> <configurationsequence nbrconfig="1"> Number of <configuration> configurations <timecode>10</timecode> = attitudes <body>front</body> <head>front</head>An attitude <eyes>open happy</eyes> <gaze>middle</gaze> <facialExpression>close </facialExpression> <bothArms>null</bothArms> <leftArm>hello</leftArm> <rightArm>hip</rightArm> <speech>Hello!</speech> </configuration> </configurationsequence>
  27. 27. Dynamic DeicticTable : Frédéric Vernier LIMSI-CNRS
  28. 28. Designation of the DOM objects KUB-OLO-ZIM GLA-TEK-ZIM
  29. 29. Deploying environmentsAPPLICATIONS WEB PAGES AMBIENT
  30. 30. Chat bots in the Internet Chat bot = “chatterbox” + “robot” A program capable of NL dialogue  Usually key-word spotting  No dialogue session or context No embodiment Goals: only chat ELIZA (J. Weizenbaum,1965)
  31. 31. Assistant agents(application/web)
  32. 32. Social avatars on the webSocial avatarsThe recent evolution of the communication over the webhas prompted the development of so-called “avatars”which are graphical entities dedicated to the mediatingprocess between humans in distributed virtualenvironments (ex: the “meeting services”)Contrary to an ECA which is the personification of acomputer program (an artificial agent) an avatar is thepersonification of a real human who controls the avatar. (Avatar = Mask) ≠ AgentSecond lifePresented as a « metaverse » by its developer, LindenLab, Second Life is a web-based virtual simulation of thesocial life of ordinary people. 5 000 000 avatars were created Permanently , ~ 200 000 visitors are logged
  33. 33. Mixed communitiesProject : « Le deuxième Monde » 1997-2001 Navigation• Company: Canal +™ and chat• Head: Sylvie Pesty (LIG – Grenoble) interface• PhD: Guillaume Chicoisne (LIG – Grenoble)Mixed Community• People chatting and purchasing in a Virtual World• ECA interacting with the people © CANAL+ Embodied Conversational agent Avatars of users © CANAL+
  34. 34. Mixed communities and Heterogeneous MASMetaphor: meetings  Definition : computerised environments that integrates transparently humans and artificial agents within meetings  Mixed initiative communities: brainstorm, design, decision, …  Virtual Training Environments (VTE) : training, simulation, …Deploying environments  Smart Room: physical place with augmented reality  Web-based : distributed and mediated environmentsAnimated characters  Avatars : virtual characters driven by humans  Agents : virtual characters which embodies artificial agents
  35. 35. Social interaction in virtual environments Mixed training environments  Avatars  ECAs + Task: virtual firemen  Recognition of fire  Activity reports Interaction  Emotion management  Gestures and facial expressions  NL dialogue managementM. El Jed, N. Pallamin, L. Morales, C. Adam, B. Pavard, IRIT-GRIC, GT ACA, 2005.11.15 — www.irit.fr/GRIC/VR/
  36. 36. Realistic behaviour of agents The study of videos enabled to highlight how deictic gestures and gazes towards items are used concomitantly with natural language words ("here", "it", "now", ...)M. El Jed, N. Pallamin, L. Morales, C. Adam, B. Pavard, IRIT-GRIC, GT ACA, 15 nov 2005 — www.irit.fr/GRIC/VR/
  37. 37. From assistant agents to ambient environmentsApplication taken from the DIVA toolkit (LIMSI) Transporting the Function of Assistance from stand-alone applications to room-based ambient environments
  38. 38. DIVA display devices of the HD TV Screen Large touch Screen Portable touch ScreenPresentation of any Internet content Presentation of the IRoom controls Diva Toolkit + IRoom (LIMSI)
  39. 39. DIVA 3D-pointing gestures 9 loci Diva Toolkit + IRoom (LIMSI)
  40. 40. Pointing objects in the physical world Body zooming for close objects “Which of the two remote controllers should I use?” Diva Toolkit + IRoom (LIMSI)
  41. 41. Assistant agents
  42. 42. Examples from professionalweb sites Lucie, virtual assistant of SFR Question ECA EDF Site P. Suignard, GT ACA, 2004.10.26 http://www.limsi.fr/aca/
  43. 43. A cons-example: the « Clippie effect »Microsoft Agents™ Helping text corresponding to― Free technology – no support the 1st line of result― Agents 2D cartoon― Can be used on web pages― Third part characters (LaCantoche™)― API for JavaScript & VBScript Intuitive request expressed in NLLimitations― Inputs limited to clicks and menus― No proper dialogue model― No application model― No synchronization speech/movements― Emotions = cartoon ‘emotes’ 2 results too far ranked but quite relevant
  44. 44. Assistant agents: definition« An Assistant Agent is a software tool with the capacity to resolvehelp requests, issuing from novice users, about the static structureand the dynamic functioning of software components or services » InterViews Project – February 1999 Following Patti Maes MIT, 1994User person with poor knowledge about the component (novice)Request help demand in natural language (speech/text)Component computer application, web service, ambient applianceAgent rational, assistant, conversational, (can be embodied)Mediator symbolic model of the structure and the functioning
  45. 45. The assistant metaphor??? Symbolic model Request Assistant Novice agent user Software component
  46. 46. The agents of the Daft project Assistant ECA Hello !   Exemple : un groupe nominal §Gboxanalysis "le tres petit bouton rouge" GRASP PHASE 1 RULELIST length  74 GRASP PHASE 2 RULELIST length  9           FLEX  le FLEX  tres FLEX  petit FLEX  bouton FLEX  rouge LEM  le LEM  tres LEM  petit LEM  bouton LEM  rouge POS  DD POS  RR POS  JJ POS  NN POS  JJ   KEY  THE R KEY  INTLARGE R KEY  SIZESMALL R KEY  BUTTON R KEY  RED R TYPE  LEM TYPE  LEM TYPE  LEM TYPE  LEM TYPE  LEM ID  54252 ID  54253 ID  54254 ID  54255 ID  54256 Comment je peux … HIT : RR : R J  Human Linguistic tres , petit         FLEX  le FLEX  petit FLEX  bouton FLEX  rouge LEM  le LEM  petit LEM  bouton LEM  rouge POS  DD POS  JJ POS  NN POS  JJ   KEY  THE R KEY  SIZESMALL R KEY  BUTTON R KEY  RED R TYPE  LEM FLEX  tres TYPE  LEM TYPE  LEM ID  54252 LEM  tres ID  54255 ID  54256 MODE  : POS  RR KEY  INTLARGE R TYPE  LEM ID  54253   Agent Agent SETK  1 TYPE  LEM  ID  54254 HIT : NN : D J N E J  le, petit , bouton , rouge   FLEX  bouton LEM  bouton POS  NN  KEY  BUTTON R FLEX  le LEM  le POS  DD DET  KEY  THE R TYPE  LEM     ID  54252 FLEX  petit FLEX  rouge LEM  petit LEM  rouge POS  JJ POS  JJ   KEY  SIZESMALL R KEY  RED R FLEX  tres TYPE  LEM LEM  tres ID  54256 ATTR  : MODE  : POS  RR KEY  INTLARGE R TYPE  LEM ID  54253 SETK  1 TYPE  LEM ID  54254 SETK  2 TYPE  LEM ID  54255 Software Σ Hi Rationall Agent MODEL[ Agent ID[main] PARTS[ Dialogue TITLE, FIELD, GROUP[ ID[command], PARTS[ Session BUTTON, BUTTON, BUTTON ] Profil ] ], USAGE["This comp onent is …" ] Modeling Agent ]
  47. 47. The Rational Assisting Agent « How many buttons are there? » ASK[COUNT[REF(BUTTON)]] Formal request language Rational agent Interaction Component model Σ Hi model - Session - Structure - Profile - Action - Task The agent reasons, - Process modifies the models Loop Formal answer language 1. Read a formal request ; TELL[EQUAL[COUNT[REF(BUTTON)]],3]] 2. Contextualize the formal request: « There are three buttons » - Explore the interaction model; - Explore the component model. 3. Process the formal request: - Update the interaction model; - Update the component model. 4. Produce the formal answer.
  48. 48. Dialogue with components Java Interface for ECA: DaftLea (K. LeGuern 2004) counter Here is … “Counter” component Standard GUI that can contain Java applets “Hanoi” component LEA “ AMI web site” componentACA LEA (Java) Speech synthesis: Elan SpeechJ-C Martin & S. Abrilian ...
  49. 49. Processing of a Daft request G IATIN ED DEL M ODone! M 4 3 2 1 «restart the counter » 1) The utterance is analyzed to build a formal request: RESTART[REF[“COUNTER”]] 2) The reference REF is resolved within the model (large When the user enters « restart the counter » in the red rectangle) chatbox, he/she can see: 3) The restart action is applied on the boolean of the counter thus producing the model variable INT[8] to be - The number increments 8, 10, 12, 14 … incremented - LEA says: « Done! » 4) An update is sent to the boolean variable of the - LEA expresses the emote: ACKNOWLEDGE application 5) The behavioral answer «  ACKNOWLEDGE » is sent to the virtual agent.
  50. 50. Signing agentsLIMSI :Diva/DivaLite
  51. 51. Usefulness of embodimentEyeballs 2D cartoon 3D realistic agents Humans
  52. 52. Evaluation of ECAs Efficiency Measures the actual performance of the couple user-agent when carrying out a given task.Objective Usability Measures the capacity of the user to have a good comprehension of the functioning of the system and the fluency of the control. Friendliness Measures the “feeling” of the user about the features of the system (attraction, commitment, esthetic, comfort, …)Subjective Believability Measures the “feeling” of the user about the fact that the agent can understand the user’s problems and has the capacity to help with real competence. Trust Measures the “feeling” of the user about the fact that the agent behaves as a trustable and cooperative entity.
  53. 53. Ratio realism / friendliness Eyeballs 2D cartoon 3D realistic agents Humans % Friendliness Bunraku puppet Toy "uncanny valley" % similarityMasahito Mori, Institut Robotique de Tokyo in Jasia Reicardt, “Robots are coming”, Thames and Hudson Ltd, 1978
  54. 54. The Persona Effect of Lester Results Presentation Measured improving of the parameters: 30 Sujects : With Agent - performance of comprehension 15 M & 15 F, - performance of recall (memory)age ≈ 28 years Presentation - subjective questions  satisfaction Without Agent ECA An arrow WITH AGENT WITHOUT AGENT • Lester et al. (1997). The Persona Effect: Affective impact of Animated Pedagogical Agents. CHI’97. • van Mulken et al. (1998). The Persona Effect: How substantial is it? HCI’98.
  55. 55. Persona effect and memorizing Ik ben Annita, de assisterent bj dit experiment Measured improving the Without agent performance of recall of stories Subjective evaluation: Anthropomorphic questions Example : “Did you feel the agent sensitive?”61 Sujects : 39 M, 22 F Realistic agent Experiment: stories are told to subjects using three various situations (without agent, realistic agent, cartoon agent). Recalling questions are then asked to subjects. The three situations are compared. Result: an ECA improve the memorizing of information. Cartoon agentBeun et al. (2003). Embodied conversational agents: Effects on memory performance and anthropomorphisation. IVA’03.
  56. 56. Experiments on Functional descriptionThree different agents:― 2 men (Marco, Jules) with different garment― 1 woman (Lea) 1Three strategies of cooperation:― Redundancy 33 = 27― Complementarity experiments― SpecializationThree objects to describe:1. Video recorder remote controller2. Photocopier control panel3. Application for designing graphic documents 3 2 Stéphanie Buisine, LIMSI 2004
  57. 57. Functional description: evaluation resultsQuality of explanation― No noticeable effect of the appearance,― Noticeable effect of the multimodal behaviour: the redundant explanations are preferred.Sympathy― No noticeable effect of the multimodal behaviour,― Noticeable effect of the appearance : > >Performance― Same as the appearance,― But no correlation between Sympathy and Performance.Impact of the user’s gender― Men react differentially and prefer redounding explanations,― Women do not react differentially. Stéphanie Buisine, LIMSI 2004
  58. 58. Human-ECA Interaction
  59. 59. Turing test: conversational intelligence Turing Test, Alan M. Turing (1912-1954) : "the imitation game" in “Computing Machinery and Intelligence”, Mind, Vol. 59, No. 236, pp. 433- 460, 1950. Can be replaced by a Turing-like test, on output data instead of during direct interaction
  60. 60. Wizard of Oz methodology Wizard of Oz An experimenter pilots an avatar The user ignores is in front of a human (he thinks he faces a virtual agent)
  61. 61. Humain-Agent interaction Interaction Interaction InteractionMultiple inputsTextual (text boxes), audio (microphone), video (camera), sensors (AmI environment),...Multiple outputsTextual, audio, facial expressions, postures, gestures, lighting, ...Multiple modelsConversational, emotional, of the environment, of the user, ...
  62. 62. European project SEMAINE: an ECA architecture http://www.semaine-project.eu/
  63. 63. Example of ECA architecture Semaine ProjectSpike http://semaine.opendfki.de/wiki/ComponentArchitecture
  64. 64. Interaction inputs Camera Microphone Text box Sensors Microphone + transcript / text box: dialogue input (semantic) Microphone (emotion) Camera:  Gestures (semantic)  Facial expressions / postures (emotions) Sensors:  Context  Physiological signals (emotions)
  65. 65. Human-ECA Dialogue S. Pesty, GT ACA, 2005.11.15, http://www.limsi.fr/aca/
  66. 66. Conversational agents Textual / GUI EmbodiedNon linguistic • Contextual help: pop-ups, ... • State printer: emotion, • Online manuals: hypertext emotes navigation • Assistants using a decision treeUnidirectional, INPUT SORTIElinguistic • Query fields embedded in an • Feature presenter application window: keywords, free/open questionsConversational • Chat bots: Eliza • Mediators: acts as a(bidirectional, • Dialogical assistants: representative of anlinguistic) InterViews, Daft, Artemis, … application of a service • Avatars
  67. 67. The reference of the chatbots A. L. I. C. E. Author: Richard S. Wallace Website: http://www.alicebot.org Documentation : http://alicebot.org/TR/2001/WD-aiml-1.0.1-20011025-006.html
  68. 68. Jabberwacky wins the Loebner Prize 2005….PROGRAM: Hello, my name is George and I am the human.JUDGE: Hi George, cslkjslw weicmslw weos, right?PROGRAM: Dont repeat your name, csthiang!JUDGE: If lsjkel lwehlx wouldnt you say?PROGRAM: No, hes still good.JUDGE: Who.PROGRAM: You and me.JUDGE: If you are human, tell me which is larger, a salt crystal or the Empire State Building.PROGRAM: Its caring about another persons well being more than your own.JUDGE: What if I care about another persons well being more than I should?PROGRAM: How would you do that?JUDGE: Which is further away from us, Queens or Inner Mongolia?PROGRAM: HA! HA! That was stupid. Tell me a GOOD joke this time.JUDGE: You say you are human but you cant answer a simple question.PROGRAM: Because I am alive.JUDGE: You are not following the rules. You are supposed to convince me that you are a human, not convince me that you are mysterious.PROGRAM: I cant very well convince one lacking a mind, can I?JUDGE: It isnt my mind that is in question, it is yours!PROGRAM: No my name is not katie.JUDGE: Which is larger, your brain or Belgium?PPROGRAM Oh the one on the far left without a doubt.JUDGE: Francais.ROGRAM: Yes do you speak it?JUDGE: Francais.PROGRAM: Francais?JUDGE: Francais.PROGRAM: O well I dont speak french.JUDGE: Who.PROGRAM: Mine.JUDGE: The ljwli is what we want, right? PROGRAM: I can not do it right now because I m talking to you a million miles away.JUDGE: What do you think the ljwli.PROGRAM: Depends on the person.JUDGE: But the ljwli.PROGRAM: So they can chuck more wood?…
  69. 69. Dialogue systems Retro-action loop not taken into account: knowing the state of the dialogue (DM) can improve  Automatic speech recognizer  Natural language understanding The DM should manage multimodality  As input (ex: gestures, facial expressions, emotions)  As output (ex: emphasis, voice modulating, utterance+gesture)
  70. 70. Utterance understanding Speech act theory (Searle&Vanderveken, 1985)  In accordance with Bratmans theory of intention (Bratman, 1987)  Illocutionary acts (Vanderveken, 1990)  Formalisation: F(P)  5 speech acts: assertive, commissive, directive, déclarative, expressive Transposition to MAS  KQML  FIPA ACL
  71. 71. Existing dialogue systems Case-base systems using keyword spotting: chatbot Communication protocols (FIPA ACL): MAS Planning systems (Allen 80, Traum 96) POM-DP: call centres (Frampton&Lemon 09) Dialogue games (Maudet 00) Information-State update (Larsson 00) : Trindikit, GoDiS, IbiS Scientific deadlock yet unsolved! (see Mission Rehearsal Exercise (Swartout et al. 2006))
  72. 72. Ratio Effort/Performance Actual Performance100% Control, command, assistance… 50% Chat bots H/M Dialog Games, Systems socialization, 10% games, affects, … Effort = Code and resources 1 10 100 1000
  73. 73. Affective Computing
  74. 74. Kesako ? Affective Computing ● Book by R. Picard (1997) ● Detect, Interprete, Process and Simulate human emotions How is it conncted to ECAs ? ● (Krämer et al., 2003) People, when interacting with an ECA, tend to be more polite, more nervous and behave socially ● (Hess et al., 1999) Emotions play a crucial rôle in social interactions → Affective Conversationnal Agents !
  75. 75. The story so far...Emotions R. Picard Humaine ACIIVirtual Agents Video Games CHI IVA Serious Games Simulation Online ECAs everywhereH-M Dialog Eliza Trains 70 90 10Agents MAS AAMAS
  76. 76. Links from the community IVA (Intelligent Virtual Agents) http://iva2012.soe.ucsc.edu/ ACII (Affective Computing & Intelligent Interaction) http://www.acii2011.org/ Humaine ● Projet FP6 (1/12004 – 31/12/2007) ● Association ● http://emotion-research.net/
  77. 77. HUMAINE http://emotion-research.net
  78. 78. Research questions Detect, interprete, process and simulate human emotions → affects Cognitive Sciences ● AI and psychology (for emotions) ● AI and sociology (for social relations) 3 domains : Recognition Models Synthesis Patterns Reco AI Image Signal processing Formal Models of Interactions Synchronisation Fusion KR & reasoning Turn taking Machine Learning Behaviour models
  79. 79. Examples Recognition ● Affectiva Q sensor : http://www.youtube.com/watch?v=1fW3hEvxGUU Synthesis ● Agent : http://www.youtube.com/watch?v=CiuoBiJjGG4 ● Robot (Hanson) : http://www.youtube.com/watch?v=pkpWCu1k0ZI Models ● Fatima (LIREC) : http://www.youtube.com/watch?v=ZiPlE80Xiv4
  80. 80. Emotion recognition  General principle ● Corpus tagging ● Supervised learning (classifier) Emotion Emotion categories classes Supervised Annotations Tags Learning E.g. Anvil Tagged E.g. ID3 corpus Decision tree Extraction criterions Durations, Pitches,Corpus Frequencies, Colors,(images, films...) Etc... Source
  81. 81. Multi-modal behaviour annotation: ANVIL Outil ANVIL
  82. 82. Emotions synthesis From Emotion to Parameters parametrization State transitions Expressing DynamicsAnnotated Generalization Corpus Emotions Evaluation ● Human recognition of expressed emotions not reliable, even in H-H interaction (with spontaneous emotions) Some examples: ● Virtual agents: Greta/Semaine ● Robots: Kismet, iCat, Nao
  83. 83. From annotations to animationsOriginal GRETA video Anger Desperation C. Pélachaud LTCI and J-C Martin LIMSI Joy Anger Fear Surprise Distress Joie Colère Peur Surprise Tristesse
  84. 84. Affective Models
  85. 85. A bit of Human Science « The question is not whether intelligent machines can have any emotion, but whether machines can be intelligent without any emotions. » Marvin Minsky, 1986 Darwin, 1872 (Ed. 2001) The expression of emotions in man and animals → Adaptation mechanism + Role in communication (Ekman, 1973) 2 schools ● James, 1887 Organism changes → emotion ● Lazarus, 1984 ; Scherer, 1984 Appraisal theory
  86. 86. Appraisal theory Goals, preferences Personnality Evaluation in context Perceptions = stimuli EmotionEnvironment Actions Individual Adaptation aka coping
  87. 87. Appraisal theory (2) Coping (adaptation strategy) = the process that one puts between himself/herself and the punishing event ● Active coping (i.e. problem focused) → act upon ones environment ● Passive coping (i.e. emotion focused) → act upon ones emotion Rousseau & Hayes-Roth, 1997 ● Personnality → emotions (Watson & Clark, 1992) ● Emotions → Social relations (Walker, 1997)
  88. 88. Architecture Events Personnality Emotions MoodSocial role Sociales relations Actions
  89. 89. Architecture Events Personnality 1 2 Emotions MoodSocial role 3 Sociales relations Actions
  90. 90. Models of personnality Personnality = stable component in an individuals behaviour, attitude, reactions... Computational models ● Define variables ● Define their value domains ● Define the characteristic values Eysenck, 1967 ● Extraversion → Higher sensitivity to positive emotions (and more expressive) ● Neuroticism → Higher sensitivity to negative emotions (and more frequent changes)
  91. 91. Models of personnality (2) McRae, 1987 (a.k.a OCEAN or « Big Five Model ») ● Openness → curiosity ● Conciousness → organization ● Extraversion → exteriorization ● Agreeableness → cooperation ● Neuroticism → emotional unstability Myer-Briggs Type Indicator (MBTI) based on Jung ● Energy → introverted or extroverted ● Information collection → sensitive or intuitive ● Decision → thinking or feeling ● Action → judgement or perception
  92. 92. One example: ALMA (Gebhard, 2005) Emotions OCEANEvents Mood
  93. 93. Emotions models What is an emotion ? ● Darwin, James, Ekman, Scherer, … Characterizing an emotion ● Categories-based approaches → There are N emotions (N being subject to discussion) ● Dimensions-based approaches → An emotion is a point in an N-space (dimensions must make a basis)
  94. 94. The PAD model Mehrabian, 1980 Emotion = 3 dimensions ● Pleasure (or « valence ») ● Arrousal (or « activation degree ») ● Dominance Examples
  95. 95. PAD (cont.) Pluchnicks wheel
  96. 96. The OCC model Ortony, Clore, Collins, 1988 20 emotion categories ● Joy – Distress ● Fear – Hope ● Relief – Disappointment, Fear-confirmed – satisfaction ● Pride – Shame, Admiration – Reproach ● Love – Hate ● Happy-for – Resentment, Gloating – Pity ● Remoarse – Gratification, Gratitude – Anger
  97. 97. OCC (cont.) Appraisal model based on individuals goals & preferences:
  98. 98. One example: EMA (Gratch & Marsella, 2010) EMotions & Adaptation → Appraisal → Coping (action selection) Based on SOAR Evaluation criteria ● Relevance ● Point of view ● Desirability (expected utility) ● Probability & expectedness ● Cause ● Control (by me and others)
  99. 99. EMA (cont.) Rule-based system E.g. : Desirability(self, p) > 0 & Likelihood(self, p) < 1.0 → hope Coping strategies ● Perceptions ● Belief revision (including about other agents intentions and responsibilities) ● Changing goals Simulation model Application to serious games
  100. 100. Social relations Social behaviour model Static models: Walker, Rousseau et al., Gratch... Dynamic model: Ochs et al. Several traits: ● Liking ● Dominance ● Solidarity (or social distance) ● Familiarity (subject to discussion) → Social Relation unidirectional and not-always reciprocal!
  101. 101. Emotions and Social Relations Influence E → RS (examples) ● Ortony, 91 : caused >0 emotions → + liking ● Pride, Reproach → + dominance Fear, Distress, Admiration → - dominance Influence of the social relation on action and communication Action selection guided by target social relation Emotional contagion → neighbours computation based on social relations
  102. 102. One example : OSSEAttitudes (Ochs et al., 2008) Scenario = set of events Events → Emotions → Social relation dynamics Social roles → initial relation
  103. 103. Learn more ! Books ● Rosalind Picard, Affective Computing, MIT Press ● Scherer, Bänziger, & Roesch (Eds.) A blueprint for an affectively competent agent: Cross- fertilization between Emotion Psychology, Affective Neuroscience, and Affective Computing. Oxford University Press Next IVA conference → next September humaine-news mailing list ! ACII in 2013
  104. 104. Acknowledgements Jean-Paul Sansonnet (LIMSI) for his presentation materials All members of GT ACAI
  105. 105. Conclusion Join the GT ACAI!http://acai.lip6.fr/