Roles of ECAsECAs are “Interactive Virtual Characters” that are situated in mediated(often distributed) environments.They can play four main roles: Assistants to welcome users and to assist them in understanding and using the structure and the functioning of applications Tutors for students in human-learning mediated environments, or for patients in psychological/pathological monitoring systems Partners for actors in virtual environments: partner or adversary in a game, participant in creative design groups, member of a mixed-initiative community, … Companions as a friend in a long term relashionship
Categories of ECAsGRETA (Pélachaud LTCI) STEVE (Rickel et al USC) VTE (Gratch et al USC) Talking heads Gestural Agents Situated Agents Fixed – Realistic Fixes/floating – moving arms Complete mobile charactersExpressions – LipSynch Dialogue – Deictic – Sign language Virtual/Augmented reality Emotions Tutoring – Assistance Training – Action
Scientific issuesAgent models: interaction performance/efficiency of models Interaction models with users: Multi-modality, H/M Dialogue Models of cognitive agents: BDI logics and planning, affective logics, ... Task models and user models: Symbolic Representation and ReasoningHuman modelling: ecological relevance of models Capture Representation of physiology Reproduction of expressions of humans Evaluation of behaviours Application
Scientific communitiesComputer Science and Natural Language Processing Multi-Agent Systems Human-Machine Interaction/Interfaces Human-Machine DialogueHumanities and Society Sciences Psychology of Language Interaction Ergonomic psychology
Projects and teams European project SEMAINE (http://www.semaine-project.eu/) LTCI (France): Greta LIMSI (France): MARC Bielefield University (Germany): MAX USC (USA): STEVE, VTE
Design of animated characters (ECAs or avatars) Independent of the embodiment level User Non verbal behaviours (postures, gestures, ...): feeling? Theoretical approach (related works) Empirical approach (corpus analysis) Generated animations From an intention (triggering action) Evaluation Automatically and cyclically <configuration> <timecode>10</timecode> <body>front</body> <head>front</head> <eyes>open happy</eyes> … … … <leftArm>hello</leftArm> <rightArm>hip</rightArm> <speech>Hello!</speech> </configuration> Corpora Representation Reproduction
Models based on visems Visem: elementary unit of the position of the muscles of the human faces (3D model) Experimental settings: Small red balls manually placed on the face 3 to 5 cameras Training corpus G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/
Transitions between expressions Torsion model of the face and the neck The positions of the red balls are set manually Pattern recognition is used to monitor ball movements Extraction of key frames automatic and manualG. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/
Analysis and synthesis Visems + 3D model are used to (re-)generate expressions 6 to 11 parameters Validation by comparing to real expressions G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/
‘Talking head’ project (LIMSI) Based on visems Video corpus of 21 utterances: « Cest /phonem/ ici » (Ex: « Cest u ici ») ... u Cest i ci ...
Realistic talking heads Greta (Prudence, Obadiah, Poppy and Spike) LTCI – Telecom ParisTechMARCLIMSI
Designation of the DOM objects KUB-OLO-ZIM GLA-TEK-ZIM
Deploying environmentsAPPLICATIONS WEB PAGES AMBIENT
Chat bots in the Internet Chat bot = “chatterbox” + “robot” A program capable of NL dialogue Usually key-word spotting No dialogue session or context No embodiment Goals: only chat ELIZA (J. Weizenbaum,1965)
Social avatars on the webSocial avatarsThe recent evolution of the communication over the webhas prompted the development of so-called “avatars”which are graphical entities dedicated to the mediatingprocess between humans in distributed virtualenvironments (ex: the “meeting services”)Contrary to an ECA which is the personification of acomputer program (an artificial agent) an avatar is thepersonification of a real human who controls the avatar. (Avatar = Mask) ≠ AgentSecond lifePresented as a « metaverse » by its developer, LindenLab, Second Life is a web-based virtual simulation of thesocial life of ordinary people. 5 000 000 avatars were created Permanently , ~ 200 000 visitors are logged
Mixed communities and Heterogeneous MASMetaphor: meetings Definition : computerised environments that integrates transparently humans and artificial agents within meetings Mixed initiative communities: brainstorm, design, decision, … Virtual Training Environments (VTE) : training, simulation, …Deploying environments Smart Room: physical place with augmented reality Web-based : distributed and mediated environmentsAnimated characters Avatars : virtual characters driven by humans Agents : virtual characters which embodies artificial agents
Social interaction in virtual environments Mixed training environments Avatars ECAs + Task: virtual firemen Recognition of fire Activity reports Interaction Emotion management Gestures and facial expressions NL dialogue managementM. El Jed, N. Pallamin, L. Morales, C. Adam, B. Pavard, IRIT-GRIC, GT ACA, 2005.11.15 — www.irit.fr/GRIC/VR/
Realistic behaviour of agents The study of videos enabled to highlight how deictic gestures and gazes towards items are used concomitantly with natural language words ("here", "it", "now", ...)M. El Jed, N. Pallamin, L. Morales, C. Adam, B. Pavard, IRIT-GRIC, GT ACA, 15 nov 2005 — www.irit.fr/GRIC/VR/
From assistant agents to ambient environmentsApplication taken from the DIVA toolkit (LIMSI) Transporting the Function of Assistance from stand-alone applications to room-based ambient environments
DIVA display devices of the HD TV Screen Large touch Screen Portable touch ScreenPresentation of any Internet content Presentation of the IRoom controls Diva Toolkit + IRoom (LIMSI)
Examples from professionalweb sites Lucie, virtual assistant of SFR Question ECA EDF Site P. Suignard, GT ACA, 2004.10.26 http://www.limsi.fr/aca/
Assistant agents: definition« An Assistant Agent is a software tool with the capacity to resolvehelp requests, issuing from novice users, about the static structureand the dynamic functioning of software components or services » InterViews Project – February 1999 Following Patti Maes MIT, 1994User person with poor knowledge about the component (novice)Request help demand in natural language (speech/text)Component computer application, web service, ambient applianceAgent rational, assistant, conversational, (can be embodied)Mediator symbolic model of the structure and the functioning
The assistant metaphor??? Symbolic model Request Assistant Novice agent user Software component
The agents of the Daft project Assistant ECA Hello ! Exemple : un groupe nominal §Gboxanalysis "le tres petit bouton rouge" GRASP PHASE 1 RULELIST length 74 GRASP PHASE 2 RULELIST length 9 FLEX le FLEX tres FLEX petit FLEX bouton FLEX rouge LEM le LEM tres LEM petit LEM bouton LEM rouge POS DD POS RR POS JJ POS NN POS JJ KEY THE R KEY INTLARGE R KEY SIZESMALL R KEY BUTTON R KEY RED R TYPE LEM TYPE LEM TYPE LEM TYPE LEM TYPE LEM ID 54252 ID 54253 ID 54254 ID 54255 ID 54256 Comment je peux … HIT : RR : R J Human Linguistic tres , petit FLEX le FLEX petit FLEX bouton FLEX rouge LEM le LEM petit LEM bouton LEM rouge POS DD POS JJ POS NN POS JJ KEY THE R KEY SIZESMALL R KEY BUTTON R KEY RED R TYPE LEM FLEX tres TYPE LEM TYPE LEM ID 54252 LEM tres ID 54255 ID 54256 MODE : POS RR KEY INTLARGE R TYPE LEM ID 54253 Agent Agent SETK 1 TYPE LEM ID 54254 HIT : NN : D J N E J le, petit , bouton , rouge FLEX bouton LEM bouton POS NN KEY BUTTON R FLEX le LEM le POS DD DET KEY THE R TYPE LEM ID 54252 FLEX petit FLEX rouge LEM petit LEM rouge POS JJ POS JJ KEY SIZESMALL R KEY RED R FLEX tres TYPE LEM LEM tres ID 54256 ATTR : MODE : POS RR KEY INTLARGE R TYPE LEM ID 54253 SETK 1 TYPE LEM ID 54254 SETK 2 TYPE LEM ID 54255 Software Σ Hi Rationall Agent MODEL[ Agent ID[main] PARTS[ Dialogue TITLE, FIELD, GROUP[ ID[command], PARTS[ Session BUTTON, BUTTON, BUTTON ] Profil ] ], USAGE["This comp onent is …" ] Modeling Agent ]
The Rational Assisting Agent « How many buttons are there? » ASK[COUNT[REF(BUTTON)]] Formal request language Rational agent Interaction Component model Σ Hi model - Session - Structure - Profile - Action - Task The agent reasons, - Process modifies the models Loop Formal answer language 1. Read a formal request ; TELL[EQUAL[COUNT[REF(BUTTON)]],3]] 2. Contextualize the formal request: « There are three buttons » - Explore the interaction model; - Explore the component model. 3. Process the formal request: - Update the interaction model; - Update the component model. 4. Produce the formal answer.
Dialogue with components Java Interface for ECA: DaftLea (K. LeGuern 2004) counter Here is … “Counter” component Standard GUI that can contain Java applets “Hanoi” component LEA “ AMI web site” componentACA LEA (Java) Speech synthesis: Elan SpeechJ-C Martin & S. Abrilian ...
Processing of a Daft request G IATIN ED DEL M ODone! M 4 3 2 1 «restart the counter » 1) The utterance is analyzed to build a formal request: RESTART[REF[“COUNTER”]] 2) The reference REF is resolved within the model (large When the user enters « restart the counter » in the red rectangle) chatbox, he/she can see: 3) The restart action is applied on the boolean of the counter thus producing the model variable INT to be - The number increments 8, 10, 12, 14 … incremented - LEA says: « Done! » 4) An update is sent to the boolean variable of the - LEA expresses the emote: ACKNOWLEDGE application 5) The behavioral answer « ACKNOWLEDGE » is sent to the virtual agent.
Usefulness of embodimentEyeballs 2D cartoon 3D realistic agents Humans
Evaluation of ECAs Efficiency Measures the actual performance of the couple user-agent when carrying out a given task.Objective Usability Measures the capacity of the user to have a good comprehension of the functioning of the system and the fluency of the control. Friendliness Measures the “feeling” of the user about the features of the system (attraction, commitment, esthetic, comfort, …)Subjective Believability Measures the “feeling” of the user about the fact that the agent can understand the user’s problems and has the capacity to help with real competence. Trust Measures the “feeling” of the user about the fact that the agent behaves as a trustable and cooperative entity.
Ratio realism / friendliness Eyeballs 2D cartoon 3D realistic agents Humans % Friendliness Bunraku puppet Toy "uncanny valley" % similarityMasahito Mori, Institut Robotique de Tokyo in Jasia Reicardt, “Robots are coming”, Thames and Hudson Ltd, 1978
The Persona Effect of Lester Results Presentation Measured improving of the parameters: 30 Sujects : With Agent - performance of comprehension 15 M & 15 F, - performance of recall (memory)age ≈ 28 years Presentation - subjective questions satisfaction Without Agent ECA An arrow WITH AGENT WITHOUT AGENT • Lester et al. (1997). The Persona Effect: Affective impact of Animated Pedagogical Agents. CHI’97. • van Mulken et al. (1998). The Persona Effect: How substantial is it? HCI’98.
Persona effect and memorizing Ik ben Annita, de assisterent bj dit experiment Measured improving the Without agent performance of recall of stories Subjective evaluation: Anthropomorphic questions Example : “Did you feel the agent sensitive?”61 Sujects : 39 M, 22 F Realistic agent Experiment: stories are told to subjects using three various situations (without agent, realistic agent, cartoon agent). Recalling questions are then asked to subjects. The three situations are compared. Result: an ECA improve the memorizing of information. Cartoon agentBeun et al. (2003). Embodied conversational agents: Effects on memory performance and anthropomorphisation. IVA’03.
Experiments on Functional descriptionThree different agents:― 2 men (Marco, Jules) with different garment― 1 woman (Lea) 1Three strategies of cooperation:― Redundancy 33 = 27― Complementarity experiments― SpecializationThree objects to describe:1. Video recorder remote controller2. Photocopier control panel3. Application for designing graphic documents 3 2 Stéphanie Buisine, LIMSI 2004
Functional description: evaluation resultsQuality of explanation― No noticeable effect of the appearance,― Noticeable effect of the multimodal behaviour: the redundant explanations are preferred.Sympathy― No noticeable effect of the multimodal behaviour,― Noticeable effect of the appearance : > >Performance― Same as the appearance,― But no correlation between Sympathy and Performance.Impact of the user’s gender― Men react differentially and prefer redounding explanations,― Women do not react differentially. Stéphanie Buisine, LIMSI 2004
Turing test: conversational intelligence Turing Test, Alan M. Turing (1912-1954) : "the imitation game" in “Computing Machinery and Intelligence”, Mind, Vol. 59, No. 236, pp. 433- 460, 1950. Can be replaced by a Turing-like test, on output data instead of during direct interaction
Wizard of Oz methodology Wizard of Oz An experimenter pilots an avatar The user ignores is in front of a human (he thinks he faces a virtual agent)
Humain-Agent interaction Interaction Interaction InteractionMultiple inputsTextual (text boxes), audio (microphone), video (camera), sensors (AmI environment),...Multiple outputsTextual, audio, facial expressions, postures, gestures, lighting, ...Multiple modelsConversational, emotional, of the environment, of the user, ...
European project SEMAINE: an ECA architecture http://www.semaine-project.eu/
Example of ECA architecture Semaine ProjectSpike http://semaine.opendfki.de/wiki/ComponentArchitecture
Human-ECA Dialogue S. Pesty, GT ACA, 2005.11.15, http://www.limsi.fr/aca/
Conversational agents Textual / GUI EmbodiedNon linguistic • Contextual help: pop-ups, ... • State printer: emotion, • Online manuals: hypertext emotes navigation • Assistants using a decision treeUnidirectional, INPUT SORTIElinguistic • Query fields embedded in an • Feature presenter application window: keywords, free/open questionsConversational • Chat bots: Eliza • Mediators: acts as a(bidirectional, • Dialogical assistants: representative of anlinguistic) InterViews, Daft, Artemis, … application of a service • Avatars
The reference of the chatbots A. L. I. C. E. Author: Richard S. Wallace Website: http://www.alicebot.org Documentation : http://alicebot.org/TR/2001/WD-aiml-1.0.1-20011025-006.html
Jabberwacky wins the Loebner Prize 2005….PROGRAM: Hello, my name is George and I am the human.JUDGE: Hi George, cslkjslw weicmslw weos, right?PROGRAM: Dont repeat your name, csthiang!JUDGE: If lsjkel lwehlx wouldnt you say?PROGRAM: No, hes still good.JUDGE: Who.PROGRAM: You and me.JUDGE: If you are human, tell me which is larger, a salt crystal or the Empire State Building.PROGRAM: Its caring about another persons well being more than your own.JUDGE: What if I care about another persons well being more than I should?PROGRAM: How would you do that?JUDGE: Which is further away from us, Queens or Inner Mongolia?PROGRAM: HA! HA! That was stupid. Tell me a GOOD joke this time.JUDGE: You say you are human but you cant answer a simple question.PROGRAM: Because I am alive.JUDGE: You are not following the rules. You are supposed to convince me that you are a human, not convince me that you are mysterious.PROGRAM: I cant very well convince one lacking a mind, can I?JUDGE: It isnt my mind that is in question, it is yours!PROGRAM: No my name is not katie.JUDGE: Which is larger, your brain or Belgium?PPROGRAM Oh the one on the far left without a doubt.JUDGE: Francais.ROGRAM: Yes do you speak it?JUDGE: Francais.PROGRAM: Francais?JUDGE: Francais.PROGRAM: O well I dont speak french.JUDGE: Who.PROGRAM: Mine.JUDGE: The ljwli is what we want, right? PROGRAM: I can not do it right now because I m talking to you a million miles away.JUDGE: What do you think the ljwli.PROGRAM: Depends on the person.JUDGE: But the ljwli.PROGRAM: So they can chuck more wood?…
Dialogue systems Retro-action loop not taken into account: knowing the state of the dialogue (DM) can improve Automatic speech recognizer Natural language understanding The DM should manage multimodality As input (ex: gestures, facial expressions, emotions) As output (ex: emphasis, voice modulating, utterance+gesture)
Utterance understanding Speech act theory (Searle&Vanderveken, 1985) In accordance with Bratmans theory of intention (Bratman, 1987) Illocutionary acts (Vanderveken, 1990) Formalisation: F(P) 5 speech acts: assertive, commissive, directive, déclarative, expressive Transposition to MAS KQML FIPA ACL
Existing dialogue systems Case-base systems using keyword spotting: chatbot Communication protocols (FIPA ACL): MAS Planning systems (Allen 80, Traum 96) POM-DP: call centres (Frampton&Lemon 09) Dialogue games (Maudet 00) Information-State update (Larsson 00) : Trindikit, GoDiS, IbiS Scientific deadlock yet unsolved! (see Mission Rehearsal Exercise (Swartout et al. 2006))
Ratio Effort/Performance Actual Performance100% Control, command, assistance… 50% Chat bots H/M Dialog Games, Systems socialization, 10% games, affects, … Effort = Code and resources 1 10 100 1000
Kesako ? Affective Computing ● Book by R. Picard (1997) ● Detect, Interprete, Process and Simulate human emotions How is it conncted to ECAs ? ● (Krämer et al., 2003) People, when interacting with an ECA, tend to be more polite, more nervous and behave socially ● (Hess et al., 1999) Emotions play a crucial rôle in social interactions → Affective Conversationnal Agents !
The story so far...Emotions R. Picard Humaine ACIIVirtual Agents Video Games CHI IVA Serious Games Simulation Online ECAs everywhereH-M Dialog Eliza Trains 70 90 10Agents MAS AAMAS
Links from the community IVA (Intelligent Virtual Agents) http://iva2012.soe.ucsc.edu/ ACII (Affective Computing & Intelligent Interaction) http://www.acii2011.org/ Humaine ● Projet FP6 (1/12004 – 31/12/2007) ● Association ● http://emotion-research.net/
Research questions Detect, interprete, process and simulate human emotions → affects Cognitive Sciences ● AI and psychology (for emotions) ● AI and sociology (for social relations) 3 domains : Recognition Models Synthesis Patterns Reco AI Image Signal processing Formal Models of Interactions Synchronisation Fusion KR & reasoning Turn taking Machine Learning Behaviour models
Emotions synthesis From Emotion to Parameters parametrization State transitions Expressing DynamicsAnnotated Generalization Corpus Emotions Evaluation ● Human recognition of expressed emotions not reliable, even in H-H interaction (with spontaneous emotions) Some examples: ● Virtual agents: Greta/Semaine ● Robots: Kismet, iCat, Nao
From annotations to animationsOriginal GRETA video Anger Desperation C. Pélachaud LTCI and J-C Martin LIMSI Joy Anger Fear Surprise Distress Joie Colère Peur Surprise Tristesse
A bit of Human Science « The question is not whether intelligent machines can have any emotion, but whether machines can be intelligent without any emotions. » Marvin Minsky, 1986 Darwin, 1872 (Ed. 2001) The expression of emotions in man and animals → Adaptation mechanism + Role in communication (Ekman, 1973) 2 schools ● James, 1887 Organism changes → emotion ● Lazarus, 1984 ; Scherer, 1984 Appraisal theory
Appraisal theory Goals, preferences Personnality Evaluation in context Perceptions = stimuli EmotionEnvironment Actions Individual Adaptation aka coping
Appraisal theory (2) Coping (adaptation strategy) = the process that one puts between himself/herself and the punishing event ● Active coping (i.e. problem focused) → act upon ones environment ● Passive coping (i.e. emotion focused) → act upon ones emotion Rousseau & Hayes-Roth, 1997 ● Personnality → emotions (Watson & Clark, 1992) ● Emotions → Social relations (Walker, 1997)
Architecture Events Personnality Emotions MoodSocial role Sociales relations Actions
Models of personnality Personnality = stable component in an individuals behaviour, attitude, reactions... Computational models ● Define variables ● Define their value domains ● Define the characteristic values Eysenck, 1967 ● Extraversion → Higher sensitivity to positive emotions (and more expressive) ● Neuroticism → Higher sensitivity to negative emotions (and more frequent changes)
Models of personnality (2) McRae, 1987 (a.k.a OCEAN or « Big Five Model ») ● Openness → curiosity ● Conciousness → organization ● Extraversion → exteriorization ● Agreeableness → cooperation ● Neuroticism → emotional unstability Myer-Briggs Type Indicator (MBTI) based on Jung ● Energy → introverted or extroverted ● Information collection → sensitive or intuitive ● Decision → thinking or feeling ● Action → judgement or perception
One example: ALMA (Gebhard, 2005) Emotions OCEANEvents Mood
Emotions models What is an emotion ? ● Darwin, James, Ekman, Scherer, … Characterizing an emotion ● Categories-based approaches → There are N emotions (N being subject to discussion) ● Dimensions-based approaches → An emotion is a point in an N-space (dimensions must make a basis)
The OCC model Ortony, Clore, Collins, 1988 20 emotion categories ● Joy – Distress ● Fear – Hope ● Relief – Disappointment, Fear-confirmed – satisfaction ● Pride – Shame, Admiration – Reproach ● Love – Hate ● Happy-for – Resentment, Gloating – Pity ● Remoarse – Gratification, Gratitude – Anger
OCC (cont.) Appraisal model based on individuals goals & preferences:
One example: EMA (Gratch & Marsella, 2010) EMotions & Adaptation → Appraisal → Coping (action selection) Based on SOAR Evaluation criteria ● Relevance ● Point of view ● Desirability (expected utility) ● Probability & expectedness ● Cause ● Control (by me and others)
EMA (cont.) Rule-based system E.g. : Desirability(self, p) > 0 & Likelihood(self, p) < 1.0 → hope Coping strategies ● Perceptions ● Belief revision (including about other agents intentions and responsibilities) ● Changing goals Simulation model Application to serious games
Social relations Social behaviour model Static models: Walker, Rousseau et al., Gratch... Dynamic model: Ochs et al. Several traits: ● Liking ● Dominance ● Solidarity (or social distance) ● Familiarity (subject to discussion) → Social Relation unidirectional and not-always reciprocal!
Emotions and Social Relations Influence E → RS (examples) ● Ortony, 91 : caused >0 emotions → + liking ● Pride, Reproach → + dominance Fear, Distress, Admiration → - dominance Influence of the social relation on action and communication Action selection guided by target social relation Emotional contagion → neighbours computation based on social relations
One example : OSSEAttitudes (Ochs et al., 2008) Scenario = set of events Events → Emotions → Social relation dynamics Social roles → initial relation
Learn more ! Books ● Rosalind Picard, Affective Computing, MIT Press ● Scherer, Bänziger, & Roesch (Eds.) A blueprint for an affectively competent agent: Cross- fertilization between Emotion Psychology, Affective Neuroscience, and Affective Computing. Oxford University Press Next IVA conference → next September humaine-news mailing list ! ACII in 2013
Acknowledgements Jean-Paul Sansonnet (LIMSI) for his presentation materials All members of GT ACAI
Conclusion Join the GT ACAI!http://acai.lip6.fr/