Caring Computers & Emotional Engines To what extent can technology have emotions, express emotions and elicit emotions? St...
Aims: <ul><li>Is it possible to create emotions in artificial systems? </li></ul><ul><li>What might it mean for a computer...
Media Perception: <ul><li>Suggests emotion is possible in artificial lifeforms </li></ul><ul><li>Can have both negative an...
Biological <ul><li>Neurological underpinning </li></ul><ul><li>Neurotransmitters e.g. dopamine, endorphines </li></ul><ul>...
Organism Physical limits Autonomous   Multiple goals <ul><li>Cognitive limits: </li></ul><ul><li>attention </li></ul><ul><...
Function of emotions  continued… <ul><li>“ An emotion is usually caused by a person consciously or unconsciously evaluatin...
‘ Goals’ & ‘Beliefs’: <ul><li>Humans: </li></ul><ul><li>‘ 4 Fs’  </li></ul><ul><li>(Feed, Flight, Fight, Reproduction) </l...
Computers & ‘qualia’: <ul><li>Must a computer be aware of its own emotive intentions & feelings? </li></ul><ul><li>Some pr...
Physiological components of emotions:   <ul><li>Does a lack of biological simulation prevent the development of artificial...
‘ ELIZA’ the Psychotherapist: Girl:   My boyfriend made me come here. ELIZA: Your boyfriend made you come here Girl:    He...
<ul><li>Emotional impact of ‘ELIZA’: </li></ul><ul><li>“ I can imagine the development of a network of computer psychother...
The Media Equation: <ul><li>Human tendency to treat computers as if they are real life, social beings.  (Reeves & Nass, 20...
‘ Natural’ emotions: <ul><li>‘ Baldi’, computer-generated head. Facial expressions as easy to label as human equivalents. ...
Kismet: <ul><li>Anthropomophic robotic head, specialised for face-to-face interaction: </li></ul><ul><li>‘ Infant-like’ em...
Conclusion: <ul><li>Q. Can computers be programmed with an emotive function? </li></ul><ul><li>A.  Depends on your definit...
References:  <ul><li>Arbib, M.A. & Fellous, J.  (2004) ‘Emotions: From Brain to Robot’  Trends in Cognitive Sciences,  8 (...
References: <ul><li>Mart ínez-Miranda, J. & Aldea, A.  (2005) ‘Emotions in Human and Artificial Intelligence’  Computers i...
Discussion Questions: <ul><li>Is Kismet’s basic emotional system an example of real emotions or ‘as if’ emotions? Could Ki...
Upcoming SlideShare
Loading in …5
×

emotionalcomputers.ppt

1,039 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,039
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
20
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

emotionalcomputers.ppt

  1. 1. Caring Computers & Emotional Engines To what extent can technology have emotions, express emotions and elicit emotions? Stephen Appleton & Becky Wright
  2. 2. Aims: <ul><li>Is it possible to create emotions in artificial systems? </li></ul><ul><li>What might it mean for a computer to possess ‘emotions’? </li></ul><ul><li>What are the potential problems, limitations? </li></ul><ul><li>Can computers express emotions? </li></ul><ul><li>Can people perceive technology as capable of feelings? </li></ul>
  3. 3. Media Perception: <ul><li>Suggests emotion is possible in artificial lifeforms </li></ul><ul><li>Can have both negative and positive results </li></ul><ul><li>e.g. ‘HAL 9000’ from ‘2001: A Space Odyssey’ </li></ul><ul><li>Child android from ‘AI’ </li></ul><ul><li>‘ Data’ from ‘Star Trek’ </li></ul><ul><li>Computers = rational, logical </li></ul><ul><li>Emotions = illogical </li></ul><ul><li>Emotions are what separates humans from machines! </li></ul>
  4. 4. Biological <ul><li>Neurological underpinning </li></ul><ul><li>Neurotransmitters e.g. dopamine, endorphines </li></ul><ul><li>Hormones e.g. cortisol, adrenaline </li></ul><ul><li>Also: </li></ul><ul><li>Physiological states e.g. ‘fear response’ </li></ul>Cognitive <ul><li>Functionalist viewpoint: mental operations of emotions </li></ul><ul><li> (Boden, 1996) </li></ul>Conscious <ul><li>Awareness of emotions </li></ul><ul><li>Feelings, qualia </li></ul><ul><li>Breadth & variety of affective experiences (Rolls, 2005) </li></ul>AI Focus (Boden, 1996) Definition:
  5. 5. Organism Physical limits Autonomous Multiple goals <ul><li>Cognitive limits: </li></ul><ul><li>attention </li></ul><ul><li>memory </li></ul><ul><li>info-processing </li></ul>Function of emotions: Evolutionary context: Complex Unpredictable Relevant data? (Sloman, 1990; Ca ñamero, 2002 ) ENVIRONMENT
  6. 6. Function of emotions continued… <ul><li>“ An emotion is usually caused by a person consciously or unconsciously evaluating an event as relevant to a concern (a goal) that is important: the emotion is felt as positive when a concern is advanced and negative when a concern is impeded.” </li></ul><ul><li> (Oatley & Jenkins, 1996, p.96) </li></ul><ul><li>“ What emotions are about is action (or motivation for action) and action control” </li></ul><ul><li>(Frijda, 1995, p.506) </li></ul><ul><li>Emotions as cognitive appraisers </li></ul><ul><li>Alerts us to goal relevant events/stimuli </li></ul><ul><li>Does this in parallel for all concerns (Frijda, 1995) </li></ul><ul><li>Emotions as somatic markers </li></ul><ul><li>‘ Gut feelings’ </li></ul><ul><li>Facilitate decision-making process (Damasio, 1994) </li></ul><ul><li>“ An intelligent system… requires mechanisms whose total effect includes the ability to produce emotional states.” </li></ul><ul><li> (Sloman & Croucher, 1981, p.2) </li></ul>
  7. 7. ‘ Goals’ & ‘Beliefs’: <ul><li>Humans: </li></ul><ul><li>‘ 4 Fs’ </li></ul><ul><li>(Feed, Flight, Fight, Reproduction) </li></ul><ul><li>Natural Selection </li></ul><ul><li>No purpose, intent behind design </li></ul><ul><li>‘ Blind Watchmaker’ (Dawkins, 1986) </li></ul><ul><li>Machines: </li></ul><ul><li>Task oriented goals </li></ul><ul><li>Human programmer </li></ul><ul><li>Specific purpose, intent </li></ul><ul><li>(Rolls, 2005) </li></ul><ul><li>‘ Chinese Room’ problem </li></ul>If a computer has no goals or concerns of its own, can it really be said to possess emotions about such things? (Sloman, 1990)
  8. 8. Computers & ‘qualia’: <ul><li>Must a computer be aware of its own emotive intentions & feelings? </li></ul><ul><li>Some primary emotions can occur prior to conscious awareness (Picard, 1997) </li></ul><ul><li>Animal, infant emotions: lack of awareness/intent (Sloman & Croucher, 1981) </li></ul><ul><li>Clinical examples (e.g. repressors) (Derakshnan & Eysenck, 1999) </li></ul><ul><li>“ Consciousness is a prerequisite only to the emotional experience and not to the generation of the emotion.” </li></ul><ul><li> (Picard, 1997, p.74) </li></ul><ul><li>“ We, all of us overanthropomorphize humans, who are after all mere machines” </li></ul><ul><li>(Brooks, 2005, cited; Blackmore, 2003, p.215) </li></ul><ul><li>‘ General Problem Solvers’ (Newell & Simon, 1958, cited; Weizenbaum, 1976) </li></ul>
  9. 9. Physiological components of emotions: <ul><li>Does a lack of biological simulation prevent the development of artificial emotions? </li></ul><ul><li>‘ COG’: synthetic hormones (Brooks, 2005) </li></ul><ul><li>James-Lang theory of emotions (James, 1884; Lang, 1885) </li></ul>Emotion <ul><li>But…. </li></ul><ul><li>Paraplegics </li></ul><ul><li>‘ Cold’ emotions (Sloman, 1990) </li></ul>
  10. 10. ‘ ELIZA’ the Psychotherapist: Girl: My boyfriend made me come here. ELIZA: Your boyfriend made you come here Girl: He says I’m depressed much of the time. ELIZA: I am sorry to hear you are depressed Girl: It’s true. I am unhappy. ELIZA: Do you think coming here will help you not to be unhappy Girl: I need some help, that much seems certain. ELIZA: What would it mean to you if you got some help Girl: Perhaps I could learn to get along with my mother. ELIZA: Tell me more about your family (Weizenbaum, 1976, p.3-4)
  11. 11. <ul><li>Emotional impact of ‘ELIZA’: </li></ul><ul><li>“ I can imagine the development of a network of computer psychotherapeutic terminals, something like arrays of large telephone booths, in which, for a few dollars a session, we would be able to talk with an attentive, tested, and largely nondirective psychotherapist.” </li></ul><ul><li>(Dr. C. Sagan, 1975, cited; Weizenbaum, p. 5) </li></ul>
  12. 12. The Media Equation: <ul><li>Human tendency to treat computers as if they are real life, social beings. (Reeves & Nass, 2002) </li></ul><ul><li>Natural, often subconscious </li></ul><ul><li>Evolutionary backdrop </li></ul><ul><li>‘ Emergent’ emotions: (Picard, 1997) </li></ul><ul><li>Human examples e.g. actors, service staff </li></ul><ul><li>Even no communication is a type of communication! (Bartneck et al., 2004 ) </li></ul><ul><li>Blank face analogy… </li></ul><ul><li>‘ Ontrack Data-Recovery’ survey </li></ul><ul><li>User responses: </li></ul><ul><li>- ‘Sweet-talk’ computer </li></ul><ul><li>Physical abuse of computer </li></ul><ul><li>(Sullivan, 2005) </li></ul>
  13. 13. ‘ Natural’ emotions: <ul><li>‘ Baldi’, computer-generated head. Facial expressions as easy to label as human equivalents. (Massaro et al., 2000) </li></ul><ul><li>Synthetic speech: recognition of emotional content. </li></ul><ul><ul><ul><ul><ul><li>(Cahn, 1990, cited; Brave et al., 2005) </li></ul></ul></ul></ul></ul>However, can take ‘natural’ look too far…. (Hara, 2000, cited; Menzel & D’Aluisio, 2000) <ul><li>The ‘Uncanny Valley’ (Mori, 1970s) </li></ul><ul><li>To be convincing an agent’s embodiment needs to match its skills (Bartneck, 2001) </li></ul><ul><li>Disturbing if fails to meet expectations (Picard & Klein, 2002) </li></ul>
  14. 14. Kismet: <ul><li>Anthropomophic robotic head, specialised for face-to-face interaction: </li></ul><ul><li>‘ Infant-like’ emotional behaviour </li></ul><ul><li>Endowed with basic ‘emotive’ system: </li></ul><ul><li>Goals = ‘interact with humans’; ‘play with toys’; ‘rest’ </li></ul><ul><li>Emotional expressions: </li></ul><ul><li>Speech: ‘babble’, emotive intent displayed through prosody </li></ul><ul><li>‘ Empathy’ skills </li></ul>
  15. 15. Conclusion: <ul><li>Q. Can computers be programmed with an emotive function? </li></ul><ul><li>A. Depends on your definition of ‘emotions’: </li></ul><ul><ul><ul><li>Emotions are a program within a human ‘robot’ (e.g. Brooks; Newell & Simon), therefore just a matter of adding program to artificial robots. </li></ul></ul></ul><ul><ul><ul><li>Emotions occur at more than just an operational level- they are a conscious experience, not a simulated behaviour. Computers can never have emotions as they can never be conscious. </li></ul></ul></ul>Are computer ‘emotions’ real or ‘as if’ simulations?...
  16. 16. References: <ul><li>Arbib, M.A. & Fellous, J. (2004) ‘Emotions: From Brain to Robot’ Trends in Cognitive Sciences, 8 (12), p.554-561 </li></ul><ul><li>Bartneck, C. (2001) ‘How Convincing is Mr. Data’s Smile: Affective Expressions of Machines’ User Modeling and User-Adapted Interaction’ 11, p.279-295 </li></ul><ul><li>Bartneck, C., Reichenbach, J. & Breeman, A.V. (2004) ‘In Your Face, Robot! The Influence of a Character’s Embodiment on How Users Perceive Its Emotional Expressions’, web page: http://www.bartneck.de/work/bartneckDE2004.pdf </li></ul><ul><li>Blackmore, S. ‘Consciousness, An Introduction’ (2003), Hodder & Stoughton, London </li></ul><ul><li>Boden, M.A. (ed) ‘The Philosophy of Artificial Life’ (1996), Oxford University Press, Oxford </li></ul><ul><li>Brave, S., Nass, C. & Hutchinson, K. (2005) ‘Computers That Care: Investigating The Effects of Orientation of Emotion Exhibited By An Embodied Computer Agent’ Int. J. Human-Computer Studies, 62, p. 161-178 </li></ul><ul><li>Breazeal, C. (2003) ‘Emotion and Sociable Humanoid Robots’ Int. J. Human-Computer Studies, 59, p.119-155 </li></ul><ul><li>Brooks, R. (2005) ‘COG’, web page: http://groups.csail.mit.edu/lbr/humanoid-robotics-group/cog/ </li></ul><ul><li>Ca ñamero, L.D. (2002) ‘Designing Emotions for Activity Selection in Autonomous Agents’ from Emotions In Humans and Artifacts’ Trappl,l R. Petta, P. Payr, Sabine. (eds), MIT Press, Cambridge, US, p.115-148 </li></ul><ul><li>Damasio, A. R. (2004) ‘William James and The Modern Neurobiology of Emotion’ from ‘Emotion, Evolution, and Rationality’ Evans, D. & Pierre, C. (eds), Oxford University Press, Oxford, p.3-14 </li></ul><ul><li>Damasio, A.R. ‘Descartes’ Error: Emotion, Reason and The Human Brain’ (1994), Putnam, New York </li></ul><ul><li>Dawkins, R. ‘The Blind Watchmaker’ (2000), Penguin, London </li></ul><ul><li>Derakshan, N., & Eysenck, M. W. (1999) ‘Are repressors self-deceivers or other- deceivers?’ Cognition & Emotion , 13, 1-17 </li></ul><ul><li>Frijda, N.H. (1995) ‘Emotions in Robots’ from ‘Comparative Approaches to Cognitive Science’ Rotiblat, H.L. & Meyer, J (eds), MIT Press, Cambridge, US </li></ul>
  17. 17. References: <ul><li>Mart ínez-Miranda, J. & Aldea, A. (2005) ‘Emotions in Human and Artificial Intelligence’ Computers in Human Behavior, 21, p.323-341 </li></ul><ul><li>Menzel, P. & D’Aluisio, F. ‘Robo Sapiens: Evolution of a New Species’ (2000), MIT Press, Cambridge, US </li></ul><ul><li>Mori, M. (1970s), web page: http://www.everything2.com/index.pl?node_id=1687559 </li></ul><ul><li>Oatley, K & Jenkins, J.M. ‘Understanding Emotions’ (1996), Blackwell Press, Oxford </li></ul><ul><li>Picard, R.W. ‘Affective Computing’ (1997), MIT Press, Cambridge, US </li></ul><ul><li>Picard, R.W. Klein, J. (2002) ‘Computers That Recognise and Respond to User Emotion: Theoretical and Practical Implications’ Interacting With Computers, 14, p.141-169 </li></ul><ul><li>Pinker, S. ‘How The Mind Works’ (1997) Penguin, London </li></ul><ul><li>Reeves, B. & Nass, C. ‘The Media Equation’ (2002) CSLI Publications, Stanford, US </li></ul><ul><li>Rolls, E.T. ‘Emotion Explained’ (2005), Oxford University Press, UK </li></ul><ul><li>Sloman, A. (1990) ‘Motives, Mechanisms and Emotions’ from ‘The Philosophy of Artificial Intelligence’ Boden, M.A. (ed), Oxford University Press, Oxford p.231-247 </li></ul><ul><li>Sloman, A. & Croucher, M. (1981) ‘You Don’t Need A Soft Skin To Have A Warm Heart: Towards A Computational Analysis of Motives and Emotions’, web page: http://www.cs.bham.ac.uk/research/cogaff/sloman-croucher-warm-heart.pdf </li></ul><ul><li>Stern, A. (2002) ‘Creating Emotional Relationships with Virtual Characters’ from ‘Emotions In Humans and Artifacts’ Trappl,l R. Petta, P. Payr, Sabine. (eds), MIT Press, Cambridge, US, p.333-362 </li></ul><ul><li>Sullivan, B. (2005) ‘Drop The Mouse and Step Away From The PC’, web page: http://www.msnbc.msn.com/id/7329279 </li></ul><ul><li>Weizenbaum, J. ‘Computer Power and Human Reason: From Judgment to Calculation’ (1976), W.H. Freedman and Company, New York </li></ul>
  18. 18. Discussion Questions: <ul><li>Is Kismet’s basic emotional system an example of real emotions or ‘as if’ emotions? Could Kismet be said to have a partial subset of emotions or is it just a simulation model? </li></ul><ul><li>If a machine can ever be said to have its own emotions, it arguably needs to be able to develop its own goals & beliefs. Would we be able to recognise & empathise with a computer’s own drives & emotions or would we end up displaying of some form of ‘artificial autism’? </li></ul><ul><li>Although the media equation is powerful it is not infallible- people can override their natural empathic tendencies, as demonstrated by Bartneck in his robot replication of Milgram’s electric-shock experiment. If machines were ever capable of emotions, should this change how we treat them? </li></ul>

×