Emotionally IntelligentCaredroids thatAct Morally GoodMatthijs Pontierm.a.pontier@vu.nl
Outline of this presentation•   SELEMCA•   Moral reasoning system•   Emotional Intelligence: Silicon Coppelia•   Moral rea...
SELEMCA• Develop ‘Caredroids’: Robots or Computer Agents  that assist Patients and Care-deliverers• Focus on patients who ...
Applications: Care AgentsDelft,12-12-2012   HUMANS IN SERVICE: DESIGN CHALLENGES   4
Applications: Care RobotsDelft,12-12-2012   HUMANS IN SERVICE: DESIGN CHALLENGES   5
Possible functionalities• Care-broker: Find care that matches need patient• Companion: Become friends with the patient to ...
How people perceive caredroids• People perceive caredroids in terms of:   • Affordances   • Ethics   • Aesthetics   • Real...
Aesthetics / Realism Caredroids• Uncanny valley: Too human-like makes it eerie• We associate almost-human with death: Zomb...
Affordances Caredroids• Make sure the caredroid is a useful tool for the  patients and the care deliverers   Make sure th...
Ethics Caredroids• Make the robot behave ethically good, so patients  perceive the robot as ethically good• Patients are i...
Background Machine Ethics• Machines are becoming more autonomous Rosalind Picard (1997): ‘‘The greater the freedom of  a ...
Moral reasoning systemWe developed a rational moral reasoning system thatis capable of balancing between conflicting moral...
Limitations moral reasoning• Only moral reasoning results in very cold decision-  making, only in terms of rights and duti...
Problem: Not Able to SimulateTrolley Dilemma vs Footbridge Dilemma• Greene et al. (2001) find that moral dilemmas vary  sy...
Solution: Add Emotional Processing• Previously, we developed Silicon Coppelia, a model  of emotional intelligence.• This c...
Silicon CoppeliaDelft,12-12-2012   HUMANS IN SERVICE: DESIGN CHALLENGES   16
Speed-dating ExperimentDelft,12-12-2012   HUMANS IN SERVICE: DESIGN CHALLENGES
Silicon Coppelia + Moral Reasoning:Decisions based on:1. Rational influences      • Does action help me to reach my goals?...
Results Moral Reasoning +Silicon Coppelia                             Kill 1 to Save 5             Do Nothing      Moral s...
Discussion• The introduction of affect in rational ethics is  important when robots communicate with humans  in a practica...
Future Work• Persuasive Technology  Moral dilemmas about Helping vs Manipulating• Integrate current system with  Health Ca...
Thank you!Matthijs Pontierm.a.pontier@vu.nlhttp://camera-vu.nl/matthijshttp://www.linkedin.com/in/matthijspontierhttp://cr...
Upcoming SlideShare
Loading in …5
×

Emotionally Intelligent Caredroids that Act Morally Good - Talk at Humans in Service: Design Challenges, TU Delft 12-12-2012

730 views

Published on

We present an integration of rational moral reasoning with emotional intelligence. The moral reasoning system alone could not simulate the different human reactions to the Trolley dilemma and the Footbridge dilemma. However, the combined system can simulate these human moral decision making processes. The introduction of affect in rational ethics is important when robots com-municate with humans in a practical context that includes moral relations and decisions. Moreover, the combination of ratio and affect may be useful for ap-plications in which human moral decision making behavior is simulated, for ex-ample, when agent systems or robots provide healthcare support.

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
730
On SlideShare
0
From Embeds
0
Number of Embeds
22
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Science + Health Care + Creative Industry Triangle Patient / Care-deliverer / Robot Robot: Repetitive tasks, so that Care-deliverer has time for: Medical + Social tasks
  • Functionalities can all be in the same robot Same functionality can be in different kind of robots (physical robot, agent, app)
  • Robots are often already very human-like in appearance, but not yet in behavior
  • We develop Care-droids: Care-agents, Care-robots that assist care-deliverers and patients
  • Results: Able to match decisions medical ethical experts Behavior system matches experts medical ethics
  • Wallach, Franklin & Allen: Ratio / Logic alone is not enough for ethical behavior towards humans Silicon Coppelia  emotional intelligence, theory of mind, personalization (through adaptation / learning from interaction)
  • Moral personal = more emotionally engaging
  • Model from media perception to let medium perceive user Ga door model, ook emotie-regulatie uitleggen Moral and Affective Decision Personalized  User-centered
  • Robot (mijn modellen) vs Mens Multiple Choice & Emoties Wat vond dit mannetje nou van jou? Proefpersonen zien geen verschil Je zou kunnen spreken van een geslaagde Turing Test
  • Moral Reasoning system alone could not simulate difference trolley dilemma and footbridge dilemma. Moral Reasoning system combined with Silicon Coppélia could simulate these human moral decision making processes
  • 1: Robot tries to convince elder to take pills 2: Entertainment  Bad can also be interesting.
  • Persuasion: Make use of History, Social network, Gameification, Communication styles Health Care Intervention Models  Increase affordances
  • Emotionally Intelligent Caredroids that Act Morally Good - Talk at Humans in Service: Design Challenges, TU Delft 12-12-2012

    1. 1. Emotionally IntelligentCaredroids thatAct Morally GoodMatthijs Pontierm.a.pontier@vu.nl
    2. 2. Outline of this presentation• SELEMCA• Moral reasoning system• Emotional Intelligence: Silicon Coppelia• Moral reasoning + Silicon Coppelia = Moral Coppelia• Future WorkDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 2
    3. 3. SELEMCA• Develop ‘Caredroids’: Robots or Computer Agents that assist Patients and Care-deliverers• Focus on patients who stay in long-term care facilities Delft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 3
    4. 4. Applications: Care AgentsDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 4
    5. 5. Applications: Care RobotsDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 5
    6. 6. Possible functionalities• Care-broker: Find care that matches need patient• Companion: Become friends with the patient to prevent loneliness and activate the patient• Coach: Assist the patient in making healthy choices: Exercising, Eating healthy, Taking medicine, etc.Delft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 6
    7. 7. How people perceive caredroids• People perceive caredroids in terms of: • Affordances • Ethics • Aesthetics • RealismDelft, April 6th 2011 Kick-off CRISP 7
    8. 8. Aesthetics / Realism Caredroids• Uncanny valley: Too human-like makes it eerie• We associate almost-human with death: Zombies etc.  Design robots of which realism in appearance matches realism in behaviorDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 8
    9. 9. Affordances Caredroids• Make sure the caredroid is a useful tool for the patients and the care deliverers  Make sure the caredroid is an expert in its task  Make sure the caredroid personalizes its behavior to the userDelft, April 6th 2011 Kick-off CRISP 9
    10. 10. Ethics Caredroids• Make the robot behave ethically good, so patients perceive the robot as ethically good• Patients are in a vulnerable position.  Moral behavior of robot is extremely important.We focus on Medical Ethics• Conflicts between: 1. Autonomy 2. Beneficence 3. Non-maleficence 4. JusticeDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 10
    11. 11. Background Machine Ethics• Machines are becoming more autonomous Rosalind Picard (1997): ‘‘The greater the freedom of a machine, the more it will need moral standards.’’• Machines interact more with people We should manage that machines do not harm us or threaten our autonomy• Machine ethics is important to establish perceived trust in usersDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 11
    12. 12. Moral reasoning systemWe developed a rational moral reasoning system thatis capable of balancing between conflicting moral goals.Delft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 12
    13. 13. Limitations moral reasoning• Only moral reasoning results in very cold decision- making, only in terms of rights and duties• Wallack, Franklin & Allen (2010): “even agents who adhere to a deontological ethic or are utilitarians may require emotional intelligence as well as other ‘‘supra- rational’’ faculties, such as a sense of self and a theory of mind”• Tronto (1993): “Care is only thought of as good care when it is personalized”Delft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 13
    14. 14. Problem: Not Able to SimulateTrolley Dilemma vs Footbridge Dilemma• Greene et al. (2001) find that moral dilemmas vary systematically in the extent to which they engage emotional processing and that these variations in emotional engagement influence moral judgment.• Their study was inspired by the difference between two variants of an ethical dilemma: Trolley dilemma (moral impersonal) Footbridge dilemma (moral personal)Delft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 14
    15. 15. Solution: Add Emotional Processing• Previously, we developed Silicon Coppelia, a model of emotional intelligence.• This can be projected in others for Theory of Mind• Learns from experience  Personalization Connect Moral Reasoning to Silicon Coppelia • More human-like moral reasoning • Personalize moral decisions and communication about moral reasoningDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 15
    16. 16. Silicon CoppeliaDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 16
    17. 17. Speed-dating ExperimentDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES
    18. 18. Silicon Coppelia + Moral Reasoning:Decisions based on:1. Rational influences • Does action help me to reach my goals?2. Affective influences • Does action reflect Involvement I feel towards user? • Does action reflect Distance I feel towards user?3. Moral reasoning • Is this action morally good?Delft,12-12-2012 Kick-off CRISP 18
    19. 19. Results Moral Reasoning +Silicon Coppelia Kill 1 to Save 5 Do Nothing Moral system Trolley X Footbridge X Moral Coppelia Trolley X Footbridge XDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 19
    20. 20. Discussion• The introduction of affect in rational ethics is important when robots communicate with humans in a practical context that includes moral relations and decisions.• Moreover, the combination of ratio and affect may be useful for applications in which human moral decision making behavior is simulated for example, when agent systems or robots provide healthcare support, or in entertainment settingsDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 20
    21. 21. Future Work• Persuasive Technology Moral dilemmas about Helping vs Manipulating• Integrate current system with Health Care Intervention models• Parametric design to adapt Appearance and Behavior to UserDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 21
    22. 22. Thank you!Matthijs Pontierm.a.pontier@vu.nlhttp://camera-vu.nl/matthijshttp://www.linkedin.com/in/matthijspontierhttp://crispplatform.nl/projects/selemcaDelft,12-12-2012 HUMANS IN SERVICE: DESIGN CHALLENGES 22

    ×