Moral CoppeliaCombining Ratio and Affect inEthical ReasoningMatthijs PontierGuy WiddershovenJohan Hoorn
Outline of this presentation•   Background•   Domain•   Moral reasoning system•   Silicon Coppelia•   Moral reasoning + Si...
Background• Machines interact more with people• Machines are becoming more autonomous• Rosalind Picard (1997): ‘‘The great...
Domain: Medical Ethics• Within SELEMCA, we develop caredroids• Patients are in a vulnerable position. Moral behavior  of r...
Previous Work:Moral reasoning systemWe developed a rational moral reasoning system thatis capable of balancing between con...
Limitations moral reasoning• Only moral reasoning results in very cold decision-  making, only in terms of rights and duti...
Problem: Not Able to SimulateTrolley Dilemma vs Footbridge Dilemma• Greene et al. (2001) find that moral dilemmas vary  sy...
Solution: Add Emotional Processing• Previously, we developed Silicon Coppelia, a  model of emotional intelligence.• This c...
Silicon CoppeliaCartagena, 15-11-2012   IBERAMIA 2012   9
Affective Decision Making Module:Moral and Affective Decision•   ExpectedSatisfaction(Agent1, Action, Agent2) =    weu * E...
Results                                   Kill 1 to Save 5   Do Nothing      Moral system      Trolley                    ...
Discussion• The introduction of affect in rational ethics is  important when robots communicate with humans  in a practica...
Future Work• More detailed model of Autonomy• In applications, choose actions that:   • Improve autonomy patient   • Impro...
ApplicationsCartagena, 15-11-2012   IBERAMIA 2012   14
Thank you!•   Matthijs Pontier•   m.a.pontier@vu.nl•   http://camera-vu.nl/matthijs•   http://www.linkedin.com/in/matthijs...
Upcoming SlideShare
Loading in …5
×

Moral Coppélia - Combining Ratio with Affect in Ethical Reasoning - Slides IBERAMIA 2012

698 views

Published on

We present an integration of rational moral reasoning with emotional intelligence. The moral reasoning system alone could not simulate the different human reactions to the Trolley dilemma and the Footbridge dilemma. However, the combined system can simulate these human moral decision making processes. The introduction of affect in rational ethics is important when robots com-municate with humans in a practical context that includes moral relations and decisions. Moreover, the combination of ratio and affect may be useful for ap-plications in which human moral decision making behavior is simulated, for ex-ample, when agent systems or robots provide healthcare support.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
698
On SlideShare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Care-droids = care-agents, care-robots, assist care-deliverers and patients
  • Silicon Coppelia  emotional intelligence, theory of mind, personalization (through adaptation / learning from interaction)
  • Moral Reasoning system alone could not simulate difference trolley dilemma and footbridge dilemma. Moral Reasoning system combined with Silicon Coppélia could simulate these human moral decision making processes
  • 1: Robot tries to convince elder to exercise 2: Entertainment  Bad can also be interesting.
  • Mental integrity, Physical integrity, Privacy, Capability to make autonomous decision: Cognitive Functioning, Adequate Information, Reflection
  • Moral Coppélia - Combining Ratio with Affect in Ethical Reasoning - Slides IBERAMIA 2012

    1. 1. Moral CoppeliaCombining Ratio and Affect inEthical ReasoningMatthijs PontierGuy WiddershovenJohan Hoorn
    2. 2. Outline of this presentation• Background• Domain• Moral reasoning system• Silicon Coppelia• Moral reasoning + Silicon Coppelia = Moral Coppelia• Results• Discussion• Future WorkCartagena, 15-11-2012 IBERAMIA 2012 2
    3. 3. Background• Machines interact more with people• Machines are becoming more autonomous• Rosalind Picard (1997): ‘‘The greater the freedom of a machine, the more it will need moral standards.’’• We should manage that machines do not harm us or threaten our autonomyCartagena, 15-11-2012 IBERAMIA 2012 3
    4. 4. Domain: Medical Ethics• Within SELEMCA, we develop caredroids• Patients are in a vulnerable position. Moral behavior of robot is extremely important. We focus on Medical Ethics• Conflicts between:1. Beneficence2. Non-maleficence3. Autonomy4. JusticeCartagena, 15-11-2012 IBERAMIA 2012 4
    5. 5. Previous Work:Moral reasoning systemWe developed a rational moral reasoning system thatis capable of balancing between conflicting moral goals.Cartagena, 15-11-2012 IBERAMIA 2012 5
    6. 6. Limitations moral reasoning• Only moral reasoning results in very cold decision- making, only in terms of rights and duties• Wallack, Franklin & Allen (2010): “even agents who adhere to a deontological ethic or are utilitarians may require emotional intelligence as well as other ‘‘supra- rational’’ faculties, such as a sense of self and a theory of mind”• Tronto (1993): “Care is only thought of as good care when it is personalized”Cartagena, 15-11-2012 IBERAMIA 2012 6
    7. 7. Problem: Not Able to SimulateTrolley Dilemma vs Footbridge Dilemma• Greene et al. (2001) find that moral dilemmas vary systematically in the extent to which they engage emotional processing and that these variations in emotional engagement influence moral judgment.• Their study was inspired by the difference between two variants of an ethical dilemma: Trolley dilemma (little emotional processing) Footbridge dilemma (much emotional processing)Cartagena, 15-11-2012 IBERAMIA 2012 7
    8. 8. Solution: Add Emotional Processing• Previously, we developed Silicon Coppelia, a model of emotional intelligence.• This can also be projected in others, for Theory of Mind• Learns from experience  PersonalizationConnect Moral Reasoning to Silicon Coppelia • More human-like moral reasoning • Personalize moral decisions and communication about moral reasoningCartagena, 15-11-2012 IBERAMIA 2012 8
    9. 9. Silicon CoppeliaCartagena, 15-11-2012 IBERAMIA 2012 9
    10. 10. Affective Decision Making Module:Moral and Affective Decision• ExpectedSatisfaction(Agent1, Action, Agent2) = weu * ExpectedUtility + wmor * Morality(action) + wpos * (1 - abs(positivity – biasInvolvement * Involvement)) + wneg* (1 - abs(negativity – biasDistance * Distance))Cartagena, 15-11-2012 IBERAMIA 2012 10
    11. 11. Results Kill 1 to Save 5 Do Nothing Moral system Trolley X Footbridge X Moral Coppelia Trolley X Footbridge XCartagena, 15-11-2012 IBERAMIA 2012 11
    12. 12. Discussion• The introduction of affect in rational ethics is important when robots communicate with humans in a practical context that includes moral relations and decisions.• Moreover, the combination of ratio and affect may be useful for applications in which human moral decision making behavior is simulated for example, when agent systems or robots provide healthcare support, or in entertainment settingsCartagena, 15-11-2012 IBERAMIA 2012 12
    13. 13. Future Work• More detailed model of Autonomy• In applications, choose actions that: • Improve autonomy patient • Improve well-being patient • Do not harm patient • Distribute resources equally among patients• Persuasive Technology. Moral dilemmas about: • Helping vs ManipulatingCartagena, 15-11-2012 IBERAMIA 2012 13
    14. 14. ApplicationsCartagena, 15-11-2012 IBERAMIA 2012 14
    15. 15. Thank you!• Matthijs Pontier• m.a.pontier@vu.nl• http://camera-vu.nl/matthijs• http://www.linkedin.com/in/matthijspontier• http://crispplatform.nl/projects/selemcaCartagena, 15-11-2012 IBERAMIA 2012 15

    ×