Moral Coppélia Combining Ratio with
Affect in Ethical
Reasoning
Matthijs Pontier
matthijspon@gmail.com
Overview of this presentation
•
•
•
•
•
•
•
•

SELEMCA
Moral Reasoning
Silicon Coppelia: Model of Emotional Intelligence
M...
SELEMCA
• Develop ‘Caredroids’: Robots or Computer Agents
that assist Patients and Care-deliverers
• Focus on patients who...
Possible functionalities
• Care-broker: Find care that matches need patient
• Companion: Become friends with the patient t...
Applications: Care Agents

Nijmegen, 18-11-2013

Good AIfternoon

5
Applications: Care Robots

Nijmegen, 18-11-2013

Good AIfternoon

6
How people perceive caredroids
• People perceive caredroids in terms of:
• Affordances
• Ethics
• Aesthetics
• Realism

Ni...
Aesthetics / Realism Caredroids
• Uncanny valley: Too human-like makes it eerie
• We associate almost-human with death: Zo...
Affordances Caredroids
• Make sure the caredroid is a useful tool for the
patients and the care deliverers
 Make sure the...
Affordances Caredroids
• Make sure the caredroid is a useful tool for the
patients and the care deliverers
 Make sure the...
Ethics Caredroids
• Make the robot behave ethically good, so patients
perceive the robot as ethically good
• Patients are ...
Background Machine Ethics
• Machines are becoming more autonomous
 Rosalind Picard (1997): ‘‘The greater the freedom of
a...
Domain: Medical Ethics
• Within SELEMCA, we develop caredroids
• Patients are in a vulnerable position. Moral behavior
of ...
Moral reasoning system
We developed a rational moral reasoning system that
is capable of balancing between conflicting mor...
Limitations rational moral reasoning
• Only moral reasoning results in very cold decisionmaking, only in terms of rights a...
Problem: Not Able to Simulate
Trolley Dilemma vs Footbridge Dilemma
• Greene et al. (2001) find that moral dilemmas vary
s...
Solution: Add Emotional Processing
• Previously, we developed Silicon Coppelia, a model
of emotional intelligence.
• This ...
Silicon Coppelia

Nijmegen, 18-11-2013

Good AIfternoon

18
Silicon Coppelia
• We developed Silicon Coppelia, with the goal to create
emotionally human-like robots
• Simulation exper...
Turing Test
• Turing Test was originally text-based
• We enriched test with affect-laden communication
• Facial expression...
Speed-dating Experiment

Nijmegen, 18-11-2013

Good AIfternoon
Results
• Participants did not detect differences on single
variables
• Participants did not recognize significant differe...
Conclusions Speed-Date
• We created simulation of affect so natural that young
women could not discern dating a robot from...
Silicon Coppelia + Moral Reasoning:
Decisions based on:
1. Rational influences
• Does action help me to reach my goals?

2...
Results Trolley & Footbridge
Kill 1 to Save 5

Moral system
Trolley
Footbridge
Moral Coppelia
Trolley
Footbridge

Nijmegen...
Background Criminology
Study
• Substantial evidence emotions are fundamental in
criminal decision making
• But emotions ra...
Adding Expected Emotional
State Affect to Moral Coppelia
• Expected Emotion(action, emotion) =
(1-b) * AEB(action, emotion...
Matching data to model
Match:
• Honesty/Humility
• Perceived Risk
• Negative State Affect

to Weightmorality
to Expected U...
Predicting Criminal Choices
Ratio

Emo

R+E

Moral

M+R

M+E

M+R+E

morcc

0

0

0

0.68

0.42

0.453

0.435

wmor

0

0
...
Conclusions Criminology Study
• Validation of Moral Coppélia
• Adds to criminological theory
• Useful in applications

Nij...
Applications:
- Virtual Storytelling
- (Serious) Games

31
Virtual Coach

32
Caredroids
Companion Robots
35
House Robots

36
House Robots

Nijmegen, 18-11-2013

Good AIfternoon

37
Conclusions
•
•
•
•

We created an affective moral reasoning system
System matches decisions medical ethical experts
Syste...
Discussion
• The introduction of affect in rational ethics is
important when robots communicate with humans
• Combination ...
Future Work
• More detailed model of Autonomy
• In Applications, choose actions that:
• Improve autonomy patient
• Improve...
Thank you!
Matthijs Pontier
m.a.pontier@vu.nl
http://camera-vu.nl/matthijs
http://www.linkedin.com/in/matthijspontier
http...
Upcoming SlideShare
Loading in …5
×

Moral coppelia - Comining Ratio with Affect in Ethical Reasoning - Good AIfternoon, Radboud University Nijmegen

534 views

Published on

We present a moral reasoner, Moral Coppélia, which combines connectionism, utilitarianism, and ethical theory about the moral duties autonomy, non-maleficence, beneficence, and justice with affective states and personality traits. We, moreover, treat human autonomy in the sense of self-determination as well as making a meaningful choice. Our system combines bottom-up with top-down approaches, calculating the effect of an act on the total moral utility in the world. Moral Coppélia can reproduce the verdicts of medical ethicists and health judges in real-life cases and can handle the emotional differences between logically identical problems such as the Trolley and Footbridge dilemma. It also deals with properties of character and personality such as honesty and humility to explain why logic reasoning is not always descriptive of actual human moral behavior. Apart from simulating known cases, we performed a split-half experiment with the responses of 153 participants in a criminal justice experiment. While fine-tuning the parameters to the first half of the data, the encompassing version of Moral Coppélia was capable of forecasting criminal decisions, leading to a better fit with the second half of the data than either of the loose component parts did. In other words, we found empirical support for the integral contribution of ratio, affect, and personality to moral decision making, which, additionally, could be acceptably simulated by our extended version of the Moral Coppélia system.

We present an integration of rational moral reasoning with emotional
intelligence. The rational moral reasoning system can match the
behavior of medical ethical experts. The moral reasoning system alone
could not simulate the different human reactions to the Trolley
dilemma and the Footbridge dilemma. However, the combined system can
simulate these human moral decision making processes. Moreover, the
system can explain and predict human criminal choices. The
introduction of affect in rational ethics is important when robots
communicate with humans in a practical context that includes moral
relations and decisions. Moreover, the combination of ratio and affect
may be useful for applications in which human moral decision making
behavior is simulated, for example, when agent systems or robots
provide healthcare support.

Published in: Technology, Spiritual
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
534
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Moral coppelia - Comining Ratio with Affect in Ethical Reasoning - Good AIfternoon, Radboud University Nijmegen

  1. 1. Moral Coppélia Combining Ratio with Affect in Ethical Reasoning Matthijs Pontier matthijspon@gmail.com
  2. 2. Overview of this presentation • • • • • • • • SELEMCA Moral Reasoning Silicon Coppelia: Model of Emotional Intelligence Moral Reasoning + Silicon Coppelia = Moral Coppelia Predicting Crime with Moral Coppelia Other Applications Discussion Future Work Nijmegen, 18-11-2013 Good AIfternoon 2
  3. 3. SELEMCA • Develop ‘Caredroids’: Robots or Computer Agents that assist Patients and Care-deliverers • Focus on patients who stay in long-term care facilities Nijmegen, 18-11-2013 Good AIfternoon 3
  4. 4. Possible functionalities • Care-broker: Find care that matches need patient • Companion: Become friends with the patient to prevent loneliness and activate the patient • Coach: Assist the patient in making healthy choices: Exercising, Eating healthy, Taking medicine, etc. Nijmegen, 18-11-2013 Good AIfternoon 4
  5. 5. Applications: Care Agents Nijmegen, 18-11-2013 Good AIfternoon 5
  6. 6. Applications: Care Robots Nijmegen, 18-11-2013 Good AIfternoon 6
  7. 7. How people perceive caredroids • People perceive caredroids in terms of: • Affordances • Ethics • Aesthetics • Realism Nijmegen, 18-11-2013 Good AIfternoon 7
  8. 8. Aesthetics / Realism Caredroids • Uncanny valley: Too human-like makes it eerie • We associate almost-human with death: Zombies etc.  Design robots of which realism in appearance matches realism in behavior Nijmegen, 18-11-2013 Good AIfternoon 8
  9. 9. Affordances Caredroids • Make sure the caredroid is a useful tool for the patients and the care deliverers  Make sure the caredroid is an expert in its task  Make sure the caredroid personalizes its behavior to the user Nijmegen, 18-11-2013 Good AIfternoon 9
  10. 10. Affordances Caredroids • Make sure the caredroid is a useful tool for the patients and the care deliverers  Make sure the caredroid is an expert in its task  Make sure the caredroid personalizes its behavior to the user Nijmegen, 18-11-2013 Good AIfternoon 10
  11. 11. Ethics Caredroids • Make the robot behave ethically good, so patients perceive the robot as ethically good • Patients are in a vulnerable position.  Moral behavior of robot is extremely important. We focus on Medical Ethics • Conflicts between: 1. 2. 3. 4. Autonomy Beneficence Non-maleficence Justice Nijmegen, 18-11-2013 Good AIfternoon 11
  12. 12. Background Machine Ethics • Machines are becoming more autonomous  Rosalind Picard (1997): ‘‘The greater the freedom of a machine, the more it will need moral standards.’’ • Machines interact more with people  We should manage that machines do not harm us or threaten our autonomy • Machine ethics is important to establish perceived trust in users Nijmegen, 18-11-2013 Good AIfternoon 12
  13. 13. Domain: Medical Ethics • Within SELEMCA, we develop caredroids • Patients are in a vulnerable position. Moral behavior of robot is extremely important.  We focus on Medical Ethics • Conflicts between: 1. Beneficence 2. Non-maleficence 3. Autonomy 4. Justice Nijmegen, 18-11-2013 Good AIfternoon 13
  14. 14. Moral reasoning system We developed a rational moral reasoning system that is capable of balancing between conflicting moral goals. Nijmegen, 18-11-2013 Good AIfternoon 14
  15. 15. Limitations rational moral reasoning • Only moral reasoning results in very cold decisionmaking, only in terms of rights and duties • Wallack, Franklin & Allen (2010): “Ethical agents require emotional intelligence as well as other ‘suprarational’ faculties, such as a sense of self and a ‘Theory of Mind” • Tronto (1993): “Care is only thought of as good care when it is personalized” Nijmegen, 18-11-2013 Good AIfternoon 15
  16. 16. Problem: Not Able to Simulate Trolley Dilemma vs Footbridge Dilemma • Greene et al. (2001) find that moral dilemmas vary systematically in the extent to which they engage emotional processing and that these variations in emotional engagement influence moral judgment. • Their study was inspired by the difference between two variants of an ethical dilemma: Trolley dilemma (moral impersonal) Footbridge dilemma (moral personal) Nijmegen, 18-11-2013 Good AIfternoon 16
  17. 17. Solution: Add Emotional Processing • Previously, we developed Silicon Coppelia, a model of emotional intelligence. • This can be projected in others for Theory of Mind • Learns from experience  Personalization  Connect Moral Reasoning to Silicon Coppelia • More human-like moral reasoning • Personalize moral decisions and communication about moral reasoning Nijmegen, 18-11-2013 Good AIfternoon 17
  18. 18. Silicon Coppelia Nijmegen, 18-11-2013 Good AIfternoon 18
  19. 19. Silicon Coppelia • We developed Silicon Coppelia, with the goal to create emotionally human-like robots • Simulation experiments: System behaves consistent with Theory and Intuition • Compare performance model with performance real human in speeddating experiment Nijmegen, 18-11-2013 Good AIfternoon 19
  20. 20. Turing Test • Turing Test was originally text-based • We enriched test with affect-laden communication • Facial expressions showing emotions • Capable of vocal speech • Afterwards questionnaire: How do you think Tom perceived you?  Measure made continuous and more elaborated than simply yes/no • Analysis: Bayesian structural equation modeling Nijmegen, 18-11-2013 Good AIfternoon 20
  21. 21. Speed-dating Experiment Nijmegen, 18-11-2013 Good AIfternoon
  22. 22. Results • Participants did not detect differences on single variables • Participants did not recognize significant differences on cognitive-affective structure • Model in which conditions (1: human, 2: robot) were assumed equal explained data better than model in which conditions were assumed different Nijmegen, 18-11-2013 Good AIfternoon 22
  23. 23. Conclusions Speed-Date • We created simulation of affect so natural that young women could not discern dating a robot from a man • Important for: • Understanding human affective communication • Developing communication technologies • Developing emotionally human-like robots Nijmegen, 18-11-2013 Good AIfternoon 23
  24. 24. Silicon Coppelia + Moral Reasoning: Decisions based on: 1. Rational influences • Does action help me to reach my goals? 2. Affective influences • Does action lead to desired emotions? • Does action reflect Involvement I feel towards user? • Does action reflect Distance I feel towards user? 3. Moral reasoning • Is this action morally good? Nijmegen, 18-11-2013 Good AIfternoon 24
  25. 25. Results Trolley & Footbridge Kill 1 to Save 5 Moral system Trolley Footbridge Moral Coppelia Trolley Footbridge Nijmegen, 18-11-2013 Good AIfternoon Do Nothing X X X X 25
  26. 26. Background Criminology Study • Substantial evidence emotions are fundamental in criminal decision making • But emotions rarely in criminal choice models  Study relation Ratio+Emotions+Moral  Apply Moral Coppelia to criminology data  Predict criminal decisions participants Nijmegen, 18-11-2013 Good AIfternoon 26
  27. 27. Adding Expected Emotional State Affect to Moral Coppelia • Expected Emotion(action, emotion) = (1-b) * AEB(action, emotion) + b * current_emotion • EESA(action) = 1- * (Desired(emotion(i)) – EE(action, i)) • ExpectedSatisfaction(action) = wmor*Morality(action) + wrat*ExpectedUtility(action) + wemo*EESA(action) Nijmegen, 18-11-2013 Good AIfternoon 27
  28. 28. Matching data to model Match: • Honesty/Humility • Perceived Risk • Negative State Affect to Weightmorality to Expected Utility to EESA Parameter Tuning: 1. Find optimal fits for initial sample 2. Predict decisions for holdout sample Nijmegen, 18-11-2013 Good AIfternoon 28
  29. 29. Predicting Criminal Choices Ratio Emo R+E Moral M+R M+E M+R+E morcc 0 0 0 0.68 0.42 0.453 0.435 wmor 0 0 0 1.00 0.96 0.97 0.87 partrat 1 0 0.34 0 1 0 0.64 partemo 0 1 0.66 0 0 1 0.36 0.8792 0.9222 0.9336 0.9871 0.9798 0.9881 0.9060 0.9323 0.9281 0.9803 0.9778 0.9821 R2 0.7553 initial R2 0.7192 holdout Nijmegen, 18-11-2013 Good AIfternoon 29
  30. 30. Conclusions Criminology Study • Validation of Moral Coppélia • Adds to criminological theory • Useful in applications Nijmegen, 18-11-2013 Good AIfternoon 30
  31. 31. Applications: - Virtual Storytelling - (Serious) Games 31
  32. 32. Virtual Coach 32
  33. 33. Caredroids
  34. 34. Companion Robots
  35. 35. 35
  36. 36. House Robots 36
  37. 37. House Robots Nijmegen, 18-11-2013 Good AIfternoon 37
  38. 38. Conclusions • • • • We created an affective moral reasoning system System matches decisions medical ethical experts System can simulate trolley and footbridge dilemma System can predict human criminal choices Nijmegen, 18-11-2013 Good AIfternoon 38
  39. 39. Discussion • The introduction of affect in rational ethics is important when robots communicate with humans • Combination Ratio + Affect + Morals useful for applications that simulate human decision making for example, when agent systems or robots provide healthcare support, or in entertainment settings Nijmegen, 18-11-2013 Good AIfternoon 39
  40. 40. Future Work • More detailed model of Autonomy • In Applications, choose actions that: • Improve autonomy patient • Improve well-being patient • Do not harm patient • Distribute resources equally among patients • Persuasive Technology Moral dilemmas about Helping vs Manipulating • Integrate current system with Health Care Intervention models Nijmegen, 18-11-2013 Good AIfternoon 40
  41. 41. Thank you! Matthijs Pontier m.a.pontier@vu.nl http://camera-vu.nl/matthijs http://www.linkedin.com/in/matthijspontier http://crispplatform.nl/projects/selemca Nijmegen, 18-11-2013 Good AIfternoon 41

×