Your SlideShare is downloading. ×
A Computational Model
of Affective Moral
Decision Making
that predicts
Human Criminal Choices
Matthijs Pontier, matthijspo...
Overview of this presentation
•
•
•
•
•
•
•

SELEMCA
Moral Reasoning
Silicon Coppelia: Model of Emotional Intelligence
Mor...
SELEMCA
• Develop ‘Caredroids’: Robots or Computer Agents
that assist Patients and Care-deliverers
• Focus on patients who...
Possible functionalities
• Care-broker: Find care that matches need patient
• Companion: Become friends with the patient t...
Applications: Care Agents

Dunedin, 04-12-2013

PRIMA 2013

5
Applications: Care Robots

Dunedin, 04-12-2013

PRIM A 2013

6
Background Machine Ethics
• Machines are becoming more autonomous
 Rosalind Picard (1997): ‘‘The greater the freedom of
a...
Moral reasoning system
We developed a rational moral reasoning system that
is capable of balancing between conflicting mor...
Limitations rational moral reasoning
• Only moral reasoning results in very cold decisionmaking, only in terms of rights a...
Solution: Add Emotional Processing
• Previously, we developed Silicon Coppelia,
a model of Emotional Intelligence.
• This ...
Silicon Coppelia

Dunedin, 04-12-2013

PRIM A 2013

11
Speed-dating Experiment

Dunedin, 04-12-2013

PRIMA 2013
Improving Emotion Regulation
in Moral Coppelia
• Expected Emotion(action, emotion) =
(1-β) * AEB(action, emotion) + β * cu...
Moral Coppelia
Decisions based on:
1. Rational influences
• Does action help me to reach my goals?

2. Affective influence...
Background Criminology
• Substantial evidence emotions are fundamental in
criminal decision making
• But emotions rarely i...
Matching data to model
Match:
• Honesty/Humility

to Weightmorality

•
•

to Expected Utility
to EESA

Perceived Risk
Nega...
Predicting Criminal Choices
Ratio

Emo

R+E

Moral

M+R

M+E

M+R+E

morcc

0

0

0

0.68

0.42

0.453

0.435

wmor

0

0
...
Conclusions
• Validation of Moral Coppélia:
Moral Coppélia can be used to predict
human criminal choices
• Adds to theory ...
Future Work
• Publish Book: “Machine Medical Ethics”
• More detailed model of Autonomy
• Develop Applications that require...
Thank you!

Matthijs Pontier
matthijspon@gmail.com
http://camera-vu.nl/matthijs
http://www.linkedin.com/in/matthijspontier...
Upcoming SlideShare
Loading in...5
×

A computational model that predicts human criminal choices - Presentation at PRIMA 2013

210

Published on

Published in: Technology, Spiritual
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
210
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Science + Health Care + Creative Industry
    Triangle Patient / Care-deliverer / Robot
    Robot: Repetitive tasks, so that
    Care-deliverer has time for: Medical + Social tasks
  • Functionalities can all be in the same robot
    Same functionality can be in different kind of robots (physical robot, agent, app)
  • Results: Able to match decisions medical ethical experts
    Behavior system matches experts medical ethics
  • Wallach, Franklin & Allen: Ratio / Logic alone is not enough for ethical behavior towards humans
    Silicon Coppelia  emotional intelligence, theory of mind, personalization (through adaptation / learning from interaction)
  • Model from media perception to let medium perceive user
    Ga door model, ook emotie-regulatie uitleggen
    Moral and Affective Decision
  • Robot (mijn modellen) vs Mens
    Multiple Choice & Emoties
    Wat vond dit mannetje nou van jou?
    Proefpersonen zien geen verschil
    Je zou kunnen spreken van een geslaagde Turing Test
  • Mental integrity, Physical integrity, Privacy, Capability to make autonomous decision: Cognitive Functioning, Adequate Information, Reflection
  • Transcript of "A computational model that predicts human criminal choices - Presentation at PRIMA 2013"

    1. 1. A Computational Model of Affective Moral Decision Making that predicts Human Criminal Choices Matthijs Pontier, matthijspon@gmail.com Jean-Louis Van Gelder Reinout E. de Vries
    2. 2. Overview of this presentation • • • • • • • SELEMCA Moral Reasoning Silicon Coppelia: Model of Emotional Intelligence Moral Reasoning + Silicon Coppelia = Moral Coppelia Predicting Crime with Moral Coppelia Conclusions Future Work Dunedin, 04-12-2013 PRIMA 2013 2
    3. 3. SELEMCA • Develop ‘Caredroids’: Robots or Computer Agents that assist Patients and Care-deliverers • Focus on patients who stay in long-term care facilities Dunedin, 04-12-2013 PRIMA 2013 3
    4. 4. Possible functionalities • Care-broker: Find care that matches need patient • Companion: Become friends with the patient to prevent loneliness and activate the patient • Coach: Assist the patient in making healthy choices: Exercising, Eating healthy, Taking medicine, etc. Dunedin, 04-12-2013 PRIM A 2013 4
    5. 5. Applications: Care Agents Dunedin, 04-12-2013 PRIMA 2013 5
    6. 6. Applications: Care Robots Dunedin, 04-12-2013 PRIM A 2013 6
    7. 7. Background Machine Ethics • Machines are becoming more autonomous  Rosalind Picard (1997): ‘‘The greater the freedom of a machine, the more it will need moral standards.’’ • Machines interact more with people  We should manage that machines do not harm us or threaten our autonomy • Machine ethics is important to establish perceived trust in users Dunedin, 04-12-2013 PRIM A 2013 7
    8. 8. Moral reasoning system We developed a rational moral reasoning system that is capable of balancing between conflicting moral goals. Dunedin, 04-12-2013 PRIMA 2013 8
    9. 9. Limitations rational moral reasoning • Only moral reasoning results in very cold decisionmaking, only in terms of rights and duties • Wallack, Franklin & Allen (2010): “Ethical agents require Emotional Intelligence as well as other ‘supra-rational’ faculties, such as a sense of self and a Theory of Mind” • Tronto (1993): “Care is only thought of as good care when it is personalized” Dunedin, 04-12-2013 PRIM A 2013 9
    10. 10. Solution: Add Emotional Processing • Previously, we developed Silicon Coppelia, a model of Emotional Intelligence. • This can be projected in others for Theory of Mind • Learns from experience  Personalization  Connect Moral Reasoning to Silicon Coppelia • More human-like moral reasoning • Personalize moral decisions and communication about moral reasoning Dunedin, 04-12-2013 PRIM A 2013 10
    11. 11. Silicon Coppelia Dunedin, 04-12-2013 PRIM A 2013 11
    12. 12. Speed-dating Experiment Dunedin, 04-12-2013 PRIMA 2013
    13. 13. Improving Emotion Regulation in Moral Coppelia • Expected Emotion(action, emotion) = (1-β) * AEB(action, emotion) + β * current_emotion • EESA(action) = 1- * (Desired(emotion(i)) – EE(action, i)) • ExpectedSatisfaction(action) = wmor*Morality(action) + wrat*ExpectedUtility(action) + wemo*EESA(action) Dunedin, 04-12-2013 PRIMA 2013 13
    14. 14. Moral Coppelia Decisions based on: 1. Rational influences • Does action help me to reach my goals? 2. Affective influences • Does action lead to desired emotions? • Does action reflect Involvement I feel towards user? • Does action reflect Distance I feel towards user? 3. Moral reasoning • Is this action morally good? Dunedin, 04-12-2013 PRIMA 2013 14
    15. 15. Background Criminology • Substantial evidence emotions are fundamental in criminal decision making • But emotions rarely in criminal choice models  Study relation Ratio + Emotions + Morality  Apply Moral Coppelia to criminology data  Predict criminal decisions participants Dunedin, 04-12-2013 PRIMA 2013 15
    16. 16. Matching data to model Match: • Honesty/Humility to Weightmorality • • to Expected Utility to EESA Perceived Risk Negative State Affect Parameter Tuning: 1. Find optimal fits for initial sample 2. Predict decisions for holdout sample Dunedin, 04-12-2013 PRIMA 2013 16
    17. 17. Predicting Criminal Choices Ratio Emo R+E Moral M+R M+E M+R+E morcc 0 0 0 0.68 0.42 0.453 0.435 wmor 0 0 0 1.00 0.96 0.97 0.87 partrat 1 0 0.34 0 1 0 0.64 partemo 0 1 0.66 0 0 1 0.36 0.8792 0.9222 0.9336 0.9871 0.9798 0.9881 0.9060 0.9323 0.9281 0.9803 0.9778 0.9821 R2 0.7553 initial R2 0.7192 holdout Dunedin, 04-12-2013 PRIMA 2013 17
    18. 18. Conclusions • Validation of Moral Coppélia: Moral Coppélia can be used to predict human criminal choices • Adds to theory criminology • Useful in applications • Serious games: virtual crook • Entertainment • Theory of Mind • Predict crime? Dunedin, 04-12-2013 PRIMA 2013 18
    19. 19. Future Work • Publish Book: “Machine Medical Ethics” • More detailed model of Autonomy • Develop Applications that require Affective Moral Behavior, such as • Persuasive Technology: Moral dilemmas about Helping vs Manipulating • Integrate current system with Health Care Intervention models Dunedin, 04-12-2013 PRIMA 2013 19
    20. 20. Thank you! Matthijs Pontier matthijspon@gmail.com http://camera-vu.nl/matthijs http://www.linkedin.com/in/matthijspontier http://crispplatform.nl/projects/selemca Dunedin, 04-12-2013 PRIM A 2013 20

    ×