Greene Presentation


Published on

Presentation the the neuroscience seminar

Published in: Technology, News & Politics
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Greene Presentation

  1. 1. The Neural Bases of Cognitive Conflict and Control in Moral Judgment <ul><li>Joshua D. Greene, Leigh E. Nystrom, Andrew D. Engell, John M. Darley and Jonathan D. Cohen </li></ul><ul><li>Neuron , Vol. 44, 389-400, October 14, 2004 </li></ul>
  2. 2. Results <ul><li>“ ...brain regions associated with abstract reasoning and cognitive control...are recruited to resolve difficult personal moral dilemmas in which utilitarian values require ‘personal’ moral violations, violations that have previously been associated with increased activity in emotion-related brain regions.” </li></ul>
  3. 3. Stroop Task
  4. 4. Moral Dilemmas
  5. 5. Moral Dilemmas Impersonal Moral Dilemma Personal Moral Dilemma
  6. 6. Definitions <ul><li>moral v . nonmoral </li></ul><ul><ul><li>as determined by pilot participants </li></ul></ul><ul><li>personal (v. impersonal ) </li></ul><ul><ul><li>action could reasonably lead to serious bodily harm to person(s) where harm is not result of deflected threat onto a different party </li></ul></ul><ul><li>cognitive </li></ul><ul><ul><li>information processing (“cognitive” in contrast with affective processing) </li></ul></ul><ul><li>difficult (v. easy ) </li></ul><ul><ul><li>longer RT and less consensus </li></ul></ul><ul><li>utilitarian </li></ul><ul><ul><li>preference for the greatest good for the greatest number over the long term </li></ul></ul>
  7. 7. The Analyses
  8. 8. Current Study <ul><li>41 subjects (inc. 9 previous subjects) </li></ul><ul><li>60 practical dilemmas </li></ul><ul><ul><li>nonmoral v. moral (personal v. impersonal) </li></ul></ul><ul><li>presented series of 12 blocks of 5 trials </li></ul><ul><ul><li>three screens; self-paced (max. 46 s per dilemma) </li></ul></ul><ul><ul><li>14 s intertrial interval (“+”) </li></ul></ul><ul><li>fMRI processing </li></ul><ul><ul><li>consistent with 2001 study </li></ul></ul><ul><ul><li>included whole-brain analysis </li></ul></ul>
  9. 9. Previous Study* <ul><li>Hypothesis: “...moral judgment in response to [moral] violations familiar to our primate ancestors (personal violations) are driven by social-emotional responses while moral judgment in response to distinctively human (impersonal) moral violation is (or can be) more ‘cognitive’.” </li></ul><ul><li>Predicted and found : (a) more activity in social-emotional brain regions for dilemmas involving personal moral violations, and (b) longer reactions times (RT) for those whose response was incongruent with the emotional response. </li></ul>* “An fMRI Investigation of Emotional Engagement in Moral Judgment.” Greene, et al. Science , Vol. 293, September 14, 2001
  10. 10. Previous Study
  11. 11. Personal v. Impersonal
  12. 12. Analysis 1 (difficult v. easy) <ul><li>Hypothesis: “...increased RT in response to personal moral dilemmas results from the conflict associated with competition between a strong prepotent response and a response supported by abstract reasoning and the application of cognitive control.” </li></ul><ul><li>Predicted and found: for personal moral dilemmas with longer RTs there would be increased activity in the anterior cingulate (ACC, which monitors conflict) and the dorsolateral prefrontal cortex (DLPFC, used in cognitive control) and the inferior parietal lobes (associated with working memory). </li></ul>
  13. 13. Brain ROI <ul><li>Anterior Cingulate Cortex (ACC) </li></ul><ul><li>Dorsolateral Prefrontal Cortex (DLPFC) </li></ul><ul><li>Inferior Parietal Lobes </li></ul><ul><li>Posterior Cingulate </li></ul><ul><li>Anterior Insula </li></ul>
  14. 14. Brain ROI (difficult/high-RT judgments) ACC precuneus posterior cingulate cortex R&L middle frontal gyrus
  15. 15. Table 2
  16. 16. Analysis 2 (utilitarian v. nonutilitarian) <ul><li>Hypothesis: for difficult personal moral dilemmas the “control processes work against the social-emotional judgments...and in favor of utilitarian judgments.” </li></ul><ul><li>Predicted and found: for difficult personal moral dilemmas there should be increased activity in the DLPFC regions in subjects who judged personal moral violations to be appropriate (as opposed to judging them inappropriate). </li></ul>
  17. 17. Utilitarian Judgments: RT
  18. 18. Utilitarian Judgments: DLPFC anterior DLPFC (BA 10/46) whole-brain analysis
  19. 19. Utilitarian Judgments: ROI
  20. 20. Utilitarian Judgments: ROI Whole-brain Analysis
  21. 21. Questions <ul><li>What does this tell us about moral judgment in general (including impersonal judgments, reference to benefit and fairness)? Are the dilemmas fully representative? </li></ul><ul><li>What does this tell us about personal judgments in general (including moral judgments)? Only one “up close and personal” nonmoral dilemma. How “up close” must it be?What of third person POV? </li></ul><ul><li>How small are the samples for the “nonutilitarian” judgments? </li></ul><ul><li>Alternatives to conflict between emotion and utilitarianism </li></ul><ul><ul><li>What about nonutilitarian/rational/deontological induced conflicts (e.g., whether a duty is present or a rule can be applied)? </li></ul></ul><ul><ul><li>Difference from moral prototypes/frames </li></ul></ul>