The Neural Bases of Cognitive Conflict and Control in Moral Judgment Joshua D. Greene, Leigh E. Nystrom, Andrew D. Engell, John M. Darley and Jonathan D. Cohen  Neuron , Vol. 44, 389-400, October 14, 2004
Results “ ...brain regions associated with abstract reasoning and cognitive control...are recruited to resolve difficult personal moral dilemmas in which utilitarian values require ‘personal’ moral violations, violations that have previously been associated with increased activity in emotion-related brain regions.”
Stroop Task
Moral Dilemmas
Moral Dilemmas Impersonal Moral Dilemma Personal Moral Dilemma
Definitions moral  v . nonmoral   as determined by pilot participants personal  (v.  impersonal )  action could reasonably lead to serious bodily harm to person(s) where harm is not result of deflected threat onto a different party cognitive   information processing (“cognitive” in contrast with affective processing)  difficult  (v.  easy ) longer RT and less consensus utilitarian preference for the greatest good for the greatest number over the long term
The Analyses
Current Study 41 subjects (inc. 9 previous subjects) 60 practical dilemmas nonmoral v. moral (personal v. impersonal) presented series of 12 blocks of 5 trials three screens; self-paced (max. 46 s per dilemma) 14 s intertrial interval (“+”) fMRI processing  consistent with 2001 study included whole-brain analysis
Previous Study* Hypothesis:  “...moral judgment in response to [moral] violations familiar to our primate ancestors (personal violations) are driven by social-emotional responses while moral judgment in response to distinctively human (impersonal) moral violation is (or can be) more ‘cognitive’.” Predicted and found : (a) more activity in social-emotional brain regions for dilemmas involving personal moral violations, and (b) longer reactions times (RT) for those whose response was incongruent with the emotional response.  * “An fMRI Investigation of Emotional Engagement in Moral Judgment.”  Greene, et al.  Science , Vol. 293, September 14, 2001
Previous Study
Personal v. Impersonal
Analysis 1 (difficult v. easy) Hypothesis:  “...increased RT in response to personal moral dilemmas results from the conflict associated with competition between a strong prepotent response and a response supported by abstract reasoning and the application of cognitive control.” Predicted and found:  for personal moral dilemmas with longer RTs there would be increased activity in the anterior cingulate (ACC, which monitors conflict) and the dorsolateral prefrontal cortex (DLPFC, used in cognitive control) and the inferior parietal lobes (associated with working memory).
Brain ROI Anterior Cingulate Cortex (ACC) Dorsolateral Prefrontal Cortex (DLPFC) Inferior Parietal Lobes Posterior Cingulate Anterior Insula
Brain ROI  (difficult/high-RT judgments) ACC precuneus posterior cingulate cortex R&L middle frontal gyrus
Table 2
Analysis 2 (utilitarian v. nonutilitarian) Hypothesis:  for difficult personal moral dilemmas the “control processes work against the social-emotional judgments...and in favor of utilitarian judgments.” Predicted and found:  for difficult personal moral dilemmas there should be increased activity in the DLPFC regions in subjects who judged personal moral violations to be appropriate (as opposed to judging them inappropriate).
Utilitarian Judgments: RT
Utilitarian Judgments: DLPFC anterior DLPFC (BA 10/46) whole-brain analysis
Utilitarian Judgments: ROI
Utilitarian Judgments: ROI Whole-brain Analysis
Questions What does this tell us about moral judgment in general (including impersonal judgments, reference to benefit and fairness)?  Are the dilemmas fully representative? What does this tell us about personal judgments in general (including moral judgments)? Only one “up close and personal” nonmoral dilemma.  How “up close” must it be?What of third person POV? How small are the samples for the “nonutilitarian” judgments? Alternatives to conflict between emotion and utilitarianism What about nonutilitarian/rational/deontological induced conflicts (e.g., whether a duty is present or a rule can be applied)? Difference from moral prototypes/frames

Greene Presentation

  • 1.
    The Neural Basesof Cognitive Conflict and Control in Moral Judgment Joshua D. Greene, Leigh E. Nystrom, Andrew D. Engell, John M. Darley and Jonathan D. Cohen Neuron , Vol. 44, 389-400, October 14, 2004
  • 2.
    Results “ ...brainregions associated with abstract reasoning and cognitive control...are recruited to resolve difficult personal moral dilemmas in which utilitarian values require ‘personal’ moral violations, violations that have previously been associated with increased activity in emotion-related brain regions.”
  • 3.
  • 4.
  • 5.
    Moral Dilemmas ImpersonalMoral Dilemma Personal Moral Dilemma
  • 6.
    Definitions moral v . nonmoral as determined by pilot participants personal (v. impersonal ) action could reasonably lead to serious bodily harm to person(s) where harm is not result of deflected threat onto a different party cognitive information processing (“cognitive” in contrast with affective processing) difficult (v. easy ) longer RT and less consensus utilitarian preference for the greatest good for the greatest number over the long term
  • 7.
  • 8.
    Current Study 41subjects (inc. 9 previous subjects) 60 practical dilemmas nonmoral v. moral (personal v. impersonal) presented series of 12 blocks of 5 trials three screens; self-paced (max. 46 s per dilemma) 14 s intertrial interval (“+”) fMRI processing consistent with 2001 study included whole-brain analysis
  • 9.
    Previous Study* Hypothesis: “...moral judgment in response to [moral] violations familiar to our primate ancestors (personal violations) are driven by social-emotional responses while moral judgment in response to distinctively human (impersonal) moral violation is (or can be) more ‘cognitive’.” Predicted and found : (a) more activity in social-emotional brain regions for dilemmas involving personal moral violations, and (b) longer reactions times (RT) for those whose response was incongruent with the emotional response. * “An fMRI Investigation of Emotional Engagement in Moral Judgment.” Greene, et al. Science , Vol. 293, September 14, 2001
  • 10.
  • 11.
  • 12.
    Analysis 1 (difficultv. easy) Hypothesis: “...increased RT in response to personal moral dilemmas results from the conflict associated with competition between a strong prepotent response and a response supported by abstract reasoning and the application of cognitive control.” Predicted and found: for personal moral dilemmas with longer RTs there would be increased activity in the anterior cingulate (ACC, which monitors conflict) and the dorsolateral prefrontal cortex (DLPFC, used in cognitive control) and the inferior parietal lobes (associated with working memory).
  • 13.
    Brain ROI AnteriorCingulate Cortex (ACC) Dorsolateral Prefrontal Cortex (DLPFC) Inferior Parietal Lobes Posterior Cingulate Anterior Insula
  • 14.
    Brain ROI (difficult/high-RT judgments) ACC precuneus posterior cingulate cortex R&L middle frontal gyrus
  • 15.
  • 16.
    Analysis 2 (utilitarianv. nonutilitarian) Hypothesis: for difficult personal moral dilemmas the “control processes work against the social-emotional judgments...and in favor of utilitarian judgments.” Predicted and found: for difficult personal moral dilemmas there should be increased activity in the DLPFC regions in subjects who judged personal moral violations to be appropriate (as opposed to judging them inappropriate).
  • 17.
  • 18.
    Utilitarian Judgments: DLPFCanterior DLPFC (BA 10/46) whole-brain analysis
  • 19.
  • 20.
    Utilitarian Judgments: ROIWhole-brain Analysis
  • 21.
    Questions What doesthis tell us about moral judgment in general (including impersonal judgments, reference to benefit and fairness)? Are the dilemmas fully representative? What does this tell us about personal judgments in general (including moral judgments)? Only one “up close and personal” nonmoral dilemma. How “up close” must it be?What of third person POV? How small are the samples for the “nonutilitarian” judgments? Alternatives to conflict between emotion and utilitarianism What about nonutilitarian/rational/deontological induced conflicts (e.g., whether a duty is present or a rule can be applied)? Difference from moral prototypes/frames