Overview of Behavioral Economics

2,854 views
2,599 views

Published on

Published in: Technology, Economy & Finance
0 Comments
5 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,854
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
0
Comments
0
Likes
5
Embeds 0
No embeds

No notes for slide
  • Social distance is measured by the degree to which people feel they are connected with through information about one abother or through common social cues.
  • Social distance is measured by the degree to which people feel they are connected with through information about one another or through common social cues.
  • Social distance is measured by the degree to which people feel they are connected with through information about one another or through common social cues.
  • Trusting behavior and positive reciprocity
  • Conciliatory behavior and negative reciprocity
  • Conciliatory behavior and negative reciprocity
  • Conciliatory behavior and negative reciprocity
    Modal: 5-5
    Less than 20% offers have a 50% of chance to be rejected.
    It’s very intriguing to see people motivated to actively turn down the offer which is more than nothing.
  • Excessive preference for membership contracts in health clubs. People overestimate the future attendance of health clubs (in the U.S.) (DellaVigna et al., 2006)
    Homework and deadline: although deadline should be set on the last of the semester to be optimal from the standard theory performance were improved in early deadline settings. Deadlines are a good commitment device (Ariely et al., 2002)
    Default effects in 401(k)s: people tend to go by default option. Changes from non-participation to participation with a 3% in money market fund  86% participation in treatment group and 49% in control group (Mandrian et al., 2001). Choi (2004) generalizes the previous findings across companies and industries . 91.5% people chose the default option among 436 available options in Sweden (Cronqvist et al., 2004)
  • People are risk neutral over gains and risk seeking over losses.
    2.4 < (B) 2.5+0=2.5
    (C) -7.5 = (D) -7.5
    49% chose (A) over (B)
    68% chose (D) over (C)
    Rationality: B-C or B-D
    Under framework: (A) (D)  overall 25% chose A-D
    B-C overwhelmed.
    When separated, none chose A-D
  • Herd behaviour describes how individuals in a group can act together without planned direction. Humans are social animals. They are dependent on each other when they made a decision. We need to understand the nature of human being in order for the wisdom of crowds.
  • Social Proof: The tendency to assume that if lots of people are doing something or believe something, here must be a good reason why.
    People will do things that they see other people are doing. For example, in one experiment, one or more confederates would look up into the sky; bystanders would then look up into the sky to see what they were seeing. At one point this experiment aborted, as so many people were looking up that they stopped traffic.
    People will do things that they see other people are doing. For example, in one experiment, one or more confederates would look up into the sky; bystanders would then look up into the sky to see what they were seeing. At one point this experiment aborted, as so many people were looking up that they stopped traffic.
  • Herd behaviour describes how individuals in a group can act together without planned direction. Humans are social animals. They are dependent on each other when they made a decision. We need to understand the nature of human being in order for the wisdom of crowds.
    A situation in which every subsequent actor, based on the observations of others, makes the same choice independent of her private signal. In this situation, everyone is individually acting rationally. Even if all members as a group have overwhelming information in favor of the correct decision, each and every participant may take the wrong action.
    The problem underlies the fact that people make a decisions not all at once but rather in sequence.
    Information that people have is not perfect. So to supplement their own information, people look at what others are doing. Information cascade is not the results of mindless trend-following, or conformity, or peer pressure. People fall in line because they believe they’re learning something important from the examples of others.
  • Prevents correlation between mistakes (or errors) people make/ in other words, it prevents the amplification of errors.2) independence does not imply rationality or impartiality. You can be biased or irrational, but as long as you are independent you are not going to make the group any dumber.
    prevents correlations of individuals’ mistakes and encourages new information inflows.
    So we can conclude that collective decisions are most likely to be good ones when the decisions are made by people with diverse opinions reaching independent conclusions, relying primarily on private information.
  • Herd behaviour describes how individuals in a group can act together without planned direction. Humans are social animals. They are dependent on each other when they made a decision. We need to understand the nature of human being in order for the wisdom of crowds.
  • SymptomsClose-mindedness
    Stereotyped vies of enemies
    Remedy?
    Institutions that expose hidden profiles, encourage counterarguments, and create alternatives
  • Monkeys show a sense of justice
    They will protest if they see another monkey get paid more for the same task.
    Researchers taught brown capuchin monkeys to swap tokens for food. Usually they were happy to exchange this "money" for cucumber.But if they saw another monkey getting a grape - a more-liked food - they took offence. Some refused to work, others took the food and refused to eat it.Scientists say this work suggests that human's sense of justice is inherited and not a social construct.
  • We could not have survived without it.
  • The human brain is a mammalian brain with a larger cortex.
  • Amygdala regulates social behavior.ArousalControls Autonomic Responses Associated with FearEmotional ResponsesHormonal Secretions
  • Political decision is not rational.
    Bush and Kerry video speech
    no matter the issue under discussion, both sides are equally convinced that the evidence overwhelmingly supports their position.
    This surety is called the confirmation bias, whereby we seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirmatory evidence. Now a functional magnetic resonance imaging (fMRI) study shows where in the brain the confirmation bias arises and how it is unconscious and driven by emotions. Psychologist Drew Westen led the study, conducted at Emory University, and the team presented the results at the 2006 annual conference of the Society for Personality and Social Psychology.
    During the run-up to the 2004 presidential election, while undergoing an fMRI bran scan, 30 men--half self-described as "strong" Republicans and half as "strong" Democrats--were tasked with assessing statements by both George W. Bush and John Kerry in which the candidates clearly contradicted themselves. Not surprisingly, in their assessments Republican subjects were as critical of Kerry as Democratic subjects were of Bush, yet both let their own candidate off the hook.
    The neuroimaging results, however, revealed that the part of the brain most associated with reasoning--the dorsolateral prefrontal cortex--was quiescent. Most active were the orbital frontal cortex, which is involved in the processing of emotions; the anterior cingulate, which is associated with conflict resolution; the posterior cingulate, which is concerned with making judgments about moral accountability; and--once subjects had arrived at a conclusion that made them emotionally comfortable--the ventral striatum, which is related to reward and pleasure.
    Emory study lights up the political brain
    When it comes to forming opinions and making judgments on hot political issues, partisans of both parties don't let facts get in the way of their decision-making, according to a new Emory University study. The research sheds light on why staunch Democrats and Republicans can hear the same information, but walk away with opposite conclusions. The investigators used functional neuroimaging (fMRI) to study a sample of committed Democrats and Republicans during the three months prior to the U.S. Presidential election of 2004. The Democrats and Republicans were given a reasoning task in which they had to evaluate threatening information about their own candidate. During the task, the subjects underwent fMRI to see what parts of their brain were active. What the researchers found was striking.
    "We did not see any increased activation of the parts of the brain normally engaged during reasoning," says Drew Westen, director of clinical psychology at Emory who led the study. "What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts." Westen and his colleagues will present their findings at the Annual Conference of the Society for Personality and Social Psychology Jan. 28.
    Once partisans had come to completely biased conclusions -- essentially finding ways to ignore information that could not be rationally discounted -- not only did circuits that mediate negative emotions like sadness and disgust turn off, but subjects got a blast of activation in circuits involved in reward -- similar to what addicts receive when they get their fix, Westen explains.
    "None of the circuits involved in conscious reasoning were particularly engaged," says Westen. "Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones."
    During the study, the partisans were given 18 sets of stimuli, six each regarding President George W. Bush, his challenger, Senator John Kerry, and politically neutral male control figures such as actor Tom Hanks. For each set of stimuli, partisans first read a statement from the target (Bush or Kerry). The first statement was followed by a second statement that documented a clear contradiction between the target's words and deeds, generally suggesting that the candidate was dishonest or pandering.
    Next, partisans were asked to consider the discrepancy, and then to rate the extent to which the person's words and deeds were contradictory. Finally, they were presented with an exculpatory statement that might explain away the apparent contradiction, and asked to reconsider and again rate the extent to which the target's words and deeds were contradictory.
    Behavioral data showed a pattern of emotionally biased reasoning: partisans denied obvious contradictions for their own candidate that they had no difficulty detecting in the opposing candidate. Importantly, in both their behavioral and neural responses, Republicans and Democrats did not differ in the way they responded to contradictions for the neutral control targets, such as Hanks, but Democrats responded to Kerry as Republicans responded to Bush.
    While reasoning about apparent contradictions for their own candidate, partisans showed activations throughout the orbital frontal cortex, indicating emotional processing and presumably emotion regulation strategies. There also were activations in areas of the brain associated with the experience of unpleasant emotions, the processing of emotion and conflict, and judgments of forgiveness and moral accountability.
    Notably absent were any increases in activation of the dorsolateral prefrontal cortex, the part of the brain most associated with reasoning (as well as conscious efforts to suppress emotion). The finding suggests that the emotion-driven processes that lead to biased judgments likely occur outside of awareness, and are distinct from normal reasoning processes when emotion is not so heavily engaged, says Westen.
    The investigators hypothesize that emotionally biased reasoning leads to the "stamping in" or reinforcement of a defensive belief, associating the participant's "revisionist" account of the data with positive emotion or relief and elimination of distress. "The result is that partisan beliefs are calcified, and the person can learn very little from new data," Westen says.
    The study has potentially wide implications, from politics to business, and demonstrates that emotional bias can play a strong role in decision-making, Westen says. "Everyone from executives and judges to scientists and politicians may reason to emotionally biased judgments when they have a vested interest in how to interpret 'the facts,' " Westen says.
    Coauthors of the study include Pavel Blagov and Stephan Hamann of the Emory Department of Psychology, and Keith Harenski and Clint Kilts of the Emory Department of Psychiatry and Behavioral Sciences.
  • Why do we make a bad money decision?
  • Expectation was that unfair offers in the Ultimatum Game induce conflict in the responder between cognitive (“accept”) and emotional (“reject”) motives.
    Activation of bilateral anterior insula to unfair offers from human partners is particularly interesting in light of this region’s oft-noted association with negative emotional states. Anterior insula activation is consistently seen in neuroimaging studies of pain and distress,hunger and thirst , and autonomic arousal.
    In contrast to the insula, DLPFC usually has been linked to cognitive processes such as goal maintenance and executive control.
    An unfair offer is more difficult to accept, as indicated by the higher rejection rates of these offers, and hence higher cognitive demands may be
    placed on the participant in order to overcome the strong emotional tendency to reject the offer.
    ACC has been implicated in detection of cognitive conflict (30, 31), and activation here may reflect the conflict between cognitive and
    emotional motivations in the Ultimatum Game.
  • DLPFC activity remains relatively constant across unfair offers, perhaps reflecting the steady task representation of money maximization,
    with anterior insula scaling monotonical
    Therefore, not only do our results provide direct empirical support for economic models that acknowledge the influence of emotional factors on
    decision-making behaviorly to the degree of unfairness, reflecting the emotional response to the offer.
    Models of decision-making cannot afford to ignore emotion as a vital and dynamic component of our decisions and choices in the real world.
  • Overview of Behavioral Economics

    1. 1. Soomi Lee 2009 Behavioral Economics Psychology and Economics
    2. 2. Perfect informatio n Unbounded rationality Gandhi-like willpower Unbounded Self- interest BEHAVIORAL ASSUMPTIONS IN ECONOMICS
    3. 3. “Live long and prosper.” In Models in Economics Max Σδt Σp(s)U(xt i|st)
    4. 4. Social Preference Utility U(xi|s) depends only on own payoff xi. • Charitable giving: 240.9 billion dollars in 2002 US, 2% of GDP • Reciprocity
    5. 5. Dictator Game (Forsythe et al. 1994) Offer x=0 (10, 0) (10-x, x) Offer x>0 Player 1
    6. 6. Dictator Game (Forsythe et al. 1994) Best response: x=0 (18%)Best response: x=0 (18%) Offer x=0 (10, 0) (10-x, x) Offer x>0 Player 1
    7. 7. Dictator Game (Forsythe et al. 1994) Best response: x=0 (18%) x>0 (82%) x>=4 (32%) Best response: x=0 (18%) x>0 (82%) x>=4 (32%) Offer x=0 (10, 0) (10-x, x) Offer x>0 Player 1
    8. 8. Best Friends? Mom and Dad? Wife? Husband? Daughter? Son? Girlfriend? Problem of “SOCIAL DISTANCE” (Hoffman et al. 1996)
    9. 9. Inverse relationship between social distance and x Best Friends? Mom and Dad? Wife? Husband? Daughter? Son? Girlfriend? Problem of “SOCIAL DISTANCE” (Hoffman et al. 1996)
    10. 10. Investment Game (Berg et al. 1995) Player 1 Player 2 (10-x+y, 10+3x-y) (10-x, 10+3x) (10, 10) Invest X X*3 Invest 0 Return Y Keep all
    11. 11. Trust Game Player 1 Player 2 (15, 25) (0, 40) (10, 10) TRUST Don’t Trust Return Keep all
    12. 12. Oxytocin Trust and Generosity
    13. 13. Ultimate Game (Guth et al. 1982) Player 1 Player 2 (10-x, x) (0,0) Make an offer x Accept x Reject x
    14. 14. Ultimate Game (Guth et al. 1982) Player 1 Player 2 (10-x, x) (0,0) Make an offer x Accept x Reject x SPNE (9,1) SPNE (9,1)
    15. 15. Ultimate Game (Guth et al. 1982) Player 1 Player 2 (10-x, x) (0,0) Make an offer x Accept x Reject x SPNE (9,1) SPNE (9,1) Modal Proposal: (5,5) Modal Proposal: (5,5)
    16. 16. Sanfey et al., 2003
    17. 17. SELF-CONTROL PROBLEMS “Discount factor δ is time invariant.” • Excessive preference for membership contracts in health clubs.(DellaVigna et al., 2006). • Homework and deadline. Deadlines are a good commitment device (Ariely et al., 2002) • Default effects in 401(k)s: people tend to go by default option.
    18. 18. REFERENCE DEPENDENT PREFERENCE “Individuals maximize a global utility function U(x|s).” • Excessive aversion to small risks in the laboratory • Endowment effect for inexperienced traders • The reluctance to sell houses at a loss • Equity premium puzzle in asset returns • The tendency to sell “winners” rather than “losers” • Target earnings in labor supply decisions • The tendency to insure against all risks • Effort in the unemployment relationship
    19. 19. Given U(x|s) and p(s),Given U(x|s) and p(s), (1) FRAMING(1) FRAMING • Judgments are comparative and changes in the framing can affect a decision if they change the nature of the comparison, even if they do not affect the underlying economic trade-offs. • Make a pair of concurrent decisions • Choose between: (A) a sure gain of $2.4 and (B) 25% chance to gain $10 and a 75% chance to gain $0. • Choose between : (C) a sure loss of $7.5 and (D) a 75% chance to lose $10 and a 25% chance to lose $0.
    20. 20. Given U(x|s) and p(s),Given U(x|s) and p(s), (2) LIMITED ATTENTION(2) LIMITED ATTENTION Attention is a limited resource. People simplify complex decisions by using shortcuts and heuristics. • Inattention to nontransparent taxes (i.e. indirect state taxes) (Chetty et al., 2009): when after tax is marked in price tags sales dropped. • Inattention to complex information in rankings (Pope 2007): college applicants respond to differences in ranks among colleges. • Information overload makes incorporation of info into stock prices slower by 20% (Hirshleifer et al., 2009).
    21. 21. SOCIAL PROOF “How can most people be wrong?” Milgram, Bickman, and Berkowitz’s experiment (1962)
    22. 22. Information Cascades
    23. 23. Why is it important? 1) prevents systematical bias 2) promotes new information
    24. 24. Given U(x|s) and p(s),Given U(x|s) and p(s), (3) MENU EFFECT(3) MENU EFFECT • Excessive diversification • Preference for the familiar • Preference for the salient • Choice avoidance • Confusion in implementing choices
    25. 25. Groupthink Given U(x|s) and p(s),Given U(x|s) and p(s), (4) SOCIAL PRESSURE AND(4) SOCIAL PRESSURE AND PERSUASIONPERSUASION
    26. 26. Solomon Asch (1951) Social Conformity Experiment “Opinions and Social Pressure”
    27. 27. Evolutionary Basis
    28. 28. The study of the economy as an evolutionary complex adaptive system grounded in a human nature that evolved functional adaptation to survival as a social primate species in the Paleolithic epoch in which we evolve Evolutionary Economics
    29. 29. Evolutionary Time Line
    30. 30. Evolutionary Time Line Homo Sapiens Homo Sapiens
    31. 31. Evolutionary Time Line Homo Sapiens Homo Sapiens The Neolithic Revolution The Neolithic Revolution
    32. 32. The Evolution of Human Groups Time Before Present Group Number of Individuals 100,000-10,000 years Bands 10s-100s 10,000-5,000 years Tribes 100s-1000s 5,000-3,000 years Chiefdoms 1000s-10,000s 3,000-1,000 years States 10,000s-100,000s 1,000 years Empires 100,000s-1,000,000s
    33. 33. food acquisition, hunting, food sharing, child rearing, mobility, camp construction, and defense Anthropological Research: Integrated System of Reciprocal Altruism among the Aché
    34. 34. Animal Research Brosnan (2003) Also see De Waal (2009)
    35. 35. Fairness evolved as a stable strategy for maintaining social harmony. Cooperation is enforced and free-riding is punished.
    36. 36. NEUROECONOMICS
    37. 37. The use of data on brain processes to suggest new underpinnings for economic theories. Neuroeconomics
    38. 38. HUMAN BRAIN = BLACK BOX?
    39. 39. William S. Jevons (1835-1882) wrote, “I hesitate it is impossible to measure the feelings of the human heart.” How to measure feelings?
    40. 40. Feeling and thoughts can be measured!!! Now,
    41. 41. Animal model Psychological reactions Brain image Single-neuron measurement Psychopathology Brain damage in humans How to measure?
    42. 42. “Human behavior will generally be a compromise between highly-evolved animal emotions and instincts, and more recently-evolved human deliberation and foresight.” (Loewenstein 1996) PASSION + REASON The process of choices are not only logical but also emotional.
    43. 43. Brain function Frontal Lobe: Reasoning, problem solving, emotional control Parietal Lobe: Cognition Occipital Lobe: Vision Temporal Lobe: Hearing, memory acquisition
    44. 44. Amygdala “The Quick and Dirty Route”
    45. 45. We seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirmatory evidence. (Drew Western, 2005) Neuro-scientific Basis of Confirmation Bias
    46. 46. Dopamine The Prediction Addiction and Drug Addiction Scoring financially is almost indistinguishable from scoring a hit off an addictive drug.
    47. 47. fMRI study The Neural Basis of Economic Decision-Making in the Ultimatum Game (Sanfey et al., 2003. Science) Activated by an unfair offer

    ×