Insights from N. Taleb's Fooled by Randomness - Tim Perzyk (2007)

  • 1,249 views
Uploaded on

Nassim Taleb (www.fooledbyrandomness.com) is a stock-market thought leader who distills complex statistical phenemomena for the masses. His thoughts on "black swan theory" (the predictability of …

Nassim Taleb (www.fooledbyrandomness.com) is a stock-market thought leader who distills complex statistical phenemomena for the masses. His thoughts on "black swan theory" (the predictability of extraordinary negative events) inspired this presentation during my time as an MBA candidate at Harvard Business School. Any misrepresentations of insight are my own, and I offer apologies for such shortcomings.

More in: Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
1,249
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
0
Comments
0
Likes
6

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Schooled by Rand mness Topics from Nassim Taleb’s Fooled by Randomness o
  • 2.
    • Author Nassim Nicholas Taleb is a trader and Wharton MBA with a brain full of cross-disciplinary knowledge.
    • Taleb rightly observes that the human mind is poorly equipped to deal with chance and probability; his book addresses this shortcoming in the context of securities markets.
    • What follows is a selection of his observations, many of which we’ve addressed (directly or indirectly) this semester. Note that Taleb would probably hate this “MBA-palatable” style of summary.
    1 The Kilimanjaro View of Things
  • 3.
    • Ultra-prosperous Greek king Croesus consults the sage legislator Solon. Croesus wants Solon to assert that the king is indeed the “happiest man of all,” but Solon instead warns of the “uncertain future”: “… [O]nly to whom the divinity has [guaranteed] continued happiness until the end we may call happy.”
    2 Richer Than Croesus… Eventually
    • The point? Neglecting the role of luck in an iterative game (here, that of life) is foolish. One could be enjoying a run in the data, and the past may poorly predict the future.
  • 4.
    • Many who have succeeded on the back of chance presume their own actions—something distinctive about their “strategy”—have brought about their good fortune.
    • Underlying this thinking is the survivorship bias , which favors a visibly successful subset amid a pool of invisible losers. Over-reliance on the bias may lead well-meaning people to heap praise on survivors for the benefits of randomness.
    3 I Will Survive
  • 5.
    • Two factors affect the relative danger of reliance on the bias: (1) the probability of success or inherent randomness of the game or field; (2) the sample size.
    4 Every 720 Minutes
    • Low-probability survivors in small pools may in fact be deserving of praise.
    • However, sample sizes in competitive labor markets are rarely small, and even a broken clock is right twice a day .
  • 6.
    • Humans tend to fixate on the frequency (i.e., probability) of an event as opposed to the expected outcome (i.e., weighted-average of its potential outcomes).
    5 Great Expectations
    • Example: The skewness of Lasik eye surgery—99% are near perfectly corrected, 1% blind or further impaired.
    • P(Success)
    • vs.
      • P(Success)*(Payoff) + P(Failure)*(Loss)
  • 7.
    • This week’s Time magazine cover story, “Why We Worry about the Wrong Things,” addresses the human tendency to ignore both frequency and expected outcomes when comparing “life-or-death” scenarios.
    • If we consider the payoff to living to be fixed and assume that all deaths are equal, this failure to assess either frequency or expected outcome seems odd…
    6 However…
  • 8. 7 Dread Locks
    • … But the reality is that humans don’t perceive all deaths to be equal—certain fates (e.g., death in a plane crash) are subject to greater dread than others (e.g., death on the road).
    • In fact, it may be that humans ARE comparing expected outcomes of different events but are simply weighing the payoffs (life and death) with consideration for dread.
    • P(Car) = 0.01 v. P(Plane) = 0.00002
    • Car Death = -1 v. Plane Death = -500
    • EV = -0.01 v. EV = -0.01
  • 9. 8 Ugly In-duck-tion
    • Which of the following two statements might you be able to prove?
      • A: No swan is black, because I looked at four thousand swans and found none.
      • B: Not all swans are white.
    • This is the problem of induction :
    • No amount of data can approve a proposition, but one piece of data can disprove it.
  • 10. 9 Without Precedent
    • Taleb focuses on statements of the variety, “The market’s never done that before.” These statements embody the problem of induction.
    • The 4000 swans I inspected—or the 100 years of securities trading I’ve witnessed—have not included a black swan —or a catastrophic but rare market collapse. Therefore, no black swans exist—a catastrophic but rare market collapse will never occur.
  • 11.
    • So what’s the real trouble with humans and probability?
    10 Satisfiction Guaranteed?
    • Computer science and psychology scholar Herbert Simon posited that humans “ satisfice ” (that is, “satisfy” and “suffice”) by condensing complex reasoning in much the way an MP3 file compresses a WAV. Certain information is lost, with the end result largely representative of the source.
    • Simon’s theory suggests that the human brain is imperfect (i.e., sometimes works based on guesstimates).
  • 12.
    • But that’s not the whole story. Consider the following.
    11 Flawed vs. Imperfect
    • Say you prefer apples to oranges and oranges to pears. We would expect that you would prefer apples to pears as well. However, we know that this may not be the case—the human brain is not always subject to a strict series of logical rules with checks and balances. It’s more than “rough around the edges” or imperfect—it’s flawed .
    ?
  • 13.
    • Daniel Kahneman and Amos Tversky codified the quirks of our flawed brains in a series of biases and heuristics —“quick and dirty” laws to explain our deviations from rational, outcome-optimizing thinking.
    12 Brains Behaving Badly
    • Recall the Time magazine article on the human trouble with risk. Dread was the “irrational” component there. Miscalculating risk in this manner may demonstrate the affect bias , in which emotions cloud our ability to calculate probability. (Taleb would probably suggest that all deaths ARE equal but that we’re fudging expected outcomes with overstated doomsday probabilities.)
  • 14.
    • Consider the following case presented to physicians. (Over 80% failed.)
    13 Dr. Doolittle
    • A test of a disease presents a rate of 5% false positives. The disease strikes 1 in 1000 of the population. People are tested at random, regardless of whether they are suspected of having the disease. A patient’s test is positive. What is the probability of the patient being stricken with the disease? (Assume no false negatives—everyone who truly does have it will be identified.)
    • The correct answer is roughly 2%, which is the ratio of the number of truly afflicted to the number of “true” positives and false positives.
  • 15.
    • Where does the “flawed brain” come from? One theory, posited by evolutionary psychologists, is that the prehistoric human brain had very little information to deal with relative to our brains today. Our ancestors met far fewer other humans and encountered fewer distinct environmental stimuli given that many never migrated far.
    14 130,000 Years Ago I wonder if she’s on MySpace…
  • 16.
    • “ I never said that every rich man is an idiot and every unsuccessful person unlucky, only that in absence of much additional information it is preferable to reserve one’s judgment. It is safer.”
    15 Choice Words
    • “… CEOs are often empty suits . In the ‘quant world,’ the designation empty suit applies to the category of persons who are good at looking the part but nothing more. More appropriately, what they have is skill in getting promoted within a company rather than pure skills in making optimal decisions—we call that ‘corporate political skill.’ These are mostly people trained at using PowerPoint presentations .”