SlideShare a Scribd company logo
COGNITIVE BIASES IN
PROJECT DECISION-MAKING
Aim
 Cognitive biases of project managers can
lead to perceptual distortion and
inaccurate judgment affecting business
and economic decisions and human
behavior in general.
 Subjecting these biases to scientific
investigations and independently
verifiable facts is the aim of the
presentation
Ignoring Regression to Mean
 The tendency to expect extreme
events
to be followed by similar extreme
events.
 In reality, extreme events are most
likely to be followed by an average
event.
Anecdote
 Daniel Kahneman was awarded Nobel
Prize in Economic Sciences in 2002.
 The notable thing about this Nobel prize
was that Daniel Kahneman was not an
Economist but a Psychologist
Confirmation bias
 The tendency to search for or interpret
information in a way that confirms to one's
preconceptions.
Gambler's fallacy
 The tendency to think that future
probabilities are altered by past events.
 This bias results from an erroneous
conceptualization of the Law of large
numbers (Trial size for expectancy given
by Bernoulli’s theorem) .
Experimenter's or Expectation
bias
 The tendency for experimenters to
believe in data that agree with their
expectations for the outcome, and to
disbelieve or downgrade the data that
appear to conflict with those
expectations.
Framing Effect
 Drawing different conclusions from
the same information, depending on
how or by whom that information is
presented.
Knowledge Bias
 The tendency of people to choose the
option they know best rather than the
best option.
Normalcy Bias
 The failure to plan for a disaster which
has never happened before.
Outcome Bias
 The tendency to judge a decision by
its eventual outcome instead of quality
of the decision at the time it was
made.
 This bias manifests in the review of
project decisions.
Well Travelled Road Effect
 Underestimation of the duration taken
to traverse oft-traveled routes and
overestimation of the duration taken to
traverse less familiar routes.
Half-Truths
In ancient Roman law, two half proofs
constituted a complete proof .
False Positives
 The probability that ‘A’ will occur if ‘B’
occurs will generally differ from the
probability that ‘B’ will occur if ‘A’
occurs.
 The probability that an event will occur
if or given that other event occur is
called Conditional Probability.
Hindsight Bias
 The tendency to see past events as being
predictable at the time those events
happened.
 In day-to-day life the past often seems
obvious even when we could not have
predicted.
 This bias manifests in the review of project
decisions.
Anecdote
 Army Chief of Staff General George
Marshal was faulted by U.S Congress
Committee for having missed all the
“signs” of a coming attack of Pearl
harbor.
Conclusion
Thank You
 “False Positive” - Tested positive but HIV
Negative.
 “True Positive” - Tested positive and HIV
positive
 "True Negative” -Tested negative and
HIV negative
 “False Negative” - Tested negative but
HIV positive
As per the rule of
compounding probabilities, no
finite number of partial proofs
will ever add up to a certainty.
Doctor’s Confusion
“He would test positive if he was
not HIV infected”
with the chances that
“He would not be HIV infected if
tested positive”
The study of randomness tells
us that the crystal ball view of
events is possible,
unfortunately, only after they
happen.
If you get 45 heads in the first
100 tosses, the coin would not
develop a bias towards tail for
the rest of the tosses to catch
up!

More Related Content

Similar to Cognitive Biases in Project Management

Signal and noise
Signal and noiseSignal and noise
Signal and noise
christiangregory
 
Full moon fallacy.pptx2
Full moon fallacy.pptx2Full moon fallacy.pptx2
Full moon fallacy.pptx2
James McCann
 
Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)APDEMS370
Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)APDEMS370Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)APDEMS370
Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)APDEMS370
AlyciaGold776
 
Learning session 2nd
Learning session 2ndLearning session 2nd
Learning session 2nd
gauravsharmamba
 
Lesson Eight Moral Development and Moral IntensityLesson Seve.docx
Lesson Eight Moral Development and Moral IntensityLesson Seve.docxLesson Eight Moral Development and Moral IntensityLesson Seve.docx
Lesson Eight Moral Development and Moral IntensityLesson Seve.docx
smile790243
 
List of cognitive biases
List of cognitive biasesList of cognitive biases
List of cognitive biases
Сања Богоева
 
decisions - cognitive biases - Sunil Gupta
decisions - cognitive biases - Sunil Guptadecisions - cognitive biases - Sunil Gupta
decisions - cognitive biases - Sunil Gupta
nikevkarpidi
 
decisions - cognitive biases.ppt
decisions - cognitive biases.pptdecisions - cognitive biases.ppt
decisions - cognitive biases.ppt
201911359
 
Financial Crises Final Paper
Financial Crises Final PaperFinancial Crises Final Paper
Financial Crises Final Paper
Josh Hamilton
 
Blackswan
BlackswanBlackswan
Blackswan
Yanli Liu
 
Lesson 24
Lesson 24Lesson 24
Lesson 24
Imran Khan
 
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docxA Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
ransayo
 
He Said What - Deception Detection Part 2
He Said What - Deception Detection Part 2He Said What - Deception Detection Part 2
He Said What - Deception Detection Part 2
Edward Schwartz
 
Many decisions are based on beliefs concerning the likelihoo.docx
Many decisions are based on beliefs concerning the likelihoo.docxMany decisions are based on beliefs concerning the likelihoo.docx
Many decisions are based on beliefs concerning the likelihoo.docx
alfredacavx97
 
Principles of behavioural economics
Principles of behavioural economics  Principles of behavioural economics
Principles of behavioural economics
Terry Corby
 
Risk management 6 mistakes
Risk management 6 mistakesRisk management 6 mistakes
Risk management 6 mistakes
Alan Haller
 
Risk management 6 mistakes
Risk management 6 mistakesRisk management 6 mistakes
Risk management 6 mistakes
Alan Haller
 
The Psychology of Thinking About the Past and Future
The Psychology of Thinking About the Past and FutureThe Psychology of Thinking About the Past and Future
The Psychology of Thinking About the Past and Future
Chris Martin
 
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docxWeek 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
cockekeshia
 
Davosstein
DavossteinDavosstein

Similar to Cognitive Biases in Project Management (20)

Signal and noise
Signal and noiseSignal and noise
Signal and noise
 
Full moon fallacy.pptx2
Full moon fallacy.pptx2Full moon fallacy.pptx2
Full moon fallacy.pptx2
 
Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)APDEMS370
Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)APDEMS370Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)APDEMS370
Disasters and Humans (DEMS3706 SU2020, Dr. Eric Kennedy)APDEMS370
 
Learning session 2nd
Learning session 2ndLearning session 2nd
Learning session 2nd
 
Lesson Eight Moral Development and Moral IntensityLesson Seve.docx
Lesson Eight Moral Development and Moral IntensityLesson Seve.docxLesson Eight Moral Development and Moral IntensityLesson Seve.docx
Lesson Eight Moral Development and Moral IntensityLesson Seve.docx
 
List of cognitive biases
List of cognitive biasesList of cognitive biases
List of cognitive biases
 
decisions - cognitive biases - Sunil Gupta
decisions - cognitive biases - Sunil Guptadecisions - cognitive biases - Sunil Gupta
decisions - cognitive biases - Sunil Gupta
 
decisions - cognitive biases.ppt
decisions - cognitive biases.pptdecisions - cognitive biases.ppt
decisions - cognitive biases.ppt
 
Financial Crises Final Paper
Financial Crises Final PaperFinancial Crises Final Paper
Financial Crises Final Paper
 
Blackswan
BlackswanBlackswan
Blackswan
 
Lesson 24
Lesson 24Lesson 24
Lesson 24
 
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docxA Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
A Pyramid ofDecision ApproachesPaul J.H. Schoemaker J. E.docx
 
He Said What - Deception Detection Part 2
He Said What - Deception Detection Part 2He Said What - Deception Detection Part 2
He Said What - Deception Detection Part 2
 
Many decisions are based on beliefs concerning the likelihoo.docx
Many decisions are based on beliefs concerning the likelihoo.docxMany decisions are based on beliefs concerning the likelihoo.docx
Many decisions are based on beliefs concerning the likelihoo.docx
 
Principles of behavioural economics
Principles of behavioural economics  Principles of behavioural economics
Principles of behavioural economics
 
Risk management 6 mistakes
Risk management 6 mistakesRisk management 6 mistakes
Risk management 6 mistakes
 
Risk management 6 mistakes
Risk management 6 mistakesRisk management 6 mistakes
Risk management 6 mistakes
 
The Psychology of Thinking About the Past and Future
The Psychology of Thinking About the Past and FutureThe Psychology of Thinking About the Past and Future
The Psychology of Thinking About the Past and Future
 
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docxWeek 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
 
Davosstein
DavossteinDavosstein
Davosstein
 

Cognitive Biases in Project Management

  • 1. COGNITIVE BIASES IN PROJECT DECISION-MAKING
  • 2. Aim  Cognitive biases of project managers can lead to perceptual distortion and inaccurate judgment affecting business and economic decisions and human behavior in general.  Subjecting these biases to scientific investigations and independently verifiable facts is the aim of the presentation
  • 3. Ignoring Regression to Mean  The tendency to expect extreme events to be followed by similar extreme events.  In reality, extreme events are most likely to be followed by an average event.
  • 4. Anecdote  Daniel Kahneman was awarded Nobel Prize in Economic Sciences in 2002.  The notable thing about this Nobel prize was that Daniel Kahneman was not an Economist but a Psychologist
  • 5. Confirmation bias  The tendency to search for or interpret information in a way that confirms to one's preconceptions.
  • 6. Gambler's fallacy  The tendency to think that future probabilities are altered by past events.  This bias results from an erroneous conceptualization of the Law of large numbers (Trial size for expectancy given by Bernoulli’s theorem) .
  • 7. Experimenter's or Expectation bias  The tendency for experimenters to believe in data that agree with their expectations for the outcome, and to disbelieve or downgrade the data that appear to conflict with those expectations.
  • 8. Framing Effect  Drawing different conclusions from the same information, depending on how or by whom that information is presented.
  • 9. Knowledge Bias  The tendency of people to choose the option they know best rather than the best option.
  • 10. Normalcy Bias  The failure to plan for a disaster which has never happened before.
  • 11. Outcome Bias  The tendency to judge a decision by its eventual outcome instead of quality of the decision at the time it was made.  This bias manifests in the review of project decisions.
  • 12. Well Travelled Road Effect  Underestimation of the duration taken to traverse oft-traveled routes and overestimation of the duration taken to traverse less familiar routes.
  • 13. Half-Truths In ancient Roman law, two half proofs constituted a complete proof .
  • 14. False Positives  The probability that ‘A’ will occur if ‘B’ occurs will generally differ from the probability that ‘B’ will occur if ‘A’ occurs.  The probability that an event will occur if or given that other event occur is called Conditional Probability.
  • 15. Hindsight Bias  The tendency to see past events as being predictable at the time those events happened.  In day-to-day life the past often seems obvious even when we could not have predicted.  This bias manifests in the review of project decisions.
  • 16. Anecdote  Army Chief of Staff General George Marshal was faulted by U.S Congress Committee for having missed all the “signs” of a coming attack of Pearl harbor.
  • 19.  “False Positive” - Tested positive but HIV Negative.  “True Positive” - Tested positive and HIV positive  "True Negative” -Tested negative and HIV negative  “False Negative” - Tested negative but HIV positive
  • 20. As per the rule of compounding probabilities, no finite number of partial proofs will ever add up to a certainty.
  • 21. Doctor’s Confusion “He would test positive if he was not HIV infected” with the chances that “He would not be HIV infected if tested positive”
  • 22. The study of randomness tells us that the crystal ball view of events is possible, unfortunately, only after they happen.
  • 23. If you get 45 heads in the first 100 tosses, the coin would not develop a bias towards tail for the rest of the tosses to catch up!

Editor's Notes

  1. Introduction I became interested in this subject as I observed during the course of various assignments; varied interpretation of events associated with accomplishments and failures. I observed failed projects, missions and failed engineering systems were subjected to enquiries and investigations, wherein I felt that investigating successful missions or an operating system would give same or more information, if preventing failures in future were the aim. I was amused when I read in the news paper recently that an expert team were subjecting the burnt down bogie (in an accident) of Tamil Nadu express to some serious examination. It occurred to me that; to prevent train bogies from catching fire in future one can subject any other randomly selected bogies of good sample size to investigation not necessarily the one which caught fire. I compiled some information and interesting facts on judgment based on our perception and intuition called cognitive biases prevalent amongst us; so that we can recognize them when we practice or encounter them. Coming to the brief presentation I’ve prepared There are two distinct views in project management practices: The rational view which focuses on management techniques and the other focuses on behavioral view. The difference between the two is significant: one looks at how projects should be managed, the other at what actually happens on projects.  The gap between the two can sometimes spell the difference between project success and failure. In many failed projects, the failure can be traced back to poor decisions, and the decisions themselves traced back to cognitive biases: i.e.  Errors in judgment based on perceptions.
  2. I would like to explain the issue with an anecdote Daniel Kahneman a psychologist worked on psychology of judgment and decision-making, behavioral economics and hedonic psychology. He was awarded  Nobel Prize in Economic Sciences in 2002 for his work in Prospect theory which is a behavioral economic theory. His interest on the subject was the result of an curious incident in his life: In 1960, Kahnman, then a junior psychology professor at Hebrew University, lectured to a group of Israeli air force flight instructors on the conventional wisdom of behavior modification and its application to psychology of flight training. Kahnman drove home the point that rewarding positive behavior works but punishing mistakes does not. One of the flight instructor said that his experience contradicts it, when I praised people for a well executed maneuvers, and the next they always do worse. And I’ve screamed at people for badly executed maneuvers, and by and large the next time they improve. Don’t tell me that reward works and punishment doesn’t work. My experience contradicts it, other flight instructors agreed. Kahnman pondered over the apparent paradox. The answer lies in a phenomenon called regression towards mean. That is, in a series of random events an extraordinary event is most likely to be followed by a more ordinary one, purely by chance. Any exceptionally good or poor performance was thus mostly a matter of luck. For the instructors it appeared that their action contributed to the pilots performance. In reality it made no difference. The error in intuition spurred Kahneman’s thinking in the people’s misjudgments when faced with uncertainty. His research over the next thirty years proved that even amongst sophisticated subjects, whether in military, business or medical professions; people’s intuition very often fails them when it came to random process. The inconsistency between logic of probability and people’s assessments of uncertain events became the subject of study for Kahneman.
  3. We do have the tendency to interpret information and preferentially seek evidence to confirm own preconceived notions and worse also interpret ambiguous evidence in favor of our ideas . It has been observed that If a set of research or experimental data without evidence for a conclusive outcome were provided to two groups for analysis, the group invariably interpreted the pattern to be an compelling evidence to their preconceived notions. We therefore should learn to spend as much time looking for evidence that we are wrong as we spend searching for reasons we are correct.
  4. Another mistaken notion connected with the law of large numbers is the idea that an event is more or less likely to occur because it has or has not happened recently. The idea that the odds of an event with a fixed probability increase or decrease depending on recent occurrence of the event is called ‘Gamblers Fallacy’, the root of ideas as “his luck has run out” and “he is due” For example if you get 45 heads in the first 100 tosses, the coin would not develop a bias towards tail to catch up! People however expect good luck to follow a bad luck, or worry that bad will follow good.
  5. It is one of those contradictions of life that although, measurements always carries uncertainty, the uncertainty in measurement is rarely disclosed when measurements are quoted. I’hd personally quoted fine measurements in the order of 0.01 mm in many bore alignments without indicating its tolerances and its associated uncertainties.
  6. Kahneman demonstrated systematic reversals of preference when the same problem is presented in different ways.
  7. A self evident bias we are all victims to, but rarely acknowledge it.
  8. A case in point is; our national disaster management team visited Andaman and Nicobar islands and sensitized local population on measures to safeguard themselves during the disasters not covering tsunami. It was a year before tsunami stuck. We do it all the time in our own projects.
  9. If a decision results in a negative outcome, this does not mean that decision was wrong. A decision has to be weighed for its value when it is taken not later when the results are out!
  10. When you examine a pert, you should know where to apply moderation based on who’d prepared it, if you know him.
  11. I heard. Weather forecast indicated that the probability of raining on a Saturday is 50% and that of the next day Sunday was again 50%; and someone concluded that it will surely rail that weekend. Ancient Romans in their justice department employed a concept ‘Half Proofs’ which applied to a evidence and testimony in which there was no compelling reason to believe or disbelieve. In their law, two half proofs constituted a complete proof for conviction. This might sound reasonable to a mind unaccustomed to quantitative thought. But the correct application of law of probability is like this: The chance of two independent half proofs being wrong is 1 in 4, so two half proofs constitute one minus one fourth .i.e. three-fourth of a proof, not a whole proof. Romans added where they should have multiplied. Similarly, the correct probability of rain during the weekend is One minus the probability of not raining on both the days being 25%, which gives the answer as 75%. Therefore, as per the rule of compounding probabilities, no finite number of partial proofs will ever add up to a certainty.
  12. Probability that ‘A’ will occur if ‘B’ occur will generally differ from the probability that ‘B’ will occur if ‘A’ occurs. This Conditional Probability may appear evident and commonsensical. However, applying this in real life situation is rather tricky. To understand this concept I will use the real life incident I ve read. Leonard Mlodinow a theoretical physicist was tested positive for HIV during a routine blood test for life insurance. On his query on the odds of the test, his doctor informed him that it is 1–in-1,000 being the statistic inaccuracy of the test .i.e. chance of he being HIV infected is 99.9% After overcoming initial shock, Leonard Mlodinow reasoned that, the doctor confused the chances that “He would test positive if he was not HIV infected” with the chances that “He would not be HIV infected if tested positive” If this appears tricky, consider this “The probability of a randomly chosen English speaking person to be a British, is very different from the probability of a randomly chosen British to be speaking English”. This example may make sense intuitively. The physicist calculated that the actual chance of he being HIV infected was about 9% in contrast to the doctor’s assessment of 99.9%. This is a case of “False Positive” that is Tested positive but HIV Negative. The other cases in conditional probability are “True Positive” - Tested positive and HIV positive, "True Negative” - tested negative and HIV negative and “False Negative” - tested negative but HIV positive
  13. Is also be called the "I-knew-it-all-along" effect In any complex string of events in which each event unfolds with some element of uncertainty, there is a fundamental asymmetry between past and future. A anecdote I’ve chosen to highlight this bias from an incident during WW-II. In Oct 1941 a message from Tokyo to a Japanese spy was intercepted and decoded by U.S which sought the Pearl Harbor be divided into five areas where the U.S. war ships are concentrated with specific emphasis on destroyers and carriers. After few days, U.S lost track of radio communication from all Japanese carriers and therefore its whereabouts. In Nov and early Dec same year, U.S noticed Japanese warships changed call signs second time in a month wherein it was normaly changed every six months . This made knowing the whereabouts of Japanese ships harder for U.S. Two days later a message from Tokyo to all diplomatic posts across many countries including London and Washington to destroy their codes and important documents immediately was intercepted and decoded by U.S. On 05 Dec, FBI intercepted a telephone call from a cook at the Japanese consulate in Hawaii reporting with great excitement that the officials were burning documents. On 06 Dec Pearl harbor was attacked crippling U.S Navy. Army Chief of Staff General George Marshal was severely faulted by U.S Congress Committee for having missed all the “signs” of a coming attack of Pearl harbor. General George Marshal was no fool but neither did he have a crystal ball. In addition to the quoted reports, General George Marshal was also privy to huge amount of intelligence reports, each bringing alarming or mysterious messages obscuring a clear vision of the future. The study of randomness tells us that the crystal ball view of events is possible, unfortunately, only after they happen.
  14. Cognitive biases play a major role in our decision making. Researchers have concluded that people have a very poor conception of randomness and uncertainty; they do not recognize it when they see it and they cannot produce it when they try it, and what’s worse we routinely misjudge the role of chance in our lives and make decisions that are demonstrably misaligned with our own best interests. It is easy to believe that ideas that worked were good ideas, that plans that succeeded were well designed, and that ideas and plans that did not were ill conceived. While ability does not guarantee achievement, nor achievement proportional to ability, it is important to keep in mind the role of chance in our project management. With all your wisdom, at times you may be required to pretend to go with the bias as; I understand that you will find it hard to stop firing a project manager who has failed repeatedly saying that he just happened to be on the “wrong end of a Bernoulli Series” nor will you earn friends if you term a project managers repeated success as a just a “Random Fluctuation”