Keppel Ltd. 1Q 2024 Business Update Presentation Slides
Making better project decisions more often (from pmicic12th annual pdd, 2014)
1. PMI – Central Indiana Chapter
Professional Development Day
Eric Wright, Ph.D., PMP
DeVry University, Indianapolis, IN
October 3, 2014
2. Introduction
The Problem
Interactive Quiz
PMBOK Process Groups Framework
Interactive Grading
Conclusion
Question & Answer
References
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 2
3. Initiating Planning Executing Monitoring
Planning is the effort that
formalizes our decision
making activities.
PMI – Central Indiana Chapter
Professional Development Day
& Controlling
Closing
Integration MGMT
Scope MGMT
Time MGMT
Cost MGMT
Quality MGMT
HR MGMT
Communications MGMT
Risk MGMT
Procurement MGMT
Stakeholder MGMT
(Adapted from Mintzberg, 1994)
Adapted from Table 3-1, PMBOK, 2013, p. 61
www.tutkus.com
10/3/14 3
4. PMI – Central Indiana Chapter
Professional Development Day
savingtosuitorsclub.net
www.picswallpaper.com
www.glogster.com
10/3/14 4
5. When I say go, take 5 seconds and
compute the product of this sequence:
Now, when I say go again, I want you to take
another 5 seconds and compute the product
of this second sequence:
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 5
6. Our government is preparing for the outbreak of
an unusual disease, which is expected to kill 600
people. Two alternative programs to combat the
disease have been proposed, and you must
choose which one you think is better. These are
the estimates of the outcomes for each program:
Now, suppose that instead of the two previous
programs, you’ve been presented with these
two programs instead. As in the previous
situation, pick the one you think is better.
Program C: 400 people will die.
Program A: D: 200 There’s people a 1/will 3 chance be saved.
that nobody
will die, and a 2/3 chance that 600 people will
die.
Program B: There’s a 1/3 chance that 600 people
will be saved, and a 2/3 chance that no people
will be saved.
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 6
7. You and a colleague are avid sports fans, and
you are both planning to travel individually 40
miles to see a basketball game. You paid for
your ticket; your colleague did not. However,
on game night, a blizzard is announced.
Which of you is more likely to brave the
blizzard to do see the game?
You, Your colleague, Neither?
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 7
8. I offer you a gamble on the toss of a coin:
If the coin shows tails, you lose $100;
If the coin shows heads, you win $150.
Would you accept or decline this gamble?
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 8
9. Paul owns shares of stock in company A. During the past
year he considered switching to stock in company B, but he
decided against it. He now learns that he would have been
better off by $1,200 if he had switched to the stock of
company B.
George on the other hand, owned shares in company B.
During the past year he switched to stock in company A. He
now learns that he would have been better off by $1,200 if
he had kept his stock in company B.
Who feels greater regret? Paul, George, Neither?
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 9
10. You’ve just met Steve. He seems very
helpful, but also is a bit shy and
withdrawn. It also appears that he likes
things orderly and has a passion for detail.
What do you think is Steve’s occupation?
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 10
Farmer or Librarian?
11. You are a basketball coach, and a player
with a 50% free throw average just missed
10 free throws in a tournament game. Do
you:
Praise her performance?
Punish her performance (i.e. bench
her)?
Ignore her performance?
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 11
12. Do you think a professional basketball
player is more likely to make a basket if he
just made the last three shots than if he
missed the last three shots?
Yes or No?
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 12
13. PMI – Central Indiana Chapter
Professional Development Day
10/3/14 13
A. Poor automobile driver?
B. Average automobile
driver?
C. Above Average
auto driver?
14. Which line is longer? A, B, or neither?
A.
B.
C. Neither
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 14
15. 1. Anchoring
2. Framing
3. Sunk Cost
Fallacy
4. Loss Aversion
5. Regret
6. Base Rate
Neglect
7. Regression to
Mean
8. Hot Hand
9. Optimism Bias
10. Illusion of
Control
Executing Monitoring & Controlling Closing
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 15
Common Biases
Process Groups
Initiating Planning
16. Ask where did number come from? Could it be wrong? You’re
considering opposite/alternative numbers
PMI – Central Indiana Chapter
Professional Development Day
www.cer.org.uk
www.thezerosbeforetheone.com
10/3/14 16
17. PMI – Central Indiana Chapter
Professional Development Day
10/3/14 17
Correct if Same
Look at more than two
perspectives; emphasize
the positive; reflect on
risk posture – averse or
seeking?
18. PMI – Central Indiana Chapter
Professional Development Day
10/3/14 18
Neither, stay home,
money is already
spent
Decision performance-based
vs. history of
time, effort, spending;
would we do this if we
didn’t have
cost/historical data?
http://lawschoollies.com/?p=843
19. PMI – Central Indiana Chapter
Professional Development Day
10/3/14 19
Accept opportunity
as maximizes
wealth
Employ decision trees
to minimize grip;
reframe issue;
evaluate pros and
cons
www.nytimes.com
20. PMI – Central Indiana Chapter
Professional Development Day
10/3/14 20
Neither, same
position
Make best decision possible, note it; sleep on it.
http://galleryhip.com/regret.html
21. PMI – Central Indiana Chapter
Professional Development Day
10/3/14 21
Farmer
Define population,
characteristics, or
performance
objectives;
investigate initial thoughts
for validity; avoid hasty
conclusions
http://theconversation.com/farmer-suicide-isnt-just-a-mental-health-issue-9381
http://mrlibrarydude.wordpress.com/2013/07/24/image-public-perception-and-lego-librarians/
22. Ignore – performance will return to average
irrespective of your actions
Explain role of regression to stakeholders/team
members; encourage consistent results and raise
average over time vs. praise and punishment
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 22
23. PMI – Central Indiana Chapter
Professional Development Day
10/3/14 23
No
Make and document all
assumptions; vet predictions
with experts; coincidence is
real, but don’t plan on it,
events are independent
24. PMI – Central Indiana Chapter
Professional Development Day
10/3/14 24
Average (but why?)
(Because we all
can’t be above
average)
Insurance; audits;
contingency
plans/budgets; PERT
https://www.google.com/search?q=auto+wreck&client=firefox-a&hs=v0m&rls=org.mozilla:en-
US:official&channel=fflb&source=lnms&tbm=isch&sa=X&ei=jUwPVKb1MM2jyATMtILYCg&ved=0CAkQ_A
UoAg&biw=981&bih=508#facrc=_&imgdii=_&imgrc=X08A_u1vI_uQJM%253A%3Basy1SM3I2CbLaM%3Bht
tp%253A%252F%252Fwww.documentarist.com%252Fsites%252Fdefault%252Ffiles%252Fimages%252Fau
towreck._washington_d.c._1920.jpg%3Bhttp%253A%252F%252Fwww.documentarist.com%252Fcar-accident-
washington-dc-1920%3B1920%3B1583
25. A.
B.
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 25
C. Neither
26. Score Result Description
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 26
8-10 Super Star
Decision-maker
Your Deliberate System (DELSYS) is engaged and making
correct judgments of the accurate intuitions your
Automatic System (AUTOSYS) is passing it
5-7 Star
Decision-maker
Your AUTOSYS is passing relatively accurate intuitions to
your relatively logical, engaged DELSYS
1-4 Average
Decision-maker
Quick intuitions generated by your active AUTOSYS are
passing relatively unchallenged through a quiet DELSYS
and into your decisions
27. All project managers are susceptible to
impaired decision-making because of
predictable mental and emotional biases,
especially when under uncertainty and risk;
However, we can identify and predict these
biases and their effects;
Which can help us reduce their impact on
important decisions, which helps us make
better project decisions more often.
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 27
28. (plus contact & additional information for Eric)...
◦ 317.414.8781
◦ doctorwright2012@yahoo.com
◦ http://www.linkedin.com/in/pmdoctor/
◦ http://pmworldjournal.net/
◦ http://www.pinterest.com/doctorwright012/
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 28
29. 1. Carter, Kaufmann, & Michel. (2007).
Behavioral Supply Management: A
Taxonomy of Judgment and Decision-making
PMI – Central Indiana Chapter
Professional Development Day
10/3/14 29
Biases.
2. Virine & Trumper. (2007). Project
Decisions: The Art and Science
3. Kahneman. (2011). Thinking, Fast and
Slow.
Editor's Notes
Good afternoon everybody! As you can see on the slide behind me, my name is Eric. As you can also see on the slide, in addition to being a project manager like you, I am also a social scientist; I teach and study financial and project management, with a particular interest in why our decisions deviate from rational models of decision-making. Anyway, this role fits me to a T; I LOVE doing experiments!
In fact, I have several planned for you this afternoon! But let’s do our first one now so we can set the stage for our topic…
First, how many of you have ever had to make, or are making now at work, a tough project decision? Ok, great, that’s about X of Y of us.
Second, how many of you have ever had to make, or are making now, a project decision under risk, i.e. you knew the possible outcomes; uncertainty, i.e. you know you don’t know all of the possible outcomes; or time, your stakeholder needs a decision now!? Wow, that’s about X of Y of us.
Third, when you think about your ability to make good project decisions, I want you to think about where you stand relative to the rest of the project manager population. Are you at bottom 25%? OK, that’s only about X of Y! Are you in the top 25%? Wow! That’s about X of Y of us; that’s most of us here!
OK, so the results of our first experiment are in, and they’re pretty conclusive. Our topic is universal in application to this audience; we all make tough, time-bound decisions under risk, uncertainty, and time constraints. It’s the reality of the project environment. And, we’ve uncovered a statistical improbability, we can’t all be better than everyone else at making good project decisions; but we think we are! That’s what psychologists call a bias, it’s called the optimism bias, and neuroscientists studying it say 80% of us have it. And that, in a nut shell, is our topic today! What mental and emotional heuristics and biases, or what I like to call decision traps, are we all susceptible to when we are making decisions under risk, uncertainty, and time constraints, and how can we mitigate their impact once we are aware of them? Because, if we get good at doing that, it means we’ll make better project decisions more often.
Here is our framework for today’s exploration…READ SLIDE
So we know our topic, decision-making under risk, uncertainty, and time, is applicable to us right; we just established that.
But before we look at it, I think it’s also worthwhile to also establish up front its importance. In my mind, there are several key facts that can do that for us:
First, it affects all of us! How do I know? We all have brains (CLICK) that we use to make our decisions, and we all know that our decision-making is influenced by our emotions, which we all also have.
Second, we know that decisions are what; choices between alternatives right. Decision making is a continuous string of choosing between alternatives. In fact, MGMT guru Henry Mintzberg connects decision making and planning like this; planning is the process we use to formalize our decisions.
And Third, we all know how important project planning is right! In fact, from the perspective of the recent edition of the PMBOK, it is the lion’s share of our project Process Group management focus, activity and energy. How? Well, the five process groups and ten knowledge areas now contain 47 processes, 24 of them lie within the Planning Process Group. That’s 51%! And, if we make the small leap that keeping a project on planned performance and intent is also decision intensive, then we can add the 11 processes from the Monitoring & Controlling process group as well. This brings us to the fact that between the Planning and M&C process groups, decision-making touches 74% of the project processes we carry out! No wonder the PMBOK identifies decision making as the third most used interpersonal skills of project managers!
Fourth, the PMBOK goes on to say that not only is decision making key to project performance, it’s key to project governance as well.
Now, with all of that said, if you’re a skeptic like me, and you’re waiting for the full story, it’s what the PMBOK doesn’t tell us, and what you may not know, that interests me. 59 years of research from cognitive psychology, decision science, neuroscience, behavioral economics, and behavioral finance makes an airtight case that when we make decisions under risk, uncertainty, and time constraints, our brains use mental shortcuts called heuristics, to reduce the complexity and cognitive load of the decision, and to shorten the time it takes to make the decision. Basically, we are using rules of thumb to form our judgments, which can be confounded by associated biases in those judgments. The evidence also suggests that most of this processing is invisible to us, it happens automatically without you knowing it. But, considering we do it so much, and that it is so important to our livelihoods. wouldn’t you like to have a bit more control over it? Wouldn’t you like to understand it a bit better since it’s so important that we get right? If your answer is yes, you’re in the right session this time slot, and probably today.
And that begging question brings us to the heart of this topic today, the problem. This topic, decision making under risk and uncertainty, started in 1955 (with Simon) and established in 1974 with two psychologists in a groundbreaking article on making decision under uncertainty. They showed that although many times our intuition is correct, when it goes haywire, there can be some serious consequences on the outcomes of our decisions.
Daniel Kahneman and Amos Tversky followed up their groundbreaking article with a second groundbreaking article in 1979 about decision making under risk. In that seminal article, they proposed the Prospect Theory, which ran counter to Economics’ fundamental utility theory, which states we’re all rational agents making optimal decisions that maximize our overall utility, or wealth, or happiness, and so on. This was key because the science of Economics assumes we’re all rational.
However, it’s the definition of rational that most economical models get wrong. It’s based on unbounded rationality, which states that we have a complete understanding of all outcomes, their size and timing of pay-offs, and their likelihood of occurrence.
Through construct and evolution, our brains aren’t wired to work that way though! We aren’t good at intuitively calculating odds and probabilities, avoiding hasty conclusions, appreciating the role of luck and coincidence, and correctly assessing our skills and knowledge for completeness. Add to this that our emotions confound our decision making, and we end up with a situation where we are beset by numerous decision traps if you will. Our rationality is bounded, we don’t completely know what we know in most cases, and we don’t know what we don’t know. This means that means we make our best intuitive decision based on our best judgment and own experience.
So, regarding construct, here’s how Nobel Prize winning psychologist Daniel Kahneman, one of those two I was telling you about, describes how it works...
Our mind has 2 systems if you will, one for intuition, and one for rationalization. System I is automatic, invisible, and quick: 2+2 = System 1 passed an answer to your System II immediately right, what was it? Right, 4. System I is also associative, when I say “Your mother”, you immediately had an image of this person in your mind. There was also most likely some emotional thoughts attached, along with even sights, sounds, smells, memories. Now, when I say to you “17 X 24”, something happens right? What [AND WAIT FOR THEIR RESPONSES AND REPEAT]. Right, you knew intuitively it was multiplication, you probably knew you could solve it, but you knew you’d need a calculator, etc. Your mind knew it would have to use System II to solve this one, which you could do with time and tools; the answer's 408 by the way…
Now, most of the time, SYS I and SYS II work together in harmony; most times when you drive, it’s automatic, it’s SYSI, but when you enter a middle lane to make a left turn in traffic, SYS II takes over, the complexity went up, the stakes went up, you intuitively know you have to pay more attention right? So, most of the time, SYS I does a fine job, and SYS II checks its outputs and stamps them OK. However, sometimes it doesn’t, and when it doesn’t, it’s invisible to us, SYS II stamps the belief or intuition as valid and we make a decision based on the automatic information passed to us.
The gist is that SYS I uses mental shortcuts, called heuristics, and intuitions, what we think we know and feel about a person, task or situation, before our SYS II even stamp it OK. Sometimes, like when we’re making decisions under risk, uncertainty, and time constraints, we accept the truncated version of the information upon which SYS II makes a decision. We’ve potentially committed an error and we don’t even know it!
Why? Because we don’t know we should check our assumptions, prejudices, conclusions, calculations, i.e. the invisible yet predictable systematic biases that we know SYS I is making. That’s the topic for today; what are some of the mental and emotional systematic biases, or decision traps, that we fall into, and how can we reduce their impact when the important decisions, so that we can make better project decisions more often.
So, with the cards dealt, and the antes in, you ready for some experiments! YEAH! LETS GET TO THE PROVING PART OF THIS WRIGHT! I mean Hey, after all, this if fun and all, but none of this applies to me, I’m rational!
[READ SLIDE AND THEN NEXT SLIDE]
“For question 4, [READ SLIDE]” and “make sure to jot down your program selection, A or B”.
]THEN SHOW SECOND CHOICE SET]…”Make sure to jot down your second program selection for this set too”.
[NEXT SLIDE]
[READ SLIDE AND THEN NEXT SLIDE]
[READ SLIDE AND THEN NEXT SLIDE]
[READ SLIDE AND THEN NEXT SLIDE]
[READ SLIDE AND THEN NEXT SLIDE]
“Looking at this graphic of the driving population, I want you to think about where you stand relative to the rest of them in relation to your driving ability, and select your answer”
[READ SLIDE AND THEN NEXT SLIDE]
All right! So how was that gang? Are you feeling confident about your results! YEAH!
OK, so remember, in a standard rational model of economics, we would only take risks because we have favorable odds, and complete information; we would accept some probability of project failure but only because the probability of success outweighs it right.
If we don’t do it that way, we’re using heuristics, mental shortcuts, which opens us up to decision traps. The most frequently discussed that directly relate to Planning are shown here, they’re the biases illustrated in our quiz.
That can lead us down a path where, unbeknownst to ourselves, we’ll make decisions based on our optimism and overconfidence, rather than on the rational weighting of gains, losses, and probabilities. This is called the planning fallacy: a state where we overestimate benefits, chances (neglect base rate), our skills (optimism, causal role of skill, neglect chance), our accuracy (framing, anchoring and adjustment) and we underestimate our losses. We also overlook mistakes and miscalculations because we don’t even know we’ve made them, and when we do realize it, we escalate commitment because we fear losses and want to save face (escalation, sunk costs, regret). All of this leads us to feel like we’re in control (illusion of control).
Exists when “decision makers use reference points for difficult evaluation tasks. Adjustments from this initial position (anchor) can be insufficient”.1 This bias allows us to focus on a small amount of info at a time and make decisions quickly; we pay attention to initial data/info, draw a conclusion, and adjust from there. Think about your last car or refrigerator purchase...did you get a ‘good’ deal? Did you haggle from the sticker price, or from the actual value of the item?
So what’s happening under our hoods here...Well, because I didn’t give you enough time to calculate the full answer, you had to make an estimate after your first few multiplications, your anchor; and it happened automatically! You then extrapolated that initial number to a final estimate. But what if your anchor is incorrect; what happens to all of your adjustments?
For example, your first multiplication effort gave you a small number right. When Kahneman and Tversky conducted this study, the median estimate was 512; what was yours?
Conversely, when you started the second sequence with the larger numbers first, your initial estimate was quite a bit higher right...for the study, the median estimate was 2,250! Again, don’t raise your hands, but what was yours?
THE ACTUAL ANSWER IS 18 TIMES THE SECOND LARGER ESTIMATE! So, please give yourself a point if you came up with the correct answer of 40,320. So, what’s the take away? Think for a second about your last project’s ECD? It’s budget? The cost of a task? Were your estimates complete, thorough, calculated, or were they provided to you by a stakeholder, i.e. your boss walks into your office and says hey!, I need project alpha in 4 months.
So how do we mitigate this bias’ effect on us?
First, through awareness, which you now have. Look for possible instances of it by asking where’d we get this number? How’d we get this number? Also, employ a disciplined, standardized approach to planning process so you can sidestep the effects of anchoring and adjusting.
This bias exists when our choice varies with the presentation of information, even though the information is the same.
A PE example is 90% return on project or 10% chance to miss
PUT THIS INTO MY WORDS…
“Which programs did you pick for the two questions above? (Hint: Most people pick A and D.)
This question was asked in a famous experiment by Tversky and Kahneman (which led to a Nobel Prize for Kahneman), with 72% of participants choosing option A over B, i.e. save lives, but it dropped to 22% C over D, i.e. loose lives.
Well, I don’t know about you, but for me these are astonishing results!
Why?
In case you didn’t notice, programs A and C are identical, as are programs B and D. They’re objectively the same — the same number of people live and die, with the same odds — but they’re presented — or framed — in different ways!
If people were to act consistently, it would be expected they would pick either A-C or B-D. But the change in wording alone was enough for people to shift their choices from the first option to the second. Many people chose inconsistently compared with their previous choice.
And that’s how powerful framing is.
No matter how “rational” we think we are, emotions and mental images play a large part in our decisions —
When information is presented to you, remove the framing from it restate the results from as many different perspectives as possible, being sure to include all possible relevant information. You want to dissect what is really going on. For example, if 7/10 people think John is kind, does that necessarily mean 3/10 don’t? Maybe they just don’t have an opinion. Slow down and remove the frame.
Question 3 represented the sunk cost fallacy, and this is one many of you have probably seen, probably even spotted right away.
I know I did, but I didn’t realize it was a predictable, systematic error in my decision-making! We commit this fallacy when we continue to make hopeless project investments into failing projects, reasoning the entire time: I can’t stop now, otherwise what I’ve invested so far will be lost. This is true, of course, but irrelevant to whether we should continue to invest in the project. Everything we’ve invested is lost regardless. If there is no hope for success in the future from the investment, then the fact that one has already lost a bundle should lead one to the conclusion that the rational thing to do is to withdraw from the project.
We do this for several reasons, to save face, to avoid loss, or we’re overly optimistic. Regardless, it’s still irrational to throw good money after bad as they say.
Did you commit the sunk cost fallacy here in question 3? Give yourself a point if you said neither is more likely drive, they should both stay home, because they’re making the rational decision not to brave the blizzard because it’s not a smart idea. The fact the money for the ticket is gone is simply that, a fact. It should have no bearing on this decision because there is no way to get the money back.
This bias, loss aversion, exists because losses loom larger than gains in most of our minds.
When we take a gamble, we are trying to win or gain something. If the gamble doesn’t pay off, we are more risk seeking to take another gamble and get the lost resource back. It’s not about wining, or gaining, the second time, it’s about avoiding the loss. We regret the outcome of the prior decision, because we are now worse off in our minds.
And, once we have something, we don’t like to lose it. Ever wonder what a test drive is really for? A 30-day money back guarantee on that newest software/computer?
So what are some ways to avoid it? One is to remove our feelings and bias…READ SLIDE
In question 5, we see our old friend regret again. This is an example of an emotional bias that influences our decision-making. Regret is an emotion arising from the difference between the actual payoff and the payoff that would have been obtained if we had taken a different course of action. Hindsight is always 20/20 right; those were the droids you were looking for.
But here’s the rub, the research shows if we take action, we’re more likely to feel regret about it if it doesn’t work out like we planned! Why? Because WE ‘made’ it happen vs. it happened to us, i.e. ‘it was meant to be’ if we hadn’t acted. The anticipation of regret can cloud our judgment and affect the decision we are about to make, regardless of the rationality of the decision or not.
For example, the correct answer here is Neither, both men are in the same position financially. However, when respondents answer this question, we see a split of 8% said Paul and 92% said George will feel more regret. Why? Because we would! We took the action that caused the loss. We overestimate our influence and control, and look for causation.
The only difference here is George acted, and Paul didn’t.
In question 6, Steve is an example of base-rate neglect. We make a judgment about something based on its representativeness, does Steve sound like a farmer or not, or does he sound more like a librarian or not?
If we just stop and think about this question for just a second or two longer, we arrive at hmmmmm, there are probably a lot more farms than libraries. It follows that the chances of Steve being a meek, tidy farmer are a lot greater than the chance of him being a male librarian.
This information wasn’t supplied, so it wasn’t considered. We didn’t know the populations of the two groups, shown here, but we know stereotypes, so most of us used our stereotypes of a farmer and a librarian.
Decision making researcher Gilovich said “The regressive fallacy is the failure to take into account natural and inevitable fluctuations of things when ascribing causes to them”. What does that mean, pun intended…? Punishment probably won’t be the stimulus for better performance, and praise in a situation where someone has done exceptionally well sets us up for a let down…at some point, their performance returns to their average ability/output. So this bias occurs when we forget that averages play into performance and results.
The researchers came across this one by accident. They were studying student pilot performance, and realized the instructors believed their punishment or praise was the cause of increased or decreased performance. They punished, i.e. screaming, shouting, dressing down, a lot more and praised a lot less. They thought the performance hinged on them, not the student’s mean performance.
So, give your self a point if you said ignore…And, I want to add here, I’m not saying as the project manager you don’t have a role in your team’s and team members’ performance, you do; we just have to think about it in another way. Your job is to help each member raise their ‘average’…if their average is 5 out of 10, practice, train, drill, help them get it to 6 or 7 out of 10.
Number 8 is very similar to number 7…it’s the "hot hand fallacy". These are the times during those exceptional streaks of performance we were just talking about are that we think the streak, or the player, or her mindset, or something, a cause, has something to do with the next independent event. We look for patterns to help us describe causation. However, research as proven this is a flawed belief, a person who has experienced success with a previous event does not have a greater chance of further success in additional attempts, each attempt is independent.
In 1985, the psychologist and mathematician team of Gilovich, Vallone and Tversky disproved this theory, finding that each shot was completely random – and that people have an inability to understand and accept randomness. 91% of the basketball fans polled believed in the hot hand fallacy, and 75% of the 80-81 76’ers believe in it. We want there to be cause…
OK, so how do we explain the Michael Jordans, or the Tiger Woods then Wright? Their mean is higher than their peers. Jordan’s average of made shots is higher than Pippins, Woods’ average of birdie putts from 15 feet is higher than Duval’s.
The take-away? Give yourself a point if you selected the correct answer of No.
We subconsciously answer easier questions in place of harder questions…the first question is easy…you’re a good driver. However, the second question takes cognitive work right, you have to think about what is an average driver? You have to do that so you can arrive at what is an above average driver? You start to think about that and simply decide that since you know you’re a good driver, which is above average, you’re an above average driver. The only problem though, is like our decision-making opener, we all can’t be above average! It’s not statistically probable!
However, this optimism bias affects a lot us, neuroscientist Tali Sharot says 80% of us have this bias! And, that’s the average professional respondent. I would suspect this number is higher for us project personnel, it has to be, or we wouldn’t continue doing what we do in the face of such adversity, stress, and failure. It’s higher in entrepreneurs…
Now, this bias doesn’t mean we think everything will turn out OK, we just think we, as opposed to the PM next to us, have the knowledge, skills, and abilities to make it so.
So, for this answer, give yourself a point if you said average.
And here we are at #10, the illusion of control. Here, I used a visual illusion as a metaphor for rationality. I got this experiment from Dan Ariely, and the reason it’s powerful, think about our vision. We use it ALL THE TIME! It follows we ought to be pretty good at it then with so much practice. However, in this illusion, the lines are the same length! The fins throw most of us off! Give yourself a point if you chose neither line is longer, C.
Now, here’s a deeper point about this illusion, look at it when I take the lines back away…what happens? RIGHT! Most of see B as longer again, even though you know it’s not true! It’s like we haven’t learned anything in the last minute…The point is that even when we are aware of the bias, it’s still hard to avoid its influence; we believe in it. Our intuition is fooling us in predictable, repeatable ways. And it’s doing with something we are really good at and practice a lot. Do we practice our decision-making as much?
Another deep point that you’ve probably picked up on intuitively, because I know you’re seeking patterns, is that many of these decision traps are closely related. So, even if you successfully avoid one, you don’t fund that project because you avoided your feelings of regret, but you DO fund it because of your optimism, we’ll turn it around, you’ve still reached an irrational decision because your rationality was trapped, it was bounded.
Now, we make tons of decisions every day, and many of them are right. I don’t want you stopping at green lights to ensure no one is running their red light right. We make the assumption they’ll stop, and we get on with it. But, what we do need to realize is that when we are making decisions that count, that will ground a plan, or set the trajectory or value of something, we need to slow down. We need to ask ourselves how we are feeling and thinking, and we need to use some tools to combat the effects of these traps. If we are aware, and we simply question our assumptions and valuations and thoughts and feelings about the decision, we reduce the impact of our biases and emotional thinking on our decision-making, which can help us make better project decisions more often.
So what was your score? READ SLIDE…
Message these decision traps are Real, Latent, and Persistent! Even after know they’re there, you still have to be conscious to know it vs. intuit it.
Additionally, they’re predictable, so we can realize they’re there, which means we can avoid them or reduce their impact with just a bit of effort.
So that’s the challenge I leave you with today; the real test…can you go out, slow down when it’s warranted because your realize you need to be deliberate instead of intuitive, and because you are aware of these decision traps, and consciously make better project decisions more often when it counts?
Thank you!
With that, here is my contact information, and what are your questions Colleagues?