SlideShare a Scribd company logo
How much information is
needed to make better
decisions?
WHAT’S THE TRUTH HERE?
There are so many views on the amount and type of information required to make a good
decision. Conventional wisdom tells us we need as much information as we can get, and
that we should keep our options open for as long as possible. But, what about ‘analysis
paralysis’? What part does intuition play? And how long should we wait before finally
making that important decision?
With the surge in research into behavioural economics, we now know that there are
several myths that prevail around decision making. These are worth exploring in order to
understand the answer to some of these questions.
SOME COMMON MYTHS OF DECISION MAKING
I have highlighted five relatively common misconceptions about information and decision
making. They address the areas of:
- how much information is needed
- who makes the decision
- how the decision is made.
In each case, we discuss conventional wisdom, and then explore some of the behavioural
implications.
© Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 1
Dr Norman Chorn
Myth 1: It is best to collect as much informations as possible and keep
options open: Research shows that large amounts of information and multiple options
cause confusion, loss of focus and inefficiency in cognitive functioning. Furthermore, we
know that the resultant confusion creates stress and an inability to consider all the
information in an optimal way.
Even if we score a list of weighted criteria against each option, we are left with a sense of
unease that the decision is sub-optimal.
Myth 2: People make irrational decisions by not considering all the
information: When looking back at previous decisions made by others, we are often
surprised by the ‘irrationality’ of these decisions, and the fact that key information was
overlooked. However, we know that most people’s rationality is ‘bounded’ by the
information to which they have access.
These ‘boundaries’ are defined by their role and situation, since we all have to be selective
in the information we are able to consider. In this sense, most decisions ARE rational in
terms of what information has been considered.
Myth 3: Experts make the best decisions: This seemingly logical conclusion was
overturned during the early stages of behavioural research into decision making.
Startlingly, evidence suggests that while experts may be good at establishing the criteria
against which decisions can be made in complex situations, algorithms and artificial
intelligence (AI) are far more accurate and efficient in making decisions when using these
criteria.
Experts often tend to ignore their own criteria, or apply them inconsistently, when making
complex decisions
Myth 4: It is often best to rely on intuition when situations are complex and
uncertain: Malcolm Gladwell’s ‘Blink’ made popular the notion that ‘thin slices’ of
information can be non-consciously processed at lightning speed by so-called intuition,
particularly where situations are complex and uncertain. This process uses pattern
recognition and association to draw conclusions far more quickly than conscious cognitive
processes.
Despite the popularity of this belief, we need to remember that the use of intuition is
ineffective in situations in which the decision maker has little experience, or when the
‘system’ is not subject to regular patterns of behaviour. Furthermore, stress, heightened
emotions and arousal will render intuition less reliable or accurate.
Myth 5: People use past evidence to judge the probability of a future event
occurring: Many people rely on the rather naive notion of the ‘law of averages’ - ie that if
heads repeatedly comes up in a coin-toss, tails must follow, since the two sides of the coin
© Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 2
should be equally represented in the long run. This ‘law’ is not really helpful when judging
the probability of random events, since future events have no ‘memory’ of past events.
Notwithstanding this, people regularly rely on what is know as ‘frequentist’ (classical)
statistics when judging the probability of a future event (“lightening never strikes twice in
the same place”). Instead, we know that the probability of a future event changes as we
gather new data or evidence about that event. This is the basis of ‘Bayesian’ statistics, an
approach in statistics used to modify probability in the light of recent events and evidence.
And behavioural research demonstrates that we usually ignore this important process - ie
we fail to update our beliefs about future probability on the basis of recent events. (We are
NOT Bayesians in the way we make decisions about the probability of future events).
SO, HOW CAN WE IMPROVE OUR DECISION MAKING?
Four approaches are outlined to address these myths - and make a difference to the
decision-making process in times of uncertainty.
They relate to:
1. How to know when you have collected sufficient information to make a decision
2. How to orientate yourself in a decision-making situation
3. How to update the probability of an event occurring, based on experience or recent
information
4. How and when should you trust your intuition in complex and uncertain situations.
1. Know the ‘optimal stopping point’ (in data collection)
Scylla and Charybdis were mythical sea monsters in ancient Greek mythology. They
reputedly guarded opposite sides of the Strait of Messina between Sicily and Calabria.
Navigating the strait was likened to being caught in the horns of a dilemma - sailors had to
choose between two equally dangerous extremes in order to survive.
This is the challenge often faced when choosing between collecting more information
(leaving no stone unturned) or making a decision too late (the one that got away). How do
you find the appropriate balance between ‘look’ vs ‘leap’?
Research in mathematics and statistics has established that there is an ‘optimal stopping
point’ - the stage during information collection when you have reached the point at which it
is optimal to make a decision . The answer is 37%. This is where you have optimised the1
amount of information collected and the time taken to make a key decision .2
So, if there are a number of cases to study or people to interview before making a decision
- you can make your decision based on 37% of the cases studied or people interviewed.
Chow, Y.S.; Robbins, H.; Siegmund, D. (1971). Great Expectations: The Theory of Optimal Stopping. Boston: Houghton Mifflin.1
Ferguson, Thomas S. (2007). Optimal Stopping and Applications. UCLA.2
© Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 3
You can select the best candidate based on what you have seen at that point. This
represents the best balance between leaving no stone unturned (collecting the necessary
information) and making the decision too late (the one that got away) .3
2. Orientate yourself in the situation
Colonel John Boyd , an American airforce officer, developed the ‘OODA’ acronym to4
describe the four-stage process for making good decisions in difficult and uncertain
situations. Observe, orientate, decide and act is the somewhat obvious process referred
to here.
However, it is the ‘orientate’ stage in which we are interested, since this is the stage
where the decision-maker seeks to check on their confirmation bias - ie: are you simply
seeing what you want to see? The goal during the orientation phase is to find potential
mismatches between what you have observed, and the judgement you may have made.
How do you overcome confirmation bias during the ‘orientate’ phase?
- Recognise that simply being aware of your bias is not sufficient to deal with it - but it
is a good starting point
- Try not to jump to conclusions too readily - take in all the information
- Embrace and welcome the surprise caused by the mismatch between what you’ve
observed and your ‘usual’ diagnosis - use it to rethink your initial hypothesis
- Seek to prove yourself wrong - why was my initial hypothesis incorrect?
Recognising any mismatch between your observation and the judgement you have made
is the critical focus in the ‘orientate’ phase. When this occurs, it is essential to schedule
time for reflection and to consider the points above.
3. Rethink the probability of a future event
Thomas Bayes - an English statistician and theologian - pioneered the approach that
allows us to update our views on the probability of an event based on recent experience
and new evidence. This is in contrast to classical statistics , which assumes that the5
probability of a future event is based on the overall average frequency of that event in the
long run - eg: “this type of flood occurs once in a hundred years”. So classical statistics
would suggest a low probability for the same flood occurring two years in a row.
Classical statistics is strongly underpinned by conservatism bias - the tendency we have to
insufficiently revise our beliefs in the light of new evidence. We generally overweight our
Hill, Theodore P. (2009). "Knowing When to Stop". American Scientist. 97: 126–1333
Boyd, John R. (3 September 1976). Destruction and Creation. U.S. Army Command and General Staff College.4
Everitt, B.S. (2002) The Cambridge Dictionary of Statistics, CUP5
© Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 4
understanding of the ‘average frequency’ of the event, and underweight any new
information or evidence .6
Bayesian statistics, on the other hand, recognises that the probability of a future event
should be updated by new data or evidence about that event. It would appear, therefore,
that the Bayesian approach is likely give us a better understanding about the probability of
random events, particularly in uncertain situations.
Sadly, as mentioned above, human decision making does not adopt a complete Bayesian
approach due to the conservatism bias and the so-called anchoring effect. Humans are
slow to adjust their decision making about future events in the light of new data or
evidence .7
How can we overcome this natural tendency to place a low value on new data and
evidence?
- Recognise that your previously held theories may be similar to ‘sunk costs’. It’s not
important what you have spent or invested in developing them - it’s more important to
think of what it may cost to hold onto them
- ‘Rent’ rather than ‘buy’. The point here is that you could benefit by holding your
current ideas lightly (renting)), rather than ‘owning’ them (buying). The cost of
ownership is high, as you have much invested in proving them correct
- Be wary of overvaluing ideas or information that is easy to understand. Sometimes
the difficult and complex information may be more valuable
- Use a variety of ‘trip wires’ to remind you that you may be overvaluing certain
information. This includes the use of checklists (to consider all the information) and
algorithms that apply sound principles consistently
- Make time to re-think and re-test your base assumptions and theories at regular
intervals - particularly if the recent data and evidence does not support these.
4. Trust you intuition in specific situations
Complex problems are very different to complicated problems. In a complicated problem,
there is a recognised ‘best practice’ solution, and this is achieved by way of a logical
reductionist process (ie: we work through the problem methodically by analysing the
various elements).
Complex problems, on the other hand, are characterised by multiple viewpoints - often in
conflict with one another. Furthermore, there is no recognised ‘best practice’ solution, and
the problem has to be solved by studying the whole system and allowing the solution to
Edwards, Ward. "Conservatism in Human Information Processing (excerpted)". In Daniel Kahneman, Paul Slovic and Amos Tversky.6
(1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University Press.
D. Kahneman, P. Slovic and A. Tversky. (1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University7
Press.
© Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 5
‘emerge’. It is clear why an intuitive approach, which relies only on ‘thin slices’ of
information, might be attractive in these instances.
However, as we know, intuition can often fool us, given that its reliability can be affected by
anchoring, priming and stress. So when can we use this powerful tool that is made
available through the immense capacity and speed of the brain’s non-conscious
functioning? Research reveals three situations where this can be used:8
- Where you are an expert in a given situation - and have been recognised as such by
others (eg: an international chess master)
- Where the situation in which you are operating is subject to regular patterns (eg: the
development of software)
- Where you are able to get immediate feedback as to the correctness of your decision
(eg: bomb disposal).
Admittedly, this is a limited set of conditions under which you can trust your intuition. But it
seems sensible if we consider that intuition is based on a fast recollection of previous
patterns you have experienced, and that these can only be trusted if you have
experienced regular success under these conditions, and the conditions are relevant to
you current situation.
________________________________________________________________________
Dr Norman Chorn is a strategist and organisation development practitioner with the
BrainLink Group. He uses principles of neuroscience to address the challenges of
developing strategy in a complex and uncertain environment. His particular areas of
focus are strategy in conditions of uncertainty; organisational and cultural alignment;
and strategic leadership.
Subscribe to our regular articles No spam guaranteed
D. Kahnemann (2011). Thinking, Fast and Slow. New York: Farrer, Straus and Giroux8
© Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 6

More Related Content

Similar to Busting the myths of decision making

Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docxWeek 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
cockekeshia
 
Lesson Eight Moral Development and Moral IntensityLesson Seve.docx
Lesson Eight Moral Development and Moral IntensityLesson Seve.docxLesson Eight Moral Development and Moral IntensityLesson Seve.docx
Lesson Eight Moral Development and Moral IntensityLesson Seve.docx
smile790243
 
Summary Perception and Individual Decision Making
Summary Perception and Individual Decision MakingSummary Perception and Individual Decision Making
Summary Perception and Individual Decision Making
Deni Triyanto
 
Risk Taking To Get Ahead
Risk Taking To Get AheadRisk Taking To Get Ahead
Risk Taking To Get Ahead
Chelse Benham
 
Risk taking to get ahead
Risk taking to get aheadRisk taking to get ahead
Risk taking to get aheadChelse Benham
 
PBH.815 Week 2 Lecture
PBH.815 Week 2 LecturePBH.815 Week 2 Lecture
PBH.815 Week 2 Lecture
Gina Crosley-Corcoran
 
The Hidden Traps in Decision Making
The Hidden Traps in Decision MakingThe Hidden Traps in Decision Making
The Hidden Traps in Decision Making
Ankit Saxena
 
Electronic Health Record Privacy - CIO's Perspective
Electronic Health Record Privacy - CIO's PerspectiveElectronic Health Record Privacy - CIO's Perspective
Electronic Health Record Privacy - CIO's Perspective
Health Informatics New Zealand
 
Cognitive bias
Cognitive bias Cognitive bias
Cognitive bias
Akshay Kumar
 
GE372 Week Two
GE372 Week TwoGE372 Week Two
GE372 Week TwoComp Class
 
Dodgy argumentsextendeda4
Dodgy argumentsextendeda4Dodgy argumentsextendeda4
Dodgy argumentsextendeda4Junji Kai
 
COGNITIVE PSYCHOLOGY.pptx
COGNITIVE PSYCHOLOGY.pptxCOGNITIVE PSYCHOLOGY.pptx
COGNITIVE PSYCHOLOGY.pptx
CherryjoyOlaso
 
How Risky is it, Really?
How Risky is it, Really?How Risky is it, Really?
How Risky is it, Really?
Business Book Summaries
 
Applied bayesian statistics
Applied bayesian statisticsApplied bayesian statistics
Applied bayesian statisticsSpringer
 
Dean R Berry Evaluating Conspiracy Theories
Dean R Berry Evaluating Conspiracy Theories Dean R Berry Evaluating Conspiracy Theories
Dean R Berry Evaluating Conspiracy Theories
Riverside County Office of Education
 
Cognitive Biases in Data Interpretation-2
Cognitive Biases in Data Interpretation-2Cognitive Biases in Data Interpretation-2
Cognitive Biases in Data Interpretation-2Vijay Kotu
 
Management of Organizations-Part I
Management of Organizations-Part IManagement of Organizations-Part I
Management of Organizations-Part I
Ramnath Srinivasan
 
Learning session 2nd
Learning session 2ndLearning session 2nd
Learning session 2nd
gauravsharmamba
 
The Failure of Skepticism: Rethinking Information Literacy and Political Pol...
 The Failure of Skepticism: Rethinking Information Literacy and Political Pol... The Failure of Skepticism: Rethinking Information Literacy and Political Pol...
The Failure of Skepticism: Rethinking Information Literacy and Political Pol...
Chris Sweet
 

Similar to Busting the myths of decision making (20)

Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docxWeek 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
Week 3 - Instructor GuidanceWeek 3 Inductive ReasoningThis we.docx
 
Lesson Eight Moral Development and Moral IntensityLesson Seve.docx
Lesson Eight Moral Development and Moral IntensityLesson Seve.docxLesson Eight Moral Development and Moral IntensityLesson Seve.docx
Lesson Eight Moral Development and Moral IntensityLesson Seve.docx
 
Summary Perception and Individual Decision Making
Summary Perception and Individual Decision MakingSummary Perception and Individual Decision Making
Summary Perception and Individual Decision Making
 
Risk Taking To Get Ahead
Risk Taking To Get AheadRisk Taking To Get Ahead
Risk Taking To Get Ahead
 
Risk taking to get ahead
Risk taking to get aheadRisk taking to get ahead
Risk taking to get ahead
 
PBH.815 Week 2 Lecture
PBH.815 Week 2 LecturePBH.815 Week 2 Lecture
PBH.815 Week 2 Lecture
 
The Hidden Traps in Decision Making
The Hidden Traps in Decision MakingThe Hidden Traps in Decision Making
The Hidden Traps in Decision Making
 
Electronic Health Record Privacy - CIO's Perspective
Electronic Health Record Privacy - CIO's PerspectiveElectronic Health Record Privacy - CIO's Perspective
Electronic Health Record Privacy - CIO's Perspective
 
Cognitive bias
Cognitive bias Cognitive bias
Cognitive bias
 
GE372 Week Two
GE372 Week TwoGE372 Week Two
GE372 Week Two
 
Biases april 2012
Biases april 2012Biases april 2012
Biases april 2012
 
Dodgy argumentsextendeda4
Dodgy argumentsextendeda4Dodgy argumentsextendeda4
Dodgy argumentsextendeda4
 
COGNITIVE PSYCHOLOGY.pptx
COGNITIVE PSYCHOLOGY.pptxCOGNITIVE PSYCHOLOGY.pptx
COGNITIVE PSYCHOLOGY.pptx
 
How Risky is it, Really?
How Risky is it, Really?How Risky is it, Really?
How Risky is it, Really?
 
Applied bayesian statistics
Applied bayesian statisticsApplied bayesian statistics
Applied bayesian statistics
 
Dean R Berry Evaluating Conspiracy Theories
Dean R Berry Evaluating Conspiracy Theories Dean R Berry Evaluating Conspiracy Theories
Dean R Berry Evaluating Conspiracy Theories
 
Cognitive Biases in Data Interpretation-2
Cognitive Biases in Data Interpretation-2Cognitive Biases in Data Interpretation-2
Cognitive Biases in Data Interpretation-2
 
Management of Organizations-Part I
Management of Organizations-Part IManagement of Organizations-Part I
Management of Organizations-Part I
 
Learning session 2nd
Learning session 2ndLearning session 2nd
Learning session 2nd
 
The Failure of Skepticism: Rethinking Information Literacy and Political Pol...
 The Failure of Skepticism: Rethinking Information Literacy and Political Pol... The Failure of Skepticism: Rethinking Information Literacy and Political Pol...
The Failure of Skepticism: Rethinking Information Literacy and Political Pol...
 

More from The BrainLink Group

Dr Norman Chorn profile 2021
Dr Norman Chorn profile 2021Dr Norman Chorn profile 2021
Dr Norman Chorn profile 2021
The BrainLink Group
 
Separating Rhinos from Swans - resilience might be the key
Separating Rhinos from Swans - resilience might be the keySeparating Rhinos from Swans - resilience might be the key
Separating Rhinos from Swans - resilience might be the key
The BrainLink Group
 
2020 is the year to challenge the NORM
2020 is the year to challenge the NORM2020 is the year to challenge the NORM
2020 is the year to challenge the NORM
The BrainLink Group
 
Does culture really eat strategy?
Does culture really eat strategy?Does culture really eat strategy?
Does culture really eat strategy?
The BrainLink Group
 
Thinking under pressure
Thinking under pressureThinking under pressure
Thinking under pressure
The BrainLink Group
 
Strategic accretion
Strategic accretionStrategic accretion
Strategic accretion
The BrainLink Group
 
Boost your strategic thinking
Boost your strategic thinkingBoost your strategic thinking
Boost your strategic thinking
The BrainLink Group
 
Strategy is alive and well... and living in UNCERTAINTY
Strategy is alive and well... and living in UNCERTAINTYStrategy is alive and well... and living in UNCERTAINTY
Strategy is alive and well... and living in UNCERTAINTY
The BrainLink Group
 
Listen to your leadership metronome
Listen to your leadership metronomeListen to your leadership metronome
Listen to your leadership metronome
The BrainLink Group
 
Norman Chorn profile
Norman Chorn profileNorman Chorn profile
Norman Chorn profile
The BrainLink Group
 
This is the year to challenge the norm
This is the year to challenge the normThis is the year to challenge the norm
This is the year to challenge the norm
The BrainLink Group
 
We're mindful - why isn't our organisation?
We're mindful - why isn't our organisation?We're mindful - why isn't our organisation?
We're mindful - why isn't our organisation?
The BrainLink Group
 
Appendix Neuroscience of mindfulness
Appendix   Neuroscience of mindfulnessAppendix   Neuroscience of mindfulness
Appendix Neuroscience of mindfulness
The BrainLink Group
 
Action is the enemy of thought
Action is the enemy of thoughtAction is the enemy of thought
Action is the enemy of thought
The BrainLink Group
 
Why can't my people be more strategic?
Why can't my people be more strategic? Why can't my people be more strategic?
Why can't my people be more strategic?
The BrainLink Group
 
Brain new world invitation
Brain new world invitationBrain new world invitation
Brain new world invitation
The BrainLink Group
 
Our brain new world - organisations and their development
Our brain new world - organisations and their developmentOur brain new world - organisations and their development
Our brain new world - organisations and their development
The BrainLink Group
 
The brain new world - insights for organisations and strategy
The brain new world - insights for organisations and strategyThe brain new world - insights for organisations and strategy
The brain new world - insights for organisations and strategy
The BrainLink Group
 
Napoleon was a neuroscientist
Napoleon was a neuroscientistNapoleon was a neuroscientist
Napoleon was a neuroscientist
The BrainLink Group
 
Want to compete like napoleon - think neurostrategy
Want to compete like napoleon -  think neurostrategy Want to compete like napoleon -  think neurostrategy
Want to compete like napoleon - think neurostrategy
The BrainLink Group
 

More from The BrainLink Group (20)

Dr Norman Chorn profile 2021
Dr Norman Chorn profile 2021Dr Norman Chorn profile 2021
Dr Norman Chorn profile 2021
 
Separating Rhinos from Swans - resilience might be the key
Separating Rhinos from Swans - resilience might be the keySeparating Rhinos from Swans - resilience might be the key
Separating Rhinos from Swans - resilience might be the key
 
2020 is the year to challenge the NORM
2020 is the year to challenge the NORM2020 is the year to challenge the NORM
2020 is the year to challenge the NORM
 
Does culture really eat strategy?
Does culture really eat strategy?Does culture really eat strategy?
Does culture really eat strategy?
 
Thinking under pressure
Thinking under pressureThinking under pressure
Thinking under pressure
 
Strategic accretion
Strategic accretionStrategic accretion
Strategic accretion
 
Boost your strategic thinking
Boost your strategic thinkingBoost your strategic thinking
Boost your strategic thinking
 
Strategy is alive and well... and living in UNCERTAINTY
Strategy is alive and well... and living in UNCERTAINTYStrategy is alive and well... and living in UNCERTAINTY
Strategy is alive and well... and living in UNCERTAINTY
 
Listen to your leadership metronome
Listen to your leadership metronomeListen to your leadership metronome
Listen to your leadership metronome
 
Norman Chorn profile
Norman Chorn profileNorman Chorn profile
Norman Chorn profile
 
This is the year to challenge the norm
This is the year to challenge the normThis is the year to challenge the norm
This is the year to challenge the norm
 
We're mindful - why isn't our organisation?
We're mindful - why isn't our organisation?We're mindful - why isn't our organisation?
We're mindful - why isn't our organisation?
 
Appendix Neuroscience of mindfulness
Appendix   Neuroscience of mindfulnessAppendix   Neuroscience of mindfulness
Appendix Neuroscience of mindfulness
 
Action is the enemy of thought
Action is the enemy of thoughtAction is the enemy of thought
Action is the enemy of thought
 
Why can't my people be more strategic?
Why can't my people be more strategic? Why can't my people be more strategic?
Why can't my people be more strategic?
 
Brain new world invitation
Brain new world invitationBrain new world invitation
Brain new world invitation
 
Our brain new world - organisations and their development
Our brain new world - organisations and their developmentOur brain new world - organisations and their development
Our brain new world - organisations and their development
 
The brain new world - insights for organisations and strategy
The brain new world - insights for organisations and strategyThe brain new world - insights for organisations and strategy
The brain new world - insights for organisations and strategy
 
Napoleon was a neuroscientist
Napoleon was a neuroscientistNapoleon was a neuroscientist
Napoleon was a neuroscientist
 
Want to compete like napoleon - think neurostrategy
Want to compete like napoleon -  think neurostrategy Want to compete like napoleon -  think neurostrategy
Want to compete like napoleon - think neurostrategy
 

Recently uploaded

Case Analysis - The Sky is the Limit | Principles of Management
Case Analysis - The Sky is the Limit | Principles of ManagementCase Analysis - The Sky is the Limit | Principles of Management
Case Analysis - The Sky is the Limit | Principles of Management
A. F. M. Rubayat-Ul Jannat
 
SOCIO-ANTHROPOLOGY FACULTY OF NURSING.....
SOCIO-ANTHROPOLOGY FACULTY OF NURSING.....SOCIO-ANTHROPOLOGY FACULTY OF NURSING.....
SOCIO-ANTHROPOLOGY FACULTY OF NURSING.....
juniourjohnstone
 
Senior Project and Engineering Leader Jim Smith.pdf
Senior Project and Engineering Leader Jim Smith.pdfSenior Project and Engineering Leader Jim Smith.pdf
Senior Project and Engineering Leader Jim Smith.pdf
Jim Smith
 
TCS AI for Business Study – Key Findings
TCS AI for Business Study – Key FindingsTCS AI for Business Study – Key Findings
TCS AI for Business Study – Key Findings
Tata Consultancy Services
 
Leadership Ethics and Change, Purpose to Impact Plan
Leadership Ethics and Change, Purpose to Impact PlanLeadership Ethics and Change, Purpose to Impact Plan
Leadership Ethics and Change, Purpose to Impact Plan
Muhammad Adil Jamil
 
Founder-Game Director Workshop (Session 1)
Founder-Game Director  Workshop (Session 1)Founder-Game Director  Workshop (Session 1)
Founder-Game Director Workshop (Session 1)
Amir H. Fassihi
 
一比一原版杜克大学毕业证(Duke毕业证)成绩单留信认证
一比一原版杜克大学毕业证(Duke毕业证)成绩单留信认证一比一原版杜克大学毕业证(Duke毕业证)成绩单留信认证
一比一原版杜克大学毕业证(Duke毕业证)成绩单留信认证
gcljeuzdu
 
Training- integrated management system (iso)
Training- integrated management system (iso)Training- integrated management system (iso)
Training- integrated management system (iso)
akaash13
 
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
CIOWomenMagazine
 
W.H.Bender Quote 65 - The Team Member and Guest Experience
W.H.Bender Quote 65 - The Team Member and Guest ExperienceW.H.Bender Quote 65 - The Team Member and Guest Experience
W.H.Bender Quote 65 - The Team Member and Guest Experience
William (Bill) H. Bender, FCSI
 

Recently uploaded (10)

Case Analysis - The Sky is the Limit | Principles of Management
Case Analysis - The Sky is the Limit | Principles of ManagementCase Analysis - The Sky is the Limit | Principles of Management
Case Analysis - The Sky is the Limit | Principles of Management
 
SOCIO-ANTHROPOLOGY FACULTY OF NURSING.....
SOCIO-ANTHROPOLOGY FACULTY OF NURSING.....SOCIO-ANTHROPOLOGY FACULTY OF NURSING.....
SOCIO-ANTHROPOLOGY FACULTY OF NURSING.....
 
Senior Project and Engineering Leader Jim Smith.pdf
Senior Project and Engineering Leader Jim Smith.pdfSenior Project and Engineering Leader Jim Smith.pdf
Senior Project and Engineering Leader Jim Smith.pdf
 
TCS AI for Business Study – Key Findings
TCS AI for Business Study – Key FindingsTCS AI for Business Study – Key Findings
TCS AI for Business Study – Key Findings
 
Leadership Ethics and Change, Purpose to Impact Plan
Leadership Ethics and Change, Purpose to Impact PlanLeadership Ethics and Change, Purpose to Impact Plan
Leadership Ethics and Change, Purpose to Impact Plan
 
Founder-Game Director Workshop (Session 1)
Founder-Game Director  Workshop (Session 1)Founder-Game Director  Workshop (Session 1)
Founder-Game Director Workshop (Session 1)
 
一比一原版杜克大学毕业证(Duke毕业证)成绩单留信认证
一比一原版杜克大学毕业证(Duke毕业证)成绩单留信认证一比一原版杜克大学毕业证(Duke毕业证)成绩单留信认证
一比一原版杜克大学毕业证(Duke毕业证)成绩单留信认证
 
Training- integrated management system (iso)
Training- integrated management system (iso)Training- integrated management system (iso)
Training- integrated management system (iso)
 
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
 
W.H.Bender Quote 65 - The Team Member and Guest Experience
W.H.Bender Quote 65 - The Team Member and Guest ExperienceW.H.Bender Quote 65 - The Team Member and Guest Experience
W.H.Bender Quote 65 - The Team Member and Guest Experience
 

Busting the myths of decision making

  • 1. How much information is needed to make better decisions? WHAT’S THE TRUTH HERE? There are so many views on the amount and type of information required to make a good decision. Conventional wisdom tells us we need as much information as we can get, and that we should keep our options open for as long as possible. But, what about ‘analysis paralysis’? What part does intuition play? And how long should we wait before finally making that important decision? With the surge in research into behavioural economics, we now know that there are several myths that prevail around decision making. These are worth exploring in order to understand the answer to some of these questions. SOME COMMON MYTHS OF DECISION MAKING I have highlighted five relatively common misconceptions about information and decision making. They address the areas of: - how much information is needed - who makes the decision - how the decision is made. In each case, we discuss conventional wisdom, and then explore some of the behavioural implications. © Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 1 Dr Norman Chorn
  • 2. Myth 1: It is best to collect as much informations as possible and keep options open: Research shows that large amounts of information and multiple options cause confusion, loss of focus and inefficiency in cognitive functioning. Furthermore, we know that the resultant confusion creates stress and an inability to consider all the information in an optimal way. Even if we score a list of weighted criteria against each option, we are left with a sense of unease that the decision is sub-optimal. Myth 2: People make irrational decisions by not considering all the information: When looking back at previous decisions made by others, we are often surprised by the ‘irrationality’ of these decisions, and the fact that key information was overlooked. However, we know that most people’s rationality is ‘bounded’ by the information to which they have access. These ‘boundaries’ are defined by their role and situation, since we all have to be selective in the information we are able to consider. In this sense, most decisions ARE rational in terms of what information has been considered. Myth 3: Experts make the best decisions: This seemingly logical conclusion was overturned during the early stages of behavioural research into decision making. Startlingly, evidence suggests that while experts may be good at establishing the criteria against which decisions can be made in complex situations, algorithms and artificial intelligence (AI) are far more accurate and efficient in making decisions when using these criteria. Experts often tend to ignore their own criteria, or apply them inconsistently, when making complex decisions Myth 4: It is often best to rely on intuition when situations are complex and uncertain: Malcolm Gladwell’s ‘Blink’ made popular the notion that ‘thin slices’ of information can be non-consciously processed at lightning speed by so-called intuition, particularly where situations are complex and uncertain. This process uses pattern recognition and association to draw conclusions far more quickly than conscious cognitive processes. Despite the popularity of this belief, we need to remember that the use of intuition is ineffective in situations in which the decision maker has little experience, or when the ‘system’ is not subject to regular patterns of behaviour. Furthermore, stress, heightened emotions and arousal will render intuition less reliable or accurate. Myth 5: People use past evidence to judge the probability of a future event occurring: Many people rely on the rather naive notion of the ‘law of averages’ - ie that if heads repeatedly comes up in a coin-toss, tails must follow, since the two sides of the coin © Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 2
  • 3. should be equally represented in the long run. This ‘law’ is not really helpful when judging the probability of random events, since future events have no ‘memory’ of past events. Notwithstanding this, people regularly rely on what is know as ‘frequentist’ (classical) statistics when judging the probability of a future event (“lightening never strikes twice in the same place”). Instead, we know that the probability of a future event changes as we gather new data or evidence about that event. This is the basis of ‘Bayesian’ statistics, an approach in statistics used to modify probability in the light of recent events and evidence. And behavioural research demonstrates that we usually ignore this important process - ie we fail to update our beliefs about future probability on the basis of recent events. (We are NOT Bayesians in the way we make decisions about the probability of future events). SO, HOW CAN WE IMPROVE OUR DECISION MAKING? Four approaches are outlined to address these myths - and make a difference to the decision-making process in times of uncertainty. They relate to: 1. How to know when you have collected sufficient information to make a decision 2. How to orientate yourself in a decision-making situation 3. How to update the probability of an event occurring, based on experience or recent information 4. How and when should you trust your intuition in complex and uncertain situations. 1. Know the ‘optimal stopping point’ (in data collection) Scylla and Charybdis were mythical sea monsters in ancient Greek mythology. They reputedly guarded opposite sides of the Strait of Messina between Sicily and Calabria. Navigating the strait was likened to being caught in the horns of a dilemma - sailors had to choose between two equally dangerous extremes in order to survive. This is the challenge often faced when choosing between collecting more information (leaving no stone unturned) or making a decision too late (the one that got away). How do you find the appropriate balance between ‘look’ vs ‘leap’? Research in mathematics and statistics has established that there is an ‘optimal stopping point’ - the stage during information collection when you have reached the point at which it is optimal to make a decision . The answer is 37%. This is where you have optimised the1 amount of information collected and the time taken to make a key decision .2 So, if there are a number of cases to study or people to interview before making a decision - you can make your decision based on 37% of the cases studied or people interviewed. Chow, Y.S.; Robbins, H.; Siegmund, D. (1971). Great Expectations: The Theory of Optimal Stopping. Boston: Houghton Mifflin.1 Ferguson, Thomas S. (2007). Optimal Stopping and Applications. UCLA.2 © Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 3
  • 4. You can select the best candidate based on what you have seen at that point. This represents the best balance between leaving no stone unturned (collecting the necessary information) and making the decision too late (the one that got away) .3 2. Orientate yourself in the situation Colonel John Boyd , an American airforce officer, developed the ‘OODA’ acronym to4 describe the four-stage process for making good decisions in difficult and uncertain situations. Observe, orientate, decide and act is the somewhat obvious process referred to here. However, it is the ‘orientate’ stage in which we are interested, since this is the stage where the decision-maker seeks to check on their confirmation bias - ie: are you simply seeing what you want to see? The goal during the orientation phase is to find potential mismatches between what you have observed, and the judgement you may have made. How do you overcome confirmation bias during the ‘orientate’ phase? - Recognise that simply being aware of your bias is not sufficient to deal with it - but it is a good starting point - Try not to jump to conclusions too readily - take in all the information - Embrace and welcome the surprise caused by the mismatch between what you’ve observed and your ‘usual’ diagnosis - use it to rethink your initial hypothesis - Seek to prove yourself wrong - why was my initial hypothesis incorrect? Recognising any mismatch between your observation and the judgement you have made is the critical focus in the ‘orientate’ phase. When this occurs, it is essential to schedule time for reflection and to consider the points above. 3. Rethink the probability of a future event Thomas Bayes - an English statistician and theologian - pioneered the approach that allows us to update our views on the probability of an event based on recent experience and new evidence. This is in contrast to classical statistics , which assumes that the5 probability of a future event is based on the overall average frequency of that event in the long run - eg: “this type of flood occurs once in a hundred years”. So classical statistics would suggest a low probability for the same flood occurring two years in a row. Classical statistics is strongly underpinned by conservatism bias - the tendency we have to insufficiently revise our beliefs in the light of new evidence. We generally overweight our Hill, Theodore P. (2009). "Knowing When to Stop". American Scientist. 97: 126–1333 Boyd, John R. (3 September 1976). Destruction and Creation. U.S. Army Command and General Staff College.4 Everitt, B.S. (2002) The Cambridge Dictionary of Statistics, CUP5 © Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 4
  • 5. understanding of the ‘average frequency’ of the event, and underweight any new information or evidence .6 Bayesian statistics, on the other hand, recognises that the probability of a future event should be updated by new data or evidence about that event. It would appear, therefore, that the Bayesian approach is likely give us a better understanding about the probability of random events, particularly in uncertain situations. Sadly, as mentioned above, human decision making does not adopt a complete Bayesian approach due to the conservatism bias and the so-called anchoring effect. Humans are slow to adjust their decision making about future events in the light of new data or evidence .7 How can we overcome this natural tendency to place a low value on new data and evidence? - Recognise that your previously held theories may be similar to ‘sunk costs’. It’s not important what you have spent or invested in developing them - it’s more important to think of what it may cost to hold onto them - ‘Rent’ rather than ‘buy’. The point here is that you could benefit by holding your current ideas lightly (renting)), rather than ‘owning’ them (buying). The cost of ownership is high, as you have much invested in proving them correct - Be wary of overvaluing ideas or information that is easy to understand. Sometimes the difficult and complex information may be more valuable - Use a variety of ‘trip wires’ to remind you that you may be overvaluing certain information. This includes the use of checklists (to consider all the information) and algorithms that apply sound principles consistently - Make time to re-think and re-test your base assumptions and theories at regular intervals - particularly if the recent data and evidence does not support these. 4. Trust you intuition in specific situations Complex problems are very different to complicated problems. In a complicated problem, there is a recognised ‘best practice’ solution, and this is achieved by way of a logical reductionist process (ie: we work through the problem methodically by analysing the various elements). Complex problems, on the other hand, are characterised by multiple viewpoints - often in conflict with one another. Furthermore, there is no recognised ‘best practice’ solution, and the problem has to be solved by studying the whole system and allowing the solution to Edwards, Ward. "Conservatism in Human Information Processing (excerpted)". In Daniel Kahneman, Paul Slovic and Amos Tversky.6 (1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University Press. D. Kahneman, P. Slovic and A. Tversky. (1982). Judgment under uncertainty: Heuristics and biases. New York: Cambridge University7 Press. © Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 5
  • 6. ‘emerge’. It is clear why an intuitive approach, which relies only on ‘thin slices’ of information, might be attractive in these instances. However, as we know, intuition can often fool us, given that its reliability can be affected by anchoring, priming and stress. So when can we use this powerful tool that is made available through the immense capacity and speed of the brain’s non-conscious functioning? Research reveals three situations where this can be used:8 - Where you are an expert in a given situation - and have been recognised as such by others (eg: an international chess master) - Where the situation in which you are operating is subject to regular patterns (eg: the development of software) - Where you are able to get immediate feedback as to the correctness of your decision (eg: bomb disposal). Admittedly, this is a limited set of conditions under which you can trust your intuition. But it seems sensible if we consider that intuition is based on a fast recollection of previous patterns you have experienced, and that these can only be trusted if you have experienced regular success under these conditions, and the conditions are relevant to you current situation. ________________________________________________________________________ Dr Norman Chorn is a strategist and organisation development practitioner with the BrainLink Group. He uses principles of neuroscience to address the challenges of developing strategy in a complex and uncertain environment. His particular areas of focus are strategy in conditions of uncertainty; organisational and cultural alignment; and strategic leadership. Subscribe to our regular articles No spam guaranteed D. Kahnemann (2011). Thinking, Fast and Slow. New York: Farrer, Straus and Giroux8 © Norman Chorn 2020 • norman.chorn@brainlinkgroup.com • (0416) 239 824 • Page 6