A brief and general account of selected potential cognitive biases in drug discovery and development, along with some suggestions on how to avoid them.
You're not so smart - Cognitive BiasesOdair Faléco
We think we are smart, but understanding Cognitive Biases shows how limited is our perception of reality and information around us.
On this presentation I expalin and bring some real examples of the most commom biases used in the market, web and UX.
There are many kinds of cognitive biases that influence individuals differently, but their common characteristic is that they lead to judgment and decision-making that deviates from rational objectivity.
A brief and general account of selected potential cognitive biases in drug discovery and development, along with some suggestions on how to avoid them.
You're not so smart - Cognitive BiasesOdair Faléco
We think we are smart, but understanding Cognitive Biases shows how limited is our perception of reality and information around us.
On this presentation I expalin and bring some real examples of the most commom biases used in the market, web and UX.
There are many kinds of cognitive biases that influence individuals differently, but their common characteristic is that they lead to judgment and decision-making that deviates from rational objectivity.
While making judgments and decisions about the world around us, we like to think that we are Objective,Logical, and
Capable of taking in and evaluating all the information that is available to us.
The reality is that our judgments and decisions are often
riddled with errors and influenced by a wide variety of biases.
The human brain is both remarkable and powerful, but certainly subject to limitations.
One type of fundamental limitation on human thinking is known as a cognitive bias.
Important concepts around how we all make decisions. This presentation introduces the work of Nobel prize winner Daniel Kahneman on Cognitive Biases, and helps you understand why we make errors in judgement, and how to look for signs you're about make one.
As thinking human beings and team leaders or architects we can benefit from knowing more about how we think, deliberate and decide. Most teams rely on trust, transparency, collaboration, and collective decision-making. “Thinking, Fast and Slow,” by Daniel Kahneman explains two systems that drive how we think. System 1 thinking is fast, intuitive, and emotional; System 2 is slow, deliberate, and logical.
In this presentation you learn how fast and slow thinking affects your reactions, behaviors, and decision-making. You’ll explore how several common development practices (with an emphasis on some agile practices), can amplify and exploit your thinking abilities and where they might lead you astray.
Fast thinking works pretty well in a well-known context. You save time when you don’t have to deliberate over details and nuances in order to make informed decisions. But fast thinking can lead to extremely poor decisions. You might jump to conclusions, be wildly optimistic, or greatly under-assess risks and rewards. You need to exploit both fast and slow thinking and be acutely aware of when fast thinking is tripping you up.
In psychology, decision-making is regarded as the cognitive process resulting in the selection of a belief or a course of action among several alternative possibilities. Every decision-making process produces a final choice, which may or may not prompt action.
In the past four decades, behavioral economists and cognitive psychologists have discovered many cognitive biases human brains fall prey to when thinking and deciding. Cognitive biases are tendencies to think in certain ways that can lead to systematic deviations from a standard of rationality or good judgment. These biases arise from errors of memory, social attribution, and miscalculations such as statistical errors or a false sense of probability. Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them.
Bayesian reasoning offers a way to improve on the native human reasoning style. Reasoning naively, we tend not to seek alternative explanations, and sometimes underrate the influence of prior probabilities in Bayes' theorem.
Credits: Wikipedia, LessWrong.org
Presented at CodeMash 2015. By Joseph Ours
Joseph's presentation is based on the book "Thinking Fast and Slow" where Nobel Prize winner Daniel Kahneman introduces two mental systems, one that is fast and the other slow. Together they shape our impressions of the world around us and help us make choices. System 1 is largely unconscious and makes snap judgments based upon memories of similar events and our emotions. System 2 is painfully slow, and is the process by which we consciously check the facts and think carefully and rationally. System 2 is easily distracted. System 1 is wrong quite often. Real-world examples that demonstrate how the two systems work are that pro golfers will more accurately putt for par than they do for birdie regardless of distance and people will buy more cans of soup when there is a sign on the display that says “limit 12 per customer."
Hello Everyone,
A big thank you for all the interest in this study guide. It was originally created as a fun introduction that took the Cognitive Bias wiki and tried to make it easier to memorize.
However, the authors of the wiki article have expressed some concern over the accuracy of certain entries. The document was taken down until that could be corrected.
But, people started asking that I release a new version with a warning. In response, a new "Beta version" of the document has been uploaded with a very strong warning label up front and improved citations. I make it clear that all the text is based on an evolving wiki page and that some of the cognitive biases in there might be incorrect wiki entries. My hope is that this will continue to get people interested in pitching in to help fix the Cognitive Bias wiki pages. :) When the wiki is in a good place, I will take the document out of Beta, and will remove the warning label.
If you are a cognitive expert, join “Operation Fix The Cognitive Bias Wiki!” Add your suggestion to the conversation here: http://en.wikipedia.org/wiki/Talk:List_of_cognitive_biases
Thanks for your interest!
Eric
P.S. . The images have been updated for better remixing and sharing rights. Rather than using permission based images, now all the images are public domain or free non-commercial use by anyone.
Cognitive Biases and Effects You Should Know AboutKevlin Henney
Presented at NDC 2011 in Oslo (8th June 2011)
Video available at http://www.everytalk.tv/talks/678-NDC-Cognitive-Biases-and-Effects-You-Should-Know-About
In software development, developers, architects and managers often like to think of themselves as rational and clear thinking, not prone to the chaotic and contradictory thinking they see at home, in politics or in the world of business. Although it is possible to get further from the truth than this, it is not likely.
Those involved in software development are just as human as people in other walks of life, and are just as subject to the cognitive biases and effects that skew, truncate and bypass clear thinking. The effects on rationality affect everything from testing to estimation, from programming to project delivery. It is easier to see and react to these effects in yourself and others when you know what some of them are.
I've discussed the various ways our brain makes illogical judgments and then makes errors in thinking. I've also discussed the difference between logical thought and how the brain thinks automatically. There is some content on logic as seen in animals too.
Here is a special post I've made about the Survivorship bias
https://cognitiontoday.com/what-you-need-to-know-about-success-stories-survivorship-bias/
Here is one on overcoming thinking biases
https://cognitiontoday.com/8-powerful-ways-to-overcome-thinking-errors-and-cognitive-biases/
Here is one on a few more cognitive biases
https://cognitiontoday.com/4-cognitive-biases-you-should-be-aware/
This is a presentation that covers the basic concepts of the book Nudge, by Richard Thaler and Cass Sunstein. We read this book at our UX Book Club meeting, and I presented an introduction to it at the LA IxDA meeting.
This deck accompanied Kelly Baron's SXSW talk on 3/13/17. Nudge theory is about hacking human nature using subtle, context-driven interventions. We all sometimes buy into the shampoo commercial dream that our products can make us into better people, but what if that were true?
Thanks to IoT, we’re designing products that make and break our habits. We applied nudge theory to our healthcare wearable, Under Currents, to solve billions of dollars’ worth of medical errors and save lives. When common sense fails, common sensors help us be the best version of ourselves.
Join Kelly Baron, a business designer from Fjord Austin, as she talks about how to apply nudge theory theory to digital experiences.
Short presentation on Decision making.
Decision making variables, Types of managerial decision, Decision making process and Techniques for Stimulating Creativity
Stuart Lane takes saying sorry seriously. Seriously seriously. To the extend he's nearly finished his PhD on it. Listen to this fantastic talk, watch the slides and add comments your comments on www.intensivecarenetwork.com.
While making judgments and decisions about the world around us, we like to think that we are Objective,Logical, and
Capable of taking in and evaluating all the information that is available to us.
The reality is that our judgments and decisions are often
riddled with errors and influenced by a wide variety of biases.
The human brain is both remarkable and powerful, but certainly subject to limitations.
One type of fundamental limitation on human thinking is known as a cognitive bias.
Important concepts around how we all make decisions. This presentation introduces the work of Nobel prize winner Daniel Kahneman on Cognitive Biases, and helps you understand why we make errors in judgement, and how to look for signs you're about make one.
As thinking human beings and team leaders or architects we can benefit from knowing more about how we think, deliberate and decide. Most teams rely on trust, transparency, collaboration, and collective decision-making. “Thinking, Fast and Slow,” by Daniel Kahneman explains two systems that drive how we think. System 1 thinking is fast, intuitive, and emotional; System 2 is slow, deliberate, and logical.
In this presentation you learn how fast and slow thinking affects your reactions, behaviors, and decision-making. You’ll explore how several common development practices (with an emphasis on some agile practices), can amplify and exploit your thinking abilities and where they might lead you astray.
Fast thinking works pretty well in a well-known context. You save time when you don’t have to deliberate over details and nuances in order to make informed decisions. But fast thinking can lead to extremely poor decisions. You might jump to conclusions, be wildly optimistic, or greatly under-assess risks and rewards. You need to exploit both fast and slow thinking and be acutely aware of when fast thinking is tripping you up.
In psychology, decision-making is regarded as the cognitive process resulting in the selection of a belief or a course of action among several alternative possibilities. Every decision-making process produces a final choice, which may or may not prompt action.
In the past four decades, behavioral economists and cognitive psychologists have discovered many cognitive biases human brains fall prey to when thinking and deciding. Cognitive biases are tendencies to think in certain ways that can lead to systematic deviations from a standard of rationality or good judgment. These biases arise from errors of memory, social attribution, and miscalculations such as statistical errors or a false sense of probability. Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them.
Bayesian reasoning offers a way to improve on the native human reasoning style. Reasoning naively, we tend not to seek alternative explanations, and sometimes underrate the influence of prior probabilities in Bayes' theorem.
Credits: Wikipedia, LessWrong.org
Presented at CodeMash 2015. By Joseph Ours
Joseph's presentation is based on the book "Thinking Fast and Slow" where Nobel Prize winner Daniel Kahneman introduces two mental systems, one that is fast and the other slow. Together they shape our impressions of the world around us and help us make choices. System 1 is largely unconscious and makes snap judgments based upon memories of similar events and our emotions. System 2 is painfully slow, and is the process by which we consciously check the facts and think carefully and rationally. System 2 is easily distracted. System 1 is wrong quite often. Real-world examples that demonstrate how the two systems work are that pro golfers will more accurately putt for par than they do for birdie regardless of distance and people will buy more cans of soup when there is a sign on the display that says “limit 12 per customer."
Hello Everyone,
A big thank you for all the interest in this study guide. It was originally created as a fun introduction that took the Cognitive Bias wiki and tried to make it easier to memorize.
However, the authors of the wiki article have expressed some concern over the accuracy of certain entries. The document was taken down until that could be corrected.
But, people started asking that I release a new version with a warning. In response, a new "Beta version" of the document has been uploaded with a very strong warning label up front and improved citations. I make it clear that all the text is based on an evolving wiki page and that some of the cognitive biases in there might be incorrect wiki entries. My hope is that this will continue to get people interested in pitching in to help fix the Cognitive Bias wiki pages. :) When the wiki is in a good place, I will take the document out of Beta, and will remove the warning label.
If you are a cognitive expert, join “Operation Fix The Cognitive Bias Wiki!” Add your suggestion to the conversation here: http://en.wikipedia.org/wiki/Talk:List_of_cognitive_biases
Thanks for your interest!
Eric
P.S. . The images have been updated for better remixing and sharing rights. Rather than using permission based images, now all the images are public domain or free non-commercial use by anyone.
Cognitive Biases and Effects You Should Know AboutKevlin Henney
Presented at NDC 2011 in Oslo (8th June 2011)
Video available at http://www.everytalk.tv/talks/678-NDC-Cognitive-Biases-and-Effects-You-Should-Know-About
In software development, developers, architects and managers often like to think of themselves as rational and clear thinking, not prone to the chaotic and contradictory thinking they see at home, in politics or in the world of business. Although it is possible to get further from the truth than this, it is not likely.
Those involved in software development are just as human as people in other walks of life, and are just as subject to the cognitive biases and effects that skew, truncate and bypass clear thinking. The effects on rationality affect everything from testing to estimation, from programming to project delivery. It is easier to see and react to these effects in yourself and others when you know what some of them are.
I've discussed the various ways our brain makes illogical judgments and then makes errors in thinking. I've also discussed the difference between logical thought and how the brain thinks automatically. There is some content on logic as seen in animals too.
Here is a special post I've made about the Survivorship bias
https://cognitiontoday.com/what-you-need-to-know-about-success-stories-survivorship-bias/
Here is one on overcoming thinking biases
https://cognitiontoday.com/8-powerful-ways-to-overcome-thinking-errors-and-cognitive-biases/
Here is one on a few more cognitive biases
https://cognitiontoday.com/4-cognitive-biases-you-should-be-aware/
This is a presentation that covers the basic concepts of the book Nudge, by Richard Thaler and Cass Sunstein. We read this book at our UX Book Club meeting, and I presented an introduction to it at the LA IxDA meeting.
This deck accompanied Kelly Baron's SXSW talk on 3/13/17. Nudge theory is about hacking human nature using subtle, context-driven interventions. We all sometimes buy into the shampoo commercial dream that our products can make us into better people, but what if that were true?
Thanks to IoT, we’re designing products that make and break our habits. We applied nudge theory to our healthcare wearable, Under Currents, to solve billions of dollars’ worth of medical errors and save lives. When common sense fails, common sensors help us be the best version of ourselves.
Join Kelly Baron, a business designer from Fjord Austin, as she talks about how to apply nudge theory theory to digital experiences.
Short presentation on Decision making.
Decision making variables, Types of managerial decision, Decision making process and Techniques for Stimulating Creativity
Stuart Lane takes saying sorry seriously. Seriously seriously. To the extend he's nearly finished his PhD on it. Listen to this fantastic talk, watch the slides and add comments your comments on www.intensivecarenetwork.com.
Decisions, decisions, decisions — we make them all the time! The life we are living right now, is the outcome of decisions we made in the past. Our future depends on the decisions that we will make today or tomorrow. To put it simply, decision making can be defined as a choice of action in an uncertain environment. There are routine and life-altering decisions; individual and group decisions; short-term and long-term decisions; high and low-stake decisions, and decisions that can be changed as well as decisions that are cast in stone — the list can go on indefinitely. Those who are not allowed to make decisions are not happy about it, and those who are entrusted to make decisions may feel burdened by the responsibility. Decisions are sometimes made by scribbling on the back of a paper napkin and at other times with the help of massive computer programs. Some decisions need to be made on the spot — in the blink of an eye — with other decisions it is often recommended that we sleep on it.
The scope of decision making is very wide, the process sometimes starkly complex and individual styles vary considerably. It is such a fundamental part of our existence and growth; however, many of us may not give enough thought to our approach to decision making — does my decision making need any tweaking or am I happy with it as it is? In order to understand our own style of decision making, we need to consider what are the major dimensions of decision making and where do we stand on each of the dimensions.
Here are 5 important dimensions of decision making:
Decision Making for European managers in public organisations.ErwinvandePol
For and with an audience of managers from many European public organisations, we discussed the issue that leaders in the public service are faced with difficult decisions affecting the public services they deliver, the electors, employees and councilors. The question is: “How do they persuade all parties to expect less in the future?" And are European leaders and managers in public organisations well equiped for making difficult decisions?
Persuasion architectures: Nudging People to do the Right ThingUser Vision
Review of some of the most popular commercial and public sector persuasion methodologies. Plus some reasons why they may not work and some criticisms, and a comparison of how supermarkets persuade us, offline.
Apply the science of decision making to improve the effectiveness of your communications. This is helpful for web sites, brochures, political campaigns, and all forms of advertising and communication. Get a competitive advantage in your communications.
Company Valuation webinar series - Tuesday, 4 June 2024FelixPerez547899
This session provided an update as to the latest valuation data in the UK and then delved into a discussion on the upcoming election and the impacts on valuation. We finished, as always with a Q&A
Putting the SPARK into Virtual Training.pptxCynthia Clay
This 60-minute webinar, sponsored by Adobe, was delivered for the Training Mag Network. It explored the five elements of SPARK: Storytelling, Purpose, Action, Relationships, and Kudos. Knowing how to tell a well-structured story is key to building long-term memory. Stating a clear purpose that doesn't take away from the discovery learning process is critical. Ensuring that people move from theory to practical application is imperative. Creating strong social learning is the key to commitment and engagement. Validating and affirming participants' comments is the way to create a positive learning environment.
Event Report - SAP Sapphire 2024 Orlando - lots of innovation and old challengesHolger Mueller
Holger Mueller of Constellation Research shares his key takeaways from SAP's Sapphire confernece, held in Orlando, June 3rd till 5th 2024, in the Orange Convention Center.
Digital Transformation and IT Strategy Toolkit and TemplatesAurelien Domont, MBA
This Digital Transformation and IT Strategy Toolkit was created by ex-McKinsey, Deloitte and BCG Management Consultants, after more than 5,000 hours of work. It is considered the world's best & most comprehensive Digital Transformation and IT Strategy Toolkit. It includes all the Frameworks, Best Practices & Templates required to successfully undertake the Digital Transformation of your organization and define a robust IT Strategy.
Editable Toolkit to help you reuse our content: 700 Powerpoint slides | 35 Excel sheets | 84 minutes of Video training
This PowerPoint presentation is only a small preview of our Toolkits. For more details, visit www.domontconsulting.com
Premium MEAN Stack Development Solutions for Modern BusinessesSynapseIndia
Stay ahead of the curve with our premium MEAN Stack Development Solutions. Our expert developers utilize MongoDB, Express.js, AngularJS, and Node.js to create modern and responsive web applications. Trust us for cutting-edge solutions that drive your business growth and success.
Know more: https://www.synapseindia.com/technology/mean-stack-development-company.html
Enterprise Excellence is Inclusive Excellence.pdfKaiNexus
Enterprise excellence and inclusive excellence are closely linked, and real-world challenges have shown that both are essential to the success of any organization. To achieve enterprise excellence, organizations must focus on improving their operations and processes while creating an inclusive environment that engages everyone. In this interactive session, the facilitator will highlight commonly established business practices and how they limit our ability to engage everyone every day. More importantly, though, participants will likely gain increased awareness of what we can do differently to maximize enterprise excellence through deliberate inclusion.
What is Enterprise Excellence?
Enterprise Excellence is a holistic approach that's aimed at achieving world-class performance across all aspects of the organization.
What might I learn?
A way to engage all in creating Inclusive Excellence. Lessons from the US military and their parallels to the story of Harry Potter. How belt systems and CI teams can destroy inclusive practices. How leadership language invites people to the party. There are three things leaders can do to engage everyone every day: maximizing psychological safety to create environments where folks learn, contribute, and challenge the status quo.
Who might benefit? Anyone and everyone leading folks from the shop floor to top floor.
Dr. William Harvey is a seasoned Operations Leader with extensive experience in chemical processing, manufacturing, and operations management. At Michelman, he currently oversees multiple sites, leading teams in strategic planning and coaching/practicing continuous improvement. William is set to start his eighth year of teaching at the University of Cincinnati where he teaches marketing, finance, and management. William holds various certifications in change management, quality, leadership, operational excellence, team building, and DiSC, among others.
The world of search engine optimization (SEO) is buzzing with discussions after Google confirmed that around 2,500 leaked internal documents related to its Search feature are indeed authentic. The revelation has sparked significant concerns within the SEO community. The leaked documents were initially reported by SEO experts Rand Fishkin and Mike King, igniting widespread analysis and discourse. For More Info:- https://news.arihantwebtech.com/search-disrupted-googles-leaked-documents-rock-the-seo-world/
VAT Registration Outlined In UAE: Benefits and Requirementsuae taxgpt
Vat Registration is a legal obligation for businesses meeting the threshold requirement, helping companies avoid fines and ramifications. Contact now!
https://viralsocialtrends.com/vat-registration-outlined-in-uae/
Kseniya Leshchenko: Shared development support service model as the way to ma...Lviv Startup Club
Kseniya Leshchenko: Shared development support service model as the way to make small projects with small budgets profitable for the company (UA)
Kyiv PMDay 2024 Summer
Website – www.pmday.org
Youtube – https://www.youtube.com/startuplviv
FB – https://www.facebook.com/pmdayconference
2. What are cognitive biases?
• Most people think they behave rationally
• It turns out that people are susceptible to limitations in
thinking, judgement and decision making
• Stems from several areas of cognition
• Memory
• Perception
• Feelings
• Misapplication of statistical reasoning
• …
• Introduced in 1972 by Amos Tversky and Daniel
Kahneman
• The list of identified cognitive biases runs into the hundreds
3. Where do the biases come from?
• At some point in our evolutionary history these biases
were useful adaptations
• Mostly they help make decisions with limited information
• Useful if your only task for the day is survival, and speed of
decision is more important that accuracy
• However in today’s world the opposite is true
• Now we need to do more than just survive they can get us
into trouble
• Parts of our lives and society rely on sound decisions
• Finding a partner
• Making a purchase
• Juries!!!
• …
4. Some examples – Confirmation Bias
• We like things that match our view of the world
• To the extent that we search out things that agree with us,
whilst ignoring conflicting information
• It is a short circuit to keeping away from things that may
cause us harm
• But…
• Makes it difficult to let go of entrenched positions
• Makes people open to scams such as psychic readings
• The uncomfortable feeling of anxiety we get when the real
world does not fit in with out world view is called cognitive
dissonance.
5. Fundamental attribution error
• A quirk in the way we reason about causality
• Attributing an aspect of a persons behaviour to their
fundamental character rather than the situation
• If I loose my keys I am just unlucky, if someone else
looses their keys they are careless
• What other examples are there?
6. Framing
• People react differently to a choice depending on how the
information is framed
• Loss versus gain
• Positive versus negative
• Is it better to say a medical intervention has 90% chance
of success or a 10% chance of death?
• It depends on what you desire as an outcome
• This bias is the cornerstone of marketing and political spin
7. Anchoring
• Humans tend to rely on the first piece of information
offered when making decisions. This is the anchor.
• Subsequent decisions are made by adjusting away from
the anchor
• So in negotiations the first price offered sets the anchor
• Related to a similar effect called priming
8. Illusionary Superiority
• People tend to overestimate their positive qualities
relative to other people or groups
• This is widespread in all aspects of life
• Intelligence, sporting ability, academic performance, job
performance, popularity, confidence, …
• We even struggle to understand that in most cases it is a
nonsensical premise
• Even at Cambridge around half of the students academic abilities
are below average for their cohort
• This is linked to the Dunning-Kruger effect
9. Survivorship Bias
• It is easy to see the things that survived – they are all
around us
• We often overlook the things that didn’t because they are
not visible
• Could be actual people – medical trails not taking into account the
people that dropped out
• Focusing on what leading businesses / business leaders did rather
than understanding what the countless other who failed did or did
not do
• This bias was faced for real during WWII when
investigating the cause of bomber losses
10. Sammelweis Reflex
• This is the dismissal of new evidence because it does not
comply with the established norms of the day
• Named after Ignaz Sammelweis, who found a causal link
between childbed fever and the mortality rates of new
mothers
• Demonstrated washing hands could reduce death rates
• Ignored by his peers as he could find no acceptable
scientific explanation and his contemporaries simple
refused to believe him
• Does this happen a lot in IT?
11. Regression to the mean
• Not exactly a cognitive bias, but a similar failure in
statistical reasoning
• If a variable is extreme in the first measurement, it will
tend to be closer to the average for the second
measurement
• Gives rise to the idea that punishment is effective
• If someone performs badly and is punished it is more likely that the
next time their performance will be closer to the average
• Likewise with praise
12. So now we know about them we are ok?
• Well, not so much
• Knowing about them does not mean that you will be able
to spot them all the time. It does help.
• If a decision is important you must explicitly call out the
biases to make sure you are not being tricked
13. Further reading
• Wikipedia’s list of cognitive biases
http://en.wikipedia.org/wiki/List_of_cognitive_biases
• Thinking Fast and Slow – Daniel Kahneman
http://www.amazon.co.uk/Thinking-Fast-Slow-Daniel-
Kahneman/dp/0141033576/
• Bad science – Ben Goldacre
http://www.amazon.co.uk/Bad-Science-Ben-
Goldacre/dp/000728487X/
• Loads more…
Editor's Notes
What are cognitive biases?Most people think they behave rationally<This is the premise that most of current economic theory is bases>It turns out that people are susceptible to limitations in thinking, judgement and decision making<and that on the most part we are completely unaware of it>Stems from several areas of cognitionMemoryPerceptionFeelingsMisapplication of statistical reasoning<There are lots more…>Introduced in 1972 by Amos Tversky and Daniel KahnmanThe list of identified cognitive biases runs into the hundreds
Where do the biases come from?At some point in our evolutionary history these biases were useful adaptationsMostly they help make decisions with limited information<Heuristics, rules of thumb>Useful if your only task for the day is survival, and speed of decision is more important that accuracy<If you think you are going to get eaten, it is better to act quickly than enter into a thorough analysis of the situation>However in today’s world the opposite is true<The world we live in today is incredibly complex, and now in most cases it is better to be accurate than fast in our decision making>Now we need to do more than just survive they can get us into trouble<They can in the most serious cases get us killed>Parts of our lives and society rely on sound decisionsFinding a partnerMaking a purchaseJuries!!!
<I have just picked a few interesting examples, they are not necessarily the most common or have the largest impact on our lives>Some examples – Confirmation BiasWe like things that match our view of the world<We like people who are like us and share our interests. This is likely to come from our evolutionary need to form social groups>To the extent that we search out things that agree with us, whilst ignoring conflicting information<It is possible for two people to interpret the same information differently depending on their world view><Fitting the terrain to the map rather than the other way around>It is a short circuit to keeping away from things that may cause us harm<After all, if things are similar to things we already know and like then they are probably alright>But…Makes it difficult to let go of entrenched positionsMakes people open to scams such as psychic readings<What other examples?>The uncomfortable feeling of anxiety we get when the real world does not fit in with out world view is called cognitive dissonance.<This is an actual physiological response to a psychological feeling>
Fundamental attribution errorA quirk in the way we reason about causalityAttributing an aspect of a persons behaviour to their fundamental character rather than the situation If I loose my keys I am just unlucky, if someone else looses their keys they are careless<They would see things the opposite way around>What other examples are there?<We attribute out own success to our hard work and skill, and other people’s success to luck><When I don’t write tests I have a good reason for doing so, if someone else doesn’t write tests they are a bad developer>
FramingPeople react differently to a choice depending on how the information is framedLoss versus gainPositive versus negativeIs it better to say a medical intervention has 90% chance of success or a 10% chance of death?It depends on what you desire as an outcome<Do I want people to take the intervention or not?>This bias is the cornerstone of marketing and political spin<Has a very powerful effect on our decision making abilities>
AnchoringHumans tend to rely on the first piece of information offered when making decisions. This is the anchor.Subsequent decisions are made by adjusting away from the anchorSo in negotiations the first price offered sets the anchor<Are anchoring effects present in estimation?>Related to a similar effect called priming<Priming is subtly or no so subtly providing information that will focus people towards a decision, stance or worldview><So in sprint planning when I say a story “feels small” I am priming (subconsciously or consciously) towards a biased estimate>
Illusionary SuperiorityPeople tend to overestimate their positive qualities relative to other people or groupsThis is widespread in all aspects of lifeIntelligence, sporting ability, academic performance, job performance, popularity, confidence, …We even struggle to understand that in most cases it is a nonsensical premise<unless the distribution is very skewed e.g. average number of legs for a human. In cases like this the median is a more suitable measure>Even at Cambridge around half of the students academic abilities are below average for their cohortThis is linked to the Dunning-Kruger effect<People who have a lack of knowledge tend to overestimate their abilities, often to the detriment of their more able peers>
Survivorship BiasIt is easy to see the things that survived – they are all around usWe often overlook the things that didn’t because they are not visibleCould be actual people – medical trails not taking into account the people that dropped outFocusing on what leading businesses / business leaders did rather than understanding what the countless other who failed did or did not doThe bomber problem was real issue faced during WWII <involved a statistician named Abraham Wald> http://youarenotsosmart.com/2013/05/23/survivorship-bias/
Sammelweis ReflexThis is the dismissal of new evidence because it does not comply with the established norms of the dayNamed after Ignaz Sammelweis, who found a causal link between childbed fever and the mortality rates of new mothersDemonstrated washing hands could reduce death ratesIgnored by his peers as he could find no acceptable scientific explanation and his contemporaries simple refused to believe him<He was driven mad by his desire to have his theory taken seriously. Died in an asylum>Does this happen a lot in IT?<Dismissal of new approaches, technologies and techniques?>
Regression to the meanNot exactly a cognitive bias, but a similar failure in statistical reasoningIf a variable is extreme in the first measurement, it will tend to be closer to the average for the second measurementGives rise to the idea that punishment is effectiveIf someone performs badly and is punished it is more likely that the next time their performance will be closer to the averageLikewise with praise<It is not that punishment works and praise fails, it is just statistics><what about things like improvements in agile, or the effectiveness of planning meetings etc?><Can be hard to separate actual improvements from regression to the mean>