Successfully reported this slideshow.

Devternity 2016 "Thinking Fast and Slow with Software Development"

4

Share

Loading in …3
×
1 of 60
1 of 60

Devternity 2016 "Thinking Fast and Slow with Software Development"

4

Share

Download to read offline

In the international bestseller 'Thinking, Fast and Slow', Daniel Kahneman explains how we as human beings think and reason, and perhaps surprisingly how our thought processes are often fundamentally flawed and biased. This talk explores the ideas presented in the book in the context of professional software development. As software developers we all like to think that we are highly logical, and make only rational choices, but after reading the book I'm not so sure. Here I'll share my thinking on thinking. Topics that will be discussed include; the 'Availability Heuristic', which can lead developers to choose the 'latest and greatest' technology without proper evaluation; 'Optimistic Bias' which can blind architects from the 'unknown unknowns' within a project; and more!

In the international bestseller 'Thinking, Fast and Slow', Daniel Kahneman explains how we as human beings think and reason, and perhaps surprisingly how our thought processes are often fundamentally flawed and biased. This talk explores the ideas presented in the book in the context of professional software development. As software developers we all like to think that we are highly logical, and make only rational choices, but after reading the book I'm not so sure. Here I'll share my thinking on thinking. Topics that will be discussed include; the 'Availability Heuristic', which can lead developers to choose the 'latest and greatest' technology without proper evaluation; 'Optimistic Bias' which can blind architects from the 'unknown unknowns' within a project; and more!

More Related Content

More from Daniel Bryant

Related Books

Free with a 14 day trial from Scribd

See all

Devternity 2016 "Thinking Fast and Slow with Software Development"

  1. 1. Thinking, Fast and Slow… With Software Development Daniel Bryant Chief Scientist, OpenCredo daniel.bryant@opencredo.com @danielbryantuk
  2. 2. Think More Deliberately • Our decision making can be flawed… • Apply process and models (as appropriate) • Collaborate more (and better) • Plan, do, check, act… 03/12/2016 @danielbryantuk
  3. 3. @danielbryantuk • London Java Community Associate • InfoQ Editor, DZone MVB, O’Reilly • Conference regular… 03/12/2016 @danielbryantuk • Chief Scientist at OpenCredo • Technical/digital transformation • Java, Golang, CI/CD, DevOps • Microservices, cloud and containers
  4. 4. Pointy-Haired Decision Making… 03/12/2016 @danielbryantuk dilbert.com/strip/2010-08-24
  5. 5. Our Thinking: A Tale of Two Systems… System 1: fast, instinctive, emotional, subconscious System 2: slower, deliberate, reasoning, conscious 03/12/2016 @danielbryantuk
  6. 6. 2 + 2 = ? 03/12/2016 @danielbryantuk
  7. 7. (24 / 2) * (1 / 3) = ? 03/12/2016 @danielbryantuk
  8. 8. 2 + 2 = 4 (24 / 2) * (1 / 3) = 4 03/12/2016 @danielbryantuk
  9. 9. 03/12/2016 @danielbryantuk
  10. 10. 03/12/2016 @danielbryantuk
  11. 11. 03/12/2016 @danielbryantuk
  12. 12. Our Thinking: A Tale of Two Systems… System 1: fast, instinctive, emotional, subconscious Rapid, associative, and has systemic errors System 2: slower, deliberate, reasoning, conscious Lazy, and causal (not statistical) 03/12/2016 @danielbryantuk
  13. 13. Heuristics / biases affecting software delivery 03/12/2016 @danielbryantuk
  14. 14. Availability Heuristic “If something can be recalled, it must be important” ‘Hipster-itis’ e.g. the ‘best’ architectural style 03/12/2016 @danielbryantuk
  15. 15. Microservices • The current flavour of the month! – Great ideas, but probably over-hyped • Frameworks and products emerging – Virtuous (vicious?) circle – “The bandwagon effect” – Have we found the silver bullet?… 03/12/2016 @danielbryantuk
  16. 16. Whatever is Available When all you have is a hammer… …everything looks like a nail 03/12/2016 @danielbryantuk
  17. 17. When all you have is a SOA… …everything looks like a service 03/12/2016 @danielbryantuk
  18. 18. When all you have is a Jenkins… …everything looks like a Jenkins Job 03/12/2016 @danielbryantuk
  19. 19. When all you have is a Java Spring Framework… …everything looks like an AbstractSingletonProxyFactoryBean 03/12/2016 @danielbryantuk
  20. 20. Availability: Think Professionally • Stop… engage system 2 • Spike/prototype, experiment, evaluate… • Constant learning – Find trusted mentors – Cultivate blogs – Read the classics 03/12/2016 @danielbryantuk
  21. 21. Fundamentals 03/12/2016 @danielbryantuk
  22. 22. Evaluation “I will postpone using this shiny new framework until my peers have validated the proposed benefits with rigorous scientific experiments” - Said by no programmer …ever 03/12/2016 @danielbryantuk
  23. 23. Raible’s Comparison Matrix 03/12/2016 @danielbryantuk Matt Raible comparison matrix (bit.ly/OxUzad)
  24. 24. Optimistic Bias “People tend to be overconfident, believing that they have substantial control in their lives” I know what our customers want… …how could I possibly be wrong? 03/12/2016 @danielbryantuk
  25. 25. Four Factors of Optimistic Bias • Desired end state – Self-enhancement, perceived control • Cognitive mechanisms – Representativeness heuristic, singular target focus • Information about self vs target • Overall mood 03/12/2016 @danielbryantuk
  26. 26. Optimism: Think Professionally • Define (and share) clear goals • Build, measure, learn… • Remove uncertainty early (bit.ly/1mAb6o4) – “Patterns of Effective Delivery” by Dan North • Software is inherently collaborative… 03/12/2016 @danielbryantuk
  27. 27. The Three Amigos 03/12/2016 @danielbryantuk www.slideshare.net/chassa/2012-1130alm-dayviennaslideshare
  28. 28. Pair Mob Programming 03/12/2016 @danielbryantuk benjiweber.co.uk/blog/2014/11/30/the-unruly-mob/ probablyfine.co.uk/papers/mob-programming.pdf www.infoq.com/interviews/zuill-mob-programming
  29. 29. DevOps: Share the Pain... 03/12/2016 @danielbryantuk
  30. 30. “Developer-on-call” An occasional spike to the head is a good thing... ...metaphorically speaking • You build it, you run it – Shared responsibility – Communication 03/12/2016 @danielbryantuk
  31. 31. Remove (or Limit) Uncertainty 03/12/2016 @danielbryantuk
  32. 32. Check the HiPPO 03/12/2016 @danielbryantuk bit.ly/1xseeXM
  33. 33. Planning Fallacy “A phenomenon in which predictions about how much time will be needed to complete a future task display an optimistic bias. ” Was your last project completed on time? …and on budget? 03/12/2016 @danielbryantuk
  34. 34. IT Track Record… • Sainsbury’s Supply Chain Management System – $526m bit.ly/160SnAj • NHS patient record system – £10bn bit.ly/XBzFuV • HealthCare.gov – onforb.es/1k7egyb 03/12/2016 @danielbryantuk
  35. 35. Most Common Factors for Failure • Unrealistic or unarticulated project goals • Inaccurate estimates of needed resources • Badly defined system requirements • Poor reporting of the project's status • Unmanaged risks • Poor communication among customers, developers, and users • Use of immature technology • Inability to handle the project's complexity • Sloppy development practices • Poor project management • Stakeholder politics • Commercial pressures Source spectrum.ieee.org/computing/software/why-software-fails 03/12/2016 @danielbryantuk
  36. 36. At the Start: Feedback & Federation • Determine goal or hypothesis – Define ‘metrics of success’ • Plan, do, check (showcase), act • Divide and conquer – SOA, microservices, modules… (systems thinking) – Squads, guilds, chapters 03/12/2016 @danielbryantuk
  37. 37. Spotify 03/12/2016 @danielbryantuk
  38. 38. A Side Note Even if your upfront planning is good… 03/12/2016 @danielbryantuk
  39. 39. Things Inevitably Go Wrong… 03/12/2016 @danielbryantuk
  40. 40. At the End: It’s All About People 03/12/2016 @danielbryantuk www.infoq.com/news/2015/06/too-big-to-fail
  41. 41. From Macro to Micro… 03/12/2016 @danielbryantuk
  42. 42. Accept Unknown Unknowns… 03/12/2016 @danielbryantuk dilbert.com/strips/comic/1995-11-10/
  43. 43. Sunk Cost Fallacy “Any past cost that has already been paid and cannot be recovered should not figure into the decision making process. ” When did you last remove a framework? …or a library? 03/12/2016 @danielbryantuk
  44. 44. Why Are We Reluctant? • We don’t like being wrong… • Existing effort appears wasted – Endowment effect • Loss aversion – Twice as powerful, psychologically, as gains? 03/12/2016 @danielbryantuk
  45. 45. Try not to ‘Sink Costs’ 03/12/2016 @danielbryantuk
  46. 46. Retrospect Regularly • Did we make the right choice? • When was the ‘last responsible moment’? • What can we learn? • How can we get better? 03/12/2016 @danielbryantuk
  47. 47. Anchoring Bias “Common tendency to rely too heavily on the first piece of information offered when making decisions. ” How does your manager ask for estimates? …is it an unbiased question? 03/12/2016 @danielbryantuk
  48. 48. 03/12/2016 @danielbryantuk
  49. 49. Anchoring: Learn to “Upwardly Manage” • Learn to say no… – Provide explanations and alternatives • Make sure goals/user stories are well-defined – Collaboration – Feedback • Apply PERT estimations (bit.ly/1mGzuoe) 03/12/2016 @danielbryantuk
  50. 50. A Little More #NoEstimates 03/12/2016 @danielbryantuk ronjeffries.com/xprog/articles/the-noestimates-movement/
  51. 51. Read Your Way to Tech Leadership (?) 03/12/2016 @danielbryantuk
  52. 52. Ok, lets wrap this up… 03/12/2016 @danielbryantuk
  53. 53. System 1 is Awesome! • It saves us from many dangers • Very valuable in many situations – ‘gut feel’, instincts, intuition – Expertise plays a role – Can be used for good • But, I see too much System 1 in IT… 03/12/2016 @danielbryantuk
  54. 54. We All Make Mistakes… 03/12/2016 @danielbryantuk
  55. 55. Have a Little Empathy… 03/12/2016 @danielbryantuk
  56. 56. “You will be the same person in five years as you are today except for the people you meet and the books you read.” - Charlie “Tremendous” Jones (bit.ly/1LAdQkv) 03/12/2016 @danielbryantuk
  57. 57. Awesome Conferences and Books 03/12/2016 @danielbryantuk
  58. 58. 03/12/2016 @danielbryantuk
  59. 59. Summary • Apply process and models (as appropriate) – Listen to system 1, but engage system 2… • Collaborate more (and better) • Learn, do, retrospect, (teach,) repeat - “Think” more deliberately - 03/12/2016 @danielbryantuk
  60. 60. Thanks for Listening! Comments and feedback are welcomed… daniel.bryant@opencredo.com @danielbryantuk www.infoq.com/author/Daniel-Bryant www.parleys.com/author/daniel-bryant www.muservicesweekly.com 03/12/2016 @danielbryantuk

Editor's Notes

  • Many explanations for the optimistic bias come from the goals that people want and outcomes they wish to see.[1] People tend to view their risks as less than others because they believe that this is what other people want to see. These explanations include self-enhancement, self-presentation, and perceived control.

    Self-enhancement suggests that optimistic predictions are satisfying and that it feels good to think that positive events will happen.
    People tend to focus on finding information that supports what they want to see happen, rather than what will happen to them

    Representative heuristic: individuals tend to think in stereotypical categories rather than about their actual targets when making comparisons. The estimates of likelihood associated with the optimistic bias are based on how closely an event matches a person's overall idea of the specific event

    Individuals know a lot more about themselves than they do about others.[1] Because information about others is less available, information about the self versus others leads people to make specific conclusions about their own risk, but results in them having a harder time making conclusions about the risks of others. This leads to differences in judgments and conclusions about self-risks compared to the risks of others, leading to larger gaps in the optimistic bias
  • In behavioral economics, the endowment effect (also known as divestiture aversion) is the hypothesis that people ascribe more value to things merely because they own them

    (1990 where participants were given a mug and then offered the chance to sell it or trade it for an equally priced alternative good (pens). Kahneman et al. (1990)[2] found that the amount participants required as compensation for the mug once their ownership of the mug had been established ("willingness to accept") was approximately twice as high as the amount they were willing to pay to acquire the mug ("willingness to pay")

    Loss aversion implies that one who loses $100 will lose more satisfaction than another person will gain satisfaction from a $100 windfall.
  • delay commitment until the last responsible moment, that is, the moment at which failing to make a decision eliminates an important alternative. If commitments are delayed beyond the last responsible moment, then decisions are made by default, which is generally not a good approach to making decisions.

    The key is to make decisions as late as you can responsibly wait because that is the point at which you have the most information on which to base the decision.
  • Program Evaluation and Review Technique
    Three-point estimation technique
  • ×