Using Simulation to Manage Software Delivery Risk

1,075 views

Published on

Session slides for LSSC12 (Lean Software and Systems Conference), Boston. MA. 2012. Sessions discusses how to model and simulate software projects.

Published in: Business, Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,075
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
48
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Using Simulation to Manage Software Delivery Risk

  1. 1. Using Simulation to Manage Software Delivery Risk Effective Modeling and Simulating Kanban and Scrum Projects using Monte Carlo Techniques Troy Magennis Troy.magennis@focusedobjective.com @AgileSimulation
  2. 2. Schedule RiskAre we meeting our commitments?
  3. 3. Staff RiskWhat team skillset additions(or losses) have the biggest impact?
  4. 4. RisksWhat are the top three risks jeopardizing delivery?
  5. 5. My Mission Arm my teams (and yours)with the tools and techniques to answer these questions And manage risk more effectively
  6. 6. • Currently: Founder and CTO Focused Objective• Previously: Vice President of Technology (Arch) • Travelocity and Lastminute.com • Director Architecture, Corbis • Various: Automotive, Banking Contact: @AgileSimulation and @t_magennis FocusedObjective.com Troy.Magennis@focusedobjective.com
  7. 7. What, when, who, whyDEFINITIONS, HISTORY AND USE
  8. 8. Definition: Model A model is a tool used tomimic a real world process A tool for low-cost experimentation
  9. 9. Definition: SimulationA technique of using a modelto determine a result given a set of input conditions
  10. 10. Monte Carlo SimulationPerforming a simulation of a model multiple times usingrandom input conditions and recording the frequency of each result occurrence
  11. 11. Simple to more complex model and simulation of a software project DEMO: VISUAL MODEL SIMULATION DEMO: MONTE-CARLO SIMULATIONIn case of demo disaster, press here…
  12. 12. History Stan Ulam Holding the FERMIACCredits: Wikipedia
  13. 13. When to use Monte Carlo Simulation When there is no correct single answer (knowable in advance) or when thetime/effort taken to computean answer is beyond realistic
  14. 14. When to use Monte Carlo Simulation When a range of inputconditions can MASSIVELY alter the final outcome
  15. 15. Who Uses Monte Carlo Simulation High risk industries Natural resource exploration, insurance, finance, banking, pharmaceutical…Software Development == High Risk!Just look at our reputation, and on-time, on-budget success rate…
  16. 16. APPLYING MONTE CARLOMETHODS TO SOFTWARE DEV
  17. 17. Why? To Answer Tough Questions… Date and cost forecasts Impact of staff hire/loss Cost of defects Cost of blocking events … And my three 1:1 questions each week!
  18. 18. But doesn’t it require estimates? Yes, but very few…MUST: Estimate major risksSHOULD: Column cycle-times and story counts
  19. 19. We need to estimate risk events **Major risk events have the predominate role in deciding where deliver actually occurs ** We spend all our time estimating here 1 2 3
  20. 20. Is it Accurate? 1. Gin still equals Gout2. Doesn’t suffer from the “Flaw of Averages”
  21. 21. Flaw of Averages50% 50%Possible PossibleOutcomes Outcomes
  22. 22. The average Major issue: Race release condition, third party date!!! component failure… 25Frequency of Result 20 15 10 5 1 Major Risk Event Shifts Developer Estimates Delivery Shape Right
  23. 23. We need to estimate risk events **Major risk events have the predominate role in deciding where deliver actually occurs ** We spend all our time estimating here 1 2 3See model example…
  24. 24. Risk likelihood changes constantly 95th Confidence Interval 1 2 3
  25. 25. Risk likelihood changes constantly 95th Confidence Interval 1 2 3
  26. 26. Risk likelihood changes constantly 95th Confidence Interval 1 2 3
  27. 27. Risk likelihood changes constantly 95th Confidence Interval 1 2 3
  28. 28. DEMO: FORECASTING (DATES & COST) DEMO: SENSITIVITY (COST OF DEFECTS) DEMO: STAFF IMPACT (STAFF RISK)In case of demo disaster or no internet, press here…
  29. 29. BEST PRACTICES AND TIPS
  30. 30. Sensitivity Model Test (a little) The Model Creation Cycle Monte- VisuallyCarlo Test Test
  31. 31. Make Informed BaselineDecision(s) The Experiment Cycle MakeCompare Single Results Change
  32. 32. Best Practice 1 Start simple and add ONE input condition at a time. Visually / Monte-carlo testeach input to verify it works
  33. 33. Best Practice 2 Find the likelihood of major events and estimate delay E.g. vendor dependencies,performance/memory issues, third party component failures.
  34. 34. Best Practice 3Only obtain and add detailed estimates and opinion to amodel if Sensitivity Analysis says that input is material
  35. 35. Best Practice 4Use a uniform random inputdistribution UNTIL sensitivity analysis says that input is influencing the output
  36. 36. Best Practice 5 Educate your managers’about risk. They will still want a “single” date for planning, but let them decide 75 th or 95 th confidence level(average is NEVER an option)
  37. 37. Q1. Are we meeting our commitments? Is the likelihood of the models forecast date increasing or decreasing?Q2. What are the top three risksjeopardizing on-time delivery? Top three items in the Sensitivity reportQ3. What skillsets do your next threehires need to have? Skills applicable to the top three WIP limit increases that cause the biggest reduction in forecast
  38. 38. Call to action• Read these books• Download the software FocusedObjective.com• Follow @AgileSimulation• Learn: http://strategicdecisions.stanford.edu/
  39. 39. Questions?My Contact Details and to get these slides, the software or the book used in this session - FocusedObjective.com Me: Troy.magennis@FocusedObjective.comFollow: @AgileSimulation and @t_magennis
  40. 40. BASICS OF MODELING AND SIMULATIONReturn to main presentation…
  41. 41. Manual Kanban Model & Simulation 2 3 4 Design Develop Test Backlog 1 – 2 days 1 – 2 days 1 – 2 days Deployed1 2 5 PLUS: For this manual example, at least 1 defect, blocking event and scope-creep item.
  42. 42. Day 1 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed 1 Day picked at random 2 for this columns cycle- time range
  43. 43. Day 2 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed 2 1 day
  44. 44. Day 3 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed 2 1 day
  45. 45. Day 4 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days DeployedAddedScope 2 1 day
  46. 46. Day 5 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed Added Scope 2 1 day
  47. 47. Day 6 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed Added Scope 2 1 day
  48. 48. Day 7 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed Added Scope 2 1 day
  49. 49. Day 8 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed 2 1 day Added Scope
  50. 50. Day 9 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed Added Scope 2
  51. 51. Day 10 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed Added Scope 2
  52. 52. Day 11 Design Develop TestBacklog 1 – 2 days 1 – 5 days 1 – 2 days Deployed Added Scope 2
  53. 53. Result versus Frequency (50 runs) More Often 25Frequency of Result 20 15 10 5 1 10 15 20 Less Often Result Values – For example, Days
  54. 54. Result versus Frequency (250 runs) More Often 25Frequency of Result 20 15 10 5 1 10 15 20 Less Often Result Values – For example, Days
  55. 55. Result versus Frequency (1000+ runs) More Often 25Frequency of Result 20 15 10 5 1 10 15 60 Less Often Result Values – For example, Days
  56. 56. Central Limit TheorumReturn to main presentation…
  57. 57. Flaw of Averages 50% 50% Possible Possible Outcomes OutcomesReturn to main presentation…
  58. 58. Software Development Model 4 3 Blocking 5 Events Added Defects Work 2 6 Staff Work Vacations1 7 Columns & WIP Model …Return to main presentation…
  59. 59. Return to mainpresentation…
  60. 60. SIMULATION EXAMPLESReturn to main presentation…
  61. 61. unlikely Forecasts Return to main presentation…certain
  62. 62. unlikely Forecasts Return to main presentation… 50% 50% Possible Possible Outcomes Outcomescertain
  63. 63. Return to main presentation… Sensitivity ReportActively Ignore for theManage moment
  64. 64. Return to main presentation…Staff Skill Impact Report Explore what staff changes have the greatest impact
  65. 65. Return to mainpresentation…
  66. 66. MULTI-MODAL RESULT MODEL
  67. 67. Return to main presentation…
  68. 68. Return to main presentation… <setup> <backlog type="custom" > <deliverable name=“work"> <custom count="10">Build website</custom> </deliverable> <deliverable name="performance issues, add caching" skipPercentage="50"> <custom count="10" >Rework: Performance Issues</custom> </deliverable> </backlog> <columns> <column id="1" estimateLowBound="1" estimateHighBound="5" wipLimit="1">Develop</column> <column id="2" estimateLowBound="1" estimateHighBound="5" wipLimit="1">Test</column> </columns></setup>
  69. 69. Return to mainpresentation…

×