Simple measurements

692 views
595 views

Published on

Experiential report on using Kanban boards and introducing quantification into process and deliverables

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
692
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Simple measurements

  1. 1. Simple Measurements Schalk W. Cronjé ysb33r @ gmail.com @ysb33r Agile Cambridge 2011 © Schalk W. Cronjé
  2. 2. What is Agile? Why are you not measuring your process today?How do you know when yourdelivery was successful? Agile Cambridge 2011 © Schalk W. Cronjé
  3. 3. “What is Agile?” - Responses● “Continuous development done in small iterations”● “Adapting to rapid change, predictable periodic delivery of usable software”● “Developers and testers work together from beginning to end”● “Youd only get a cynical answer from me” Agile Cambridge 2011 © Schalk W. Cronjé
  4. 4. What is Agile Fundamentally? ● Increased revenue ● Reduction in costs ● Faster response times ● Less reworkDelivering value to stakeholders ● Customers ● Users ● Developers & Testers ● Governments & Legislators ● Societal Institutions Agile Cambridge 2011 © Schalk W. Cronjé
  5. 5. Opposing Forces? Software Craftsmanship Engineering Accuracy Precision Agile Cambridge 2011 © Schalk W. Cronjé
  6. 6. Software Engineering● Implies measurement● Implies Kolb / Shewhart / Deming Cycles● Is not code craftmanship● SE receives bad press – Perceived difficulty in measurement – Lack of measurement in S/W dev Agile Cambridge 2011 © Schalk W. Cronjé
  7. 7. Accuracy Accuracy is to precision as engineering is to mathematics Agile Cambridge 2011 © Schalk W. Cronjé
  8. 8. Accuracy We need to be accurate enough to get there, but not more accurate Agile Cambridge 2011 © Schalk W. Cronjé
  9. 9. Properties of a Kanban System Agile Cambridge 2011 © Schalk W. Cronjé
  10. 10. Taking an Economic View● It helps to quantify the effects of multiple interacting variables● It helps us to understand that the customer is not the only judge of value● By using an economic framework it can allow us to maximise value, including – cycle time – product cost – development expense● It helps to communicate with non-technical decision makers Agile Cambridge 2011 © Schalk W. Cronjé
  11. 11. Background 3 Teams – 3 Locations – 18 months Agile Cambridge 2011 © Schalk W. Cronjé
  12. 12. David Andersons Six Steps● Focus on quality● Reduce Work-in-Progress● Deliver often● Balance demand against throughput● Prioritise● Attack sources of variability to improve predictability Agile Cambridge 2011 © Schalk W. Cronjé
  13. 13. Visualisation Agile Cambridge 2011 © Schalk W. Cronjé
  14. 14. Visualisation● Honesty!● Makes it transparent to everyone that is going on● Visual cue for bottlenecks● Manual board and easy-to-use online system Agile Cambridge 2011 © Schalk W. Cronjé
  15. 15. Business Value Increments● Delivery is in business value increments (BVIs)● Only “done” when the increment is delivered end-to-end● No burning down of tasks ! (false economy)● Each increment must have at least one metric of the value it will deliver Agile Cambridge 2011 © Schalk W. Cronjé
  16. 16. Batch DevelopmentFeature 1 Dev Feature 1 QAFeature 2 Dev Feature 2 QAFeature 3 Dev Feature 3 QAFeature 4 Dev Feature 4 QA delivered build Agile Cambridge 2011 © Schalk W. Cronjé
  17. 17. Batch DevelopmentTotal Time = devt1 .. devt4 + qat1 …qat4 + fixdelay + fixtime + retest time Feature 1 Dev Feature 1 QA Feature 2 Dev Feature 2 QA Feature 3 Dev Feature 3 QA Feature 4 Dev Feature 4 QA delivered build defect trickle feedBug Feature 5+ QADB Retest fixes QA delivered build Feature 5+ Dev Agile Cambridge 2011 © Schalk W. Cronjé
  18. 18. Time Factors Building test understanding infrastructure specs Feature 1 Dev Feature 1 QA Feature 2 Dev Feature 2 QA Feature 3 Dev Feature 3 QA Feature 4 Dev Feature 4 QA basic build verification delivered build time from raising defect until it is available for testing defect trickle feed time to re-testBug Feature 5+ QADB Retest fixes QA delivered build Feature 5+ Dev Agile Cambridge 2011 © Schalk W. Cronjé
  19. 19. "If you only quantify one thing, quantify the cost of delay" Don Reinertsen Agile Cambridge 2011 © Schalk W. Cronjé
  20. 20. Cost of Delay – Quality Issue● One small feature, not properly tested● Found post-release● Time wasting by users having to manual work quantified => £35,000TotalManualTimeSpent x AvgCostToCompanyPerUser +TimeToRework x AvgCostToCompanyPerDev Agile Cambridge 2011 © Schalk W. Cronjé
  21. 21. Limit Work-in-Progress Ready Spec Spec Dev Dev QA QA Released Complete Complete Complete● Use a pull system● Managing queues are easier than managing timelines● Allows for better throughput utilisation● Identifies bottlenecks in the system● Measuring cycle time allows better prediction than traditional estimation Agile Cambridge 2011 © Schalk W. Cronjé
  22. 22. Policies Ready Spec Spec Dev Dev QA QA Released Complete Complete Complete● Each stage has a policy● Describes generic activities required● “Definition of Done” for each stage Agile Cambridge 2011 © Schalk W. Cronjé
  23. 23. Feedback● Faster feedback makes learning faster and more efficient● Faster feedback provides a sense of control● Large batches leads to slower feedback Agile Cambridge 2011 © Schalk W. Cronjé
  24. 24. Measuring Cycle Time● Measure average time in system of each BVI● Less time is spent on estimating future delivery times● Historic data automatically takes into account any disruptions such as – Operational issues – Days off sick AvgTimeInSystem x NoOfBVIsTime = --------------------------------- Concurrency Agile Cambridge 2011 © Schalk W. Cronjé
  25. 25. Cadences● Delivery is done at regular cadences.● What is ready is delivered, what is not, is left out● No artificial time-boxing or break points as in SCRUM-like approaches● Cycle-time leads to better predictability of what can actually be delivered Agile Cambridge 2011 © Schalk W. Cronjé
  26. 26. Historic Data vs Estimation Agile Cambridge 2011 © Schalk W. Cronjé
  27. 27. Simple Flow Model Specify Develop QAKanban Board Ready Spec Spec Dev Dev QA QA Released Complete Complete Complete Agile Cambridge 2011 © Schalk W. Cronjé
  28. 28. Reducing Learning Time● Writing test specifications after the development is costly in time – Knowledge decay – Leads to incomplete designs● Move the test specification up-front before any development starts – This is part of requirements discovery – It broadens the perspective of the programmer – Allows us to distinguish between automated checks and human testing Agile Cambridge 2011 © Schalk W. Cronjé
  29. 29. Reducing Learning Time Specify Develop QA Test specification +Time to specify -Time to re-work +Time to develop -Time to re-test Agile Cambridge 2011 © Schalk W. Cronjé
  30. 30. Change the Flow Model Specify Develop & Check Verify● Ensure the all automated checks are included in the development stage● Free up a Verification stage for exploratory testing and validating that process has been fulfilled. Agile Cambridge 2011 © Schalk W. Cronjé
  31. 31. Change the Flow Model Specify Develop & Check Verify +Time to develop -Time to re-work +More testing people -Time to re-test Agile Cambridge 2011 © Schalk W. Cronjé
  32. 32. Real-world measurements Specify Develop & Check Verify S1 S2 S3 S4READY 4 6 5 1hSPEC 1 0.5 6 0.5SPEC/Completed 2 0.5 1 0.5DEV + Check 2 1 13 4DEV + Check / 3 6 4 18CompletedVERIFY 3 7 8 1VERIFY/Completed 2 5 6 18Blockages 0.5 1h 2 1hTotal 17 26 43 42 Average cycle days per feature Agile Cambridge 2011 © Schalk W. Cronjé
  33. 33. What is wrong? S1 S2 S3 S4READY 4 6 5 1hSPEC 1 0.5 6 0.5SPEC/Completed 2 0.5 1 0.5DEV + Check 2 1 13 4DEV + Check / 3 6 4 18CompletedVERIFY 3 7 8 1VERIFY/Completed 2 5 6 18Blockages 0.5 1h 2 1hTotal 17 26 43 42 Average cycle days per feature Agile Cambridge 2011 © Schalk W. Cronjé
  34. 34. What is wrong?Excessive long time between feature availability and testing. Is there enoughtesting bandwidth? Or is this team just cheating? S1 S2 S3 S4READY 4 6 5 1hSPEC 1 0.5 6 0.5SPEC/Completed 2 0.5 1 0.5DEV + Check 2 1 13 4DEV + Check / 3 6 4 18CompletedVERIFY 3 7 8 1VERIFY/Completed 2 5 6 18Blockages 0.5 1h 2 1hTotal 17 26 43 42 Average cycle days per feature Agile Cambridge 2011 © Schalk W. Cronjé
  35. 35. Quantifying BVIs Agile Cambridge 2011 © Schalk W. Cronjé
  36. 36. Quantifying BVIs● Know what you want to achieve● Establish a scale for measurement● Decide how to measure● Make constraints explicit● Communicate Agile Cambridge 2011 © Schalk W. Cronjé
  37. 37. Quantify BVIs - Example For all customer submissions that meets the following criteria, 80% should be handled by an automation system instead of a human and resolution achieved within 30 minutes. Criteria A Criteria B etc. Agile Cambridge 2011 © Schalk W. Cronjé
  38. 38. Planguage● Created by Tom Gilb● Provides a structured way of quantifying requirements● Allows for reuse of specifications Agile Cambridge 2011 © Schalk W. Cronjé
  39. 39. Quantify BVI - PlanguageName: Customer submissions resolved by automation systemsScale: Percentage of submissions resolved by automation inrelation to all submissionsMeter: Query on database using the following statement:SELECT … FROM … WHERE ...Goal: 80% [Criteria A, Criteria B] Agile Cambridge 2011 © Schalk W. Cronjé
  40. 40. Applying the Economic Model● Compare to average human time to respond and volume covered● Extrapolate to time savings or cost savings● This allows for two measures of success – Technical – Financial Agile Cambridge 2011 © Schalk W. Cronjé
  41. 41. Kanban fails when peopledon’t want to face the truth. - Hillel Glazer Agile Cambridge 2011 © Schalk W. Cronjé
  42. 42. It all fell apart ...● Restructure – 3 teams + 3 managers● 2 managers rejected Kanban – 1 manager wanted ScrumWorks – 1 manager used Microsoft Project● 1 manager was forced not to use Kanban – Switched to iteration-based deliveries● All teams forced back to time-based estimation Agile Cambridge 2011 © Schalk W. Cronjé
  43. 43. It fell apart even more ...● 1 team decided on 4 wks dev + 4 wks testing – Quality was worse than before – Cycle time was at least 8 calendar weeks per BVI● 1 team did not deliver for over 5 months – Finally delivered something the user did not want● None of teams have any measurements on cycle time; cannot predict how long delivery will take Agile Cambridge 2011 © Schalk W. Cronjé
  44. 44. What did the team members say?● “I have lost visualisation. I have no idea what is going on”● “We are not using Kanban, because the company forced us to SCRUM”● “If I was given a choice, my the team and I would definitely go with Kanban”● “It is not the company standard as such you need to sell the process frequently. “ Agile Cambridge 2011 © Schalk W. Cronjé
  45. 45. Lessons Learnt Agile Cambridge 2011 © Schalk W. Cronjé
  46. 46. Apply Lean Principles to your Team Specify Develop & Check Verify● Make the system pull-based● Reduce the batch size● Limit the work-in-progress● One feature end-to-end● If the flow breaks, fix it immediately● Measure cycle time● Quantify requirements● Use Cost of Delay where applicable Agile Cambridge 2011 © Schalk W. Cronjé
  47. 47. Paired Teams● Pair up people with different skills to work on same feature – Feature teams are not a new concept● Could be perceived to have higher person cost per feature – Instead of distributing the person cost over multiple queues, the cost is combined into a single queue – Can actually lead to reduced cycle time. Agile Cambridge 2011 © Schalk W. Cronjé
  48. 48. Simple measurements are a foundation of software engineeringand not for measuring humans Agile Cambridge 2011 © Schalk W. Cronjé

×