Simple measurements
Upcoming SlideShare
Loading in...5
×
 

Simple measurements

on

  • 669 views

Experiential report on using Kanban boards and introducing quantification into process and deliverables

Experiential report on using Kanban boards and introducing quantification into process and deliverables

Statistics

Views

Total Views
669
Views on SlideShare
668
Embed Views
1

Actions

Likes
0
Downloads
9
Comments
0

1 Embed 1

http://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Simple measurements Simple measurements Presentation Transcript

    • Simple Measurements Schalk W. Cronjé ysb33r @ gmail.com @ysb33r Agile Cambridge 2011 © Schalk W. Cronjé
    • What is Agile? Why are you not measuring your process today?How do you know when yourdelivery was successful? Agile Cambridge 2011 © Schalk W. Cronjé
    • “What is Agile?” - Responses● “Continuous development done in small iterations”● “Adapting to rapid change, predictable periodic delivery of usable software”● “Developers and testers work together from beginning to end”● “Youd only get a cynical answer from me” Agile Cambridge 2011 © Schalk W. Cronjé
    • What is Agile Fundamentally? ● Increased revenue ● Reduction in costs ● Faster response times ● Less reworkDelivering value to stakeholders ● Customers ● Users ● Developers & Testers ● Governments & Legislators ● Societal Institutions Agile Cambridge 2011 © Schalk W. Cronjé
    • Opposing Forces? Software Craftsmanship Engineering Accuracy Precision Agile Cambridge 2011 © Schalk W. Cronjé
    • Software Engineering● Implies measurement● Implies Kolb / Shewhart / Deming Cycles● Is not code craftmanship● SE receives bad press – Perceived difficulty in measurement – Lack of measurement in S/W dev Agile Cambridge 2011 © Schalk W. Cronjé
    • Accuracy Accuracy is to precision as engineering is to mathematics Agile Cambridge 2011 © Schalk W. Cronjé
    • Accuracy We need to be accurate enough to get there, but not more accurate Agile Cambridge 2011 © Schalk W. Cronjé
    • Properties of a Kanban System Agile Cambridge 2011 © Schalk W. Cronjé
    • Taking an Economic View● It helps to quantify the effects of multiple interacting variables● It helps us to understand that the customer is not the only judge of value● By using an economic framework it can allow us to maximise value, including – cycle time – product cost – development expense● It helps to communicate with non-technical decision makers Agile Cambridge 2011 © Schalk W. Cronjé
    • Background 3 Teams – 3 Locations – 18 months Agile Cambridge 2011 © Schalk W. Cronjé
    • David Andersons Six Steps● Focus on quality● Reduce Work-in-Progress● Deliver often● Balance demand against throughput● Prioritise● Attack sources of variability to improve predictability Agile Cambridge 2011 © Schalk W. Cronjé
    • Visualisation Agile Cambridge 2011 © Schalk W. Cronjé
    • Visualisation● Honesty!● Makes it transparent to everyone that is going on● Visual cue for bottlenecks● Manual board and easy-to-use online system Agile Cambridge 2011 © Schalk W. Cronjé
    • Business Value Increments● Delivery is in business value increments (BVIs)● Only “done” when the increment is delivered end-to-end● No burning down of tasks ! (false economy)● Each increment must have at least one metric of the value it will deliver Agile Cambridge 2011 © Schalk W. Cronjé
    • Batch DevelopmentFeature 1 Dev Feature 1 QAFeature 2 Dev Feature 2 QAFeature 3 Dev Feature 3 QAFeature 4 Dev Feature 4 QA delivered build Agile Cambridge 2011 © Schalk W. Cronjé
    • Batch DevelopmentTotal Time = devt1 .. devt4 + qat1 …qat4 + fixdelay + fixtime + retest time Feature 1 Dev Feature 1 QA Feature 2 Dev Feature 2 QA Feature 3 Dev Feature 3 QA Feature 4 Dev Feature 4 QA delivered build defect trickle feedBug Feature 5+ QADB Retest fixes QA delivered build Feature 5+ Dev Agile Cambridge 2011 © Schalk W. Cronjé
    • Time Factors Building test understanding infrastructure specs Feature 1 Dev Feature 1 QA Feature 2 Dev Feature 2 QA Feature 3 Dev Feature 3 QA Feature 4 Dev Feature 4 QA basic build verification delivered build time from raising defect until it is available for testing defect trickle feed time to re-testBug Feature 5+ QADB Retest fixes QA delivered build Feature 5+ Dev Agile Cambridge 2011 © Schalk W. Cronjé
    • "If you only quantify one thing, quantify the cost of delay" Don Reinertsen Agile Cambridge 2011 © Schalk W. Cronjé
    • Cost of Delay – Quality Issue● One small feature, not properly tested● Found post-release● Time wasting by users having to manual work quantified => £35,000TotalManualTimeSpent x AvgCostToCompanyPerUser +TimeToRework x AvgCostToCompanyPerDev Agile Cambridge 2011 © Schalk W. Cronjé
    • Limit Work-in-Progress Ready Spec Spec Dev Dev QA QA Released Complete Complete Complete● Use a pull system● Managing queues are easier than managing timelines● Allows for better throughput utilisation● Identifies bottlenecks in the system● Measuring cycle time allows better prediction than traditional estimation Agile Cambridge 2011 © Schalk W. Cronjé
    • Policies Ready Spec Spec Dev Dev QA QA Released Complete Complete Complete● Each stage has a policy● Describes generic activities required● “Definition of Done” for each stage Agile Cambridge 2011 © Schalk W. Cronjé
    • Feedback● Faster feedback makes learning faster and more efficient● Faster feedback provides a sense of control● Large batches leads to slower feedback Agile Cambridge 2011 © Schalk W. Cronjé
    • Measuring Cycle Time● Measure average time in system of each BVI● Less time is spent on estimating future delivery times● Historic data automatically takes into account any disruptions such as – Operational issues – Days off sick AvgTimeInSystem x NoOfBVIsTime = --------------------------------- Concurrency Agile Cambridge 2011 © Schalk W. Cronjé
    • Cadences● Delivery is done at regular cadences.● What is ready is delivered, what is not, is left out● No artificial time-boxing or break points as in SCRUM-like approaches● Cycle-time leads to better predictability of what can actually be delivered Agile Cambridge 2011 © Schalk W. Cronjé
    • Historic Data vs Estimation Agile Cambridge 2011 © Schalk W. Cronjé
    • Simple Flow Model Specify Develop QAKanban Board Ready Spec Spec Dev Dev QA QA Released Complete Complete Complete Agile Cambridge 2011 © Schalk W. Cronjé
    • Reducing Learning Time● Writing test specifications after the development is costly in time – Knowledge decay – Leads to incomplete designs● Move the test specification up-front before any development starts – This is part of requirements discovery – It broadens the perspective of the programmer – Allows us to distinguish between automated checks and human testing Agile Cambridge 2011 © Schalk W. Cronjé
    • Reducing Learning Time Specify Develop QA Test specification +Time to specify -Time to re-work +Time to develop -Time to re-test Agile Cambridge 2011 © Schalk W. Cronjé
    • Change the Flow Model Specify Develop & Check Verify● Ensure the all automated checks are included in the development stage● Free up a Verification stage for exploratory testing and validating that process has been fulfilled. Agile Cambridge 2011 © Schalk W. Cronjé
    • Change the Flow Model Specify Develop & Check Verify +Time to develop -Time to re-work +More testing people -Time to re-test Agile Cambridge 2011 © Schalk W. Cronjé
    • Real-world measurements Specify Develop & Check Verify S1 S2 S3 S4READY 4 6 5 1hSPEC 1 0.5 6 0.5SPEC/Completed 2 0.5 1 0.5DEV + Check 2 1 13 4DEV + Check / 3 6 4 18CompletedVERIFY 3 7 8 1VERIFY/Completed 2 5 6 18Blockages 0.5 1h 2 1hTotal 17 26 43 42 Average cycle days per feature Agile Cambridge 2011 © Schalk W. Cronjé
    • What is wrong? S1 S2 S3 S4READY 4 6 5 1hSPEC 1 0.5 6 0.5SPEC/Completed 2 0.5 1 0.5DEV + Check 2 1 13 4DEV + Check / 3 6 4 18CompletedVERIFY 3 7 8 1VERIFY/Completed 2 5 6 18Blockages 0.5 1h 2 1hTotal 17 26 43 42 Average cycle days per feature Agile Cambridge 2011 © Schalk W. Cronjé
    • What is wrong?Excessive long time between feature availability and testing. Is there enoughtesting bandwidth? Or is this team just cheating? S1 S2 S3 S4READY 4 6 5 1hSPEC 1 0.5 6 0.5SPEC/Completed 2 0.5 1 0.5DEV + Check 2 1 13 4DEV + Check / 3 6 4 18CompletedVERIFY 3 7 8 1VERIFY/Completed 2 5 6 18Blockages 0.5 1h 2 1hTotal 17 26 43 42 Average cycle days per feature Agile Cambridge 2011 © Schalk W. Cronjé
    • Quantifying BVIs Agile Cambridge 2011 © Schalk W. Cronjé
    • Quantifying BVIs● Know what you want to achieve● Establish a scale for measurement● Decide how to measure● Make constraints explicit● Communicate Agile Cambridge 2011 © Schalk W. Cronjé
    • Quantify BVIs - Example For all customer submissions that meets the following criteria, 80% should be handled by an automation system instead of a human and resolution achieved within 30 minutes. Criteria A Criteria B etc. Agile Cambridge 2011 © Schalk W. Cronjé
    • Planguage● Created by Tom Gilb● Provides a structured way of quantifying requirements● Allows for reuse of specifications Agile Cambridge 2011 © Schalk W. Cronjé
    • Quantify BVI - PlanguageName: Customer submissions resolved by automation systemsScale: Percentage of submissions resolved by automation inrelation to all submissionsMeter: Query on database using the following statement:SELECT … FROM … WHERE ...Goal: 80% [Criteria A, Criteria B] Agile Cambridge 2011 © Schalk W. Cronjé
    • Applying the Economic Model● Compare to average human time to respond and volume covered● Extrapolate to time savings or cost savings● This allows for two measures of success – Technical – Financial Agile Cambridge 2011 © Schalk W. Cronjé
    • Kanban fails when peopledon’t want to face the truth. - Hillel Glazer Agile Cambridge 2011 © Schalk W. Cronjé
    • It all fell apart ...● Restructure – 3 teams + 3 managers● 2 managers rejected Kanban – 1 manager wanted ScrumWorks – 1 manager used Microsoft Project● 1 manager was forced not to use Kanban – Switched to iteration-based deliveries● All teams forced back to time-based estimation Agile Cambridge 2011 © Schalk W. Cronjé
    • It fell apart even more ...● 1 team decided on 4 wks dev + 4 wks testing – Quality was worse than before – Cycle time was at least 8 calendar weeks per BVI● 1 team did not deliver for over 5 months – Finally delivered something the user did not want● None of teams have any measurements on cycle time; cannot predict how long delivery will take Agile Cambridge 2011 © Schalk W. Cronjé
    • What did the team members say?● “I have lost visualisation. I have no idea what is going on”● “We are not using Kanban, because the company forced us to SCRUM”● “If I was given a choice, my the team and I would definitely go with Kanban”● “It is not the company standard as such you need to sell the process frequently. “ Agile Cambridge 2011 © Schalk W. Cronjé
    • Lessons Learnt Agile Cambridge 2011 © Schalk W. Cronjé
    • Apply Lean Principles to your Team Specify Develop & Check Verify● Make the system pull-based● Reduce the batch size● Limit the work-in-progress● One feature end-to-end● If the flow breaks, fix it immediately● Measure cycle time● Quantify requirements● Use Cost of Delay where applicable Agile Cambridge 2011 © Schalk W. Cronjé
    • Paired Teams● Pair up people with different skills to work on same feature – Feature teams are not a new concept● Could be perceived to have higher person cost per feature – Instead of distributing the person cost over multiple queues, the cost is combined into a single queue – Can actually lead to reduced cycle time. Agile Cambridge 2011 © Schalk W. Cronjé
    • Simple measurements are a foundation of software engineeringand not for measuring humans Agile Cambridge 2011 © Schalk W. Cronjé