Your SlideShare is downloading. ×
0
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Sagi Smolarski ITG - Enterprise Metrics on Agile
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Sagi Smolarski ITG - Enterprise Metrics on Agile

1,665

Published on

1 Comment
3 Likes
Statistics
Notes
No Downloads
Views
Total Views
1,665
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
42
Comments
1
Likes
3
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Photo by Daniel Goodwin - http://www.flickr.com/photos/dgoodphoto/
  • Transition of teams include 3 days training, followed by 2 days at the end of the first iteration.
  • Sten [CC-BY-SA-3.0-2.5-2.0-1.0 (www.creativecommons.org/licenses/by-sa/3.0) or GFDL (www.gnu.org/copyleft/fdl.html)], via Wikimedia Commons
  • William Thomson, 1st Baron KelvinDiscovered the first and second law of thermodynamics
  • How many of you relate to these needs?How many of you are collecting process/quality metrics?
  • Inspect and Adapt is key and we want to make the most of itTeams have a hard time interpreting charts, especially in situations where they are not very good or very badTeams often don’t take the time to look at the chartsAn easy indication of good/bad can at least tell them that there might be a problem
  • I’m going to focus on the quantitative side
  • The First "Computer Bug" Moth found trapped between points at Relay # 70, Panel F, of the Mark II Aiken Relay Calculator while it was being tested at Harvard University, 9 September 1947. The operators affixed the moth to the computer log, with the entry: "First actual case of bug being found". They put out the word that they had "debugged" the machine, thus introducing the term "debugging a computer program". In 1988, the log, with the moth still taped by the entry, was in the Naval Surface Warfare Center Computer Museum at Dahlgren, Virginia, which erroneously dated it 9 September 1945. The Smithsonian Institute's National Museum of American History and other sources have the correct date of 9 September 1947 (Object ID: 1994.0191.01). The Harvard Mark II computer was not complete until the summer of 1947.Photo in the public domain
  • Is this good or bad?
  • What is the problem here
  • What is the problem here
  • Is this good or bad?
  • Transcript

    • 1. Enterprise Metrics on Agile<br />Sagi Smolarski<br />Director, Software Development<br />Sagi.Smolarski@itg.com<br />[CC] Daniel Goodwin via Wikimedia Commons<br />
    • 2. Disclaimers<br />The information contained in this presentation is provided for informational purposes only.  <br />All trademarks, service marks, and trade names not owned by ITG are owned by their respective owners.<br />…in other words, the data presented has been modified and does not represent real data for ITG teams.<br />
    • 3. Agenda<br />Why do we need metrics for agile?<br />How do we generate those metrics?<br />Which metrics do we look at?<br />Pros and cons of looking at those metrics.<br />
    • 4. Investment Technology Group<br />NYSE: ITG<br />www.itg.com<br />Leading provider of trading solutions<br />Trading Front-Ends<br />Trading Algorithms<br />Research (fundamental analytic, pre/post-trade)<br />Trading desks<br />Electronic trading connectivity<br />Liquidity pool<br />Development operation: 300 Developers, 100 QA Analysts, ~60 teams<br />
    • 5. Iteration Metrics<br />Quality Metrics<br />Process Baseline<br />ITG’s Agile Transition Timeline<br />ITG Triton<br />XP<br />Enterprise rollout plan<br />Rally<br />+Scrum, +Lean<br />90% of teams have been trained and are using Rally<br />Best <br />Execution <br />Management <br />System<br />XP Pilot<br />2000<br />2001<br />2002<br />2003<br />2004<br />2005<br />2006<br />2007<br />2008<br />2009<br />2010<br />2011<br />Massive transition<br />Informal, spotty, inconsistent adoption<br />
    • 6. Our Process Baseline – How We Expect Teams to WorkExcerpt<br />
    • 7. The journey to Agile is a long, winding road…<br />Are we moving forward?<br />[CC] – Sten via Wikimedia Commons<br />
    • 8. Why Measure?<br />“If you cannot measure it you cannot improve it.”<br />Lord Kelvin<br />
    • 9. Why Metrics?<br />Teams (Inspect & Adapt):<br /><ul><li>Are we doing okay?
    • 10. How can we improve?
    • 11. Are we doing what the company expect from us?</li></ul>Coaches, PMO (Train and Coach):<br /><ul><li>Are teams using what we taught?
    • 12. Which teams need our help?</li></ul>Executives (Govern & Steer):<br />What are we getting out of the agile transition?<br />Are teams sticking to our process baseline and to enterprise initiatives?<br />Is productivity/quality improving?<br />
    • 13. Team A<br />Teams Process HealthData is readily available in Rally<br />Team B<br />Velocity<br />Velocity<br />vs.<br />Burndown/Story Acceptance<br />Burndown/Story Acceptance<br />vs.<br />Isn’t this enough?<br />
    • 14. Why Metrics? - Dumbing down<br />…<br />This is too complex for team members to interpret and monitor…<br />Are the charts okay?<br />
    • 15. Why Metrics – Scaling up<br />How do we watch 80 teams? <br />Lines of products? <br />The whole enterprise?<br />?<br />
    • 16. Types of Metrics<br />Qualitative<br />Satisfaction of stakeholders<br />Product Management, Client Services, Product Support (deployment, service), Delivery team<br />Teams adherence to practices<br />Agility questionnaire<br />Quantitative<br />Quality metrics<br />Process health metrics<br />Time-to-Market<br />We’ll be focusing on these<br />
    • 17. What Would We Want to Measure?<br />Productivity<br />Quality<br />ROI<br />Satisfaction<br />Some of these are not easy to measure so we have to find proxies<br />
    • 18. What We Are Actually Measuring<br />Satisfaction<br />Quality<br />Process<br />As a partial proxy for productivity<br />Stakeholders Surveys<br />Production defects<br />Defects Debt<br />Work-in -Progress<br />Velocity Stability<br />Full Completion or work<br />Gradual acceptance of work<br />Churn<br />Small Stories<br />Practices Radar map<br />
    • 19. How We Generate Metrics<br />How do we obtain data?Rally’s Web Services API<br />Rest and SOAP<br />Access to virtually all data in the tool<br />Users, Releases, Iterations, Stories, Tasks, Defects, Test cases…<br />Ruby toolkit… or any WS library<br />We use mostly C#<br />Automated generation, monthly<br />How do we make sense out of mounds of data?<br />
    • 20. The Metrics<br />
    • 21. Quality Metrics<br />
    • 22. Quality MetricsDefects Found in Production<br />Metric #1<br />Production Defects<br />Goal<br />Jan<br />Feb<br />Mar<br />Apr<br />May<br />Jun<br />Jul<br />Aug<br />Sep<br />Oct<br />Nov<br />Dec<br />Jan<br />Feb<br />2010<br />2011<br />Team Level & Enterprise Level<br />
    • 23. Metric #2<br />Quality MetricsFix-Bugs-First<br />If you are fixing defects first<br />As part of existing development, within the iteration<br />Before new development for regression defects<br />Then…<br />The number of defects in your backlog should be small<br />The age of open defects should be low<br />Together…<br />Defects Debt should be low<br />Defects Debt = [# open defects] x [avg age of open defects]<br />
    • 24. Quality MetricsProject-Level Defects Debt<br />Metric #2<br />
    • 25. Quality MetricsEnterprise Defects Debt<br />Metric #2<br />Defects Debt<br />
    • 26. Process <br />Health <br />Metrics<br />
    • 27. Team Process HealthAgility Questionnaire<br />
    • 28. Scope CreepChurn<br />Metric #3<br />Iteration Cumulative Flow<br />No Scope Creep<br />Plan Estimates<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Churn (Scope variation) = 0<br />
    • 29. Work in Progress & Scope Creep<br />Metric #3<br />Iteration Cumulative Flow<br />Plan Estimates<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Churn = 30%<br />
    • 30. Work in Progress & Scope Creep<br />Metric #3<br />Iteration Cumulative Flow<br />Plan Estimates<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Scope Creep: <br /> Team Cannot Commit<br /> Disruptive (task switching?)<br /> Less efficient<br />Day 12<br />Day 13<br />Day 14<br />Churn = 30%<br />
    • 31. Work in ProgressWIP Average<br />Metric #4<br />Iteration Cumulative Flow<br />No Scope Creep<br />Work In Progress<br />Plan Estimates<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Planned<br />WIP average = 10%<br />In Progress<br />Accepted<br />
    • 32. Work in Progress & Scope Creep<br />Metric #4<br />Iteration Cumulative Flow<br />Plan Estimates<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Planned<br />WIP average = 70%<br />In Progress<br />Accepted<br />
    • 33. Work in Progress & Scope Creep<br />Metric #4<br />Iteration Cumulative Flow<br /> Large WIP = <br />Long Cycle Time<br />Risk of not completing work within iteration<br />Plan Estimates<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Planned<br />WIP average = 70%<br />In Progress<br />Accepted<br />
    • 34. Full Acceptance – Final Acceptance %<br />Metric #5<br />Iteration Burn-up<br />100% Accepted<br />Plan Estimates (Points)<br />50% Acceptance<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Day 12<br />Day 13<br />Day 14<br />Accepted<br />Acceptance rate on the last day = 100%<br />
    • 35. Full Acceptance<br />Metric #5<br />Iteration Burn-up<br />100% Accepted<br />Plan Estimates (Points)<br />50% Acceptance<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Day 13<br />Day 14<br />Day 12<br />Day 13<br />Day 14<br />Accepted<br />Acceptance rate on the last day = 55%<br />
    • 36. Full Acceptance<br />Metric #5<br />Iteration Burn-up<br />100% Accepted<br />Plan Estimates (Points)<br />50% Acceptance<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Day 13<br />Day 14<br /> Partial Completion<br />Hard to plan<br />Greater cycle time<br />Inefficient<br />Day 12<br />Day 13<br />Day 14<br />Accepted<br />Acceptance rate on the last day = 55%<br />
    • 37. Gradual Acceptance – 50% day<br />Metric #6<br />Iteration Burn-up<br />100% Accepted<br />Plan Estimates (Points)<br />50% Acceptance<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />>50% points <br />accepted<br />Day 12<br />Day 13<br />Day 14<br />Accepted<br />Day of 50% Acceptance = 8days/14days = 57%<br />
    • 38. Gradual AcceptanceFull Acceptance<br />Metric #6<br />Iteration Burn-up<br />100% Accepted<br />Plan Estimates (Points)<br />50% Acceptance<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Day 12<br />Day 13<br />Day 14<br />Accepted<br />Day of 50% Acceptance = 14/14 = 100%<br />
    • 39. Gradual Acceptance<br />Metric #6<br />Iteration Burn-up<br />100% Accepted<br />Plan Estimates (Points)<br />50% Acceptance<br />Day 1<br />Day 2<br />Day 3<br />Day 4<br />Day 5<br />Day 6<br />Day 7<br />Day 8<br />Day 9<br />Day 10<br />Day 11<br />Day 12<br />Day 13<br />Day 14<br />Day 12<br />Day 13<br />Day 14<br />Accepted<br /> Late Acceptance<br />High risk of not completing work<br />Late feedback on work<br />Day of 50% Acceptance = 14/14 = 100%<br />
    • 40. Velocity StabilityVelocity Variation<br />Metric #7<br />Velocity<br />60<br />50<br />40<br />30<br />20<br />10<br />Iteration 23<br />Iteration 24<br />Iteration 25<br />Iteration 26<br />Iteration 27<br />Iteration 28<br />Iteration 29<br />Iteration 30<br />Iteration 31<br />Iteration 32<br />Day 12<br />Day 13<br />Day 14<br />Accepted within iteration<br />Velocity is fairly stable<br />Accepted after iteration<br />Variation = 7%<br />Never accepted<br />
    • 41. Velocity StabilityVelocity Variation<br />Metric #7<br />Velocity<br />60<br />50<br />40<br />30<br />20<br />10<br />Iteration 23<br />Iteration 24<br />Iteration 25<br />Iteration 26<br />Iteration 27<br />Iteration 28<br />Iteration 29<br />Iteration 30<br />Iteration 31<br />Iteration 32<br />Day 12<br />Day 13<br />Day 14<br />Accepted within iteration<br />Velocity is unstable<br />Accepted after iteration<br />Variation = 87%<br />Never accepted<br />
    • 42. Velocity Stability<br />Metric #7<br />Velocity<br />60<br />50<br />40<br />30<br />20<br />10<br />Iteration 23<br />Iteration 24<br />Iteration 25<br />Iteration 26<br />Iteration 27<br />Iteration 28<br />Iteration 29<br />Iteration 30<br />Iteration 31<br />Iteration 32<br />Day 12<br />Day 13<br />Day 14<br /> Unstable Velocity<br />Low predictability<br />Hard to plan<br />Inefficient?<br />Accepted within iteration<br />Velocity is unstable<br />Accepted after iteration<br />Variation = 87%<br />Never accepted<br />
    • 43. Defects Debt<br />Velocity Stability<br />Churn<br />% Completion<br />WIP, Day of 50% Acceptance, Story Sizing<br />Our Process Baseline… and metrics we can collect<br />
    • 44. Team DashboardAll together now…<br />Day 12<br />Day 13<br />Day 14<br />Day 12<br />Day 13<br />Day 14<br />Day 12<br />Day 13<br />Day 14<br />Day 12<br />Day 13<br />Day 14<br />Day 12<br />Day 13<br />Day 14<br />Day 12<br />Day 13<br />Day 14<br />
    • 45. Teams Process HealthEnterprise Process Health Map<br />(Probably) Healthy Process<br />Line of Business #1<br />Line of Business #2<br />Line of Business #3<br />Line of Business #4<br />Line of Business #5<br />ProjY<br />Proj k<br />Proj X<br />Proj B<br />Proj N<br />Projא<br />Project A<br />Prohj Z<br />Proj E<br />Projב<br />Proj 1<br />Proj C<br />Proj W<br />Proj ג<br />Proj 2<br />Proj ד<br />Proj 3<br />Projה<br />Proj D<br />Proj 4<br />Proj 5<br />Proj M<br />Proj L<br />Projו<br />(Possibly) Unhealthy Process<br />
    • 46. Problems with Metrics<br />Metric-Driven Dysfunctions <br />Teams misreporting defects<br />Not tracking unpredictable work in Rally<br />We can limit these by not using these metrics to reward, reprimand<br />False positives, False negatives<br />Some metrics are sensitive to how Rally is being use<br />These metrics don’t cover some important aspects of teams’ process<br />Metrics should be treated as an indication that further examination is needed<br />Hey, I just figured out how we can double our quarterly sales. From now on, each quarter will last six months.<br />
    • 47. Looking at Next…<br />Small Stories<br />Time to Market<br />Trends<br />By Simon A. Eugster (EigenesWerk) [GFDL (www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (www.creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons<br />
    • 48. Q&A<br />Q&A<br />How are you measuring yourselves?<br />

    ×