Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Measuring Agility: Top 5 Metrics And Myths

19,249 views

Published on

Pete presented a webinar on Measuring Agility for VersionOne. This presentation looks at four fundamental elements of measuring agility: predictability, value, productivity and quality. In the presentation, Pete evaluates ten effective measurements and some measurement myths.

Published in: Business, Lifestyle, Technology

Measuring Agility: Top 5 Metrics And Myths

  1. 1. Measuring Agility Top 5 Metrics and Myths Pete Behrens Agile Organization & Process Coach © 2009 Trail Ridge Consulting, LLC pete@trailridgeconsulting.com 303.819.1809
  2. 2. Pete Behrens Agile Organization & Process Coach  Certified Scrum Trainer  Certified Scrum Coach  Guide enterprise organizations in transitioning to an agile organization implementing agile methods  Services for agile assessment, alignment, training and coaching Previous Experience  Led development of the requirements management solution RequisitePro – a core product in the IBM Rational product line – using the Rational Unified Process (RUP)  Consulted with EDS leading development of large data warehouse solutions using Rapid Application Development (RAD) © 2009 Trail Ridge Consulting, LLC 2
  3. 3. Measurement Dimensions Predictability Value Quality Productivity © 2009 Trail Ridge Consulting, LLC 3
  4. 4. How are projects measured today? Failed 23% Median Overrun Succeeded 28% Cost 50% Challenged 49% Schedule 100% • On Time • On Budget The average project • With all initially planned features Costs 50% more and takes twice as long as planned Source: Chaos Report, Standish Group, 2001 © 2009 Trail Ridge Consulting, LLC 4
  5. 5. Traditional project visibility is often too late Surprise ! n la tP s Features Te al t& ctu en t A pm s Te lo & ve t en De m e lop D ev Analysis & Requirements Time © 2009 Trail Ridge Consulting, LLC 5
  6. 6. Agile seeks transparency from the outset of the project Surprise ! n la Features stP Te i ty ibil t& is al l eV en ctu i m A Ag st p lo Te ve & De e nt m e lop v Analysis & Requirements De Sprint 1 Sprint 2 Sprint 3 Sprint 4 Time © 2009 Trail Ridge Consulting, LLC 6
  7. 7. Predictability Metric - Velocity 25 20 Average = 19 15 10 Teams will tend toward a 5 consistent velocity after a few sprints if the team and domain stay consistent 0 Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5 Sprint 6 Sprint 7 Sprint 8 © 2009 Trail Ridge Consulting, LLC 7
  8. 8. Sprints drive predictability Traditional Project Predictable Uncertain Unpredictable Agile Project with Timeboxed Iterations Sprint Sprint Sprint Sprint Sprint Sprint Sprint Sprint Project Timeline Definition: Sprint = Iteration = Timebox © 2009 Trail Ridge Consulting, LLC 8
  9. 9. Velocity - Advanced Burn Down  Measures team velocity (work complete per sprint)  Measures scope change over time  Guides release-level decision making © 2009 Trail Ridge Consulting, LLC 9
  10. 10. Predictability Metric - On Time Delivery Last non-agile release Since March 2007 every Salesforce.com agile release has been deployed on-time (down to the exact minute) Source: Scrum Gathering 2008 - Salesforce.com Keynote Address © 2009 Trail Ridge Consulting, LLC 10
  11. 11. Agile is value-driven Predictive Process Adaptive Process (Waterfall) (Agile) Constraints Requirements Cost Schedule Plan Value/Vision Driven Driven Estimates Cost Schedule Features The plan creates The vision creates cost/schedule estimates feature estimates © 2009 Trail Ridge Consulting, LLC 11
  12. 12. What is valued? Actual use of requested features in predictive projects Always 7% Never Often 45% 13% Results: • 64% Rarely or never used Sometimes 16% • 20% Frequently used Rarely 19% Source: Standish Group study presented at XP2002 by Jim Johnson © 2009 Trail Ridge Consulting, LLC 12
  13. 13. The Value of Time Traditional Single Release Delivered 1-5 Agile Incremental Release Delivered Delivered Delivered Delivered Delivered 1 2 3 4 5 Time Value Gap Value © 2009 Trail Ridge Consulting, LLC 13
  14. 14. Value Metric - Feature Delivery  94% feature request increase from 2006 - 2007  38% increase in feature request delivered per developer Source: Scrum Gathering 2008 - Salesforce.com Keynote Address © 2009 Trail Ridge Consulting, LLC 14
  15. 15. Value Metric - Customer Survey  Ask your customers!  Set a baseline and measure quarterly  Qualitative & Quantitative  Questions cover  Responsiveness  Quality of features  Support provided  Delivery timeliness  Feature value © 2009 Trail Ridge Consulting, LLC  ... 15
  16. 16. Quality Metric - Running Tested Features (RTF)  Measures the number of automated unit and functional tests for a team/product over time  Measures quality as a leading indicator  Measures productivity with respect to complexity better than other measures © 2009 Trail Ridge Consulting, LLC Source: http://www.xprogramming.com/xpmag/jatRtsMetric.htm 16
  17. 17. RTF Example Israeli Air Force in 2005  Increased confidence in team and management  Enabled accurate and effective decision making  Motivated writing tests  Motivated writing smaller tests - more adaptable  # of tests generally reflected http://www.cs.huji.ac.il/~davidt/papers/Agile_Metrics_AgileUnited05.pdf complexity better than other methods (e.g. SLOC, Function Points, etc.) © 2009 Trail Ridge Consulting, LLC 17
  18. 18. Quality Metric - Issue / Defect Costs  Measure the # of product issues and defects multiplied by the cost of addressing them  Measures quality as a lagging indicator  Measures support cost impact of quality © 2009 Trail Ridge Consulting, LLC 18
  19. 19. Issue/Defect Cost @ IBM 1,056 1,056 792 2008 Expected Cost per Actual Defect = $16,000 528 Ticket = $500 Savings = $2.6M 264 168 67 Economics of Agile Development 32 Sue McKinney, IBM - 2008 Agile2008 Conference Case Study 0 Defects Tickets © 2009 Trail Ridge Consulting, LLC 19
  20. 20. Myth - Metrics drive team performance Metrics are not inherently good or bad It is the use of the metric that drives team dysfunction © 2009 Trail Ridge Consulting, LLC 20
  21. 21. Myth - Velocity measures productivity (or value)  Story points are relative  Cannot compare velocity across teams  All teams, products, environments, constraints, and dependencies are different  Some stories are more valued than others © 2009 Trail Ridge Consulting, LLC 21
  22. 22. Myth - 100% Committed vs. Actual drives estimation accuracy 40 78% 64% 91% 100% 120% 30 32 Predictability 28 Story Points 25 24 20 22 20 20 20 20 Committed 18 Actual 10 Productivity 0 Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5 © 2009 Trail Ridge Consulting, LLC 22
  23. 23. Commitment vs. Actual Completed Remaining Share commitment vs. actual as a fact to drive discussions: 1. Why didn’t we get it done? 2. What are we doing about it? 3. What are the impact to the release goals? © 2009 Trail Ridge Consulting, LLC 23
  24. 24. Commitment vs. Actual 150.0 Completed Remaining 112.5 Share commitment vs. actual as a fact to drive discussions: Points 75.0 1. Why didn’t we get it done? 37.5 2. What are we doing about it? 3. What are the impact to the 0 Jan Feb Mar Apr May Jun Jul Aug Sep release goals? © 2009 Trail Ridge Consulting, LLC 23
  25. 25. Myth - Higher velocity is always a good thing  Technical debt is bad  Technical debt is any “not- quite-right” code not fixed (e.g. Bugs, refactors, workarounds, etc.)  Pushing too hard on new product value and velocity tends to increase technical debt  Measure and limit technical debt accumulation © 2009 Trail Ridge Consulting, LLC 24
  26. 26. Myth - Sprints “Fail” Maximum information is generated when the probability of failure is 50% - not when hypothesis are always correct. It is necessary to have a reasonable failure rate in order to generate a reasonable amount of new information. - Reinertsen, Managing the Design Factory © 2009 Trail Ridge Consulting, LLC 25
  27. 27. Treat Sprints as Practice  Sprints allow teams to practice the skill of delivering high-quality software on time  Preventing failure in sprints limits team learning, growth, discipline, empowerment and productivity © 2009 Trail Ridge Consulting, LLC 26
  28. 28. Top 5 (or 6) Agile Metrics Predictability Value 2. On time delivery 3. Customer 1. Velocity Surveys 4. # Features or Value Delivered 6. Issue/Defect Cost 5. Running Tested Features Quality Productivity © 2009 Trail Ridge Consulting, LLC 27
  29. 29. Top 5 (or 6) Agile Metric Myths 1. Metrics drive team performance 2. Velocity measures productivity 3. Achieving 100% commitment to actual increases estimation accuracy 4. Increasing velocity is always a good thing 5. Sprints “Fail” 6. An Agile tool will make you agile © 2009 Trail Ridge Consulting, LLC 28
  30. 30. V1 provides metrics and an agile framework - You guide agility © 2009 Trail Ridge Consulting, LLC 29
  31. 31. Extending the Metrics 7. Velocity --> Investment, $/sprint, $/story pt 8. Features Delivered --> Earned Value 9. Customer Surveys --> Employee Surveys 10.Quality/Productivity --> Technical Debt © 2009 Trail Ridge Consulting, LLC 30

×