Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Agile Metrics

2,509 views

Published on

Background of measuring and metric usage is traditional waterfall projects, psychology of measuring, agile response to traditional metrics, and suggested agile metrics.

Published in: Technology, Business
  • Be the first to comment

Agile Metrics

  1. 1. Erik Weber @erikjweber Slidesha.re/AgileMAgile MetricsOr: How I Learned toStop Worrying andLove Agile
  2. 2. ABOUT CENTARE Agile/ALM Mobile Cloud Microsoft 2011 Partner of the Year Finalist ALM Gold Competency Azure Circle / Cloud Accelerate Apple / Java / Scrum iOS iPhone/iPad/Android Scrum.org Partner Certified Professional Scrum Trainers
  3. 3. BackgroundAGENDA Why metrics? The Psychology of Metrics Agile Response Examples of Agile Metrics Sources
  4. 4. ABOUT MEWork Stuff Me Stuff Healthcare, Finance, Green Huge foodie and amateur cook Buildings Wearer of bowties Huge Conglomerates, Small Homebrewer and beverage Employee Owned, Fortune imbiber 500 Passionate about Agile (have Tester -> Developer -> multiple kanban boards up in Automation Dude -> QA my living room) Manager -> Project Manager -> Scrum Master -> Scrum Product Owner -> Scrum Coach Consulting and FTE Passionate about Agile
  5. 5. WHY DO WEUSE METRICS?
  6. 6. WE NEED TANGIBLES As gauges or indicators - For status, quality, doneness, cost, etc. As predictors - What can we expect in the future? As decision making tools - Can we release yet? A visual way to peer into a mostly non-visual world - Because we don’t completely understand what’s going on in the software/project and we need to
  7. 7. HISTORY TELLS US TO USE METRICS Tons of research. Mostly from the 80’s and 90’s and based upon industrial metrics. Tons of implementation at companies Research + Implementation has grown exponentially Hasn’t really affected project success (what a metric!) Metrics Usage: Papers, Books, Co mpanies, etc. Software Project Success Rate1980 1985 1990 1995 2000 2005 2010 *Chaos Report from 1995 to 2010
  8. 8. WATERFALL IS SCARY WITHOUT THEM “Metrics are used in waterfall because we had no idea what was happening, so we tried to measure anything.” – Ken Schwaber, ALM Chicago Keynote, 2012 Because the system is complex and intangible. So we worry. So we want a way to peer into the system and make predictions. So we take measurements to try to create a window. But we still worry. EVERYTHING STILL FEELS RISKY
  9. 9. THE PSYCHOLOGYOF METRICS
  10. 10. THE MEASUREMENT PARADOX “Not everything that can be counted counts, and not everything that counts can be counted” – Albert Einstein Software development is a complex system Metrics used in isolation probably don’t measure what you think they do Beware ‘low hanging fruit’ Value of Measurement = 1/Ease of Measuring
  11. 11. Number of Test Cases 600 500 400 300 200 100 0 December January February MarchReal Life Example In reality, we just started focusing on cleaning up old test cases.
  12. 12. THE HAWTHORNE EFFECT Measuring something will change people’s behavior When you measure something, you influence it You can exploit this effect in a positive way Most traditional metrics have a negative hawthorne effect Gaming = Hawthorne Effect * Deliberate Personal Gain“Tell me how you will measure me and I will tell you how I will behave” -Goldratt
  13. 13. “Test case TC8364 has failed, the customer settings page doesn’t work in Chrome.” “Tests: Passed - But I wrote a bug for not being able to use the customer setting page in Chrome.Real Life Example Same Tester. Same Test. One sprint before test pass/fail percentage metric put in place, and one sprint after.
  14. 14. MEASURING AT THE WRONG LEVEL Austin Corollary: You get what you measure, and only what you measure Austin Corollary: You tend to lose others you cannot measure: collaboration, creativity, happiness, dedication to customer service … Suggests “measuring up” Measure the team, not the individual Measure the business, not the team Helps keep focus on outcomes, not output
  15. 15. Real Life Example Defects per Person-Hour went down! We met our quality goal! Customer Complaints went up. Oops. Pankaj Jalote. Software Project Management in Practice. Tsinghua University Press, 2004. Pages 90-922.
  16. 16. AGILE RESPONSE
  17. 17. EVERYTHING STILL FEELS RISKY Is it still risky in Agile?
  18. 18. INCREMENTS ARE GAME CHANGERS- Agile projects produce potentially shippable Increments every few weeks - The system is no longer intangible - No need to have tons of predictive metrics- Reviewing the Increment (sprint review) - Enables quick adaptation to customer needs, market concerns, quality issues, etc.
  19. 19. SCRUM BUILDS QUALITY IN Definition of Done + Acceptance Criteria Quality (Sprint Review + Stakeholder Feedback) ^ Customer Feedback Quality
  20. 20. SCRUM APPROACH The only metric that really matters is what I say about your product.
  21. 21. DOES THAT MEAN … No Metrics?! Well, OK; no metrics are better than bad metrics.
  22. 22. OUR AGILE METRICS MANIFESTO We no longer view or use metrics as isolated gauges, predictors, or decision making tools; rather they indicate a need to investigate something and have a conversation, nothing more. We realize now that the system is more complex than could ever be modeled by a discrete set of measurements; we respect this. We understand there are some behavioral psychology concepts associated with measuring [the product of] people’s work; we respect this.
  23. 23. SUGGESTEDAGILE METRICS
  24. 24. CONSIDERATIONS What really matters? Listen to the customer Understand and respect the complex system Trends over static numbers Are we measuring at the right level? How can we make this measurement a bit less isolated? How can we ensure only the correct audience sees it? Measure up! What behaviors are we trying to nurture (or avoid)? Will this help us be more agile? No Single Prescription
  25. 25. WORKING SOFTWARE Can everybody confidently give the “thumbs up” to the increment?
  26. 26. SPRINT BURNDOWN Shows team progress in sprint
  27. 27. VELOCITY Forecasts what can get to DONE in a Sprint Measures throughput, not capacity
  28. 28. RELEASE BURNDOWN
  29. 29. SUMMARYWaterfall makes me anxiousAgile inherently limits risk, renders manytraditional metrics moot The increment is a game changerMeasuring people influences their behaviorThere are useful metrics in agile Beware traditional metrics and low hanging fruit Leverage the Hawthorne effect Measure up Promote Agile/Lean/XP/good development practices
  30. 30. Scrum.Org Professional Scrum Product Owner Course. http://bit.ly/xOccnM Mike Grifiths- Leading Answers: “Smart Metrics” http://bit.ly/yfV643 Elisabeth Hendrickson – Test Obsessed : “Question from the Mailbox: What Metrics Do You Use in Agile?” http://bit.ly/xtSDdg SOURCES Jason Montague – Observations of a Reflective Commuter: “Systems Thinking and Brain Surgery” http://bit.ly/ylBxIn Ian Spence – Measurements for Agile Software Development Organizations: “Better Faster Cheaper Happier” http://bit.ly/y4UKIt N.E. Fenton – “Software Metrics: Successes, Failures & New Directions” http://bit.ly/ybwUzA Failure Rate - “Statistics over IT projects failure rate.” http://bit.ly/xjBRv0 Chad Albrecht – Ballot Debris: “Simple Scrum Diagram” http://bit.ly/yc7yFW Robert Austin–“Measuring and Managing Performance in Organization” http://amzn.to/wTfgx3These people are Mary Poppendieck– Lean Software Development “Measure Up”much smarter than I, http://bit.ly/zppVTCplease read what they Jeff Sutherland – Scrum Log: “Happiness Metric – The Wave of the Future”have to say! http://bit.ly/xO8ETS
  31. 31. THANK YOUErik Weber@erikjweberErik.Weber@Centare.com
  32. 32. UNIT TEST COVERAGE Encourages teams to write unit tests, good xp/agile/development practice Doesn’t guarantee GOOD tests – careful! 120% 100% 80% Team 1 60% Team 2 Team 3 40% 20% 0% Sprint 1 Sprint 2 Sprint 3 Sprint 4
  33. 33. CONTINUOUS INTEGRATION STATUS Current build status red/green How long has it been broken?
  34. 34. TEST CASE LIVELIHOOD Trend of new or Team 2 changing test cases 18% 16% 14% Shows if tests are 12% 10% keeping up with a 8% 6% 4% growing/changing 2% 0% software Sprint 1 Sprint 2 Sprint 3 Sprint 4 Team 3 Encourages teams to upkeep tests 10% 9% 8% 7% 6% 5% 4% 3% 2% 1% 0% Sprint 1 Sprint 2 Sprint 3 Sprint 4
  35. 35. HAPPINESS
  36. 36. ACCEPTED FEATURES
  37. 37. CUSTOMER REPORTED DEFECTS Make these visible! Customer Happiness Net Promoter Score
  38. 38. STRATEGIC ALIGNMENT INDEX Are the features we’re implementing really the highest value? Are the projects we’re running really the best ROI?
  39. 39. USAGE INDEX Are the features we’ve implemented being used? Where should we focus our attention? Feature Usage Index 1 0.9 Percent of Users Using 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 A B C D E F G H I J K
  40. 40. CYCLES TIMES How long does To-do to Done take?

×