Agile Base Camp - Agile metrics

2,622 views

Published on

Agile Metrics by Serge Kovaleff

http://kiev.agilebasecamp.org/sergej-kovalev/

Published in: Technology
0 Comments
6 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,622
On SlideShare
0
From Embeds
0
Number of Embeds
65
Actions
Shares
0
Downloads
106
Comments
0
Likes
6
Embeds 0
No embeds

No notes for slide

Agile Base Camp - Agile metrics

  1. 1. Agile Metrics facebook.com/ SergeKovaleff @gmail.com linkedin.com/in/ .blogspot.com
  2. 2. What is a Metric?
  3. 3. Process Improvement Cycle Improve Process Measure Define Process Control Process Process Execute ProcessPDCA (plan-do-check-act) by Dr. W. Edwards DemingDMAIC (Define-Measure-Analyze-Improve-Control) in Six Sigma
  4. 4. A lot of Metrics different ...• Business metrics • Process metrics – RTF (Running Tested Features) – Agile practice maturity – Impediments cleared per iteration – Earned Business Value (EBV) – Impediments carried over the next iteration – Net Present Value (NPV) – User stories carried over the next iteration – Internal Rate of Return (IRR) – User stories done per iteration – Return of Investment (ROI) – Defects carried over the next iteration• Code Metrics – Defects carried over the next iteration – Team member loading – Cyclomatic complexity – Velocity of development – Best practices violation – Backlog size – Coding standards violation • Automation Metrics – Possible bugs – Code coverage – Code duplication – Number of builds per day – Code coverage – Time taken per build – Dead code – Number of failed/success builds – Test Quality – Trends in code metrics • Testing Metrics• Design Metrics – Acceptance tests per story – Code dependencies – Defects count per story • Incoming (Affering Coupling) – Test time to run • Outgoing (Efferent Coupling) – Tests run frequency – Abstractness – Manual tests per story • Number of abstract clases and interfaces – Automation percent • Number of concrete classes – Time to fix tests This slide is not for human reading 
  5. 5. Thermometer measures a temperature of the thermometer
  6. 6. There is no spoon THE METRIC
  7. 7. The single metric to control• Quality• Speed• Productivity• Customer satisfaction• Etc.… is still being looked for :)
  8. 8. MythMetricsimproveproductivity?
  9. 9. MythMyth: bigger velocity is better
  10. 10. WILL ACCURACY OF ESTIMATES Myth INCREASED IF REQUIRE COMMITTED = 100% DELIVERED?
  11. 11. Choosing a GOOD metric
  12. 12. Encourage target behaviour
  13. 13. Measure trend
  14. 14. Easy to collect and measure
  15. 15. Inspire useful discussions (what have learnt)
  16. 16. Givesperiodicfeedback
  17. 17. Can make conclusions and fix the process
  18. 18. Measurement effectTell me how you will measure me andI’ll tell you how I will behave. Eli Goldratt “The Goal” (2004)
  19. 19. Leading vs. Lagging Leading Lagging
  20. 20. Leading vs. Lagging Leading Lagging
  21. 21. What to measure?• Productivity metrics Productivity• Predictability metrics Predict• Quality metrics Quality• Value metrics Value
  22. 22. And now ... The Metrics!
  23. 23. ProductivityProductivity metrics
  24. 24. Velocity Leading Productivity Myth
  25. 25. Risk of Story Points inflation
  26. 26. Truth: Story points are relative
  27. 27. LeadingWork In Progress Productivity
  28. 28. Leading ProductivityStoryCycleTime
  29. 29. Predictability metrics Predict
  30. 30. Sprint Burn Down Chart Leading Predict
  31. 31. Release Burn Down Chart Leading Predict
  32. 32. Leading PredictBurn Up Chart
  33. 33. ROI (Return-on-Investment)• Backlog item - Business-Value• Value in $
  34. 34. Quality metrics
  35. 35. Leading Technical Debt Quality• Backlog – Inner vs. External quality – Testing Automation – Reengineering – Maintainability – Documentation
  36. 36. Leading Running Automated Tests Quality• Amount of tests – Functional – Unit• Code Coverage• Must be increasing• Tests must pass 
  37. 37. Lagging Quality PostSprintDefectArrival
  38. 38. LaggingPost Release Defect Arrival Quality THE SAME ... after global Release
  39. 39. LaggingThe ONLY valid Quality metric Quality
  40. 40. Value Metrics
  41. 41. Lagging Customer Satisfaction Survey ValueThe best way to find outwhether your customers are ask themsatisfied is to
  42. 42. Basic satisfaction How satisfied are you with  your purchase?  the service you received?  with our company overall?
  43. 43. Customer How likely are you to  buy from us again?loyalty  recommend our product/service to others?  recommend our company to others?
  44. 44. Pr omoters Detractors Total
  45. 45. Lagging ValueCustomerSatisfactionSurvey
  46. 46. How often: Lagging“So often, that get more information, ValueBut not so often that annoy”
  47. 47. Key Drivers of satisfaction: • Conflicting goals • What does Client feel important • Scatter diagram (Key Driver Chart)
  48. 48. LaggingEmployee Satisfaction Survey Value
  49. 49. LaggingValue
  50. 50. Short Summary ∑®• Goal – Why do you want to measure? – What would be the result of the intrusion?• Classification – Time • Leading • Lagging – Subject • Productivity – Story Cycle Time • Predictability – Burn up Chart • Quality – Tech Debt • Value – Net promoters
  51. 51. Questions time 
  52. 52. Links• http://agilebasecamp.org/• http://blog.scrumtrek.ru/2010/01/wip-story-cycle-time.html• http://blog.scrumtrek.ru/2010/01/velocity.html• http://www.scrum.org.za/uploads/2009/09/measuring-for-results- 2-small.pdf• http://scrumorlando09.pbworks.com/Scrum-Metrics-and-Myths• http://www.slideshare.net/petebehrens/measuring-agility-top-5- metrics-and-myths• http://www.slideshare.net/alimenkou/agile-metrics-2725666• http://management.about.com/od/competitiveinfo/a/CustomerSatS urv.htm• http://en.wikipedia.org/wiki/Net_Promoter

×