Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Metrics For Agile @CSI SPIN Mumbai Mar2011

1,233 views

Published on

Metrics for Agile presented at Computer Society of India -CSI SPIN Mumbai on 23 Mar 2011

  • Be the first to comment

Metrics For Agile @CSI SPIN Mumbai Mar2011

  1. 1. METRICS FOR AGILE CSI SPIN Mumbai Chapter 2011 -Priyank email: priyankdk@gmail.com © Cybercom Datamatics Information Solutions.
  2. 2. ABOUT US
  3. 3. MeasureMetrics
  4. 4. Qualitative Quantitative © Cybercom Datamatics
  5. 5. FULLY SUPERVISED PARTIAL SUPERVISED © Cybercom Datamatics
  6. 6. DEFINITIONS Effort – the actual hours required to write the software. Defect – the unaccepted functionality, hopefully identified by test case… through web search - A flaw in a component or system that can cause the component or system to fail to perform its required function. Schedule/Duration –the calendar time to get something done Cost – strongly correlated with effort, but duration also plays a role Size – something that can be counted/measured. Hopefully it is representative of effort. Plan/Estimated – our educated guess, is a probability. Actual – measured result. Quality – A delight
  7. 7. METRICS FOR AGILE- Efforts ,Top-Line, Velocity, Burn-Down,- Cost- Schedule, Time to market , Cycle time- Defects- Technical debt NEED OF THESE METRICS  Can help you  Understand about scrum performance  Drawing scrum progress, productivity, predictability  Analyze quality and value  Pain points, Improvement areas  Motivation & Performance  Simple Scrum (Time Boxed Continuous Iterations & Release)
  8. 8. MANIFESTO FOR AGILE © Agile Alliance http://agilemanifesto.org
  9. 9. AGILE IS VALUE DRIVEN & ADAPTIVEConstraints Requirement Cost Schedule Value Driven Plan DrivenEstimates Schedule Cost Features Predictive Agile - Adaptive
  10. 10. TOP-LINE, RELEASE BURN-UP Base Measure – • Total Number of Story Points • Total Number of Sprints Planned • Story Points planned at each sprint • Story Points completed in each sprint
  11. 11. VELOCITY Velocity is relative measure of progress. It can be measured by Feature delivered in an iteration & It is a measure of how much Product Backlog the team can complete in a given amount of time. Feature are usually initial stories and some times are set of feature with some non features.
  12. 12. BURN DOWN Burn-down chart shows the estimated number of hours required to complete the tasks of the Sprint. And similar to earned-value chart if you count the delivered functionality over time – Accepted work. It shows both the status and rate of progress (“velocity”) in a way that is both clear and easy to discuss.
  13. 13. BURN UP Burn-up chart shows the amount of Accepted work (that work which has been completed, tested and met acceptance criteria) And is shows the Scope - how much work is in the project as whole.
  14. 14. SCHEDULE & COST METRICSMetrics can be derived from this – Actual percent complete (APC) = Complete Story Point/Total Story Points Expected Percent Complete(EPC) = Number of completed iterations /number of planned iteration Base Measure – Planned Value (PV)= EPC x Budget • Budget Allocated for the project • Total Number of Story Points AC =Actual Cost in $ or soft-cost in Hrs spent • Total Number of Sprints Planned EV(Earned Value)=APC x Budget • Story Points planned at each sprint • Story Points completed in each sprint Schedule Performance Index (SPI) • Release variance – plan vs. actual = EV/PV, greater than 1 is good (ahead of schedule) Cost Performance Index (CPI) = EV/ AC, greater than 1 is good (under budget) Cost variance (CV) = EV – AC, greater than 0 is good (under budget) Schedule variance (SV)= EV –PV, greater than 0 is good (ahead of schedule) Value realization or Velocity.
  15. 15. VALUE REALIZATION (VELOCITY) In the given example - Budget = 100 $ Total SP = 120 Total Sprint = 12 After 4th Sprint where in First Sprint SP Accepted 9 out of 10, in Second Sprint 10 out of 10, in Third 10 : 10 & in Fourth 10:10 APC = 39/120 which is 0.325 , in % 32.5 EPC = 4/12 = 0.33 , in % 33.33 PV = 0.33 x 100 = 33 EV = 0.325 x 100 = 32.5 Lets assume is AC = 40 $ (or 400 Hrs, where 10 Hrs = 1 $) SPI = 32.5/33 = 0.98 CPI = 32.5/40 = 0.81
  16. 16. DEFECTS Defect Removal Efficiency (DRE) is a base measure which we can tailor for Scrum DRE = E / ( E + D )  Where E = No. of Errors found before delivery of the software and  D = No. of Errors found after delivery of the software @Scrum  E = No. of Errors found before delivery of the software in any iteration (@ during sprint execution )and  D = No. of Errors found after delivery of the software (@ Production ) Ideal DRE = 1. DRE less than 1 needs RCA
  17. 17. TECHNICAL DEBTQuality can be best view through code ….Referencehttp://nemo.sonarsource.orgCopyrighthttp://sonarsource.org
  18. 18. FEW MORE BASICS QUALITY METRICS Technical debt Test case, Bugs Complexity Cyclomatic Complexity Violations Class, Methods, Duplication, Comments etc..
  19. 19. QUALITY METRICS -Referencehttp://nemo.sonarsource.orgCopyrighthttp://sonarsource.org
  20. 20. REFERENCES - http://www.mountaingoatsoftware.com http://www.agilemodeling.com http://jamesshore.com/Agile-Book/assess_your_agility.html http://java.net/projects/hudson/ http://www.sonarsource.org/ http://docs.codehaus.org/display/SONAR/Metric+definitions https://wiki.rallydev.com http://www.infoq.com/
  21. 21. cdis.in

×