Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Agile Metrics V6

9,293 views

Published on

Agile Metrics

  • Slide 11: The best (only good ones) metrics can be related directly to corporate goals. Process Engineering teaches you to start with goals analysis, and work down the process tree to figure out which metrics are important and which are not. Recording lots of metrics does not mean that you're recording good metrics.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Very interesting approach. I think it could be integrated with backlog prioritization techniques to help deciding the better subset of features to be included at different releases in the roadmap
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

Agile Metrics V6

  1. 1. Agile Metrics Dave Nicolette
  2. 2. An approach to software development based on the values and principles expressed in the Agile Manifesto. http://www.agilemanifesto.org Definition: Agile Software Development Copyright © 2007-2009 Dave Nicolette
  3. 3. A metric is a standard for measuring or evaluating something. A measure is a quantity, a proportion, or a qualitative comparison of some kind. Quantity : "There are 25 open defect reports on the application as of today.” Proportion : "This week there are 10 percent fewer open defect reports than last week.” Qualitative comparison : "The new version of the software is easier to use than the old version." Definition: Metrics Copyright © 2007-2009 Dave Nicolette
  4. 4. <ul><li>Informational – tells us what’s going on </li></ul><ul><li>Diagnostic – identifies areas for improvement </li></ul><ul><li>Motivational – influences behavior </li></ul><ul><li>One metric may function in multiple categories. </li></ul><ul><li>Example: Delivering high value to customers (informational) can increase team morale (motivational). </li></ul><ul><li>Beware of unintended side-effects. </li></ul><ul><li>Example: Rewarding people for fixing bugs may result in an increase in bugs, as people create opportunities to earn the rewards. </li></ul>Three Kinds of Metrics Copyright © 2007-2009 Dave Nicolette
  5. 5. Metrics as Indicators Leading Indicator Suggests future trends or events Lagging Indicator Provides information about outcomes Copyright © 2007-2009 Dave Nicolette
  6. 6. Einstein’s Wisdom Copyright © 2007-2009 Dave Nicolette
  7. 7. Agile Rule of Thumb About Metrics Measure outcomes, not activity. Copyright © 2007-2009 Dave Nicolette
  8. 8. A Minimalist Philosophy of Metrics Measure everything necessary and nothing more . Copyright © 2007-2009 Dave Nicolette
  9. 9. All the information they need to make decisions, and no more. Information at the level of detail they can use. Information at the scope they care about (team, project, program, line of business, enterprise, industry) Information pertaining to the time frame they care about (day, iteration, release, project, strategic milestone) For each stakeholder... Copyright © 2007-2009 Dave Nicolette
  10. 10. Stakeholders Team member Product Owner ScrumMaster Project Manager User Copyright © 2007-2009 Dave Nicolette Executive Auditor Process Improvement Researcher Production Support
  11. 11. <ul><li>General style of the agile process </li></ul><ul><li>Type of work being done </li></ul><ul><li>How the work is decomposed and planned </li></ul><ul><li>The team's self-organization choices </li></ul><ul><li>Characteristics of the organization </li></ul><ul><li>The team's continuous improvement goals </li></ul>Factors Influencing Your Choice of Metrics Copyright © 2007-2009 Dave Nicolette
  12. 12. Iterative Based on time-boxed iterations of fixed duration Continuous Flow Based on principles derived from Lean Manufacturing Styles of Agile Software Development Copyright © 2007-2009 Dave Nicolette
  13. 13. New Dev or Major Enhancement Project with a defined “end” to build a planned scope of work Ongoing Maintenance & Support Variable rate of incoming work requests, no defined “end” or planned scope Broad Categories of Business SW Development Copyright © 2007-2009 Dave Nicolette
  14. 14. <ul><li>Variation 1: </li></ul><ul><li>Short iterations (e.g., 1 week) </li></ul><ul><li>Low-overhead process </li></ul><ul><li>High maturity in agile thinking </li></ul><ul><li>Unit of work for execution is the User Story </li></ul><ul><li>User Stories are not decomposed into Tasks </li></ul><ul><li>User Stories are small and of consistent size </li></ul><ul><li>No sizing or estimation of fine-grained work items </li></ul><ul><li>Team commits to completing selected User Stories </li></ul><ul><li>No daily burn tracking </li></ul>How Agile Teams Plan Their Work (Short Term) Copyright © 2007-2009 Dave Nicolette
  15. 15. <ul><li>Variation 2: </li></ul><ul><li>Short to medium length iterations (e.g., 1-2 weeks) </li></ul><ul><li>Reasonably low-overhead process </li></ul><ul><li>Moderately high maturity in agile thinking </li></ul><ul><li>Unit of work for execution is the User Story </li></ul><ul><li>User Stories are not decomposed into Tasks </li></ul><ul><li>User Stories are small and of consistent size </li></ul><ul><li>Team sizes User Stories relative to each other </li></ul><ul><li>Team commits to completing a given number </li></ul><ul><ul><li>of Story Points </li></ul></ul><ul><li>No daily burn tracking </li></ul>How Agile Teams Plan Their Work (Short Term) Copyright © 2007-2009 Dave Nicolette
  16. 16. <ul><li>Variation 3: </li></ul><ul><li>Short to medium length iterations (e.g., 1-2 weeks) </li></ul><ul><li>Moderately low-overhead process </li></ul><ul><li>Average maturity in agile thinking </li></ul><ul><li>Unit of work for execution is the Task </li></ul><ul><li>User Stories are decomposed into Tasks </li></ul><ul><li>Variations in story size may affect planning </li></ul><ul><li>Team sizes User Stories relative to each other </li></ul><ul><li>Team estimates Tasks in terms of ideal time </li></ul><ul><li>Team agrees to try to complete a given number </li></ul><ul><ul><li>of ideal hours or days of work </li></ul></ul><ul><li>Daily burn tracking and re-estimation of Tasks </li></ul>How Agile Teams Plan Their Work (Short Term) Copyright © 2007-2009 Dave Nicolette
  17. 17. <ul><li>Variation 4: </li></ul><ul><li>Iteration length up to 6 weeks </li></ul><ul><li>Iterative process with some elements of agile work </li></ul><ul><li>Low maturity in agile thinking </li></ul><ul><li>Unit of work for execution is the Task </li></ul><ul><li>User Stories are decomposed into Tasks </li></ul><ul><li>Story points (if used) are pegged to ideal time </li></ul><ul><li>Team estimates Tasks in terms of ideal time </li></ul><ul><li>Team uses a “load factor” to guess at the </li></ul><ul><li>amount of non-ideal time </li></ul><ul><li>Team agrees to work a given number of </li></ul><ul><ul><li>hours or days </li></ul></ul><ul><li>Daily burn tracking and re-estimation of Tasks </li></ul>How Agile Teams Plan Their Work (Short Term) Copyright © 2007-2009 Dave Nicolette
  18. 18. The Team's Self-Organization Choices Copyright © 2007-2009 Dave Nicolette <ul><li>Generalizing specialists – peer model </li></ul><ul><li>Chief Programmer / technical lead model </li></ul><ul><li>Specialists with internal hand-offs </li></ul>
  19. 19. Characteristics of the Organization Copyright © 2007-2009 Dave Nicolette <ul><li>Fully supportive lean or “agile” organization </li></ul><ul><li>Organization embraces “agile,” </li></ul><ul><li>some areas operate in a traditional way </li></ul><ul><li>“ Agile” is an experiment or skunkworks operation, </li></ul><ul><li>low level of organizational buy-in </li></ul>
  20. 20. Organizational Differences Traditional organization Culture Risk aversion, blame-shifting, competition, zero-sum thinking, fear of failure Structure Administrative separation between application developers and their customers Management philosophy Command-and-control, Theory X, crack the whip Teams Temporary assignment, multiple assignment, functional silos Financial management Cost-center mentality; Cost Accounting Agile/Lean organization Risk management, trust, transparency, collaboration, failure as learning opportunity Application developers work for the lines of business they serve; central IT is for central functions Self-organizing teams, Theory Y, enable and support people Stable teams, dedicated teams, cross-functional teams Profit center mentality; Throughput Accounting Copyright © 2007-2009 Dave Nicolette
  21. 21. Team's Self-Improvement Goals Copyright © 2007-2009 Dave Nicolette Choose metrics that track the team's progress toward self-improvement goals Discontinue these metrics when the goals have been achieved
  22. 22. &quot;Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.&quot; and &quot;Working software is the primary measure of progress.&quot; Two Agile Principles That Guide the Choice of Metrics Copyright © 2007-2009 Dave Nicolette
  23. 23. Running Tested Features Graphic from Ron Jeffries
  24. 24. Running Tested Features Copyright © 2007-2009 Dave Nicolette
  25. 25. Running Tested Features Process style Nature of the work Stakeholders Frequency Time-boxed iterations Continuous flow Ongoing support Delivery of defined scope As each feature is delivered Copyright © 2007-2009 Dave Nicolette
  26. 26. Running Tested Features Principle Working software is the primary measure of progress . Informational Direct measure of delivered results. Diagnostic If flat or declining over time, a problem is indicated. Motivational Team members naturally want to see RTF increase. Copyright © 2007-2009 Dave Nicolette
  27. 27. Forms of “Business” Value <ul><li>Revenue </li></ul><ul><li>Cost savings </li></ul><ul><li>Market share </li></ul><ul><li>Customer relations </li></ul><ul><li>Reputation </li></ul>Copyright © 2007-2009 Dave Nicolette
  28. 28. Tracking Hard Financial Value Copyright © 2007-2009 Dave Nicolette Profit = Income - Costs Incremental delivery to production Baseline: Calculate profitability of the system/process being replaced or enhanced Per release: Calculate the change in profitability
  29. 29. Tracking Hard Financial Value Copyright © 2007-2009 Dave Nicolette
  30. 30. Hard Financial Value Process style Nature of the work Stakeholders Frequency Time-boxed iterations Continuous flow Ongoing support Delivery of defined scope As process performance is observed in production operation Copyright © 2007-2009 Dave Nicolette
  31. 31. Hard Financial Value Principle Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Informational Direct measure of financial value delivered. Diagnostic Downward trend or projection can be used to inform business decisions about continuing or modifying the project. Motivational Team members like to deliver value because it makes them feel they are contributing to the success of the organization. Stakeholders are motivated to pay attention to the business value of incremental releases. Copyright © 2007-2009 Dave Nicolette
  32. 32. Tracking Projected Value Copyright © 2007-2009 Dave Nicolette When incremental delivery is not to production...
  33. 33. Earned Business Value from Dan Rawsthorne, Calculating Earned Business Value for an Agile Project , 2006
  34. 34. Earned Business Value BV ( bucket ) = BV ( parent ) x This results in a percentage value for each item delivered to the customer representing the relative business value of the item as defined by the customer. Copyright © 2007-2009 Dave Nicolette wt ( bucket ) wt ( bucket ) + wt ( sibling ) siblings
  35. 35. Earned Business Value Example <ul><li>Applying the formula to “Update Cust Info” in the feature decomposition, we have </li></ul><ul><li>x (3 / 4) x (1 / 1) x (10 / 40 ) x (10 / 20) = 9.4% (approx) </li></ul><ul><li>This means that when the team delivers the item named “Update Cust Info,” they will have delivered 9.4% of the business value of the project, according to the customer’s own definition of the relative value of each item. </li></ul>Copyright © 2007-2009 Dave Nicolette
  36. 36. Earned Business Value: When To Use It Yes The scope of the project is well-known up front and it is possible to develop a fairly comprehensive decomposition of features before development begins. No There is a high level of uncertainty about scope, and the expectation is that the scope will emerge as the team makes progress and stakeholders learn more about the problem and the solution. EBV breaks down in the latter case because as new scope is added, the percentage of business value already delivered decreases. This makes is appear as if the project is taking business value away from the customer. Copyright © 2007-2009 Dave Nicolette
  37. 37. Earned Business Value by Points Feature Group A Feature Group B Feature Group C 600 300 100 Early in project
  38. 38. Earned Business Value by Points Feature Group A Feature Group B Feature Group C 600 300 100 Part-way through Feature A-1 Feature A-2 400 200 Story 1 Story 2 50 35 etc . Copyright © 2007-2009 Dave Nicolette
  39. 39. Earned Business Value Charts See the Agile Metrics spreadsheet, “EBV Charts” sheet, for examples Based on percentages Based on points Copyright © 2007-2009 Dave Nicolette
  40. 40. Earned Business Value Process style Nature of the work Stakeholders Frequency Time-boxed iterations Continuous flow Ongoing support Delivery of defined scope As each feature is delivered Copyright © 2007-2009 Dave Nicolette
  41. 41. Earned Business Value Principle Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Informational Direct measure of customer-defined value delivered. Diagnostic Trend should be an S curve; otherwise, problems in prioritization or valuation are indicated. Motivational Team members like to deliver value because it makes them feel they are contributing to the success of the organization. Stakeholders are motivated to pay attention to the business value of incremental releases. Copyright © 2007-2009 Dave Nicolette
  42. 42. Velocity <ul><li>Velocity is... </li></ul><ul><li>An empirical observation of the team’s capacity to complete work per iteration. </li></ul><ul><li>...and not... </li></ul><ul><li>an estimate </li></ul><ul><li>a target to aim for </li></ul>Copyright © 2007-2009 Dave Nicolette
  43. 43. Velocity <ul><li>Velocity is... </li></ul><ul><li>Based on the team’s own sizing of work items </li></ul><ul><li>...and not... </li></ul><ul><li>based on estimated or actual time </li></ul><ul><li>dictated or imposed by anyone other than team members </li></ul>Copyright © 2007-2009 Dave Nicolette
  44. 44. Velocity <ul><li>Velocity is... </li></ul><ul><li>Comparable across iterations for a given team on a given project </li></ul><ul><li>...and not... </li></ul><ul><li>comparable across teams </li></ul><ul><li>comparable across projects </li></ul>Copyright © 2007-2009 Dave Nicolette
  45. 45. Unit of Measure for Velocity How the team plans Unit of Measure Commitment to stories Story Relative sizing (points) Points Estimation (ideal hours) Ideal Hours Copyright © 2007-2009 Dave Nicolette
  46. 46. What Counts Toward Velocity? Only completed work counts toward velocity Copyright © 2007-2009 Dave Nicolette
  47. 47. Velocity Copyright © 2007-2009 Dave Nicolette
  48. 48. Velocity Process style Nature of the work Stakeholders Frequency Time-boxed iterations Continuous flow Ongoing support Delivery of defined scope At the end of each iteration Copyright © 2007-2009 Dave Nicolette
  49. 49. Velocity Principle Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Informational Empirical observation of the team’s capacity for work; useful for projecting the likely completion date of a given amount of scope; useful for estimating the amount of scope that can be delivered by a given date. Diagnostic Patterns in trends in velocity indicate various problems; provides a baseline for continuous improvement efforts Motivational Team members take pride in achieving a high velocity and keeping it stable. Copyright © 2007-2009 Dave Nicolette
  50. 50. Putting Velocity to Work: Burn Charts Burn-down chart How much work remains to be completed? Burn-up chart How much work has been completed? Combined burn chart How much work has been completed and how much work remains? Copyright © 2007-2009 Dave Nicolette
  51. 51. Burndown Chart – Line Style Copyright © 2007-2009 Dave Nicolette
  52. 52. Burndown Chart – Bar Style Copyright © 2007-2009 Dave Nicolette
  53. 53. Burnup and Burndown Chart Copyright © 2007-2009 Dave Nicolette
  54. 54. Burn Chart Process style Nature of the work Stakeholders Frequency Time-boxed iterations Continuous flow Ongoing support Delivery of defined scope When time-boxed iterations are used: At the end of each iteration When continuous flow is used: At fixed time intervals (e.g., monthly) Copyright © 2007-2009 Dave Nicolette
  55. 55. Burn Charts Principle Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Informational Direct measure of work remaining; projected completion dates; impact of scope changes on schedule. Diagnostic Indicates whether scope changes or team performance is the cause of schedule variance. Motivational Team members are motivated by seeing clearly when they are likely to finish the project and by seeing the amount of work remaining steadily reduced. Copyright © 2007-2009 Dave Nicolette
  56. 56. LT = WIP (units) / ACR (units per time period) WIP = LT * ACR ACR = WIP / LT Little’s Law <ul><li>Lead Time is the time required to deliver a given </li></ul><ul><li>amount of work. </li></ul><ul><li>WIP is work in process – items started but not completed. </li></ul><ul><li>ACR is average completion rate. </li></ul>Copyright © 2007-2009 Dave Nicolette
  57. 57. Cumulative Flow Diagram Copyright © 2007-2009 Dave Nicolette
  58. 58. Cumulative Flow Diagram Copyright © 2007-2009 Dave Nicolette Lead time WIP inventory
  59. 59. Cumulative Flow Diagram Process style Nature of the work Stakeholders Frequency Time-boxed iterations * Continuous flow Ongoing support Delivery of defined scope When time-boxed iterations are used: At the end of each iteration When continuous flow is used: At fixed time intervals (e.g., monthly) Copyright © 2007-2009 Dave Nicolette * if release cadence is decoupled from development cadence
  60. 60. Cumulative Flow Diagram Principle Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Metric Cumulative Flow Informational Visualization of flow. Empirical observation of lead time and WIP queue depth. Diagnostic Exposes capacity constraints and not-immediately-available constraints. Motivational Team members take pride in seeing the workflow in a visual form. Copyright © 2007-2009 Dave Nicolette
  61. 61. “ Continuous attention to technical excellence and good design enhances agility.” “ Simplicity – the art of maximizing the amount of work not done – is essential.” “ At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.” More Agile Principles That Guide the Choice of Metrics Copyright © 2007-2009 Dave Nicolette
  62. 62. Static Code Analysis Example Copyright © 2007-2009 Dave Nicolette
  63. 63. Static Code Analysis Example cyclomatic complexity not covered by tests warns of large methods Copyright © 2007-2009 Dave Nicolette
  64. 64. Automated Inference of TDD Practices Copyright © 2007-2009 Dave Nicolette
  65. 65. Earned Value Management (EVM) Myth EVM doesn’t apply to agile projects because it requires a detailed WBS at the outset. Copyright © 2007-2009 Dave Nicolette
  66. 66. Earned Value Formula Copyright © 2007-2009 Dave Nicolette EV = Current Start PV (Completed)
  67. 67. Adapting Earned Value Management (EVM) to Agile Projects <ul><li>Predictive planning (traditional) </li></ul><ul><li>Detailed work breakdown structure at the outset </li></ul><ul><li>Method of quantifying “done” for each item in the WBS </li></ul><ul><li>Definition of the value of each item in the WBS </li></ul><ul><li>Track planned (BCWS) and actual costs (ACWP) </li></ul><ul><li>EV is the budgeted cost of work performed (BCWP) </li></ul><ul><li>Adaptive planning (agile) </li></ul><ul><li>Scope defined at a high level at the outset (features) </li></ul><ul><li>Definition of “done” for each feature in scope </li></ul><ul><li>Definition of the value of each feature in scope </li></ul><ul><li>Track planned (budget) and actual costs (spend) </li></ul><ul><li>EV is the budgeted cost of features delivered </li></ul>Copyright © 2007-2009 Dave Nicolette
  68. 68. Budgeted Cost of Work Scheduled (BCWS) on Agile Projects Iterative process (or non-iterative process with equal-length releases) Sum of one-time costs / number of iterations (or releases) = One-time cost allocation per iteration Total on-going costs per iteration X number of iterations = Total on-going costs BCWS = Sum of one-time costs + Total on-going costs Cost per iteration = One-time cost allocation per iteration + On-going costs per iteration Non-iterative process with variable release schedule Sum of one-time costs / a chosen time interval (e.g., week) = One-time cost allocation per time interval Total on-going costs per time interval X number of time intervals = Total on-going costs BCWS = Sum of one-time costs + Total on-going costs Cost per time interval = One-time cost allocation per time interval + On-going costs per time interval Copyright © 2007-2009 Dave Nicolette
  69. 69. EV Examples See the Agile Metrics spreadsheet, “EV Iterative” and “EV Non-iterative” sheets for examples. Iterative process Non-iterative process Copyright © 2007-2009 Dave Nicolette
  70. 70. When EVM is Applicable Yes Level of effort per task is well understood Example: Corporate intranet CRUD webapp based on existing st andards No Project involves a high degree of uncertainty and will involve prototyping, spiking, research, and/or experimentation Example: The company’s first business application using an unfamiliar programming language Work items are added to the work queue in an unpredictable fashion Example: Production support group that addresses bug reports as they are received Copyright © 2007-2009 Dave Nicolette
  71. 71. Throughput Accounting Throughput (T): The rate at which a system produces goal units (money). S = net sales TVC = totally variable cost T = S – TVC Investment (I): The money tied up in the system. Operating Expense (OE): The cost of generating goal units. Copyright © 2007-2009 Dave Nicolette
  72. 72. Throughput Accounting Net Profit (NP) is throughput less operating expense. NP = T - OE Return on Investment (ROI) is net profit / investment. ROI = NP / I TA Productivity is throughput / operating expense TAP = T / OE Investment Turns (IT) are throughput / investment IT = T / I Copyright © 2007-2009 Dave Nicolette
  73. 73. Throughput Accounting and Iterative Agile Methods <ul><li>There are no “sales” and therefore no “sales price” in internal IT projects. </li></ul><ul><li>Use the project budget as the sales price. </li></ul><ul><li>Investment is the total cost of preparing the Master Story List or Product Backlog – the list of all the features to be developed. May include: </li></ul><ul><li>All up-front analysis costs </li></ul><ul><li>All up-front requirements elaboration costs </li></ul><ul><li>All project planning, release planning, and iteration planning costs </li></ul><ul><li>Operating Expense includes all costs for the iteration except investment. </li></ul>Copyright © 2007-2009 Dave Nicolette
  74. 74. Throughput Accounting: Investment for a Release Copyright © 2007-2009 Dave Nicolette I = I release + Iterations n = 0 I n
  75. 75. OE release = OE iteration x iterations Throughput Accounting: Operating Expense for a Release Copyright © 2007-2009 Dave Nicolette
  76. 76. Throughput Accounting Example See the Agile Metrics spreadsheet, “TA Iterative” sheet for an example. NP (net profit) isn’t really profit. It tells you whether you’re doing better than your budget. I (Inventory) is the cost of requirements, analysis, writing acceptance tests, and writing user stories. OE (operating expense) is the cost of building the solution. If you can drive these down, then T (throughput) and NP (net profit) will go up. Copyright © 2007-2009 Dave Nicolette
  77. 77. Reliability of Promises Reliable promise: I deliver as promised, or I tell you I can't deliver as soon as I know it. Copyright © 2007-2009 Dave Nicolette
  78. 78. Niko-Niko Calendar Symbols Positive Neutral Negative Copyright © 2007-2009 Dave Nicolette
  79. 79. Niko-Niko Calendar Patterns Copyright © 2007-2009 Dave Nicolette
  80. 80. Niko-Niko Calendar Patterns Copyright © 2007-2009 Dave Nicolette
  81. 81. Niko-Niko Calendar Example Copyright © 2007-2009 Dave Nicolette
  82. 82. Story Cycle Time (Iterative) The number of iterations it takes to complete a story. Copyright © 2007-2009 Dave Nicolette
  83. 83. Cycle Time (Lean) The average time between delivery of completed work items. Copyright © 2007-2009 Dave Nicolette
  84. 84. Problematic Measures <ul><li>Not relevant to agile methods: </li></ul><ul><li>Gantt chart </li></ul><ul><li>Percent complete </li></ul><ul><li>Time per team member per task </li></ul><ul><li>Actual time vs. estimated time </li></ul>Copyright © 2007-2009 Dave Nicolette
  85. 85. Using Trends to Spot Problems
  86. 86. Sample Scorecard Value Delivery Risks Copyright © 2007-2009 Dave Nicolette
  87. 87. Sample Scorecard Delivery Effectiveness Story Cycle Time: 2 Copyright © 2007-2009 Dave Nicolette
  88. 88. Sample Scorecard Software Quality Customer satisfaction Non-functional requirements Testing metrics Coverage Tests passing Least-tested components Static code analysis metrics Cyclomatic complexity Structural complexity Cyclic dependencies Observational/calculated Defect density Copyright © 2007-2009 Dave Nicolette
  89. 89. Sample Scorecard Continuous Improvement Copyright © 2007-2009 Dave Nicolette Build frequency Escaped defects Use of TDD Big-bang refactorings Pairing time vs solo time Overtime Issues from Retrospective
  90. 90. Agile Balanced Metrics (Forrester) Operational Excellence User Orientation Business Value Future Orientation <ul><li>Project Management </li></ul><ul><li>Productivity </li></ul><ul><li>Organizational Effectiveness </li></ul><ul><li>Quality </li></ul><ul><li>User Satisfaction </li></ul><ul><li>Responsiveness to needs </li></ul><ul><li>Service Level Performance </li></ul><ul><li>IT Partnership </li></ul><ul><li>Business value of projects </li></ul><ul><li>Alignment with strategy </li></ul><ul><li>Synergies across business </li></ul><ul><li>units </li></ul><ul><li>Development capability </li></ul><ul><li>improvement </li></ul><ul><li>Use of emerging processes </li></ul><ul><li>and methodologies </li></ul><ul><li>Skills for future needs </li></ul>Copyright © 2007-2009 Dave Nicolette
  91. 91. Agile Project Scorecard (Ross Pettit) Copyright © 2007-2009 Dave Nicolette
  92. 92. Sample Agile Dashboard (VersionOne)
  93. 93. Sample Agile Dashboard (Serena)
  94. 94. Thanks for your time! Contact information: Dave Nicolette Email [email_address] Blogs http://www.davenicolette.net/agile http://www.davenicolette.net/taosoft Workshops http://davenicolette.wikispaces.com Copyright © 2007-2009 Dave Nicolette

×