Metrics Sirisha

1,209
-1

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,209
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
41
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Metrics Sirisha

  1. 1. Software Metrics –Overview Blackboard by Sirisha N
  2. 2. Objectives <ul><li>Understand metrics </li></ul><ul><li>Applying Metrics to drive Testing Projects </li></ul><ul><li>Data sources and means of capturing metrics </li></ul>
  3. 3. Agenda <ul><li>Introduction </li></ul><ul><ul><li>Why Measure </li></ul></ul><ul><ul><li>Purpose </li></ul></ul><ul><ul><li>Metrics in ISO, CMM & CMMI </li></ul></ul><ul><ul><li>Basic Definitions </li></ul></ul><ul><ul><li>Institutionalize Metrics program </li></ul></ul><ul><ul><li>What to Measure </li></ul></ul><ul><ul><li>Data Collection Strategy </li></ul></ul><ul><li>Operational definitions </li></ul><ul><li>Metrics based Project Management </li></ul>
  4. 4. Purpose <ul><li>To plan projects (estimation based on past data) </li></ul><ul><li>To control project’s process </li></ul><ul><ul><li>Taking corrective and preventive action in a timely manner </li></ul></ul><ul><ul><li>To monitor the goals of the project set by client/organization </li></ul></ul><ul><li>To provide feedback on performance of projects </li></ul><ul><li>To improve the organization process/tools/methods </li></ul><ul><li>To identify the training needs based on project performance </li></ul>
  5. 5. Metrics in ISO, CMM & CMMI <ul><li>ISO9001:2000 </li></ul><ul><li>Section 8 – Measurement Analysis & improvement </li></ul><ul><li>CMM </li></ul><ul><li>Level 4 KPA’s – Quantitative Process Management </li></ul><ul><li>– Software Quality Management </li></ul><ul><li>CMMI </li></ul><ul><li>Level 2 KPA – Measurement and Analysis </li></ul><ul><li>Level 4 KPA’s – Quantitative Project Management </li></ul><ul><li>– Organizational Process Performance </li></ul>
  6. 6. Basic Definitions <ul><li>Metrics </li></ul><ul><ul><li>“ A quantitative measure of the degree to which a system, component or process possesses a given attribute” ---- IEEE </li></ul></ul><ul><ul><li>e.g, No of defects per KLOC/FP, No. of TC Authored / Hr. </li></ul></ul><ul><ul><li>Defects per person hour </li></ul></ul><ul><li>Measurement </li></ul><ul><ul><li>“ is the act of determining a measure of something” </li></ul></ul><ul><li>Measure </li></ul><ul><ul><li>“ is a single quantitative attribute of an entity – the basic building block for a measurement” </li></ul></ul><ul><ul><li>e.g, 100 LOC – “100 is the measure, LOC is the Unit of measure” </li></ul></ul>
  7. 7. Institutionalize Metrics Program <ul><li>SG 1 Align Measurement and </li></ul><ul><li>Analysis Activities </li></ul><ul><li>SP 1.1-1 Establish Measurement </li></ul><ul><li>Objectives </li></ul><ul><li>SP 1.2.1 Specify Measures </li></ul><ul><li>SP 1.3.1 Specify Data Collection and </li></ul><ul><li>Storage Procedures </li></ul><ul><li>SP 1.4.1 Specify Analysis Procedures </li></ul>CMMI L 2 – PA 5 SG 2 Provide Measurement Results SP 2.1-1 Collect Measurement Data SP 2.2.1 Analyze Measurement Data SP 2.3.1 Store Data and Results SP 2.4.1 Communicate Results Clarify Business Goals Prioritize Issues Select & Define Measures Collect, Verify & Store Data Analyze Process Behavior Stable? Capable? Continually Improve Remove Assignable Causes Change Process New Issues? New Measures? New Goals? Y N N N N N Y Y Y
  8. 8. What to Measure ? <ul><li>Process Metrics: </li></ul><ul><li>Metrics that are used to control processes in a software system </li></ul><ul><li>( i.e, Productivity, Efficiency, etc ) </li></ul>Product Metrics: Metrics that are used to control the software life cycle process ( not within the scope of EQA ) Project Metrics: Metrics that are used to control project life cycle process ( i.e, Effort Variation, Schedule Variation, etc ) Quality Metrics: Metrics that are used to control quality in product or service ( i.e, CSI, % TC modified, etc )
  9. 9. … ..What to Measure ? Software Test Metrics Product Metrics Process Metrics Project Metrics Quality Metrics 1. Size Variation 2. Defect density 3. Code coverage 4. MTBF 1. TCA Productivity 2. TCR Productivity 3. TCE Productivity 4. Test Case challenged percentage 1. Effort variation 2. Schedule Variation 3. Schedule Compliance 4. Staff Utilization 1. Adhoc Bug % 2. Challenged Bug % 3. Rejected Bug % 4. Customer satisfaction Index
  10. 10. Data Collection Strategy WBS (.XLS) Time Sheet (.XLS) Estimation Sheet (.XLT) Estimation Methodology DTS (.XLS) RL (.XLS) Data Collected in PROJECT DATA COLLECTION EQA 2.0 D1.XLS PROJECT WBS EQA 1.0 D1.XLS Resource Name, Project Name, Build Name, Planned Tasks, Unplanned Tasks, Time spent Testing Defects, Customer identified defects (CID) Review Errors & Defects Test Report (.XLS) Guidelines, Templates Planned Effort, Actual Effort, Planned Start date, Planned Finish date, Interim Start date, Interim Finish date, Actual Start date, Actual Finish date, Estimated Size, Estimated Effort, Estimated Resource Count, CID Report (.XLS) Defect ID, Description, Source / Location, Identified date / by, Defect Type / Class, Detected in Phase, Injected in Phase, Defect Severity, Defect Status Error / Defect ID, Description, Source / Location, Identified date / by, Error/Defect Status Derived Metrics: 1. Effort Variation 2. Schedule Variation 3. TCA productivity 4. TCR productivity 5. TCE productivity 6. Challenged TC % 7. Adhoc Bug % 8. Challenged Bug % 9. Rejected Bug % Test Case ID, Executed by, Execution date, Test procedure, Expected results, Actual results, Execution status, Defect description Defect description Identified by, Identified date, Defect Type, Defect priority,
  11. 11. Operational Definitions <ul><li>For each Metric </li></ul><ul><ul><li>Objective </li></ul></ul><ul><ul><li>Definition </li></ul></ul><ul><ul><li>Formula used </li></ul></ul><ul><ul><li>Unit of Measure </li></ul></ul><ul><ul><li>Key Note </li></ul></ul><ul><li>For each Metric </li></ul><ul><ul><li>Data Input </li></ul></ul><ul><ul><li>Data Source </li></ul></ul><ul><ul><li>Responsibility </li></ul></ul><ul><ul><li>Frequency </li></ul></ul>A. Metrics Operational Definitions: B. Decision Criteria: C. Data Collection Procedure: Measurement Method Base Measure Life Cycle Definition Attribute/Entity UOM Metric Availability Distribution Reporting cycle Data extraction cycle Reporting Format Data Pattern Data Collection Rules & Procedures Who Collects the Data Data Elements / Fields Record Database Data type Data Item
  12. 12. Effort Variation <ul><li>Objective </li></ul><ul><ul><li>To improve the estimation and productivity </li></ul></ul><ul><li>Definition </li></ul><ul><ul><li>Effort variation is the % deviation of the actual effort spent on a project/ phase/activity from the estimated effort </li></ul></ul><ul><li>Formula used </li></ul><ul><ul><li>Effort Variation = Actual Effort – Estimated Effort x 100% </li></ul></ul><ul><ul><li> Estimated Effort </li></ul></ul><ul><li>Unit of Measure </li></ul><ul><ul><li>% </li></ul></ul>
  13. 13. Effort Variation <ul><li>Key Note </li></ul><ul><ul><li>If the figure is negative – efforts put in the project is less </li></ul></ul><ul><ul><li>If the figure is positive - efforts put in the project is more </li></ul></ul><ul><ul><li>Can be used as a multiplication factor to the estimated effort to arrive at a near approximate and realistic figure </li></ul></ul><ul><li>Data Input </li></ul><ul><ul><li>Activity code/Activity name </li></ul></ul><ul><ul><li>Actual Effort (PH) </li></ul></ul><ul><ul><li>Estimated Effort (PH) </li></ul></ul><ul><ul><li>Estimation methodology </li></ul></ul><ul><li>Other Inputs </li></ul><ul><ul><li>Effort type (Requirement. Analysis/ Test Design/Test Execution) </li></ul></ul><ul><ul><li>Product/Build/Module/ Phase </li></ul></ul>
  14. 14. Effort Variation Effort can be derived from Size, if Productivity factor is known.. Effort (PH) = {Size (# TC) x 1000} / Productivity (TCA or TCR or TCE /Hr.) Data Collection Sheet for Effort Variation -Phase Wise Artifact : Project WBS EQA 1.0 D1.xls                                                                                                             % Variation Actual Effort (in person hrs) Planned Effort (in person hrs) Activity Phase Modules Build Product Activity Code
  15. 15. Effort Variation <ul><li>Responsibility </li></ul><ul><ul><li>PL/TL (EQA) will provide estimated effort data </li></ul></ul><ul><ul><li>Actual effort data/activity will be provided by every staff using Time sheet </li></ul></ul><ul><li>Frequency </li></ul><ul><ul><li>Estimated effort will be sent to SEPG at the time of estimation and effort distribution (includes Re-estimates too) </li></ul></ul><ul><ul><li>Actual effort data will be sent to SEPG on weekly basis (Time sheet) </li></ul></ul><ul><li>Data Source </li></ul><ul><ul><li>Estimated Effort – Proposal.doc , Contract.doc , Test Effort Estimation Sheet.doc </li></ul></ul><ul><ul><li>Actual Effort - Project Data Collection.xls , Project WBS EQA.xls, Time Sheet.xls </li></ul></ul>
  16. 16. Schedule Variation <ul><li>Objective </li></ul><ul><ul><li>To identify the current status of the project and check whether the project can meet the schedule deadline </li></ul></ul><ul><li>Definition </li></ul><ul><ul><li>Schedule variation is the % deviation of actual duration from the planned duration of a project, phase or activity </li></ul></ul><ul><li>Formula used </li></ul><ul><ul><li>Schedule Variation = Actual Finish – Planned Finish x 100% </li></ul></ul><ul><ul><li> (Planned Finish – Planned Start) + 1 </li></ul></ul><ul><li>Unit of Measure </li></ul><ul><ul><li>% </li></ul></ul>
  17. 17. Schedule Variation <ul><li>Key Note </li></ul><ul><ul><li>If the figure is negative - the schedule has crossed the target duration </li></ul></ul><ul><ul><li>If the figure is positive - the schedule is ahead of the target duration </li></ul></ul><ul><ul><li>Can be used as a multiplication factor to estimate realistic schedule for each milestone </li></ul></ul><ul><li>Data Input </li></ul><ul><ul><li>Activity code/Activity name </li></ul></ul><ul><ul><li>Plan start & finish dates </li></ul></ul><ul><ul><li>Planned Duration (PD) </li></ul></ul><ul><ul><li>Actual start & finish dates </li></ul></ul><ul><ul><li>Actual Duration (AD) </li></ul></ul><ul><ul><li>% complete </li></ul></ul><ul><li>Other Inputs </li></ul><ul><ul><li>Task type (Technical/Project) </li></ul></ul><ul><ul><li>(Planned/Unplanned) </li></ul></ul><ul><ul><li>Product/Build/Module/ Phase </li></ul></ul>
  18. 18. Schedule Variation Data Collection Sheet for Schedule Variation - Phase Wise Artifact : Project WBS EQA 1.0 D1.xls                                                                                                                                                                         % Comp lete % Varia tion Actual Finish Date Actual Start Date Actual Duration (cal. days) Plan Finish Date Plan Start Date Plan Duration (cal. days) Activity Phase Modules Build Product Act. Code
  19. 19. Schedule Variation <ul><li>Responsibility </li></ul><ul><ul><li>PL/TL (EQA) will provide the planned schedule data </li></ul></ul><ul><li>Frequency </li></ul><ul><ul><li>Schedule data will be reported to SEPG at WBS completion and as and when re-scheduled during the PLC phases </li></ul></ul><ul><ul><li>Final schedule data will be sent at the project closure </li></ul></ul><ul><li>Data Source </li></ul><ul><ul><li>Planned Duration – WBS Sheet.mpp, Test Plan.doc </li></ul></ul><ul><ul><li>Actual Duration – Project Data Collection.xls , Project WBS EQA.xls, </li></ul></ul>
  20. 20. Productivity <ul><li>Objective </li></ul><ul><ul><li>To find out the productivity of a project </li></ul></ul><ul><li>Definition </li></ul><ul><ul><li>Is the size of the task completed per hours of effort (effort being fixed) </li></ul></ul><ul><li> Formula used </li></ul><ul><ul><li>(TCA/TCR/TCE) Productivity = Actual Size (# TC) </li></ul></ul><ul><ul><li> Effort </li></ul></ul><ul><li>Unit of Measure </li></ul><ul><ul><li># TCA/PH , # TCR/PH , # TCE/PH </li></ul></ul>
  21. 21. Productivity <ul><li>Key Note </li></ul><ul><ul><li>No. of test steps shall also be taken into account </li></ul></ul><ul><ul><li>In case of test scripts, Lines of script (LOS) shall be considered </li></ul></ul><ul><ul><li>In case of back end scripting, LOS generated by Tool shall be counted </li></ul></ul><ul><li>Data Input </li></ul><ul><ul><li># TC Authored </li></ul></ul><ul><ul><li># TC Reviewed </li></ul></ul><ul><ul><li># TC Executed </li></ul></ul><ul><ul><li>TCA Effort spent </li></ul></ul><ul><ul><li>TCR Effort spent </li></ul></ul><ul><ul><li>TCE Effort spent </li></ul></ul><ul><li>Other Inputs </li></ul><ul><ul><li>Date </li></ul></ul><ul><ul><li>Resource ID </li></ul></ul><ul><ul><li>Product/Build/Module </li></ul></ul>
  22. 22. Productivity Artifact : Project Data Collection EQA 2.0 D1.xls Data Collection Sheet for Test Case Productivity                                                                                                                                                           TC Execution Effort # TC Executed TC Reviewing Effort # TC Reviewed TC Authoring Effort # TC Authored Module Build Product Resource ID Date
  23. 23. Productivity <ul><li>Responsibility </li></ul><ul><ul><li>PL/TL will provide the size and effort data (design/Doc./Manual Testing) </li></ul></ul><ul><ul><li>Size Capturing Toll will provide size data for coding/Automated Testing </li></ul></ul><ul><li>Frequency </li></ul><ul><ul><li>productivity data will be sent to SEPG on weekly basis (every Friday) </li></ul></ul><ul><ul><li>Final size and effort details are reported at every phase milestone </li></ul></ul><ul><li>Data Source </li></ul><ul><ul><li>Effort data - Project Data Collection EQA 2.0 D1.xls, </li></ul></ul><ul><ul><li>Size data - Project Data Collection EQA 2.0 D1.xls, Size Capturing Tools , </li></ul></ul>
  24. 24. Adhoc/Challenged/Rejected Bug % <ul><li>Objective </li></ul><ul><ul><li>To effectively identify and report bugs early in the product </li></ul></ul><ul><li>Definition </li></ul><ul><ul><li>Is the % of bugs Adhoc/Challenged/Rejected as compared to the total no. of bugs identified in the product </li></ul></ul><ul><li> Formula used </li></ul><ul><ul><li>Adhoc/Chall./Rej. Bug % = # of Adhoc/Chall./Rej. bugs x 100% </li></ul></ul><ul><ul><li> Total # of bugs found </li></ul></ul><ul><li>Unit of Measure </li></ul><ul><ul><li>% </li></ul></ul>
  25. 25. Adhoc/Challenged/Rejected Bug % <ul><li>Key Note </li></ul><ul><ul><li>Challenged bugs are challenged by client and later accepted </li></ul></ul><ul><ul><li>Adhoc bugs are identified during Adhoc/Exploratory testing </li></ul></ul><ul><li>Data Input </li></ul><ul><ul><li>Bugs by Testing type </li></ul></ul><ul><ul><li>Total Bugs posted </li></ul></ul><ul><ul><li># Enhancement Bugs </li></ul></ul><ul><ul><li># Challenged Bugs </li></ul></ul><ul><ul><li># Redundant Bugs </li></ul></ul><ul><ul><li># Invalid Bugs </li></ul></ul><ul><li>Other Inputs </li></ul><ul><ul><li>Date </li></ul></ul><ul><ul><li>Resource ID </li></ul></ul><ul><ul><li>Product/Build/Module </li></ul></ul><ul><ul><li>Severity (Cr/H/M/L) </li></ul></ul>
  26. 26. Adhoc/Challenged/Rejected Bug % Data Collection Sheet for Bug Details Artifact : Project Data Collection EQA 2.0 D1.xls                                                                                         # Invalid Bugs # Redundant Bugs # Challenged Bugs # Enhance ments # Bugs Posted Bugs by Testing Type Module Build Product Resource ID Date
  27. 27. Adhoc/Challenged/Rejected Bug % <ul><li>Responsibility </li></ul><ul><ul><li>PL/TL will provide the review Bug data </li></ul></ul><ul><li>Frequency </li></ul><ul><ul><li>will be sent to SEPG on weekly basis (every Friday) </li></ul></ul><ul><ul><li>Final review defect details are reported at project closure </li></ul></ul><ul><li>Data Source </li></ul><ul><ul><li>Bug data - Project Data Collection EQA 2.0 D1.xls, </li></ul></ul>
  28. 28. Process Capability Baseline Is Process Stable/Capable? <ul><ul><li>Variation brings inconsistency in a process </li></ul></ul><ul><ul><li>Variations are either due to Chance/Assignable Causes </li></ul></ul><ul><ul><li>80% of Variation are caused by 20% of Causes </li></ul></ul><ul><ul><li>Eliminating variation brings a stable process </li></ul></ul><ul><ul><li>However a stable process may not be Capable!!! </li></ul></ul>
  29. 29. Metrics based Project Mgmt.
  30. 30. Metrics based Project Mgmt.
  31. 31. Metrics based Project Mgmt.
  32. 32. Q Thank

×