Your SlideShare is downloading. ×
0
SEA Side   Software Engineering Annotations Annotation11:  Software Metrics One hour presentation to inform you of new tec...
Express in Numbers Metrics Measurement provides a mechanism for  objective  evaluation
Software  Crisis <ul><li>According to American Programmer, 31.1% of computer software projects get canceled before they ar...
Software Metrics <ul><li>It refers to  a broad range of quantitative measurements for computer software that enable to </l...
Measure, Metrics, Indicators <ul><li>Measure. </li></ul><ul><ul><li>provides a quantitative  indication of the extent, amo...
What Should Be Measured? measurement What do we use as a basis? •  size? •  function? project metrics process metrics proc...
Metrics of Process Improvement <ul><li>Focus on Manageable Repeatable Process </li></ul><ul><li>Use of Statistical SQA on ...
Statistical Software Process Improvement All errors and defects are categorized by  origin The  cost  to correct each erro...
Causes and Origin of Defects
Metrics of Project Management <ul><li>Budget </li></ul><ul><li>Schedule/ReResource Management </li></ul><ul><li>Risk Manag...
Metrics of the Software Product <ul><li>Focus on Deliverable Quality </li></ul><ul><li>Analysis Products </li></ul><ul><li...
How Is Quality Measured? <ul><li>Analysis Metrics </li></ul><ul><ul><li>Function-based Metrics:  Function Points( Albrecht...
Source Lines of Code (SLOC) <ul><li>Measures the number of physical lines of active code   </li></ul><ul><li>In general th...
Function  Oriented Metric - Function Points <ul><li>Function Points are a measure of “how big” is the program, independent...
Analyzing the Information Domain Assuming all inputs with the same weight, all output with the same weight, … Complete For...
Taking Complexity into Account Formula :
Typical Function-Oriented Metrics <ul><li>errors per FP (thousand lines of code) </li></ul><ul><li>defects per FP </li></u...
LOC vs. FP <ul><li>Relationship between lines of code and function points depends upon the programming language that is us...
LOC/FP (average) Assembly language 320 C 128 COBOL, FORTRAN 106 C++   64 Visual Basic   32 Smalltalk   22 SQL   12 Graphic...
How Is Quality Measured? <ul><li>Design Metrics </li></ul><ul><ul><li>Structural Complexity:  fan-in, fan-out, morphology ...
Comment Percentage (CP) <ul><li>Number of commented lines of code divided by the number of non-blank lines of code   </li>...
Size Oriented Metric - Fan In and Fan Out <ul><li>The Fan In of a module is the amount of information that “enters” the mo...
Size Oriented Metric - Halstead Software Science Primitive Measures  number of distinct operators number of distinct opera...
Flow Graph a Y X Predicate Nodes if (a) { X(); } else { Y(); } <ul><li>V(G) = E - N + 2   </li></ul><ul><li>where E = numb...
McCabes Metric <ul><li>Smaller the V(G) the simpler the module. </li></ul><ul><li>Modules larger than V(G) 10 are a little...
Chidamber and Kemerer Metrics <ul><li>Weighted methods per class (MWC) </li></ul><ul><li>Depth of inheritance tree (DIT) <...
Weighted methods per class (WMC) <ul><li>c i  is the  complexity  of each method M i  of the class </li></ul><ul><ul><li>O...
Depth of inheritance tree (DIT) <ul><li>For the system under examination, consider the hierarchy of classes </li></ul><ul>...
Number of children (NOC) <ul><li>For any class in the inheritance tree, NOC is the number of  immediate  children of the c...
Coupling between object classes (CBO) <ul><li>For a class, C, the CBO metric is the number of other classes to which the c...
Response for class (RFC) <ul><li>Mc i  # of methods called in response to a message that invokes method M i </li></ul><ul>...
Lack of cohesion metric (LCOM) <ul><li>Number of methods in a class that reference a specific  instance variable </li></ul...
Testing Metrics <ul><li>Metrics that predict the likely number of tests required during various testing phases </li></ul><...
Views on SE Measurement
Views on SE Measurement
Views on SE Measurement
12 Steps to Useful Software Metrics <ul><li>Step 1 - Identify Metrics Customers </li></ul><ul><li>Step 2 - Target Goals </...
Step 1 - Identify Metrics Customers <ul><li>Who needs the information? </li></ul><ul><li>Who’s going to use the metrics? <...
Step 2 - Target Goals <ul><li>Organizational goals </li></ul><ul><ul><li>Be the low cost provider  </li></ul></ul><ul><ul>...
Step 3 - Ask Questions <ul><li>Goal: Maintain a high level of customer  </li></ul><ul><li>satisfaction </li></ul><ul><li>W...
Step 4 - Select Metrics <ul><li>Select metrics that provide information  </li></ul><ul><li>to help answer the questions </...
Selecting Metrics <ul><li>Goal:  Ensure all known defects are corrected before shipment </li></ul>
Metrics Objective Statement Template <ul><li>Example - M etric:  % defects corrected </li></ul>evaluate % defects found & ...
Step 5 - Standardize Definitions Developer Open User Open
Step 6 - Choose a Measurement <ul><li>Models for code inspection metrics </li></ul><ul><li>Primitive Measurements: </li></...
Step 7 - Establish Counting Criteria <ul><li>Lines of Code </li></ul><ul><li>Variations in counting </li></ul><ul><li>No i...
Counting Criteria - Effort <ul><li>What is a Software Project? </li></ul><ul><li>When does it start / stop? </li></ul><ul>...
Step 8 - Decide On Decision Criteria <ul><li>Establish Baselines </li></ul><ul><li>Current value </li></ul><ul><ul><li>Pro...
Step 9 - Define Reporting Mechanisms
Step 10 - Determine Additional Qualifiers <ul><li>A good metric is a generic metric </li></ul><ul><li>Additional qualifier...
Additional Qualifier Example  <ul><li>Metric:  software defect arrival rate </li></ul><ul><li>Release / product / product ...
Step 11 – Collect Data <ul><li>What data to collect?   </li></ul><ul><ul><li>Metric primitives </li></ul></ul><ul><ul><li>...
Examples of Data Ownership
Step 12 – Consider Human Factors <ul><li>The People Side of the Metrics Equation </li></ul><ul><li>How measures affect peo...
Don’t Measure individuals Use metrics as a “stick” Ignore the data Use only one metric Cost Quality Schedule
Do Select metrics based on goals Goal 1 Goal 2 Question 1 Question 2 Question 3 Question 4 Metrics 1 Metric 2 Metric 3 Met...
References <ul><li>Chidamber, S. R. & Kemerer, C. F., “A Metrics Suite for Object Oriented Design”,  IEEE Transactions on ...
Upcoming SlideShare
Loading in...5
×

Software Metrics

5,000

Published on

0 Comments
6 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
5,000
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
358
Comments
0
Likes
6
Embeds 0
No embeds

No notes for slide
  • Process Software Development Process (requirements, design, code, test, implementation) Product (code, requirements specification, production software, design documentation, risk assessment) Process Metrics give insight into the dynamics of a given software process enabling project management to evaluate the efficiency of that process. Project Metrics provide software project management with means to measure risk, progress, quality of the project. Product Metrics provide software developers means to measure defects, errors, flaws. design strategies
  • error rates and counts defect rates and counts
  • Budgeted Cost Work Scheduled (BCWS) Budgeted Cost of Work Performed (BCWP) Actual Cost of Work Performed (ACWP) Budget at Completion (BAC) BAC=  (BCWS) Cost Variance CV=BCWP-ACWP Cost Performance Index (CPI) CPI=BCWP/ACWP Estimate at Completion EAC = BAC/CPI Estimate to Completion (ETC) ETC = EAC-ACWP Effort/Time per SE Task Distribution of Effort per SE Task Schedule vs Actual Milestones Errors uncovered per review hour Percentage slipage per time period Risk type: Technical, Personnel, Vendor... Probability Impact Overall Risk Risk Management Strategy effort expended (force) schedule expended (distance) productivity rates ( pages/hour, delivered documents, sloc/day )
  • Size (Function Pts, SLOC, Modules, Subsystems, Pages, Documents ) Error density ( errors/ KSLOC ) Defect density ( defects/ KSLOC or defects/ FP ) Quality (Correctness, Maintainability, Reliability, Efficiency, Integrity, Usability, Flexibility, Testability, Portability, Reusability, Availability, Complexity, Understandability, Modifiability )
  • Independent of programming language Does not penalize a piece of code for limiting the LOC by efficient means
  • Each Fi has value from 0 to 5.
  • In the mid 70’s Halstead developed his Software Science . His metrics are important more as a background knowledge than as a real measure. Notice that they have nothing to do with the representational theory of measurement. The real point is “What is an operand?” “What is an operator?” … the situation is not really clear nowadays with Java … He also defines approximations, so that this computations could be performed before the actual program writing.
  • Frustrations -- &gt; Goals Project manager: our estimated project schedules always end up being way off. Functional manager: if we have another late delivery... Engineer: too much overtime -- not enough time to do things right Test manager: no time left to test product before it is scheduled to ship Improve the schedule estimation process
  • Turning Questions into Lower Level Goals High level goal: Ship only defect free software Questions: Is software adequately tested? How many defects are still undetected? Are all known defects corrected? Lower level goal: Ensure all known defects are corrected before shipment
  • Select definitions from the literature that match your organizational goals Use them as a basis for creating your own definitions Apply them consistently Include them in an appendix to each metrics report
  • Transcript of "Software Metrics"

    1. 1. SEA Side Software Engineering Annotations Annotation11: Software Metrics One hour presentation to inform you of new techniques and practices in software development. Professor Sara Stoecklin Director of Software Engineering- Panama City Florida State University – Computer Science [email_address] [email_address] 850-522-2091 850-522-2023 Ex 182
    2. 2. Express in Numbers Metrics Measurement provides a mechanism for objective evaluation
    3. 3. Software Crisis <ul><li>According to American Programmer, 31.1% of computer software projects get canceled before they are completed, </li></ul><ul><li>52.7% will overrun their initial cost estimates by 189%. </li></ul><ul><li>94% of project start-ups are restarts of previously failed projects. Solution? systematic approach to software development and measurement </li></ul>
    4. 4. Software Metrics <ul><li>It refers to a broad range of quantitative measurements for computer software that enable to </li></ul><ul><ul><li>improve the software process continuously </li></ul></ul><ul><ul><li>assist in quality control and productivity </li></ul></ul><ul><ul><li>assess the quality of technical products </li></ul></ul><ul><ul><li>assist in tactical decision-making </li></ul></ul>
    5. 5. Measure, Metrics, Indicators <ul><li>Measure. </li></ul><ul><ul><li>provides a quantitative indication of the extent, amount, dimension, capacity, or size of some attributes of a product or process. </li></ul></ul><ul><li>Metrics. </li></ul><ul><ul><li>relates the individual measures in some way . </li></ul></ul><ul><li>Indicator. </li></ul><ul><ul><li>a combination of metrics that provide insight into the software process or project or product itself. </li></ul></ul>
    6. 6. What Should Be Measured? measurement What do we use as a basis? • size? • function? project metrics process metrics process product product metrics
    7. 7. Metrics of Process Improvement <ul><li>Focus on Manageable Repeatable Process </li></ul><ul><li>Use of Statistical SQA on Process </li></ul><ul><li>Defect Removal Efficiency </li></ul>
    8. 8. Statistical Software Process Improvement All errors and defects are categorized by origin The cost to correct each error and defect is recorded No. of errors and defects in each category is counted and ranked in descending order The overall cost in each category is computed Resultant data are analyzed and the “culprit” category is uncovered Plans are developed to eliminate the errors
    9. 9. Causes and Origin of Defects
    10. 10. Metrics of Project Management <ul><li>Budget </li></ul><ul><li>Schedule/ReResource Management </li></ul><ul><li>Risk Management </li></ul><ul><li>Project goals met or exceeded </li></ul><ul><li>Customer satisfaction </li></ul>
    11. 11. Metrics of the Software Product <ul><li>Focus on Deliverable Quality </li></ul><ul><li>Analysis Products </li></ul><ul><li>Design Product Complexity – algorithmic, architectural, data flow </li></ul><ul><li>Code Products </li></ul><ul><li>Production System </li></ul>
    12. 12. How Is Quality Measured? <ul><li>Analysis Metrics </li></ul><ul><ul><li>Function-based Metrics: Function Points( Albrecht), Feature Points (C. Jones) </li></ul></ul><ul><ul><li>Bang Metric (DeMarco): Functional Primitives, Data Elements, Objects, Relationships, States, Transitions, External Manual Primitives, Input Data Elements, Output Data Elements, Persistent Data Elements, Data Tokens, Relationship Connections. </li></ul></ul>
    13. 13. Source Lines of Code (SLOC) <ul><li>Measures the number of physical lines of active code </li></ul><ul><li>In general the higher the SLOC in a module the less understandable and maintainable the module is </li></ul>
    14. 14. Function Oriented Metric - Function Points <ul><li>Function Points are a measure of “how big” is the program, independently from the actual physical size of it </li></ul><ul><li>It is a weighted count of several features of the program </li></ul><ul><li>Dislikers claim FP make no sense wrt the representational theory of measurement </li></ul><ul><li>There are firms and institutions taking them very seriously </li></ul>
    15. 15. Analyzing the Information Domain Assuming all inputs with the same weight, all output with the same weight, … Complete Formula for the Unadjusted Function Points: Unadjusted Function Points:
    16. 16. Taking Complexity into Account Formula :
    17. 17. Typical Function-Oriented Metrics <ul><li>errors per FP (thousand lines of code) </li></ul><ul><li>defects per FP </li></ul><ul><li>$ per FP </li></ul><ul><li>pages of documentation per FP </li></ul><ul><li>FP per person-month </li></ul>
    18. 18. LOC vs. FP <ul><li>Relationship between lines of code and function points depends upon the programming language that is used to implement the software and the quality of the design </li></ul><ul><li>Empirical studies show an approximate relationship between LOC and FP </li></ul>
    19. 19. LOC/FP (average) Assembly language 320 C 128 COBOL, FORTRAN 106 C++ 64 Visual Basic 32 Smalltalk 22 SQL 12 Graphical languages (icons) 4
    20. 20. How Is Quality Measured? <ul><li>Design Metrics </li></ul><ul><ul><li>Structural Complexity: fan-in, fan-out, morphology </li></ul></ul><ul><ul><li>System Complexity: </li></ul></ul><ul><ul><li>Data Complexity: </li></ul></ul><ul><ul><li>Component Metrics: Size, Modularity, Localization, Encapsulation, Information Hiding, Inheritance, Abstraction, Complexity, Coupling, Cohesion, Polymorphism </li></ul></ul><ul><li>Implementation Metrics </li></ul><ul><ul><li>Size, Complexity, Efficiency, etc. </li></ul></ul>
    21. 21. Comment Percentage (CP) <ul><li>Number of commented lines of code divided by the number of non-blank lines of code </li></ul><ul><li>Usually 20% indicates adequate commenting for C or Fortran code </li></ul><ul><li>The higher the CP value the more maintainable the module is </li></ul>
    22. 22. Size Oriented Metric - Fan In and Fan Out <ul><li>The Fan In of a module is the amount of information that “enters” the module </li></ul><ul><li>The Fan Out of a module is the amount of information that “exits” a module </li></ul><ul><li>We assume all the pieces of information with the same size </li></ul><ul><li>Fan In and Fan Out can be computed for functions, modules, objects, and also non-code components </li></ul><ul><li>Goal - Low Fan Out for ease of maintenance. </li></ul>
    23. 23. Size Oriented Metric - Halstead Software Science Primitive Measures number of distinct operators number of distinct operands total number of operator occurrences total number of operand occurrences Used to Derive maintenance effort of software testing time required for software
    24. 24. Flow Graph a Y X Predicate Nodes if (a) { X(); } else { Y(); } <ul><li>V(G) = E - N + 2 </li></ul><ul><li>where E = number of edges </li></ul><ul><li>and N = number of nodes </li></ul>
    25. 25. McCabes Metric <ul><li>Smaller the V(G) the simpler the module. </li></ul><ul><li>Modules larger than V(G) 10 are a little unmanageable . </li></ul><ul><li>A high cyclomatic complexity indicates that the code may be of low quality and difficult to test and maintain </li></ul>
    26. 26. Chidamber and Kemerer Metrics <ul><li>Weighted methods per class (MWC) </li></ul><ul><li>Depth of inheritance tree (DIT) </li></ul><ul><li>Number of children (NOC) </li></ul><ul><li>Coupling between object classes (CBO) </li></ul><ul><li>Response for class (RFC) </li></ul><ul><li>Lack of cohesion metric (LCOM) </li></ul>
    27. 27. Weighted methods per class (WMC) <ul><li>c i is the complexity of each method M i of the class </li></ul><ul><ul><li>Often, only public methods are considered </li></ul></ul><ul><li>Complexity may be the McCabe complexity of the method </li></ul><ul><li>Smaller values are better </li></ul><ul><li>Perhaps the average complexity per method is a better metric? </li></ul>The number of methods and complexity of methods involved is a direct predictor of how much time and effort is required to develop and maintain the class.
    28. 28. Depth of inheritance tree (DIT) <ul><li>For the system under examination, consider the hierarchy of classes </li></ul><ul><li>DIT is the length of the maximum path from the node to the root of the tree </li></ul><ul><li>Relates to the scope of the properties </li></ul><ul><ul><li>How many ancestor classes can potential affect a class </li></ul></ul><ul><li>Smaller values are better </li></ul>
    29. 29. Number of children (NOC) <ul><li>For any class in the inheritance tree, NOC is the number of immediate children of the class </li></ul><ul><ul><li>The number of direct subclasses </li></ul></ul><ul><li>How would you interpret this number? </li></ul><ul><li>A moderate value indicates scope for reuse and high values may indicate an inappropriate abstraction in the design </li></ul>
    30. 30. Coupling between object classes (CBO) <ul><li>For a class, C, the CBO metric is the number of other classes to which the class is coupled </li></ul><ul><li>A class, X, is coupled to class C if </li></ul><ul><ul><li>X operates on (affects) C or </li></ul></ul><ul><ul><li>C operates on X </li></ul></ul><ul><li>Excessive coupling indicates weakness of class encapsulation and may inhibit reuse </li></ul><ul><li>High coupling also indicates that more faults may be introduced due to inter-class activities </li></ul>
    31. 31. Response for class (RFC) <ul><li>Mc i # of methods called in response to a message that invokes method M i </li></ul><ul><ul><li>Fully nested set of calls </li></ul></ul><ul><li>Smaller numbers are better </li></ul><ul><ul><li>Larger numbers indicate increased complexity and debugging difficulties </li></ul></ul>If a large number of methods can be invoked in response to a message, the testing and debugging of the class becomes more complicated
    32. 32. Lack of cohesion metric (LCOM) <ul><li>Number of methods in a class that reference a specific instance variable </li></ul><ul><li>A measure of the “tightness” of the code </li></ul><ul><li>If a method references many instance variables, then it is more complex, and less cohesive </li></ul><ul><li>The larger the number of similar methods in a class the more cohesive the class is </li></ul><ul><li>Cohesiveness of methods within a class is desirable, since it promotes encapsulation </li></ul>
    33. 33. Testing Metrics <ul><li>Metrics that predict the likely number of tests required during various testing phases </li></ul><ul><li>Metrics that focus on test coverage for a given component </li></ul>
    34. 34. Views on SE Measurement
    35. 35. Views on SE Measurement
    36. 36. Views on SE Measurement
    37. 37. 12 Steps to Useful Software Metrics <ul><li>Step 1 - Identify Metrics Customers </li></ul><ul><li>Step 2 - Target Goals </li></ul><ul><li>Step 3 - Ask Questions </li></ul><ul><li>Step 4 - Select Metrics </li></ul><ul><li>Step 5 - Standardize Definitions </li></ul><ul><li>Step 6 - Choose a Model </li></ul><ul><li>Step 7 - Establish Counting Criteria </li></ul><ul><li>Step 8 - Decide On Decision Criteria </li></ul><ul><li>Step 9 - Define Reporting Mechanisms </li></ul><ul><li>Step 10 - Determine Additional Qualifiers </li></ul><ul><li>Step 11 - Collect Data </li></ul><ul><li>Step 12 - Consider Human Factors </li></ul>
    38. 38. Step 1 - Identify Metrics Customers <ul><li>Who needs the information? </li></ul><ul><li>Who’s going to use the metrics? </li></ul>? If the metric does not have a customer -- do not use it.
    39. 39. Step 2 - Target Goals <ul><li>Organizational goals </li></ul><ul><ul><li>Be the low cost provider </li></ul></ul><ul><ul><li>Meet projected revenue targets </li></ul></ul><ul><li>Project goals </li></ul><ul><ul><li>Deliver the product by June 1st </li></ul></ul><ul><ul><li>Finish the project within budget </li></ul></ul><ul><li>Task goals (entry & exit criteria) </li></ul><ul><ul><li>Effectively inspect software module ABC </li></ul></ul><ul><ul><li>Obtain 100% statement coverage during </li></ul></ul><ul><ul><li>testing </li></ul></ul>
    40. 40. Step 3 - Ask Questions <ul><li>Goal: Maintain a high level of customer </li></ul><ul><li>satisfaction </li></ul><ul><li>What is our current level of customer satisfaction? </li></ul><ul><li>What attributes of our products and services are most important to our customers? </li></ul><ul><li>How do we compare with our competition? </li></ul>
    41. 41. Step 4 - Select Metrics <ul><li>Select metrics that provide information </li></ul><ul><li>to help answer the questions </li></ul><ul><li>Be practical, realistic, pragmatic </li></ul><ul><li>Consider current engineering environment </li></ul><ul><li>Start with the possible </li></ul>Metrics don’t solve problems -- people solve problems Metrics provide information so people can make better decisions
    42. 42. Selecting Metrics <ul><li>Goal: Ensure all known defects are corrected before shipment </li></ul>
    43. 43. Metrics Objective Statement Template <ul><li>Example - M etric: % defects corrected </li></ul>evaluate % defects found & corrected during testing ensure all known defects are corrected before shipment To understand evaluate control predict the attribute of the entity in order to goal(s) To the in order to
    44. 44. Step 5 - Standardize Definitions Developer Open User Open
    45. 45. Step 6 - Choose a Measurement <ul><li>Models for code inspection metrics </li></ul><ul><li>Primitive Measurements: </li></ul><ul><ul><li>Lines of Code Inspected = loc </li></ul></ul><ul><ul><li>Hours Spent Preparing = prep_hrs </li></ul></ul><ul><ul><li>Hours Spent Inspecting = in_hrs </li></ul></ul><ul><ul><li>Discovered Defects = defects </li></ul></ul><ul><li>Other Measurements : </li></ul><ul><ul><li>Preparation Rate = loc / prep_hrs </li></ul></ul><ul><ul><li>Inspection Rate = loc / in_hrs </li></ul></ul><ul><ul><li>Defect Detection Rate = defects / (prep_hrs + in_hrs) </li></ul></ul>
    46. 46. Step 7 - Establish Counting Criteria <ul><li>Lines of Code </li></ul><ul><li>Variations in counting </li></ul><ul><li>No industry accepted standard </li></ul><ul><li>SEI guideline - check sheets for criteria </li></ul><ul><li>Advice: use a tool </li></ul>
    47. 47. Counting Criteria - Effort <ul><li>What is a Software Project? </li></ul><ul><li>When does it start / stop? </li></ul><ul><li>What activities does it include? </li></ul><ul><li>Who works on it? </li></ul>
    48. 48. Step 8 - Decide On Decision Criteria <ul><li>Establish Baselines </li></ul><ul><li>Current value </li></ul><ul><ul><li>Problem report backlog </li></ul></ul><ul><ul><li>Defect prone modules </li></ul></ul><ul><li>Statistical analysis (mean & distribution) </li></ul><ul><ul><li>Defect density </li></ul></ul><ul><ul><li>Fix response time </li></ul></ul><ul><ul><li>Cycle time </li></ul></ul><ul><ul><li>Variance from budget (e.g., cost, schedule) </li></ul></ul>
    49. 49. Step 9 - Define Reporting Mechanisms
    50. 50. Step 10 - Determine Additional Qualifiers <ul><li>A good metric is a generic metric </li></ul><ul><li>Additional qualifiers: </li></ul><ul><li>Provide demographic information </li></ul><ul><li>Allow detailed analysis at multiple levels </li></ul><ul><li>Define additional data requirements </li></ul>
    51. 51. Additional Qualifier Example <ul><li>Metric: software defect arrival rate </li></ul><ul><li>Release / product / product line </li></ul><ul><li>Module / program / subsystem </li></ul><ul><li>Reporting customer / customer group </li></ul><ul><li>Root cause </li></ul><ul><li>Phase found / phase introduced </li></ul><ul><li>Severity </li></ul>
    52. 52. Step 11 – Collect Data <ul><li>What data to collect? </li></ul><ul><ul><li>Metric primitives </li></ul></ul><ul><ul><li>Additional qualifiers </li></ul></ul><ul><li>Who should collect the data? </li></ul><ul><ul><li>The data owner </li></ul></ul><ul><ul><ul><li>Direct access to source of data </li></ul></ul></ul><ul><ul><ul><li>Responsible for generating data </li></ul></ul></ul><ul><ul><ul><li>Owners more likely to detect anomalies </li></ul></ul></ul><ul><ul><ul><li>Eliminates double data entry </li></ul></ul></ul>? ? ?
    53. 53. Examples of Data Ownership
    54. 54. Step 12 – Consider Human Factors <ul><li>The People Side of the Metrics Equation </li></ul><ul><li>How measures affect people </li></ul><ul><li>How people affect measures </li></ul>“ Don’t underestimate the intelligence of your engineers. For any one metric you can come up with, they will find at least two ways to beat it.” [unknown]
    55. 55. Don’t Measure individuals Use metrics as a “stick” Ignore the data Use only one metric Cost Quality Schedule
    56. 56. Do Select metrics based on goals Goal 1 Goal 2 Question 1 Question 2 Question 3 Question 4 Metrics 1 Metric 2 Metric 3 Metric 4 Metric 5 [Basili-88] Focus on processes, products & services Processes, Products & Services Provide feedback Feedback Data Data Providers Metrics Obtain “buy-in”
    57. 57. References <ul><li>Chidamber, S. R. & Kemerer, C. F., “A Metrics Suite for Object Oriented Design”, IEEE Transactions on Software Engineering , Vol. 20, #6, June 1994. </li></ul><ul><li>  </li></ul><ul><li>Hitz, M. and Montazeri, B. “Chidamber and Kemerer’s Metrics Suite: A Measurement Theory Perspective”, IEE Transaction on Software Engineering , Vol. 22, No. 4, April 1996. </li></ul><ul><li>  </li></ul><ul><li>Lacovara , R.C., and Stark G. E., “A Short Guide to Complexity Analysis, Interpretation and Application”, May 17, 1994. http://members. aol .com/ GEShome /complexity/Comp.html </li></ul><ul><li>Tang, M., Kao, M., and Chen, M., “An Empirical Study on Object-Oriented Metrics”, IEEE Transactions on Software Engineering, 0-7695-0403-5, 1999. </li></ul><ul><li>Tegarden, D., Sheetz, S., Monarchi, D., “Effectiveness of Traditional Software Metrics for Object-Oriented Systems”, Proceedings: 25th Hawaii International Confernce on System Sciences, January , 1992, pp. 359-368. </li></ul><ul><li>“ Principal Components of Orthogonal Object-Oriented Metrics” </li></ul><ul><ul><li>http://satc.gsfc.nasa.gov/support/OSMASAS_SEP01/Principal_Components_of_Orthogonal_Object_Oriented_Metrics.htm </li></ul></ul><ul><li>Software Engineering Fundamentals </li></ul><ul><ul><li>by Behforhooz & Hudson, Oxford Press, 1996 </li></ul></ul><ul><ul><li>Chapter 18: Software Quality and Quality Assurrance </li></ul></ul><ul><li>Software Engineering: A Practioner's Approach </li></ul><ul><ul><li>by Roger Pressman, McGraw-Hill, 1997 </li></ul></ul><ul><li>IEEE Standard on Software Quality Metrics Validation Methdology (1061) </li></ul><ul><li>Object-Oriented Metrics </li></ul><ul><ul><li>by Brian Henderson-Sellers, Prentice-Hall, 1996 </li></ul></ul>
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×