Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The Dark Side of Code Metrics

  • Login to see the comments

  • Be the first to like this

The Dark Side of Code Metrics

  1. 1. The Dark Side of Code Metrics
  2. 2. Why do we care so much?• Trying to qualify work completed• Trying to predict future work
  3. 3. Lines of Code (LoC)• Quantifying outstanding or completed work• What is better, having fewer or more lines of code?
  4. 4. Lines of Code (LoC)
  5. 5. Lines of Code (LoC)
  6. 6. Lines of Code (LoC)
  7. 7. Lines of Code (LoC)• 1000s of ways to write the same thing• Context is king• An estimating crutch for management
  8. 8. Amount of Comments• Sometimes seen as LoC:LoComments ratio• Driven by the belief that comments increase the understandability of code
  9. 9. Comments
  10. 10. Comments
  11. 11. Comments
  12. 12. Comments
  13. 13. Comments
  14. 14. Amount of Comments• A crutch for lazy coding• Misuse of existing systems• Creates a maintenance burden• Guaranteed to be incorrect
  15. 15. Code Coverage• Executing code via automated tests increases the quality of the code• %age of lines of code executed by automated tests
  16. 16. Code Coverage1. Codebase has 85% coverage2. Developer refactors some code and commits changes3. Codebase has 82% coverage, build fails because threshold is 85% What went wrong?
  17. 17. Code Coverage• Codebase has 100% coverage• Developer makes daily releases to fix daily defect reports What’s the problem?
  18. 18. Code Coverage• Class has 40 lines of code and 0% coverage What’s the problem?
  19. 19. Code Coverage• Class has 5 lines of code and 0% coverage
  20. 20. Code Coverage• Context is king• Who decides the arbitrary “acceptable” %age?• Who decides what code needs to be tested?
  21. 21. Maintainability Index= MAX(0,(171 - 5.2 * log(Halstead Volume) - 0.23 * (Cyclomatic Complexity) - 16.2 * log(Lines of Code))*100 / 171)• Higher number means more maintainable
  22. 22. Maintainability Index• What does it tell us though?• Comparison of two codebases? – Only if they have very similar LoC and CC values• Comparison of two modules?• How hard it is to understand the code?
  23. 23. Maintainability Index• With no peer comparison it is an arbitrary number• What is an application’s peer though?
  24. 24. Generated Code• Should metrics be applied?• What does the metric mean?• How easily can you exclude/separate?
  25. 25. The Lie• Metrics offer emperical evidence about a codebase• By themselves, metrics provide meaning
  26. 26. The Truth• Metrics are nothing more than statistics• Can be made to represent anything• The numbers, by themselves, represent nothing• Interpretation is in the eye of the beholder
  27. 27. The Light Side• The output numbers mean nothing by themselves• Look at interesting comparisons – Time based – Module based – Developer based• Look for deltas rather than actuals
  28. 28. Deltas Over Time• How did <metric> change – since last run? – since last release?
  29. 29. Related Metrics• Code Coverage vs Defect Reports• Defect Reports vs Code Churn• Look at the deltas over a time period
  30. 30. Creating valuable meaning requires hard work
  31. 31.