Structure of the lecture


                                                                                               ...
39 I 112




                                        PATTERNS




Software Analysis and Testing, MFES Universidade do Minh...
Patterns



Coding style and coding standards                                                                             ...
Patterns
Style and standards


Checking coding style and coding standards                                                 ...
Patterns
Secure coding


Checking secure coding guidelines                                                                ...
Patterns
Bugs


                                                                                                          ...
Patterns
Exercises


                                                                                                     ...
Structure of the lecture


                                                                                               ...
46 I 112




                            METRICS & QUALITY




Software Analysis and Testing, MFES Universidade do Minho b...
Software analysis
What?


                                                                                                ...
The bermuda triangle of software quality



                                                  COBIT    Security           ...
Software Quality
Process


Capability Maturity Model® Integration (CMMI®)                             49 I 112

 • “… is a...
Software Quality
Process


                                                                                               ...
Software Quality
Process


Levels                                           Process Areas                                 ...
The bermuda triangle of software quality



                                                  COBIT    Security           ...
But …



                                                                                                                 ...
Software product quality standards



ISO/IEC 9126                                                                        ...
ISO/IEC 9126, Part 1
Quality perspectives


                                                                              ...
ISO/IEC 9126, Part 1
  Product quality model: internal and external


                                                    ...
ISO 9126, Part 1
Maintainability (= evolvability)


                                                                      ...
ISO 9126, Part 1
Reliability


                                                                       58 I 112
Reliability...
ISO/IEC 9126, Part 1
Product quality model: quality-in-use


                                                             ...
ISO 9126
Part 2,3: metrics


External metrics, e.g.:                                                                      ...
The issue



                                                                                                             ...
Functional vs technical quality



                                                                                       ...
ISO/IEC 9126, Part 1
  Product quality model: technical quality


                                                        ...
So …



                                                                                                                  ...
A Challenge



Use source code metrics to measure technical quality?                                                      ...
Source code metrics
Lines of code (LOC)


                                                                                ...
Source code metrics
Function Point Analysis (FPA)


                                                                      ...
Source code metrics
Cyclomatic complexity


 •   T. McCabe, IEEE Trans. on Sw Engineering, 1976                           ...
Code duplication
Definition


Code duplication measurement                                                                ...
Code duplication



                                                                                                      ...
Source code metrics
Coupling


 • Efferent Coupling (Ce)                                                                  ...
Software metrics crisis
How does measurement data lead to information?



                                                ...
The statistical nature of software metrics
Averaging is fundamentally flawed


                                           ...
The statistical nature of software metrics
Emphasize area of risk


                                                      ...
The statistical nature of software metrics
Go where the variation is


                                                   ...
The statistical nature of software metrics
Go where the variation is


                                                   ...
SIG Quality Model
Quality profiles


                                                                                     ...
Quality profiles
Comparing systems


Aggregation by averaging is fundamentally flawed                                     ...
Quality profiles, in general



Input                                                                                     ...
SIG Quality Model
How do measurements lead to ratings?


A practical model for measuring maintainability                  ...
Software product certification
by SIG and TÜViT


                                                                        ...
Evaluation results



Evaluation report                                                                                   ...
Further reading



                                                                                                       ...
Software Risk Assessment service



                                                                                      ...
Software Risk Assessment



                                                                                              ...
Software Risk Assessment
Example: stagnation before go-live


Internal architecture                                     Co...
Software Monitoring service



                                                                                           ...
Software Monitor
Dashboard


                                                                                             ...
Software Monitor
Example: vendor management and roadmap


Duplication                                                     ...
What should you remember (so far)
from this lecture?


                                                                   ...
Upcoming SlideShare
Loading in …5
×

2010 01 lecture SIG UM MFES 2 - Patterns metrics quality

1,275 views

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,275
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
51
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

2010 01 lecture SIG UM MFES 2 - Patterns metrics quality

  1. 1. Structure of the lecture 38 I 112 Analysis Static Dynamic Analysis Analysis metrics patterns models testing Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  2. 2. 39 I 112 PATTERNS Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  3. 3. Patterns Coding style and coding standards 40 I 112 • E.g. layout, identifiers, method length, … Secure coding guidelines • E.g. SQL injection, stack trace visibility Bug patterns • E.g. null pointer dereferencing, bounds checking Code smells • E.g. “god class”, “greedy class”, .. Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  4. 4. Patterns Style and standards Checking coding style and coding standards 41 I 112 • Layout rules (boring) • Identifier conventions • Length of methods • Depth of conditionals Aim • Consistency across different developers • Ensure maintainability Tools • E.g. CheckStyle, PMD, … • Integrated into IDE, into nightly build • Can be customized Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  5. 5. Patterns Secure coding Checking secure coding guidelines 42 I 112 • SQL injection attack • Storing and sending passwords • Stack-trace leaking • Cross-site scripting Aim • Ensure security • Security = Confidentiality + Integrity + Availability Tools • E.g. Fortify, Coverity Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  6. 6. Patterns Bugs 43 I 112 Detecting bug patterns • Null-dereferencing • Lack of array bounds checking • Buffer overflow Aim • Correctness • Compensate for weak type checks Tools: • e.g. FindBugs • Esp. for C, C++ Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  7. 7. Patterns Exercises 44 I 112 Run PMD / Checkstyle / FindBugs • E.g. on a project of your own • E.g. on some (easy-to-compile) open source project Inspect results • False or true positives? Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  8. 8. Structure of the lecture 45 I 112 Analysis Static Dynamic Analysis Analysis metrics patterns models testing Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  9. 9. 46 I 112 METRICS & QUALITY Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  10. 10. Software analysis What? 47 I 112 Quality performance defects adaptability reliability complexity size usability correctness security Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  11. 11. The bermuda triangle of software quality COBIT Security 48 I 112 CMMI (Scampi) SAS70 ISO17799 ISO27001 BS7799 Process TickIT (organizational) ISO9001:2000 ISO 20000 Product ITIL Prince2 Six Sigma DSDM J2EE People Project (IBM) (individual) (individual) PMI TMap ISTQB MCP Siebel RUP (Microsoft) (Oracle) (IBM) Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  12. 12. Software Quality Process Capability Maturity Model® Integration (CMMI®) 49 I 112 • “… is a process improvement approach that provides organizations with the essential elements of effective processes..” (SEI) • CMMI for Development (CMMI-DEV), Version 1.2, August 2006. • consists of 22 process areas with capability or maturity levels. • CMMI was created and is maintained by a team consisting of members from industry, government, and the Software Engineering Institute (SEI) • http://www.sei.cmu.edu/cmmi The Standard CMMI Appraisal Method for Process Improvement (SCAMPI) • “… is the official SEI method to provide benchmark-quality ratings relative to CMMI models.” Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  13. 13. Software Quality Process 50 I 112 http://sas.sei.cmu.edu/pars/ Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  14. 14. Software Quality Process Levels Process Areas 51 I 112 • Causal Analysis and Resolution • L1: Initial • Configuration Management • L2: Managed • Decision Analysis and Resolution • Integrated Project Management • L3: Defined • Measurement and Analysis • L4: Quantitatively Managed • Organizational Innovation and Deployment • Organizational Process Definition • L5: Optimizing • Organizational Process Focus • Organizational Process Performance • Organizational Training • Product Integration • Project Monitoring and Control • CMMI Project Planning • Process and Product Quality Assurance • Quantitative Project Management http://www.cmmi.de • Requirements Development • Requirements Management (browser) • Risk Management • Supplier Agreement Management • Technical Solution • Validation • Verification Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  15. 15. The bermuda triangle of software quality COBIT Security 52 I 112 CMMI (Scampi) SAS70 ISO17799 ISO27001 BS7799 Process TickIT (organizational) ISO9001:2000 ISO 20000 Product ITIL Prince2 Six Sigma ISO 9126 ISO 14598 DSDM J2EE People Project (IBM) (individual) (individual) PMI TMap ISTQB MCP Siebel RUP (Microsoft) (Oracle) (IBM) Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  16. 16. But … 53 I 112 What is software quality? What are the technical and functional aspects of quality? How can technical and functional quality be measured? Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  17. 17. Software product quality standards ISO/IEC 9126 54 I 112 Software engineering -- Product quality 1. Quality model 2. External metrics 3. Internal metrics 4. Quality in use metrics ISO/IEC 14598 Information technology -- Software product evaluation 1. General overview 2. Planning and management 3. Process for developers 4. Process for acquirers 5. Process for evaluators 6. Documentation of evaluation modules Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  18. 18. ISO/IEC 9126, Part 1 Quality perspectives 55 I 112 phase metrics internal quality build 9126, Part 3 software product external quality test 9126, Part 2 effect of software quality in use deploy 9126, Part 4 product Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  19. 19. ISO/IEC 9126, Part 1 Product quality model: internal and external 56 I 112 ISO/IEC 9126 Internal/External Quality functionality portability reliability efficiency usability maintainability suitability adaptability accuracy maturity time behavior installability fault-tolerance understandability analysability interoperability co-existence recoverability learnability changeability resource security replaceability operability stability utilisation attractiveness testability Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  20. 20. ISO 9126, Part 1 Maintainability (= evolvability) 57 I 112 Maintainability = • Analyzability: easy to understand where and how to modify? • Changeability: easy to perform modification? • Stability: easy to keep coherent when modifying? • Testability: easy to test after modification? Maintain Analyze Change Stabilize Test Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  21. 21. ISO 9126, Part 1 Reliability 58 I 112 Reliability = • Maturity: how much has been done to prevent failures? • Fault tolerance: when failure occurs, is it fatal? • Recoverability: when fatal failure occurs, how much effort to restart? Degree of failure Prevent Tolerate Recover Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  22. 22. ISO/IEC 9126, Part 1 Product quality model: quality-in-use 59 I 112 ISO/IEC 9126 Quality in Use effectiveness satisfaction productivity safety Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  23. 23. ISO 9126 Part 2,3: metrics External metrics, e.g.: 60 I 112 • Changeability: “change implementation elapse time”, time between diagnosis and correction • Testability: “re-test efficiency”, time between correction and conclusion of test Internal metrics, e.g.: • Analysability: “activity recording”, ratio between actual and required number of logged data items • Changeability: “change impact”, number of modifications and problems introduced by them Critique • Not pure product measures, rather product in its environment • Measure after the fact • No clear distinction between functional and technical quality Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  24. 24. The issue 61 I 112 • Companies innovate and change • Software systems need to adapt in the same pace as the business changes • Software systems that do not adapt lose their value • The technical quality of software systems is a key element Clients Business IT Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  25. 25. Functional vs technical quality 62 I 112 low cost & risk Technical quality high cost & risk Functional quality Software with high technical quality can evolve with low cost and risk to keep meeting functional and non-functional requirements. Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  26. 26. ISO/IEC 9126, Part 1 Product quality model: technical quality 63 I 112 ISO/IEC 9126 Software Product Quality functionality portability reliability efficiency usability maintainability suitability adaptability accuracy maturity time behavior installability fault-tolerance understandability analysability interoperability co-existence recoverability learnability changeability resource security replaceability operability stability utilisation attractiveness testability Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  27. 27. So … 64 I 112 What is software quality? What are the functional and technical aspects of quality? How can technical quality be measured? Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  28. 28. A Challenge Use source code metrics to measure technical quality? 65 I 112 Plenty of metrics defined in literature • LOC, cyclomatic complexity, fan in/out, coupling, cohesion, … • Halstead, Chidamber-Kemener, Shepperd, … Plenty of tools available • Variations on Lint, PMD, FindBugs, … • Coverity, FxCop, Fortify, QA-C, Understand, … • Integrated into IDEs But: • Do they measure technical quality of a system? Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  29. 29. Source code metrics Lines of code (LOC) 66 I 112 • Easy! Or … • SLOC = Source Lines of Code • Physical (≈ newlines) • Logical (≈ statements) • Blank lines, comment lines, lines with only “}” • Generated versus manually written • Measure effort / productivity: specific to programming language Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  30. 30. Source code metrics Function Point Analysis (FPA) 67 I 112 • A.J. Albrecht - IBM - 1979 • Objective measure of functional size • Counted manually • IFPUG, Nesma, Cocomo • Large error margins • Backfiring • Per language correlated with LOC • SPR, QSM • Problematic, but popular for estimation Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  31. 31. Source code metrics Cyclomatic complexity • T. McCabe, IEEE Trans. on Sw Engineering, 1976 68 I 112 • Accepted in the software community • Number of independent, non-circular paths per method • Intuitive: number of decisions made in a method • 1 + the number of if statements (and while, for, ...) while if if Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  32. 32. Code duplication Definition Code duplication measurement 69 I 112 0: abc 34: xxxxx 1: def 35: def Number of 2: ghi 36: ghi duplicated lines: 3: jkl 37: jkl 14 4: mno 38: mno 5: pqr 39: pqr 6: stu 40: stu 7: vwx 41: vwx 8: yz 42: xxxxxx Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  33. 33. Code duplication 70 I 112 A B C D Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  34. 34. Source code metrics Coupling • Efferent Coupling (Ce) 71 I 112 • How many classes do I depend on? • Afferent Coupling (Ca) • How many classes depend on me? • Instability = Ce/(Ca+Ce) ∈ [0,1] • Ratio of efferent versus total coupling • 0 = very stable = hard to change • 1 = very instable = easy to change Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  35. 35. Software metrics crisis How does measurement data lead to information? 72 I 112 Plethora of software metrics • Ample definitions in literature • Ample tools that calculate Measurement yields data, not information • How to aggregate individual measurement values? • How to map aggregated values onto quality attributes? • How to set thresholds? • How to act on results? SIG quality model handles these issues in a pragmatic way Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  36. 36. The statistical nature of software metrics Averaging is fundamentally flawed 73 I 112 Average • Is measure for central tendency • For “symmetric” distributions, such as normal. But: Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  37. 37. The statistical nature of software metrics Emphasize area of risk 74 I 112 Exploit a-symmetry • High-risk code is on the right • Weighing with LOC Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  38. 38. The statistical nature of software metrics Go where the variation is 75 I 112 Observe for all: • Systems are similar in low percentiles. Systems differ in higher percentiles. • Interesting differences occur mostly above the 70% percentile Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  39. 39. The statistical nature of software metrics Go where the variation is 76 I 112 Similar for most source code metrics Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  40. 40. SIG Quality Model Quality profiles 77 I 112 1. Measure source code metrics per method / file / module 2. Summarize distribution of measurement values in “Quality Profiles” Cyclomatic Risk complexity category 1 - 10 Low Sum lines of code Lines of code per risk category per category 11 - 20 Moderate Low Moderate High Very high 21 - 50 High 70 % 12 % 13 % 5% > 50 Very high Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  41. 41. Quality profiles Comparing systems Aggregation by averaging is fundamentally flawed 78 I 112 Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  42. 42. Quality profiles, in general Input 79 I 112 • type Input metric = Map item (metric,LOC) Risk groups • type Risk = Low | Moderate | High | Very High • risk :: metric → Risk Output • type ProfileAbs = Map Risk LOC • type Profile = Map Risk Percentage Aggregation • profile :: Input metric → Profile Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  43. 43. SIG Quality Model How do measurements lead to ratings? A practical model for measuring maintainability 80 I 112 Heitlager, Kuipers, Visser in QUATIC 2007, IEEE Press a. Aggregate measurements into “Quality Profiles” b. Map measurements and quality profiles to ratings for system properties c. Map ratings for system properties to ratings for ISO/IEC 9126 quality characteristics d. Map to overall rating of technical quality Measure- a. Quality b. Property c. d. ments Profiles Ratings Quality HHIII Ratings Overall HIIII HHIII Rating HHHII HHIII HHHII HHHHI HHHII HHHHH HHHHI HHHHI Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  44. 44. Software product certification by SIG and TÜViT 81 I 112 1. system source code Evaluation body performs evaluation and + high-level description delivers evaluation report System producer submits source code and high-level description 2. evaluation report Certification client receives certificate and obtains right to use quality mark 3. Certification body certificate confirms evaluation report and issues certificate System producer and certification client can be the same organization Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  45. 45. Evaluation results Evaluation report 82 I 112 • Defines scope of the evaluation • Summarizes measurement results • Provides ratings (properties, quality, and overall) • May provide hints for the producer to improve ratings Certificate • States conformance to SIG/TÜViT Evaluation Criteria • Confers right to use quality mark “TÜViT Trusted Product Maintainability” Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  46. 46. Further reading 83 I 112 A pragmatic model for measuring maintainability. Heitlager, T. Kuipers, J. Visser. QUATIC 2007. Certification of Technical Quality of Software. J.P. Correia, J.Visser. OpenCert 2008. Mapping System Properties to ISO/IEC 9126 Maintainability Characteristics J.P. Correia, Y. Kanellopoulos, J.Visser. SQM 2009. Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  47. 47. Software Risk Assessment service 84 I 112 Assignment • “Can we scale from 100 to 100,000 customers?” • “Should we accept delay and cost overrun, or cancel the project?” Analysis • Source code: understanding (reverse engineering) + evaluation (quality) • Interviews: technical + strategic Reporting • Quality judgment using star ratings • Risk analysis putting quality findings in business perspective • Recommendations to mitigate risks Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  48. 48. Software Risk Assessment 85 I 112 Report Presentation Interpretation, reconciliation, evaluation Facts Facts “Facts” Automated analysis Documentation Interviews Benchmark Source code Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  49. 49. Software Risk Assessment Example: stagnation before go-live Internal architecture Core 86 I 112 • Technology risks GUI templates 7 MY • Rebuild value • Quality 170 MY Rule engine DB 5 MY DB 5 MY 7 MY 15 MY DB 13 MY 21 MY 23 MY Results • Insurmountable stability issues, untestable, excessive maintenance burden • Now: reduce technical complexity, partially automate deployment • Start planning replacement Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  50. 50. Software Monitoring service 87 I 112 Quality roadmap • “complexity from 2 to 4 stars by 3rd month” in maintenance project • “final product shall be 4 stars” in development project Dashboard • Regular analysis of source code typically once per week • Shown on dashboard with overviews and drill down possibilities Consultancy • Regular reports (presentation and/or written) • Guard quality agreements, meet quality targets. • Identify risks and opportunities Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  51. 51. Software Monitor Dashboard 88 I 112 Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  52. 52. Software Monitor Example: vendor management and roadmap Duplication 89 I 112 Complexity From client testimonial: • “Technical quality: as it improves adding functionality is made easier” • “As quality was increasing, productivity was going up” Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.
  53. 53. What should you remember (so far) from this lecture? 90 I 112 Testing • Automated unit testing! Patterns • Run tools! Quality and metrics • Technical quality matters in the long run • A few simple metrics are sufficient • If aggregated in well-chosen, meaningful ways • The simultaneous use of distinct metrics allows zooming in on root causes Software Analysis and Testing, MFES Universidade do Minho by Joost Visser, Software Improvement Group © 2010.

×