DEMO             Application SHARPDEVELOP                           Audit Report                                2011-01-01...
Code audit of SharpDevelop application                                                                   2011-01-01       ...
Code audit of SharpDevelop application                                                                                    ...
Code audit of SharpDevelop application                                                                       2011-01-01   ...
Code audit of SharpDevelop application                                                                           2011-01-0...
Code audit of SharpDevelop application                                                                     2011-01-01     ...
Code audit of SharpDevelop application                                                                    2011-01-01      ...
Code audit of SharpDevelop application                                                                              2011-0...
Code audit of SharpDevelop application                                                                                2011...
Code audit of SharpDevelop application                                                                  2011-01-01        ...
Code audit of SharpDevelop application                                                                        2011-01-01  ...
Code audit of SharpDevelop application                                                                 2011-01-01        4...
Code audit of SharpDevelop application                                                              2011-01-01        4.2 ...
Code audit of SharpDevelop application                                                                   2011-01-01       ...
Code audit of SharpDevelop application                                                                2011-01-01        4....
Code audit of SharpDevelop application                                                                        2011-01-01  ...
Code audit of SharpDevelop application                                                                         2011-01-01 ...
Code audit of SharpDevelop application                                                                 2011-01-01        4...
Code audit of SharpDevelop application                                                       2011-01-01                   ...
Code audit of SharpDevelop application                                                                   2011-01-01       ...
Code audit of SharpDevelop application                                                                   2011-01-01       ...
Code audit of SharpDevelop application                                                                   2011-01-01       ...
Code audit of SharpDevelop application                                                                        2011-01-01  ...
Code audit of SharpDevelop application                                                                      2011-01-01    ...
Code audit of SharpDevelop application                                                                          2011-01-01...
Code audit of SharpDevelop application                                                                     2011-01-01     ...
Code audit of SharpDevelop application                                                                       2011-01-01   ...
Code audit of SharpDevelop application                                                                       2011-01-01   ...
Code audit of SharpDevelop application                                                              2011-01-01        Cycl...
Code audit of SharpDevelop application                                                             2011-01-01          Met...
Code audit of SharpDevelop application                                                                2011-01-01          ...
Code audit of SharpDevelop application                                                               2011-01-01        The...
Code audit of SharpDevelop application                                                             2011-01-01        The h...
Code audit of SharpDevelop application                                                          2011-01-01         Method ...
Code audit of SharpDevelop application                                                                       2011-01-01   ...
Code audit of SharpDevelop application                                                                                 201...
Code audit of SharpDevelop application                                                                       2011-01-01   ...
Code audit of SharpDevelop application                                                                      2011-01-01    ...
Code audit of SharpDevelop application                                                                       2011-01-01   ...
Code audit of SharpDevelop application                                                               2011-01-01           ...
Code audit of SharpDevelop application                                                                 2011-01-01         ...
Code audit of SharpDevelop application                                                                           2011-01-0...
Code audit of SharpDevelop application                                                                 2011-01-01         ...
Code audit of SharpDevelop application                                                                       2011-01-01   ...
Code audit of SharpDevelop application                                                             2011-01-01        5.7.1...
Code audit of SharpDevelop application                                                                          2011-01-01...
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Better testing for C# software through source code analysis
Upcoming SlideShare
Loading in …5
×

Better testing for C# software through source code analysis

1,536 views

Published on

You are probably using source code analysis for your C# software in order to ensure code quality. Want to go further ? You can use source code analysis to test the software more efficiently through risk based testing and improved regression testing and then deliver the software faster reducing testing cost in the meantime

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Better testing for C# software through source code analysis

  1. 1. DEMO Application SHARPDEVELOP Audit Report 2011-01-01 This document is a sample audit report produced automaticallyfrom the results of the analysis of the application on the Kalistick platform. It does not include any specific comments on the results. Its purpose is to serve as a model to build custom reports, it illustrates the ability of the platform to render a clear and comprehensible quality of an application. This document is confidential and is the property of Kalistick. It should not be circulated or modified without permission. Kalistick 13 av Albert Einstein F-69100 Villeurbanne +33 (0) 486 68 89 42 contact@kalistick.com www.kalistick.com
  2. 2. Code audit of SharpDevelop application 2011-01-01 1 Executive Summary The Quality Cockpit uses static analysis techniques: it does not execute the application, but analyzes the elements that compose it (code, test results, architecture ...). The results are correlated, aggregated and compared within the project context to identify risks related to quality. This report presents the results. Variation compared to the objective This chart compares the current status of the project to the objectives set for each quality factor. The goal, set at the initialization of the audit, represents the importance of each quality factor. It is intended to define the rules to follow during development and the accepted tolerance. Rate of overall non-compliance This gauge shows the overall level of quality of the application compared to its objective. It displays the percentage of the application (source code) regarded as not-compliant. According to the adopted configuration, a rate higher than 15% indicates the need for further analysis. Origin of non-compliances This graph identifies the technical origin of detected non-compliances, and the main areas of improvement. According to elements submitted for the analysis, some quality domains may not be evaluated.Confidential – This document is the property of Kalistick 2/59
  3. 3. Code audit of SharpDevelop application 2011-01-01 Report Organization This report presents the concepts of Quality Cockpit, the goal and the associated technical requirements before proceeding with the summary results and detailed results for each technical area. 1 Executive Summary ...................................................................................................................................... 2 2 Introduction .................................................................................................................................................. 4 2.1 The Quality Cockpit............................................................................................................................... 4 2.2 The analytical........................................................................................................................................ 4 3 Quality objective........................................................................................................................................... 7 3.1 The quality profile ................................................................................................................................ 7 3.2 The technical requirements ................................................................................................................. 7 4 Summary of results..................................................................................................................................... 10 4.1 Project status ...................................................................................................................................... 10 4.2 Benchmarking ..................................................................................................................................... 13 4.3 Modeling application.......................................................................................................................... 17 5 Detailed results........................................................................................................................................... 20 5.1 Detail by quality factors...................................................................................................................... 20 5.2 Implementation .................................................................................................................................. 21 5.3 Structure ............................................................................................................................................. 26 5.4 Test ..................................................................................................................................................... 35 5.5 Architecture ........................................................................................................................................ 42 5.6 Duplication ......................................................................................................................................... 43 5.7 Documentation................................................................................................................................... 44 6 Action Plan.................................................................................................................................................. 47 7 Glossary ...................................................................................................................................................... 49 8 Annex .......................................................................................................................................................... 51 8.1 Cyclomatic complexity........................................................................................................................ 51 8.2 The coupling ....................................................................................................................................... 53 8.3 TRI and TEI .......................................................................................................................................... 54 8.4 Technical Requirements ..................................................................................................................... 56Confidential – This document is the property of Kalistick 3/59
  4. 4. Code audit of SharpDevelop application 2011-01-01 2 Introduction 2.1 The Quality Cockpit This audit is based on an industrialized process of code analysis. This industrialization ensures reliable results and easily comparable with the results of other audits. The analysis process is based on the "Quality Cockpit" platform, available through SaaS1 model (https://cockpit.kalistick.com). This platform has the advantage of providing a knowledge base unique in that it centralizes the results from statistical analysis of millions code lines, enriched continuously with new analyses. It allows performing comparative analysis with other similar projects. 2.2 The analytical The analysis focuses on the code of the application (source code and binary code), for Java (JEE) or C# (. Net) technologies. It is a static analysis (without runtime execution), supplemented by correlation with information from development tools already implemented for the project: version control system, unit testing frameworks, code coverage tools. The results are given through an analytical approch based around three main dimensions:  The quality factors, which determine the nature of the impact of non-compliances detected, and the impact on the quality of the application  The quality domains, which specify the technical origin of non-compliances  The severity levels, which positions the non-compliances on a severity scale to characterize their priority 1 Software as a Service: application accessible remotely via Internet (using a standard browser)Confidential – This document is the property of Kalistick 4/59
  5. 5. Code audit of SharpDevelop application 2011-01-01 2.2.1 The quality factors The quality factors standardize a set of quality attributes which should claim the application according to ISO 912623:  Maintainability. Ability of software to be easily repaired, depending on the effort required to locate, identify and correct errors.  Reliability. Ability of software to function properly in making the service expected in normal operation.  Changeability. Ability of software to be able to evolve, depending on the effort required to add, delete, and modify the functions of an operating system.  Security. Ability of software to operate within the constraints of integrity, confidentiality and traceability requirements.  Transferability. Ability to perform maintenance and evolution of software by a new team separate from the one which developed the original software.  Efficiency. Relationship between the level of software performance and the number of resources required to operate in nominal conditions. 2.2.2 The quality domains The quality domains determine the nature of problems according to their technical origin. There is six of it:  Implementation. The problems inherent in coding: misuse of language, potential bugs, code hard to understand ... These problems can affect one or more of the six quality factors.  Structure. Problems related to the code organization: methods too long, too complex, with too many dependencies ... These issues impact maintainability and changeability of the application.  Test. Describes how the application is tested based on results of unit tests (failure rate, execution time ...) but also of the nature of the code covered by the test execution. The objective is to ensure that the tests cover the critical parts of the application. 2 ISO/IEC 9126-1:2001 Software engineering — Product quality — Part 1: Quality model : http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=22749 3 The analysis focuses on a subset of ISO 9126 in order to focus on controllable dimensions automatically.Confidential – This document is the property of Kalistick 5/59
  6. 6. Code audit of SharpDevelop application 2011-01-01  Architecture. Problems with the software architecture of the application. The platform allows the definition of an architectural model to modularize the application into layers or components and define communication constraints between them. The analysis identifies in the code all the calls which do not satisfy these constraints, to detect the maintainability, changeability and security risk levels.  Documentation. Problems related to lack of documentation in the code. This area primarily impacts the transferability of code.  Duplication. Identification of all significant copy-pastes in the application. They impact reliability, maintainability, transferability and changeability. 2.2.3 Severity levels The severity levels are intended to characterize the priority of correction of non-compliances. This priority depends on the severity of the impact of non-compliance, but also on the effort required for correction: some moderately critical problems might be marked with a high level of severity because of the triviality of their resolution. To simplify interpretation, the severity levels are expressed using a four-level scale. The first is an error, the others are warnings, from most to least severe:  Forbidden  Highly inadvisable  Inadvisable  To be avoided Compared to the Forbidden level, other levels of severity are managed with a tolerance threshold, which increases inversely with gravity.Confidential – This document is the property of Kalistick 6/59
  7. 7. Code audit of SharpDevelop application 2011-01-01 3 Quality objective One of distinctive features of "Quality Cockpit" is to perform the analysis according to real needs of the project in terms of quality, in order to avoid unnecessary efforts and to ensure greater relevance of quality risks. These requirements are formalized by defining the "quality profile" of the application, which characterizes the quality levels expected on each of the six main quality factors. This profile is then translated as "technical requirements" which are technical rules to be followed by the developers. 3.1 The quality profile For this audit, the profile is established as follows: See the Quality Cockpit 3.2 The technical requirements Based on the above quality profile, technical requirements have been selected from the “Quality Cockpit” knowledge base. These technical requirements cover the six quality domains (implementation, structure, testing, architecture, documentation, duplication) and are configured according to the quality profile (thresholds, levels of severity ...). The objective is to ensure a calibration of requirements that ensures the highest return on investment.Confidential – This document is the property of Kalistick 7/59
  8. 8. Code audit of SharpDevelop application 2011-01-01 Here are the details of these technical requirements: Domain Rule Explanation, goal and possible thresholds - According to your profile, between 150 and 200 rules were selected. They Implementation are exhaustively presented in the appendix of the report (8.4.1 Implementation rules). Objective: avoid bad practices and apply best practices related to the technology used. Size of methods Number of statements. This measure is different from the number of lines of code: it does not include comment lines or blank lines but only lines with at least one statement. Objective: avoid processing blocks difficult to understand. The threshold for the project is:  Number of lines: 100 Complexity of methods Cyclomatic complexity of a method. It measures the complexity of the control flow of a method by counting the number of independent paths covering all possible cases. The higher the number, the harder the code is to maintain and test. Structure Objective: avoid processing blocks difficult to understand, not testable and which tend to have a significant rate of failure. The threshold for the project is:  Cyclomatic complexity: 20 Complexity and Identifies methods difficult to understand, test and maintain because of coupling of methods moderate complexity (cyclomatic complexity) and numerous references to other types (efferent coupling). Objective: avoid processing blocks difficult to understand and not testable. The thresholds for the project are:  Cyclomatic complexity: 15  Efferent coupling: 20Confidential – This document is the property of Kalistick 8/59
  9. 9. Code audit of SharpDevelop application 2011-01-01 Domain Rule Explanation, goal and possible thresholds Test coverage methods Rate of code coverage for a method. This metric is standardized by our platform based on raw measures of code coverage when they are provided in the project archive. This rule requires a minimum level of testing (code coverage) for each method of the application according to the TRI (TestRelevancyIndex); TRI for each method assesses the risk that it contains bugs. His calculation takes into account the business risks defined for the application. Test Objective: focus the test strategy and test efforts towards sensitive areas of the application and check them. These sensitive areas are evaluated according to their propensity to contain bugs and according to business risks defined for the application. Details of the thresholds are provided in the annex to the report (8.4.2 Code coverage). Rules defined See the architecture model defined for the application to check architecture specifically through the constraints. Architecture architecture model. Objective: ensure that developments follow the expected architecture model and do not introduce inconsistencies which could be security holes, maintenance or evolution issues. Note: violations of architecture are not taken into account in the calculation of non-compliance. Header documentation Identifies methods of moderate complexity which have no documentation of methods header. The methods considered are those whose cyclomatic complexity Documentation and number of statements exceed the thresholds defined specifically for the project. Objective: ensure that documentation is available in key processing blocks to facilitate any changes in the development team (transferability). The thresholds for the project are:  Cyclomatic complexity: 10  Number of lines: 50 Detection of Duplicated blocks are invalid beyond 20 Statements Duplication duplications Objective: detect identical blocks of code in several places in the application, which often causes inconsistencies when making changes, and which are factor of increased costs of testing and development.Confidential – This document is the property of Kalistick 9/59
  10. 10. Code audit of SharpDevelop application 2011-01-01 4 Summary of results This chapter summarizes the status of the project using global indicators. These indicators measure the intrinsic quality of the project, but also compare its situation to other projects using “Quality Cockpit” knowledge base. 4.1 Project status The following indicators are related to the intrinsic situation of the project. 4.1.1 Rate of overall non-compliance The rate of non-compliance measures the percentage of application code considered as non-compliant. See the Quality Cockpit Specifically, this represents the ratio between the total number of statements, and the number of statements in non-compliant classes. A class is considered as non-compliant if at least one of the following statements is true: - A forbidden non-compliance is detected in the class - A set of non-compliances highly inadvisable, inadvisable, or to be avoided are detected in the class, and beyond a certain threshold. This calculation depends on the severity of each non- compliance and on the quality profile that adjusts the threshold of tolerance.Confidential – This document is the property of Kalistick 10/59
  11. 11. Code audit of SharpDevelop application 2011-01-01 4.1.2 Deviation from target This chart summarizes the difference between the target as represented by the quality profile and the current status of the project. This difference is shown for each quality factor: See the Quality Cockpit The level of non-compliance is calculated for each quality factor, and then weighted by the level of requirements set for the related quality factor. Quality theme Classes Significant non-compliances % application Changeability 429 1794 84% Efficiency 159 283 42% Maintainability 54 339 18% Reliability 425 1925 84% Security 0 0 0% Transferability 51 180 25% [Total] 480 2286 86.92% Detailed results specify for each quality factor: the number of non-compliant classes, the number of violations for selected rules, and the percentage of application code involved in non- compliant classes.Confidential – This document is the property of Kalistick 11/59
  12. 12. Code audit of SharpDevelop application 2011-01-01 4.1.3 Origin of non-compliances The following chart shows the distribution of non-compliances according to their technical origin: See the Quality Cockpit This chart compares each field according to the impact of rules that are associated with the quality of the application. The impact is measured from the number of statements in classes non-compliant. 4.1.4 Volumetry The following table specifies the volume of the analyzed application: Metric Value Trend Line count 70895 +0.14% Statement count 48877 +0.15% Method count 7568 +0.36% Class count 975 +0.21% Package count 48 = See the Quality Cockpit A "line" corresponds to a physical line of a source file. It may involve a white line or a comment line. A "statement" is a primary unit of code, it can be written on multiple lines, but also a line may contain multiple statements. For simplicity, a statement is delimited by a semicolon (;) or a left brace ({).Confidential – This document is the property of Kalistick 12/59
  13. 13. Code audit of SharpDevelop application 2011-01-01 4.2 Benchmarking The “Quality Cockpit" knowledge base allows a comparative analysis of the project with other projects reviewed on the platform. The objective is to measure its level of quality compared to an overall average. This comparison benchmarking is proposed in relation to two categories of projects:  The “Intra-Cockpit” projects: projects analyzed continuously on the platform, therefore, with a quality level above average (a priori)  The “Extra-Cockpit” projects: the projects reviewed from time to time on the platform in audit mode, so with a highly heterogeneous quality. Note: each project having its own specific quality profile, benchmarking does not take in account project configuration, but uses instead raw measures. 4.2.1 Comparison on implementation issues The chart below shows the status of the project implementation compared to the Extra-Cockpit projects, therefore analyzed promptly on the platform. For each level of severity, the quality of the project is positioned relative to others: See the Quality CockpitConfidential – This document is the property of Kalistick 13/59
  14. 14. Code audit of SharpDevelop application 2011-01-01 The project is positioned relative to other projects according to the rate of violations for each rule. The distribution is based on the quartile method, three groups are distinguished, "Better": the 25% best projects, "On the average": the 50% average projects, "Worse": the 25% worse projects. This information is then synthesized by level of severity. The implementation rules compared are not necessarily the same as quality profiles, but here we compare the rules according to their severity level set for each project. The following graph provides the same analysis, but this time with the Intra-Cockpit projects, analyzed continuously on the platform, so with a level of quality normally above average since detected violations should be more corrected: See the Quality Cockpit A dominant red color indicates that the other projects tend to correct the violations detected on this project.Confidential – This document is the property of Kalistick 14/59
  15. 15. Code audit of SharpDevelop application 2011-01-01 4.2.2 Mapping the structure The following chart compares the size of the methods of the current project with those of other projects, "Intra-Cockpit" and "Extra-Cockpit", comparing the ratio of the application (as a percentage of statements) which is located in processing blocks (methods) with a high number of statements: See the Quality Cockpit A significant proportion of the application in the right area is an indicator of greater maintenance and evolution costs. NB: The application analyzed is indicated by the term "Release".Confidential – This document is the property of Kalistick 15/59
  16. 16. Code audit of SharpDevelop application 2011-01-01 A similar comparison is provided for the cyclomatic complexity4 of methods, comparing the proportion of the application (as a percentage of statements) that is located within complex methods: See the Quality Cockpit A significant proportion of the application in the right area shows not only greater maintenance and evolution costs, but also problems of reliability because this code is difficult to test. 4.2.3 Comparison of main metrics The following table compares the project with other projects, "Intra-Cockpit" and "Extra-cockpit", on the main metrics related to the structure of the code. Recommended interval values are provided for information purposes. Metric Project Extra-Cockpit Intra-Cockpit Recommended interval Classes per package 20.31 10.48 10.9 6 - 26 Methods per class 7.76 7.78 7.58 4 - 10 Statements per method 6.46 12.76 10.85 7 - 13 Cyclomatic complexity per statement 0.31 0.22 0.15 0.16 - 0.24 See the Quality Cockpit 4 Cyclomatic complexity measures the complexity of the code, and thus its ability to test it, cf.http://classes.cecs.ucf.edu/eel6883/berrios/notes/Paper%204%20(Complexity%20Measure).pdfConfidential – This document is the property of Kalistick 16/59
  17. 17. Code audit of SharpDevelop application 2011-01-01 4.3 Modeling application To facilitate understanding of analysis results, the application is modeled in two ways: a functional perspective to better identify the business features of the application and link them to the source code, and a technical perspective to verify the technical architecture of the application. These models are built using the modeling wizard available in the Cockpit. You can modify these templates on the pages Functional modelization et Technical Architecture (depending on your user rights). 4.3.1 Functional model The functional model represents the business view of application, which may be understood by all project members. See the Quality Cockpit The functional model is composed of modules, each one representing a business feature, or a group of functionalities. These modules have been identified from a lexical corpus generated from the application code which allows isolating the business vocabulary of the application.Confidential – This document is the property of Kalistick 17/59
  18. 18. Code audit of SharpDevelop application 2011-01-01 4.3.2 Technical model The technical model represents the technical architecture of the application code. The idea is to define a target architecture model, which identifies the layers and / or technical components within the application, and sets constraints to allow or prohibit communications between each of these elements. The aim is threefold:  Homogenize the behavior of an application. For example, to ensure that the logging traces are written through a specific API, that data accesses pass through a dedicated layer, that some third- party library is only used by specific components ...  Ensure tightness of some components to facilitate their development and limit unintended consequences, but also make them shareable with other applications. Dependency cycles are for instance forbidden.  Avoid security flaws for example by ensuring that calls to data layer always pass through a business layer in charge of validation controls. Results of the architecture analysis are provided in chapter 5.5 Architecture.Confidential – This document is the property of Kalistick 18/59
  19. 19. Code audit of SharpDevelop application 2011-01-01 See the Quality Cockpit Green arrows formalize allowed communications between modules, while red arrows formalize forbidden communications.Confidential – This document is the property of Kalistick 19/59
  20. 20. Code audit of SharpDevelop application 2011-01-01 5 Detailed results This chapter details the results by focusing, for each quality domain, non-compliant elements. 5.1 Detail by quality factors The histogram below details the non-compliance rate for each quality factor, displaying also the number of non-compliant classes. As a reminder, the rate of non-compliance is based on the number of statements defined in non-compliant classes compared to the total number of statements in the project. These rates of non-compliance directly depend on the quality profile and on the level of requirements that have been selected: See the Quality Cockpit Same class may be non-compliant on several factors, the total does not necessarily correspond to the sum of the factors.Confidential – This document is the property of Kalistick 20/59
  21. 21. Code audit of SharpDevelop application 2011-01-01 5.2 Implementation Implementation domain covers the rules related to coding techniques. Unlike other domains, these rules are often specific to the characteristics of a language (Java / C#). They identify, for example:  Potential bugs: uninitialized variables, concurrency issues, recursive calls ...  Optimizations in terms of memory or CPU  Security vulnerabilities  Obsolete code  Code deviating from recommended standards  ... Implementations rules are the most numerous of the technical requirements. They are called "practice". 5.2.1 Breakdown by severity The objective of this indicator is to identify the severity of the practices that led to the invalidation of the classes. Here, severity is divided in two levels: forbidden practices (Forbidden security level) and inadvisable practices (Highly inadvisable, Inadvisable and To be avoided security levels). The following pie compares the number of non-compliant classes in implementation, according to the practices that participated in this invalidation:  When a class only violates forbidden practices, it is in the group “Forbidden practices”  When a class only violates inadvisable practices, it is in the group “Inadvisable practices”  Otherwise, the class violates practices of both categories and is in the group “Inadvisable and forbidden practices” See the Quality CockpitConfidential – This document is the property of Kalistick 21/59
  22. 22. Code audit of SharpDevelop application 2011-01-01 The effort of correction related to forbidden practices is generally less important compared to lower severities: a single violation is sufficient to cause a forbidden non-compliance when several inadvisable practices are needed to cause non-compliance, depending on tolerance thresholds. The table below completes the previous graph by introducing the concept of “Significant non-compliance”. A significant violation is a violation whose correction can fix fully or partially the non-compliance of a class. Indeed, due to tolerance thresholds associated with levels of severity, the correction of some violations has no impact on the non-compliance of the class. Severity Significant non- New non- Corrected non- Other non- compliances compliances compliances compliances Forbidden 382 5 0 0 Highly inadvisable 176 1 0 55 Inadvisable 81 5 2 336 To be avoided 202 1 1 340 The columns "New non-compliance" and "Corrected non-compliances" are only relevant if the audit follows a previous audit.Confidential – This document is the property of Kalistick 22/59
  23. 23. Code audit of SharpDevelop application 2011-01-01 5.2.2 Practices to fix in priority The two following tables provide a list of forbidden practices and highly inadvisable practices detected in the application. These are generally the rules to correct first. These tables provide for each practice the number of new non-compliances (if a previous audit has been done), the total number of non-compliances for this practice, the number of non-compliant classes where this practice has been detected and the percentage of statements of these classes compared to the overall number of statement in the project. These figures help to set up an action plan based on the impact associated with each practice. 5.2.2.1 Forbidden practices Practice New Non- NC % compliances classes application AvoidRedundantCasts 1 124 83 28.55% ImplementIDisposableForTypesWithDisposableFields 0 103 64 13.32% DontHardcodeLocaleSpecificStrings 2 81 56 13.62% UseConstInsteadOfReadOnlyWhenPossible_ 0 33 10 4.29% UseIsNullOrEmptyToCheckEmptyStrings 0 12 9 3.9% OverrideEqualsWithOperatorOnValueTypes 0 11 11 3.52% PropertyNamesMustNotMatchGetMethods 0 6 5 1.46% InstantiateExceptionsWithArguments 0 5 4 1.79% DontImplementWriteOnlyProperty 0 3 3 1% DefineMessageForObsoleteAttribute 2 2 2 1% DontUseInadvisableTypes 0 1 1 1% DontRaiseExceptionInUnexpectedMethod_ 0 1 1 1% See the Quality CockpitConfidential – This document is the property of Kalistick 23/59
  24. 24. Code audit of SharpDevelop application 2011-01-01 5.2.2.2 Practice highly inadvisable Practice New Non- NC classes % compliances application NeverMakeCtorCallOverridableMethod 0 185 48 10.15% DontUseNonConstantStaticVisibleFields 1 26 11 2.94% OverrideMethodsInIComparableImplementations 0 9 6 1.75% DefineAttributeForISerializableTypes 0 7 5 2.77% DontNestGenericInMemberSignatures_ 0 3 2 1.91% DontIgnoreMethodsReturnValue 0 1 1 1% See the Quality Cockpit 5.2.3 Classes to fix in priority on the implementation issues The two following tables provide an additional view about the impact of implementation issues in listing the main classes involved in forbidden practices or highly inadvisable practices. For each class is associated the number of existing violations (forbidden or highly inadvisable practices), the number of new violations (if a previous audit has been done), and the compliance status of the class.Confidential – This document is the property of Kalistick 24/59
  25. 25. Code audit of SharpDevelop application 2011-01-01 5.2.3.1 Classes with forbidden practices Class NC New Non- Instructions compliances ICSharpCode.SharpDevelop.Dom.NRefactoryResolver.NRefa Yes 0 10 647 ctoryResolver ICSharpCode.SharpDevelop.Dom.VBNet.VBExpressionFinder Yes 0 10 319 ICSharpCode.SharpDevelop.Dom.CSharp.CSharpExpressionF Yes 0 10 600 inder ICSharpCode.SharpDevelop.DefaultEditor.Gui.Editor.SharpD Yes 0 9 255 evelopTextAreaControl ICSharpCode.SharpDevelop.Gui.DefaultWorkbench Yes 0 8 387 ICSharpCode.SharpDevelop.Project.ConfigurationGuiHelper Yes 1 8 0 ICSharpCode.SharpDevelop.Widgets.TreeGrid.DynamicListIt Yes 0 8 211 em ICSharpCode.SharpDevelop.ParserService Yes 0 7 489 ICSharpCode.SharpDevelop.Gui.SdiWorkbenchLayout Yes 0 7 360 ICSharpCode.SharpDevelop.Dom.DomPersistence Yes 0 6 567 ICSharpCode.Core.MenuService Yes 0 6 78 ICSharpCode.SharpDevelop.Dom.DefaultProjectContent Yes 0 5 554 ICSharpCode.SharpDevelop.Gui.XmlForms.XmlLoader Yes 0 5 200 ICSharpCode.SharpDevelop.Refactoring.RefactoringService Yes 0 5 312 ICSharpCode.SharpDevelop.Project.ProjectService Yes 0 5 355 ICSharpCode.SharpDevelop.Project.MSBuildEngine Yes 0 5 337 ICSharpCode.SharpDevelop.Gui.FontSelectionPanelHelper Yes 0 5 101 ICSharpCode.SharpDevelop.Project.Commands.AddExistingI Yes 0 4 168 temsToProject ICSharpCode.SharpDevelop.Debugging.DebuggerService Yes 0 4 288 See the Quality CockpitConfidential – This document is the property of Kalistick 25/59
  26. 26. Code audit of SharpDevelop application 2011-01-01 5.2.3.2 Classes with practice highly inadvisable Class NC New Non- Instructions compliances ICSharpCode.SharpDevelop.Gui.ExtTreeNode Yes 0 69 248 ICSharpCode.SharpDevelop.Gui.ClassBrowser.MemberNode Yes 0 12 72 ICSharpCode.SharpDevelop.Gui.XmlForms.XmlForm Yes 0 12 22 ICSharpCode.SharpDevelop.Dom.HostCallback Yes 1 9 19 ICSharpCode.SharpDevelop.Dom.ReflectionLayer.ReflectionClass Yes 0 8 102 ICSharpCode.SharpDevelop.Dom.ExpressionContext Yes 0 6 142 ICSharpCode.SharpDevelop.Dom.ReflectionLayer.ReflectionMetho Yes 0 5 41 d ICSharpCode.SharpDevelop.Project.Dialogs.NewProjectDialog Yes 0 4 274 ICSharpCode.SharpDevelop.Project.FileNode Yes 0 4 155 ICSharpCode.SharpDevelop.Gui.NewFileDialog Yes 0 3 378 ICSharpCode.SharpDevelop.Project.ProjectNode Yes 0 3 114 ICSharpCode.SharpDevelop.Dom.DefaultEvent Yes 0 3 43 ICSharpCode.SharpDevelop.Dom.ReflectionLayer.ReflectionParame Yes 0 3 14 ter ICSharpCode.SharpDevelop.Dom.DefaultProperty Yes 0 3 62 ICSharpCode.SharpDevelop.Dom.DefaultMethod Yes 0 3 84 ICSharpCode.SharpDevelop.Dom.DefaultProjectContent Yes 0 2 554 ICSharpCode.SharpDevelop.Internal.Templates.FileTemplate Yes 0 2 124 ICSharpCode.SharpDevelop.Dom.DefaultParameter Yes 0 2 76 ICSharpCode.Core.MenuCommand Yes 0 2 85 See the Quality Cockpit 5.3 Structure The Structure domain targets rules related to the code structure, for example:  The size of methods  The cyclomatic complexity of methods  Coupling, or the dependencies of methods towards other classes The objective is to ensure that the code is structured in such a way that it can be easily maintained, tested, and can evolve. These rules are “metric”. They measure values (e.g. A number of statements) and are conditioned by thresholds (e.g. 100 statements / method). Only metrics on which developers are able to act are presented here. They apply to all methods.Confidential – This document is the property of Kalistick 26/59
  27. 27. Code audit of SharpDevelop application 2011-01-01 5.3.1 Typology of structural problems This histogram shows for each rule of structure domain number of non-compliance (thus methods) and the percentage of related statements compared to the total number of statements in the application: See the Quality Cockpit The percentage of statements shown is interesting since there is often only a few methods concentrating a large part of the application code. When some rules have been configured to be excluded from the analysis, they are displayed in this graph but without any results. One method may be affected by several rules; therefore, the total does not correspond to the sum of numbers. The following table completes this view by introducing the number of new violations and the number of violations corrected in the case where a previous audit was conducted: Anomaly Significant non- New non- Corrected non- NC compliances compliances compliances rate Cyclomatic complexity higher than 20 41 1 0 5%Confidential – This document is the property of Kalistick 27/59
  28. 28. Code audit of SharpDevelop application 2011-01-01 See the Quality Cockpit 5.3.2 Mapping methods by size The histogram below shows a mapping of methods according to their size. The size is expressed in number of statements to ignore the writing styles conventions. The last interval identifies the methods with a number of statements which exceeds the threshold. These methods are considered non-compliant because they are generally difficult to maintain and extend, and also show a high propensity to reveal bugs because they are difficult to test. The percentage of statements is provided because larger methods usually focus a significant part of the application: See the Quality Cockpit The following table details the main non-compliant methods identified in the last interval of the previous graph: Method Instructions Lines Complexity New violation 5.3.3 Mapping methods by complexity The histogram below shows a mapping of methods according to their cyclomatic complexity (see 8.1 Cyclomatic complexity).Confidential – This document is the property of Kalistick 28/59
  29. 29. Code audit of SharpDevelop application 2011-01-01 Cyclomatic complexity is a measure aiming to characterize the complexity of a block of code, by identifying all possible execution paths. This concept has been standardized by Mc Cabe5, but several calculation methods exist. The one used here is the most popular and the simplest: it counts the number of branching operators (if, for, while,? ...) and conditions (??, && ...). The last interval identifies methods whose complexity exceeds the threshold. These methods are considered non-compliant for the same reasons as for the long methods: they are generally difficult to maintain and extend, and also show a high propensity to reveal bugs. The percentage of statements and the percentage of complexity are provided because the most complex methods generally focus a significant part of the application. See the Quality Cockpit The following table details the main non-compliant methods identified in the last interval of the previous graph: 5 1976, IEEE Transactions on Software Engineering: 308–320. http://classes.cecs.ucf.edu/eel6883/berrios/notes/Paper%204%20(Complexity%20Measure).pdf.Confidential – This document is the property of Kalistick 29/59
  30. 30. Code audit of SharpDevelop application 2011-01-01 Method Instructions Lines Complexity New violation ICSharpCode.SharpDevelop.Dom.ReflectionLayer.Refle 21 36 21 New ctionClass.InitMembers ( System.Type) ICSharpCode.SharpDevelop.Dom.MemberLookupHelper 53 77 83 .ConversionExists ( ICSharpCode.SharpDevelop.Dom.IReturnType, ICSharpCode.SharpDevelop.Dom.IReturnType) ICSharpCode.SharpDevelop.Dom.CSharp.CSharpExpre 57 78 47 ssionFinder.SearchBracketForward ( System.String, System.Int32, System.Char, System.Char) ICSharpCode.SharpDevelop.Dom.VBNet.VBNetAmbienc 81 128 44 e.Convert ( ICSharpCode.SharpDevelop.Dom.IClass) ICSharpCode.SharpDevelop.Dom.CSharp.CSharpAmbie 77 118 41 nce.Convert ( ICSharpCode.SharpDevelop.Dom.IClass) ICSharpCode.SharpDevelop.Dom.NRefactoryResolver.N 71 100 34 RefactoryResolver.ResolveInternal ( ICSharpCode.NRefactory.Ast.Expression, ICSharpCode.SharpDevelop.Dom .ExpressionContext) ICSharpCode.SharpDevelop.Widgets.SideBar.SideBarC 81 97 32 ontrol.ProcessCmdKey ( ref System.Windows.Forms.Message, System.Windows.Forms.Keys) ICSharpCode.SharpDevelop.Dom.MemberLookupHelper 20 22 31 .GetBetterPrimitiveConversion ( ICSharpCode.SharpDevelop.Dom.IReturnType, ICSharpCode.SharpDevelop.Dom.IReturnType) ICSharpCode.SharpDevelop.Dom.NRefactoryResolver.T 53 73 31 ypeVisitor.CreateReturnType ( ICSharpCode.NRefactory.Ast.TypeReference, ICSharpCode.SharpDevelop.Dom.IClass, ICSharpCode.SharpDevelop.Dom.IMember, System.Int32, System.Int32, ICSharpCode.SharpDevelop.Dom.IProjectContent, System.Boolean) ICSharpCode.SharpDevelop.Dom.CecilReader.CecilClas 57 83 30 s.InitMembers ( Mono.Cecil.TypeDefinition) ICSharpCode.SharpDevelop.DefaultEditor.Gui.Editor.Me 42 59 29 thodInsightDataProvider.SetupDataProvider ( System.String, ICSharpCode.TextEditor.Document.IDocument, ICSharpCode.SharpDevelop.Dom.ExpressionResult, System.Int32, System.Int32) ICSharpCode.SharpDevelop.Refactoring.RefactoringSer 55 89 29 vice.AddReferences ( System.Collections.Generic.List<ICSharpCode.SharpDev elop.Refactoring.Reference>, ICSharpCode.SharpDevelop.Dom.IClass, ICSharpCode.SharpDevelop.Dom.IMember, System.Boolean, System.String, System.String) ICSharpCode.SharpDevelop.Project.DirectoryNode.Initial 81 121 29 ize ( )Confidential – This document is the property of Kalistick 30/59
  31. 31. Code audit of SharpDevelop application 2011-01-01 ICSharpCode.SharpDevelop.Project.MSBuildBasedProjec 92 140 28 t.SetPropertyInternal ( System.String, System.String, System.String, System.String, ICSharpCode.SharpDevelop.Project.PropertyStorageLoca tions, System.Boolean) ICSharpCode.SharpDevelop.Dom.NRefactoryResolver.NR 53 81 28 efactoryResolver.ResolveIdentifierInternal ( System.String) ICSharpCode.SharpDevelop.Commands.ToolMenuBuilde 54 74 26 r.ToolEvt ( System.Object, System.EventArgs) ICSharpCode.SharpDevelop.Dom.MemberLookupHelper. 32 51 26 GetBetterFunctionMember ( ICSharpCode.SharpDevelop.Dom.IReturnType[], ICSharpCode.SharpDevelop.Dom.IMethodOrProperty, ICSharpCode.SharpDevelop.Dom.IReturnType[], System.Boolean, ICSharpCode.SharpDevelop.Dom.IMethodOrProperty, ICSharpCode.SharpDevelop.Dom.IReturnType[], System.Boolean) ICSharpCode.SharpDevelop.Dom.CSharp.CSharpExpressi 51 68 26 onFinder.FindFullExpression ( System.String, System.Int32) ICSharpCode.SharpDevelop.Dom.CSharp.CSharpExpressi 58 76 25 onFinder.ReadNextToken ( ) 5.3.4 Mapping methods by their complexity and efferent coupling This rule is intended to identify methods whose code has many dependencies to other classes. The concept of “efferent coupling” refers to those outgoing dependencies. The principle is that a method with a strong efferent coupling is difficult to understand, maintain and test. First because it requires knowledge of the different types it depends on, then because the risk of destabilization is higher because of these dependencies. This rule is crossed with the cyclomatic complexity to ignore some trivial methods, such as initialization methods of graphical interfaces that make calls to many classes of widgets without presenting any real complexity. This rule considers that a method is non-compliant if it exceeds a threshold of efferent coupling and threshold of cyclomatic complexity.Confidential – This document is the property of Kalistick 31/59
  32. 32. Code audit of SharpDevelop application 2011-01-01 The chart below shows a mapping of methods according to their complexity and their efferent coupling. Each dot represents one or more methods with the same values of complexity and coupling. They are divided into four zones according to their status in relation to both thresholds:  The area on the lower left (green dots) contains compliant methods, below both thresholds  The area on the lower right (gray dots) contains compliant methods; they have reached the complexity threshold, but remain below the coupling threshold  The area in the upper left (gray dots) contains compliant methods; they have reached the coupling threshold, but remain below the complexity threshold  The area in the upper right (red dots) contains non-compliant methods; above both thresholds See the Quality Cockpit The intensity of the color of the dots depends on the number of methods that share the same values in complexity and coupling: the more the color of the point is marked, the more involved methods.Confidential – This document is the property of Kalistick 32/59
  33. 33. Code audit of SharpDevelop application 2011-01-01 The histogram below provides an additional view of this mapping and precise figures for the four zones in terms of percentage of methods and statements of the application. The last bars indicate the area of non- compliance: See the Quality Cockpit The following table details the main non-compliant methods:Confidential – This document is the property of Kalistick 33/59
  34. 34. Code audit of SharpDevelop application 2011-01-01 Method Efferent Complexity New Coupling violation ICSharpCode.SharpDevelop.Refactoring.RefactoringMenuBuilder.Buil 45 22 dSubmenu ( ICSharpCode.Core.Codon, System.Object) ICSharpCode.SharpDevelop.Project.Commands.AddExistingItemsToP 39 22 roject.Run ( ) ICSharpCode.SharpDevelop.Dom.CecilReader.CecilClass.InitMember 36 30 s ( Mono.Cecil.TypeDefinition) ICSharpCode.SharpDevelop.Gui.NewFileDialog.OpenEvent ( 35 17 System.Object, System.EventArgs) ICSharpCode.SharpDevelop.Commands.ToolMenuBuilder.ToolEvt ( 32 26 System.Object, System.EventArgs) ICSharpCode.SharpDevelop.Dom.NRefactoryResolver.NRefactoryRes 31 34 olver.ResolveInternal ( ICSharpCode.NRefactory.Ast.Expression, ICSharpCode.SharpDevelop.Dom.ExpressionContext) ICSharpCode.SharpDevelop.DefaultEditor.Gui.Editor.MethodInsightDa 30 29 taProvider.SetupDataProvider ( System.String, ICSharpCode.TextEditor.Document.IDocument, ICSharpCode.SharpDevelop.Dom.ExpressionResult, System.Int32, System.Int32) ICSharpCode.SharpDevelop.Dom.NRefactoryResolver.NRefactoryResol 30 18 New ver.CtrlSpace ( System.Int32, System.Int32, System.String, System.String, ICSharpCode.SharpDevelop.Dom.ExpressionContext) ICSharpCode.SharpDevelop.DefaultEditor.Commands.ClassBookmark 29 19 MenuBuilder.BuildSubmenu ( ICSharpCode.Core.Codon, System.Object) ICSharpCode.SharpDevelop.Project.MSBuildEngine.BuildRun.ParseSol 29 19 ution ( Microsoft.Build.BuildEngine.Project) ICSharpCode.SharpDevelop.Dom.NRefactoryResolver.NRefactoryResol 28 28 ver.ResolveIdentifierInternal ( System.String) ICSharpCode.SharpDevelop.Dom.CecilReader.CreateType ( 28 21 ICSharpCode.SharpDevelop.Dom.IProjectContent, ICSharpCode.SharpDevelop.Dom.IDecoration, Mono.Cecil.TypeReference) ICSharpCode.SharpDevelop.Project.ProjectService.LoadProject ( 27 15 System.String) ICSharpCode.SharpDevelop.Dom.NRefactoryResolver.TypeVisitor.Crea 26 31 teReturnType ( ICSharpCode.NRefactory.Ast.TypeReference, ICSharpCode.SharpDevelop.Dom.IClass, ICSharpCode.SharpDevelop.Dom.IMember, System.Int32, System.Int32, ICSharpCode.SharpDevelop.Dom.IProjectContent, System.Boolean) ICSharpCode.SharpDevelop.Project.DirectoryNode.Initialize ( ) 26 29 ICSharpCode.SharpDevelop.Widgets.TreeGrid.DynamicList.OnPaint ( 26 21 System.Windows.Forms.PaintEventArgs) ICSharpCode.SharpDevelop.Project.Dialogs.NewProjectDialog.OpenEv 26 20 ent ( System.Object, System.EventArgs)Confidential – This document is the property of Kalistick 34/59
  35. 35. Code audit of SharpDevelop application 2011-01-01 ICSharpCode.SharpDevelop.Dom.CecilReader.CecilClass ( 26 15 ICSharpCode.SharpDevelop.Dom.ICompilationUnit, ICSharpCode.SharpDevelop.Dom.IClass, Mono.Cecil.TypeDefinition, System.String).CecilClass ( ICSharpCode.SharpDevelop.Dom.ICompilationUnit, ICSharpCode.SharpDevelop.Dom.IClass, Mono.Cecil.TypeDefinition, System.String) ICSharpCode.SharpDevelop.Gui.GotoDialog.TextBoxTextChanged ( 25 18 System.Object, System.EventArgs) See the Quality Cockpit 5.4 Test The Test domain provides rules to ensure that the application is sufficiently tested, quantitatively but also qualitatively, i.e. tests should target risk areas. 5.4.1 Issues It is important to situate the problems inherent in managing tests to understand the results of analysis for this area. 5.4.1.1 Unit testing and code coverage The results of this domain depend on the testing process applied to the project: if automated unit testing process and / or code coverage are implemented on the project, then the analysis uses the results of these processes. As a reminder, we must distinguish unit testing and code coverage:  A unit test is an automated test, which usually focus on a simple method inside source code. But since this method has generally dependencies on other methods or classes, a unit test can test a more or less important part of the application (the larger is this part, the less relevant is the test)  Code coverage measures the amount of code executed from tests, by identifying each element actually executed at runtime (statements, conditional branches, methods ...). These tests can be unit tests (automated) or integration tests / functional (manual or automated). Code coverage is interesting to combine with the unit tests because it is the only way to measure the code actually tested. However, many projects still do not check the code coverage, which does not allow verifying the quality of testing in this type of analysis. The indicators presented next address both cases; they are useful for projects with unit tests and/or code coverage but also for other projects. 5.4.1.2 Relevance of code coverage Code coverage provides figures indicating the proportion of code executed after the tests, for example 68% of statements of a method are covered or 57% of the project statements...Confidential – This document is the property of Kalistick 35/59
  36. 36. Code audit of SharpDevelop application 2011-01-01 The problem is that these figures do not take into account the relevance to test the code. For example a coverage of 70% of the application is a good figure, but the covered code could be trivial and without any real interest for the tests (e.g. accessors or generated code), whereas the critical code may be located in the remaining 30%. The analysis performed here captures the relevance to test of each method, which is used to calibrate the code coverage requirements and to set appropriate thresholds to better target testing effort towards risk areas. 5.4.2 TestRelevancyIndex metrics (TRI) and TestEffortIndex (TEI) To refine the analysis of tests, two new metrics were designed by the Centre of Excellence in Information and Communication Technologies (CETIC) based on researches conducted during the past 20 years and from the “Quality Cockpit” knowledge base6. The TestRelevancyIndex (TRI) measures the relevancy of testing a method in accordance with its technical risks and its business risk. Technical risk assesses the probability of finding a defect; it is based on different metrics such as cyclomatic complexity, number of variables, number of parameters, efferent coupling, cumulative number of non- compliances... The business risk associates a risk factor to business features which should be tested in priority (higher risk), or instead which should not be tested (minor risk). It must be determined at the initialization of the audit to be considered in the TRI calculations. The objective is to guide the testing effort on the important features. For this, the TRI is used to classify the methods according to a scale of testing priority, and thus to distinguish the truly relevant methods to test from trivial and irrelevant methods in this area. For each level of the scale, a specific threshold to achieve with code coverage can be set. This allows setting a high threshold for critical methods, and a low threshold for low-priority methods. The TestEffortIndex (TEI) completes the TRI by measuring the level of effort required to test a method. Like the TRI, it is based on a set of unit metrics characterizing a method. It helps to refine the decisions to select the code to be tested by balancing the effort over the relevance test. The details of calculating these two indexes are providing in annex (8.2 The coupling). 5.4.3 Mapping methods by testing priority The histogram below shows a mapping of methods according to their priority of testing, using a scale of four levels based on TRI of methods (each level corresponding to a range of TRI). This mapping uses the code coverage information only if they were supplied for analysis. For each priority level are indicated:  The average coverage rate (0 if coverage information was not provided)  The number of methods not covered (no coverage) 6 CETIC, Kalistick. Statistically Calibrated Indexes for Unit Test Relevancy and Unit Test Writing Effort, 2010Confidential – This document is the property of Kalistick 36/59
  37. 37. Code audit of SharpDevelop application 2011-01-01  The number of methods insufficiently covered (coverage rate below the target rate set for this level of priority)  The number of methods sufficiently covered (coverage greater than or equal to the target rate set for this level of priority) The table below shows these figures for each priority level, also adding a fifth level corresponding to the methods without test priority: Test priority Covered Uncovered Insufficient covered Critical 0 1373 0 High 0 515 0 Medium 0 10 0 Low 0 14 0 None 0 5656 0 [Total] 0 7568 0 See the Quality Cockpit 5.4.4 Coverage of application by tests This graph, called “TreeMap” shows code coverage of the application against test objectives. It helps to identify parts of the application that are not sufficiently tested regarding identified risks. It gathers the classes of project into technical subsets, and characterizes them following two dimensions:  size, which depends on the number of statements  color, which represents the deviation from the test objective set for the classes: the color red indicates that the current coverage is far from the goal, whereas the green color indicates that the goal is reachedConfidential – This document is the property of Kalistick 37/59
  38. 38. Code audit of SharpDevelop application 2011-01-01 See the Quality Cockpit A class can be green even if it is not or little tested: for example, classes with a low probability of technical defects or without business risk. Conversely, a class already tested can be stated as insufficient (red / brown) if its objective is very demanding. An effective strategy to improve its coverage is to focus on large classes close to the goal. 5.4.5 Most important classes to test (Top Risks) The following chart allows quickly identifying the most relevant classes to test, the “Top Risks”. It is a representation known as "cloud" that displays the classes using two dimensions:  The size of the class name depends on its relevancy in being tested (TRI cumulated for all methods of this class)  The color represents the deviation from the coverage goal set for the class, just as in the previous TreeMapConfidential – This document is the property of Kalistick 38/59
  39. 39. Code audit of SharpDevelop application 2011-01-01 See the Quality Cockpit This representation identifies the critical elements, but if you want to take into account the effort of writing tests, you must focus on the following representation to select items to be corrected. 5.4.6 Most important classes to test and require the least effort (Quick Wins) The “Quick Wins” complements “Top Risks” by taking into account the testing effort required for testing the class (TEI):  The size of the class name depends on its interest in being tested (TRI), but weighted by the effort required (TEI accumulated for all methods): a class with a high TRI and a high TEI (therefore difficult to test) appears smaller than a class with an average TRI but a low TEI  The color represents the deviation from the coverage goal set for the class, just like the TreeMap or QuickWinConfidential – This document is the property of Kalistick 39/59
  40. 40. Code audit of SharpDevelop application 2011-01-01 See the Quality Cockpit 5.4.7 Methods to test in priority The following table details the main methods to be tested first. Each method is associated with its current coverage rate, the raw value of its TRI and its level of TEI:Confidential – This document is the property of Kalistick 40/59
  41. 41. Code audit of SharpDevelop application 2011-01-01 Method Coverage Relevancy Priority Effort New (TRI) violation ICSharpCode.SharpDevelop.Refactoring.Refact 0% 37.00 Critical High oringMenuBuilder.BuildSubmenu ( ICSharpCode.Core.Codon, System.Object) ICSharpCode.SharpDevelop.Project.Solution.Se 0% 37.00 Critical Very high tupSolution ( ICSharpCode.SharpDevelop.Project.Solution, System.String) ICSharpCode.SharpDevelop.Project.MSBuildBa 0% 37.00 Critical Very high sedProject.SetPropertyInternal ( System.String, System.String, System.String, System.String, ICSharpCode.SharpDevelop.Project.PropertySt orageLocations, System.Boolean) ICSharpCode.SharpDevelop.Dom.NRefactoryR 0% 37.00 Critical Very high esolver.NRefactoryResolver.ResolveIdentifierI nternal ( System.String) ICSharpCode.SharpDevelop.Commands.Sharp 0% 36.00 Critical High DevelopStringTagProvider.Convert ( System.String) ICSharpCode.Core.AddInTree.Load ( 0% 36.00 Critical High System.Collections.Generic.List<System.String >, System.Collections.Generic.List<System.String >) ICSharpCode.SharpDevelop.Project.Commands 0% 36.00 Critical High .AddExistingItemsToProject.Run ( ) ICSharpCode.SharpDevelop.Dom.CecilReader. 0% 35.00 Critical High CreateType ( ICSharpCode.SharpDevelop.Dom.IProjectCont ent, ICSharpCode.SharpDevelop.Dom.IDecoration, Mono.Cecil.TypeReference) ICSharpCode.SharpDevelop.Commands.ToolM 0% 35.00 Critical High enuBuilder.ToolEvt ( System.Object, System.EventArgs) ICSharpCode.SharpDevelop.DefaultEditor.Gui. 0% 35.00 Critical Very high Editor.MethodInsightDataProvider.SetupDataP rovider ( System.String, ICSharpCode.TextEditor.Document.IDocument , ICSharpCode.SharpDevelop.Dom.ExpressionRe sult, System.Int32, System.Int32) ICSharpCode.SharpDevelop.Refactoring.Refact 0% 35.00 Critical High oringService.AddReferences ( System.Collections.Generic.List<ICSharpCode.S harpDevelop.Refactoring.Reference>, ICSharpCode.SharpDevelop.Dom.IClass, ICSharpCode.SharpDevelop.Dom.IMember, System.Boolean, System.String, System.String)Confidential – This document is the property of Kalistick 41/59
  42. 42. Code audit of SharpDevelop application 2011-01-01 ICSharpCode.SharpDevelop.Dom.ReflectionLay 0% 35.00 Critical High er.ReflectionReturnType.Create ( ICSharpCode.SharpDevelop.Dom.IProjectCont ent, ICSharpCode.SharpDevelop.Dom.IDecoration, System.Type, System.Boolean) ICSharpCode.SharpDevelop.Dom.NRefactoryR 0% 35.00 Critical Very high esolver.NRefactoryResolver.ResolveInternal ( ICSharpCode.NRefactory.Ast.Expression, ICSharpCode.SharpDevelop.Dom.ExpressionCo ntext) ICSharpCode.SharpDevelop.Dom.NRefactoryR 0% 35.00 Critical Very high esolver.TypeVisitor.CreateReturnType ( ICSharpCode.NRefactory.Ast.TypeReference, ICSharpCode.SharpDevelop.Dom.IClass, ICSharpCode.SharpDevelop.Dom.IMember, System.Int32, System.Int32, ICSharpCode.SharpDevelop.Dom.IProjectCont ent, System.Boolean) ICSharpCode.SharpDevelop.Dom.CSharp.CShar 0% 35.00 Critical High pExpressionFinder.SearchBracketForward ( System.String, System.Int32, System.Char, System.Char) ICSharpCode.SharpDevelop.DefaultEditor.Gui. 0% 35.00 Critical High Editor.AbstractCodeCompletionDataProvider.C reateItem ( System.Object, ICSharpCode.SharpDevelop.Dom.ExpressionCo ntext) ICSharpCode.SharpDevelop.Project.MSBuildEn 0% 34.00 Critical Normal gine.BuildRun.ParseSolution ( Microsoft.Build.BuildEngine.Project) ICSharpCode.SharpDevelop.Project.ProjectSer 0% 34.00 Critical High vice.LoadProject ( System.String) ICSharpCode.SharpDevelop.Project.ProjectBro 0% 34.00 Critical High wserControl.FindDeepestOpenNodeForPath ( System.String) See the Quality Cockpit 5.5 Architecture The Architecture domain aims to monitor compliance of a software architecture model. The target architecture model has been presented in Chapter 4.3.2 Technical model. The following diagram shows the results of architecture analysis by comparing this target model with current application code. Currently, architecture non-compliances are not taken into account in the calculation of non-compliance of the application.Confidential – This document is the property of Kalistick 42/59
  43. 43. Code audit of SharpDevelop application 2011-01-01 See the Quality Cockpit Non-compliances related to communication constraints between two elements are represented using arrows. The starting point is the calling element, the destination is the one called. The orange arrows involve direct communication between a top layer and bottom layer non-adjacent (sometimes acceptable). The black arrows refer to communications totally prohibited. 5.6 Duplication The Duplication domain is related to the “copy-and-paste” identified in the application. To avoid many false positives in this area, a threshold is defined to ignore blocks with few statements. Duplications should be avoided for several reasons: maintenance and changeability issues, testing costs, lack of reliability... 5.6.1 Mapping of duplication The chart below shows a mapping of duplications within the application. It does not take into account the duplication involving a number of statements below the threshold, because they are numerous and mostly irrelevant (e.g. duplication of accessors between different classes sharing similar properties).Confidential – This document is the property of Kalistick 43/59
  44. 44. Code audit of SharpDevelop application 2011-01-01 Duplicates are categorized by ranges of duplicated statements. For each range is presented:  The number of different duplicated blocks (each duplicated at least once)  The maximum number of duplications of the same block See the Quality Cockpit 5.6.2 Duplications to fix in priority The following table lists the main duplicates to fix in priority. Each block is identified by a unique identifier, and each duplication is located in the source code. If a previous audit were completed, a flag indicates whether duplication is new or not. Duplication Duplicated Class involved Lines New number blocks size violation See the Quality Cockpit 5.7 Documentation The Documentation domain aims to control the level of technical documentation of the code. Only the definition of standard comment header of the methods is verified: Javadoc for Java, XmlDoc for C#. Inline comments (in the method bodies) are not evaluated because of the difficulty to verify their relevance (often commented code or generated comments). In addition, the header documentation is verified only for methods considered quite long and complex. Because the effort to document trivial methods is rarely justified. For this, a threshold on the cyclomatic complexity and a threshold on the number of statements are defined to filter out methods to check.Confidential – This document is the property of Kalistick 44/59
  45. 45. Code audit of SharpDevelop application 2011-01-01 5.7.1 Mapping documentation issues The chart below shows the status of header documentations for all methods with a complexity greater than the threshold. The methods are grouped by ranges of size (number of statements). For each range are given the number of methods with header documentation and the number of methods without header documentation. The red area in the last range corresponds to the methods not documented therefore non- compliant. 5.7.2 Methods to document in priority The following table lists the main methods to document in priority:Confidential – This document is the property of Kalistick 45/59
  46. 46. Code audit of SharpDevelop application 2011-01-01 Method Instructions Complexity New violation ICSharpCode.SharpDevelop.Project.MSBuildBasedProject.Set 92 28 PropertyInternal ICSharpCode.SharpDevelop.Project.DirectoryNode.Initialize 81 29 ICSharpCode.SharpDevelop.Dom.VBNet.VBNetAmbience.Co 81 44 nvert ICSharpCode.SharpDevelop.Widgets.SideBar.SideBarControl. 81 32 ProcessCmdKey ICSharpCode.SharpDevelop.Commands.SharpDevelopString 81 25 TagProvider.Convert ICSharpCode.SharpDevelop.Dom.CSharp.CSharpAmbience.C 77 41 onvert ICSharpCode.SharpDevelop.Refactoring.RefactoringMenuBuil 74 22 der.BuildSubmenu ICSharpCode.SharpDevelop.Project.Solution.Save 74 11 ICSharpCode.SharpDevelop.Dom.NRefactoryResolver.NRefact 71 34 oryResolver.ResolveInternal ICSharpCode.SharpDevelop.Widgets.TreeGrid.DynamicList.On 71 21 Paint ICSharpCode.SharpDevelop.DefaultEditor.XmlFormattingStrat 70 24 egy.TryIndent ICSharpCode.SharpDevelop.Project.Commands.AddExistingIte 70 22 msToProject.Run ICSharpCode.SharpDevelop.Project.Solution.SetupSolution 64 20 ICSharpCode.Core.AddInTree.Load 62 21 ICSharpCode.SharpDevelop.DefaultEditor.Commands.ClassMe 60 23 mberMenuBuilder.BuildSubmenu ICSharpCode.SharpDevelop.Dom.Refactoring.NRefactoryRefa 58 24 ctoringProvider.GetFullCodeRangeForType ICSharpCode.SharpDevelop.Dom.CSharp.CSharpExpressionFin 58 25 der.ReadNextToken ICSharpCode.SharpDevelop.Dom.CecilReader.CecilClass.InitM 57 30 embers ICSharpCode.SharpDevelop.Dom.CSharp.CSharpExpressionFin 57 47 der.SearchBracketForward See the Quality CockpitConfidential – This document is the property of Kalistick 46/59

×