Code Coverage in Theory and in practice form the DO178B perspective


Published on

Published in: Technology, Education
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Statement Coverage only requires one test case (condition evaluates to true) to satisfy this code Decision/Condition Coverage requires two test cases (condition evaluates to true and false) to satisfy this code
  • Decision/Condition Coverage only requires two test cases, one to execute the if branch, one to execute the else branch MCDC requires 4 test cases Must make “A OR B” both true and false, while simultaneously making the entire statement true and false.
  • White box testing is a structural technique, which is based on the structure of the test item. This structure is evaluated and measurements are taken from it. These assess characteristics such as The control flow of the item under test Data flows into, out of and within the item under test Which part of the source that have been covered by a particular set of tests In order to take these measurements, it is necessary to enhance the code with additional statements that record key events during execution. This is called  Instrumentation As instrumentation may influence the behaviour of the test object, all tests should be repeated without coverage measurement, to exclude the possibility of side effects.
  • Code coverage as acceptance criteria: Statement about expected coverage in relation to the maximal possible coverage in terms of a certain coverage measurement. E.g. 90 % statement coverage means: 90% of all statements in the source code are covered by the tests executed. Empirical studies have shown that a incredibly difficult to achieve coverage rate must be reached to improve error detection noticeably (90 – 95 % or higher). This implies that punctually using coverage measurement to visualize areas of the code that are not covered with tests is superior to acceptance criteria using coverage measurement. Code Coverage as acceptance criteria may motivate developers to concentrate on developing test code to increase coverage (e.g. testing getter and setters) instead of to find errors in the code. It makes sense to concentrate on critical and complex parts of the code, where most bugs are expected to be. But those parts are often also hard to test and code coverage is increased only to a negligible degree by these tests.
  • A good code coverage measurement tool is not only capable of measuring code coverage and check whether some acceptance criteria in terms of code coverage are met. A good code coverage tool should also provide means to visualize achieved coverage in order to help developers identifying source code in need of additional tests. A good example is the reasonable priced Clover, a coverage measurement and visualization tool for Java available from It integrates nicely with the Eclipse IDE and will be used in the labs for this module.
  • Method coverage doesn’t really help a lot. There is a need for more elaborate code coverage models.
  • While statement coverage certainly is a weak coverage measurement, it can make sense to use it, if the code itself is not complex and does not contain many branches.
  • Condition coverage reports the true or false outcome of each boolean sub-expression, separated by logical-and and logical-or if they occur. Condition coverage measures the sub-expressions independently of each other. Multiple condition coverage reports whether every possible combination of boolean sub-expressions occurs. Condition/Decision Coverage is a hybrid measure composed by the union of condition coverage and decision coverage.
  • following red path, warning seems justified but red path is infeasible given error is from Flexelint; Orion reports no errors
  • Method coverage focuses on called methods. Statement coverage focuses on called statements. Branch coverage focuses on decisions based on results of boolean expressions (true, false). Condition coverage focuses on decisions based each condition of an boolean expression. Path coverage focuses on each possible unique path through a method. Test coverage focuses on finding and elimination errors in the code.
  • Code Coverage in Theory and in practice form the DO178B perspective

    1. 1. Code Coverage in Theory and in practice form the DO178B perspective Daniel Liezrowice – Engineering Software Lab (The Israeli Center for Static Code Analysis ) Presentation is Supported by Parasoft Makers of Automated DO178B Certified Analysis Tools & A Wind River Certified partner
    2. 2. Testing of Code Coverage is essential part of Safety-Critical Software Certification
    3. 3. A little bit of Baselining <ul><li>Why Certify? </li></ul><ul><li>What is DO-178B? </li></ul><ul><li>DO-178B Criticality Levels </li></ul><ul><li>Effect of Criticality Levels on Testing </li></ul>
    4. 4. Why Certify? <ul><li>Software flying on a commercial aircraft and sometimes military aircraft must adhere to DO-178B. </li></ul><ul><li>Meant to ensure software safety, provide high software quality, and high reliability. </li></ul>
    5. 5. What is DO-178B? <ul><li>Developed by the International Commercial Aviation Community. </li></ul><ul><li>Technically an established set of guidelines for developing safety-critical software. </li></ul><ul><li>In reality, it’s a set of requirements for safety-critical software development. </li></ul><ul><li>Part of DO-178B identifies “levels” of software criticality and what is required to certify each of the different levels. </li></ul>
    6. 6. Software Criticality Levels <ul><li>DO-178B categorizes five levels of software criticality </li></ul><ul><li>Levels A through E, with Level A being the highest level of criticality. </li></ul><ul><li>Application of DO-178B to Level E systems is not required </li></ul>
    7. 7. Software Criticality Levels <ul><li>Level A: anomalous software behavior would cause or contribute to a catastrophic failure condition resulting in a total loss of life. </li></ul><ul><li>Level B: anomalous software behavior would cause or contribute to a hazardous/sever-major failure condition, resulting in some loss of life. </li></ul>
    8. 8. Software Criticality Levels <ul><li>Level C: anomalous behavior would cause or contribute to a major failure condition, resulting in serious injuries . </li></ul><ul><li>Level D: anomalous behavior would cause or contribute to a minor failure condition, resulting in minor injuries . </li></ul><ul><li>Level E: anomalous behavior would not effect aircraft operation or pilot workload, resulting in no impact on passenger or aircraft safety . </li></ul>
    9. 9. Testing Criticality Levels <ul><li>Level D </li></ul><ul><ul><li>Must have some verification process in place, but no structural code testing is required. </li></ul></ul><ul><li>Level C </li></ul><ul><ul><li>Must perform everything required by Level D </li></ul></ul><ul><ul><li>Additionally, must perform “Statement Coverage” </li></ul></ul><ul><ul><li>Statement Coverage - Every statement of source code must be executed by a formal test case </li></ul></ul>
    10. 10. Testing Criticality Levels <ul><li>Level B </li></ul><ul><ul><li>Must perform everything required by Level C </li></ul></ul><ul><ul><li>Additionally must perform “Decision/Condition” coverage </li></ul></ul><ul><ul><li>Decision/Condition Coverage - Every code branch must be executed by a formal test case. </li></ul></ul>
    11. 11. Testing Criticality Levels <ul><li>Example: </li></ul><ul><li>int* p = NULL; </li></ul><ul><li>if (condition) </li></ul><ul><li>p = &variable; </li></ul><ul><li>*p = 123; </li></ul>
    12. 12. Testing Criticality Levels <ul><li>Level A </li></ul><ul><ul><li>Must perform everything required by Level B </li></ul></ul><ul><ul><li>Additionally must perform “Modified Condition/Decision Coverage” </li></ul></ul><ul><ul><li>Modified Condition/Decision Coverage - Every condition within each decision statement needs to be independently verified for its effect on the statement. </li></ul></ul>
    13. 13. Testing Criticality Levels <ul><li>Example: </li></ul><ul><li>if ((A || B) && C) </li></ul><ul><ul><li>// Do something </li></ul></ul><ul><li>else </li></ul><ul><li>// Do something else </li></ul>
    14. 14. Sub summery <ul><li>Avionics software must adhere to the DO-178B guidelines before it can fly. </li></ul><ul><li>The DO-178B guidelines specify five levels of software criticality. </li></ul><ul><li>The level of testing required varies by the software criticality level. </li></ul><ul><li>The structural coverage testing required by DO-178B is complex and is a primary cost driver on avionics software projects . </li></ul>
    15. 15. Code Coverage Agenda <ul><li>Introduction to structural test case investigation techniques </li></ul><ul><li>Why do we measure code coverage? </li></ul><ul><li>Method Coverage </li></ul><ul><li>Statement Coverage </li></ul><ul><li>Branch Coverage </li></ul><ul><li>Condition Coverage </li></ul><ul><li>Path Coverage </li></ul><ul><li>Limitations of structural techniques and coverage measurement </li></ul>
    16. 16. Code Coverage classified as White Box Testing <ul><li>The basis for designing white box tests is the source code of the software under test </li></ul><ul><li>Program source must be available </li></ul><ul><li>Basic idea: Try to execute all parts of the source at least once </li></ul>
    17. 17. Working with Code Coverage <ul><li>Code coverage analysis is the process of: </li></ul><ul><ul><li>Writing test cases and execute them </li></ul></ul><ul><ul><li>Finding areas of code not covered by a set of test cases. </li></ul></ul><ul><ul><li>Creating additional test cases to increase coverage. </li></ul></ul><ul><ul><li>Determining a quantitative measure of code coverage </li></ul></ul><ul><li>Code Coverage is an indirect measure of quality. </li></ul><ul><li>A code coverage analyser tool supports and automates this process. </li></ul><ul><li>Code coverage is not fool-proof </li></ul><ul><ul><li>There is still the possibility of errors, even with 100% code coverage </li></ul></ul>
    18. 18. Making Good Use of Code Coverage <ul><li>Improve tests using code coverage </li></ul><ul><ul><li>Design functional unit tests by using black-box tes t design techniques based on the Interface of the component. </li></ul></ul><ul><ul><li>Measure code coverage and visualize the result to identify missing tests. </li></ul></ul><ul><ul><li>Add some functional unit tests by using white-box test design techniques to reach the expected/target code coverage. </li></ul></ul><ul><li>Code Coverage Trend Reports </li></ul><ul><ul><li>Code Coverage Trend Reports are an important indicator for the health of your project. </li></ul></ul><ul><li>Beware of code coverage as acceptance criteria </li></ul><ul><ul><li>Coverage above 90% is incredibly difficult to achieve. </li></ul></ul><ul><ul><li>If a certain level of code coverage is requested, developers tend to concentrate on easy unit tests, missing the hard and important ones. </li></ul></ul><ul><ul><li>It may make sense to define hard coverage criteria for more complex components, as a result of a risk analysis, or if regulations exist (e.g. FDA for medical devices). </li></ul></ul>
    19. 19. Code Coverage Tools Visualization of code not covered in the code editor Visualization of coverage rate achieved Warnings generated by the tool to point to code not covered Coverage rate achieved broken down by packages and classes
    20. 20. Method Coverage (Function Coverage) <ul><li>Reports whether a method (function) was invoked while testing the application. </li></ul><ul><ul><li>Identifies untested classes/methods. </li></ul></ul><ul><ul><li>Is the weakest of all code coverage models. </li></ul></ul><ul><ul><li>May help identifying where to focus testing at the beginning of a testing phase. </li></ul></ul><ul><ul><li>No meaningful statement about the quality of test (coverage) possible. </li></ul></ul>
    21. 21. Statement Coverage (Line Coverage) <ul><li>Reports whether each executable statement was executed. </li></ul><ul><ul><li>Identifies statements that are never called (Dead Code) </li></ul></ul><ul><ul><li>High statement coverage is still far from good test coverage. </li></ul></ul><ul><ul><ul><li>Lot’s of bugs may still be undetected! </li></ul></ul></ul>true covers all statement
    22. 22. Statement Coverage (Line Coverage) <ul><li>The main disadvantage of statement coverage is that it is insensitive to some control structures </li></ul><ul><li>int* p = NULL; </li></ul><ul><li>if ( condition ) p = &variable; </li></ul><ul><li>*p = 123; </li></ul><ul><li>Without a test case that causes condition to evaluate false, statement coverage rates this code fully covered. </li></ul><ul><li>In fact, if condition ever evaluates false, this code fails. </li></ul>
    23. 23. Branch Coverage (Decision Coverage) <ul><li>Reports whether boolean expressions evaluate to true AND false. </li></ul><ul><ul><li>Ignores branches within boolean expressions which occur due to short-circuit operators </li></ul></ul><ul><ul><li>High branch coverage shows that some substantial unit testing was done </li></ul></ul><ul><ul><li>May still miss errors in complex (compound) boolean expressions because of its focus of the two possible outcomes (true or false). </li></ul></ul>true false
    24. 24. Branch Coverage (Decision Coverage) <ul><li>This metric has the advantage of simplicity without the problems of  statement coverage . </li></ul><ul><li>A disadvantage is that this metric ignores branches within Boolean expressions which occur due to short-circuit operators. For example </li></ul><ul><li>if (condition1 && (condition2 || function1())) statement1; </li></ul><ul><li>else statement2; </li></ul><ul><li>This metric could consider the control structure completely exercised without a call to function1. </li></ul><ul><li>The test expression is true when condition1 is true and condition2 is true, and the test expression is false when condition1 is false. </li></ul><ul><li>In this instance, the short-circuit operators preclude a call to function1. </li></ul>
    25. 25. Condition Coverage <ul><li>Reports the true or false outcome of each boolean sub-expression </li></ul><ul><ul><li>similar to branch coverage but has better sensitivity to the control flow </li></ul></ul><ul><ul><li>But (simple) condition coverage does not guarantee full branch coverage. </li></ul></ul><ul><li>Variations </li></ul><ul><ul><li>(Simple) Condition Coverage </li></ul></ul><ul><ul><li>Multiple Condition Coverage </li></ul></ul><ul><ul><li>... </li></ul></ul>true, true true, false false, true false, false
    26. 26. Condition Coverage <ul><li>Condition coverage reports the true or false outcome of each condition. </li></ul><ul><li>A condition is an operand of a logical operator that does not contain logical operators. </li></ul><ul><li>Condition coverage measures the conditions independently of each other. </li></ul><ul><li>This metric is similar to  decision coverage  but has better sensitivity to the control flow. </li></ul><ul><li>However, full condition coverage does not guarantee full  decision coverage . </li></ul><ul><li>For example, consider the following C++/Java fragment. </li></ul><ul><li>bool f(bool e) { return false; } bool a[2] = { false, false }; </li></ul><ul><li>if (f(a && b)) ... </li></ul><ul><li>if (a[int(a && b)]) ... if ((a && b) ? false : false) ... </li></ul><ul><li>All three of the if-statements above branch false regardless of the values of a and b. However if you exercise this code with a and b having all possible combinations of values, condition coverage reports full coverage. </li></ul>
    27. 27. Path Coverage (Predicate Coverage) <ul><li>Reports whether each of the possible paths in each method have been followed </li></ul><ul><ul><li>A path is a unique sequence of branches from the method entry to the exit. </li></ul></ul><ul><ul><li>100% Path coverage may not be achieved because: </li></ul></ul><ul><ul><ul><li>Many paths are impossible to exercise (dynamically) due to relationships of data. </li></ul></ul></ul><ul><ul><ul><li>The number of paths is exponential to the number of branches. (in some cases we can even reach the “halting problem&quot;) </li></ul></ul></ul><ul><ul><ul><li>Considers only a limited number of looping possibilities. </li></ul></ul></ul><ul><li>Variations of Path Coverage </li></ul><ul><ul><li>data flow coverage </li></ul></ul><ul><ul><li>(By Static code Analysis) </li></ul></ul>
    28. 28. False Positive Problem: False Errors Solution By Data Flow Analysis <ul><li>false error : reported by analyzer but not in fact a latent error in the program </li></ul>10-Mar-05 1 int f(int x) { 2 int y; 3 if (x > 0) y = x; 4 if (x > 3) x = y; 5 return x; 6 } 3 4 6 (x ≤ 0) (x > 0) y = x 5 (x ≤ 3) (x > 3) x = y return x Warning Variable 'y' (line 2) may not have been initialized (x ≤ 0) (x > 3)
    29. 29. Code Coverage: Brief Summary Method coverage Statement coverage Branch coverage Condition coverage Path coverage
    30. 30. Condition/Decision Coverage Condition/Decision Coverage is a hybrid metric composed by the union of  condition coverage  and  decision coverage .
    31. 31. Modified Condition/Decision Coverage DO178B Level A Every point of entry and exit in the program has been invoked at least once, every condition in a decision has taken all possible outcomes at least once, every decision in the program has taken all possible outcomes at least once, and each condition in a decision has been shown to independently affect that decisions outcome. A condition is shown to independently affect a decisions outcome by varying just that condition while holding fixed all other possible conditions  This metric is specified for safety critical aviation software by RCTA/DO-178B
    32. 32. How It is Done? - Instrumentation 1  void  foo()    2 {    3   found= false ;    4    for  (i=0;(i<100) && ( ! found );i++)    5   {    6      if  (i==50)  break ;    7      if  (i==20) found= true ;    8      if  (i==30) found= true ;    9   }   10   printf(&quot;foon&quot;);   11 }
    33. 33. How It is Done? - Automatic Instrumentation. Instrumentation for statement coverage 1 char inst[5];    2  void  foo()    3 {    4   found= false ;    5    for  (i=0;(i<100) && (! found);i++)    6   {    7      if  (i==50 ) { inst[0]=1 ; break ;}    8      if  (i==20 ) { inst[1]=1 ;found= true ;}    9      if  (i==30 ) { inst[2]=1 ;found= true ;}   10     inst[3]=1 ; }   11   printf(&quot;foon&quot;);   12   inst[4]=1 ; }
    34. 34. How It is Done? - Instrumentation Inserting the full instrumentation code for the condition coverage in this example will produce the following code:     1 char inst[15] ;    2  void  foo()    3 {    4   found= false ;    5    for  (i=0;((i<100)? inst[0]=1 :inst[1]=1,0) && ((! found)? inst[2]=1 :inst[3]=1,0);i++)    6   {    7      if  ((i==50? inst[4]=1 : inst[5]=1 ,0) ) { inst[6]=1 ; break ;}    8      if  ((i==20? inst[7]=1 : inst[8]=1 ,0) ) { inst[9]=1 ;found= true ;}    9      if  ((i==30? inst[10]=1 : inst[11]=1 ,0) ) { inst[12]=1 ;found= true ;}   10    inst[13]=1 ; }   11   printf(&quot;foon&quot;);   12  inst[14]=1 ; } Full code coverage instrumentation at condition level
    35. 35. References <ul><li>Cornett, Steve. “Code Coverage Analysis” <> </li></ul><ul><li>Enea Teksci. “DO-178B Training”. March 2007 </li></ul><ul><li>High Rely, Reliable Embedded Solutions. “FAA Compliance Questions and Answers” <> </li></ul><ul><li>Nilsen, Kelvin. “Certification Requirements for Safety-Critical Software” <> </li></ul>
    36. 36. Parasoft Corporate Overview <ul><li>Founded in 1987, privately held </li></ul><ul><li>CEO Dr. Adam Kolawa from CalTech </li></ul><ul><li>Headquarters in Monrovia, CA </li></ul><ul><li>18 locations and 360+ employees worldwide </li></ul><ul><li>10,000+ customers worldwide; 85% Fortune 100 Company </li></ul><ul><li>Technical innovator </li></ul><ul><ul><li>27 US patents for software technology </li></ul></ul><ul><li>26million LOC VS 130 Developers </li></ul>Company Background
    37. 37. <ul><li>We now moving to the Demo but before… </li></ul><ul><li>Questions? </li></ul>
    38. 38. Thank You Daniel Liezrowice – Engineering Software Lab (The Israeli Center for Static Code Analysis & Dynamic testing) [email_address] 09-8855803