Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Like this presentation? Why not share!

- verification and validation by Dinesh Pasi 8270 views
- Validation and verification by De La Salle Unive... 6540 views
- Introduction To Computers by baldwind1976 1513 views
- Re-Branding Software Testing by Softwarecentral 477 views
- Black Box Software Testing Special ... by Softwarecentral 452 views
- A Machine Learning Approach for Sta... by Softwarecentral 455 views

No Downloads

Total views

3,087

On SlideShare

0

From Embeds

0

Number of Embeds

4

Shares

0

Downloads

385

Comments

0

Likes

4

No embeds

No notes for slide

- 1. Lesson 09 Software Verification, Validation and Testing <ul><li>Includes: </li></ul><ul><li>Software Testing Techniques </li></ul><ul><li>Intro to Testing </li></ul>Includes materials adapted from Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 2. Software Testing Testing is the process of exercising a program with the specific intent of finding errors prior to delivery to the end user. From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 3. We Design Test Cases to... <ul><li>have high likelihood of finding errors </li></ul><ul><li>exercise the internal logic of software components </li></ul><ul><li>exercise the input and output to uncover errors in program function, behavior, and performance </li></ul>Goal is to find maximum number of errors with the minimum amount of effort and time! From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 4. Testing is “Destructive” Activity <ul><li>designing and executing test cases to “break” or “demolish” the software. </li></ul><ul><li>Must change your mindset during this activity </li></ul>The objective is to find errors therefore errors found are good not bad. Tell that to a manager! From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 5. Testing Objectives <ul><li>Execute a program with intent of finding an error </li></ul><ul><li>Good test case has high probability of finding an as-yet undiscovered error </li></ul><ul><li>Successful test case finds an as-yet undiscovered error </li></ul>Successful testing uncovers errors. From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 6. Testing demonstrates ... <ul><li>Software functions work as specified </li></ul><ul><li>Behavioral and performance requirements appear to be met </li></ul><ul><li>Data collected is an indicator of reliability and quality </li></ul>TESTING CANNOT SHOW THE ABSENCE OF ERRORS AND DEFECTS . Testing only shows that errors and defects are present. From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 7. Basic Principles of Testing <ul><li>All testing traceable to requirement </li></ul><ul><li>Plan testing long before testing begins. Plan and design tests during design before any code has been generated. </li></ul><ul><li>Pareto Principle - 80% errors in 20% of components </li></ul><ul><li>Start small and progress to large. First test individual components (unit test), then on clusters of integrated components (integration test), then on whole system </li></ul><ul><li>Exhaustive testing not possible but we can assure that all conditions have been exercised </li></ul><ul><li>All testing should not be done by developer - need independent 3rd party </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 8. Testability <ul><li>Operability —it operates cleanly </li></ul><ul><li>Observability —the results of each test case are readily observed </li></ul><ul><li>Controllability —the degree to which testing can be automated and optimized </li></ul><ul><li>Decomposability —control scope of testing </li></ul><ul><li>Simplicity —reduce complex architecture and logic to simplify tests </li></ul><ul><li>Stability —few changes are requested during testing </li></ul><ul><li>Understandability —of the design and documents </li></ul>Testability refers to how easily product can be tested. Design software with “Testability” in mind. From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 9. What Testing Shows <ul><li>Errors </li></ul><ul><li>Requirements conformance </li></ul><ul><li>Performance </li></ul><ul><li>An indication of quality </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 10. Who Tests the Software? Developer Independent Tester Understands the system but will test “gently” and is driven by “delivery” Must learn about the system but will attempt to break it and is driven by quality From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 11. Software Testing <ul><li>Black Box Testing Methods </li></ul><ul><li>White Box Testing Methods </li></ul><ul><li>Strategies for Testing </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 12. Black Box Testing <ul><li>Based on specified function, on the requirements </li></ul><ul><li>Tests conducted at the software interface </li></ul><ul><li>Demonstrates that the software functions are operational, input is properly accepted, output is correctly produced, and integrity of external info is maintained </li></ul><ul><li>Uses the SRS as basis for construction of tests </li></ul><ul><li>Usually performed by independent group </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 13. White-Box Testing … Our goal is to ensure that all statements and conditions have been executed at least once. From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 14. White Box Testing -- I <ul><li>Based on internal workings of a product; requires close examination of software </li></ul><ul><li>Logical paths are tested by providing test cases that exercise specific sets of conditions and/or loops </li></ul><ul><li>Check status of program by comparing actual results to expected results at selected points in the software </li></ul>Exhaustive path testing is impossible From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 15. Exhaustive Testing loop < 20 times There are 10 possible paths! If we execute one test per millisecond, it would take 3,170 years to test this program!! 14 From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 16. White Box Testing -- II <ul><li>Logic errors and incorrect assumptions usually occur with special case processing </li></ul><ul><li>Our assumptions about flow of control and data may lead to errors that are only uncovered during path testing </li></ul><ul><li>We make typing errors; some uncovered by compiler (syntax, type checking) BUT others only uncovered by testing. Typo may be on obscure path </li></ul>Black box testing can miss these types of errors From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 17. Selective Testing loop < 20 times Selected path From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 18. Software Testing Techniques Testing Analysis Includes materials adapted from Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 19. Test Case Design <ul><li>Uncover errors in a complete manner with a minimum of effort and time! </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 20. Basis Path Testing -- I <ul><li>A white box testing technique - McCabe </li></ul><ul><li>Use this technique to derive a logical measure of complexity </li></ul><ul><li>Use as a guide for defining a “basis set” of execution paths </li></ul><ul><li>Test cases derived to execute the basis set are guaranteed to execute every statement at least one time during testing </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 21. Cyclomatic Complexity <ul><li>This is a quantitative measure of the logical complexity of a program. </li></ul><ul><li>Used in conjunction with basis set testing it defines the number of independent paths in the basis set </li></ul><ul><li>It provides an upper bound for the number of tests that ensure all statements have been executed at least once. </li></ul><ul><li>See http://www.mccabe.com/pdf/nist235r.pdf for more detailed paper McCabe’s Cyclomatic Complexity. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 22. Basis Path Testing -- II First, we compute the cyclomatic Complexity: : number of simple decisions + 1 1,2,3=3 decisions+1 or number of enclosed areas + 1 A,B,C=3 areas +1 In this case, V(G) = 4 1 2 3 A B C From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 23. Cyclomatic Complexity A number of industry studies have indicated that the higher the V(G), the higher the probability of errors. V(G) modules modules in this range are more error prone From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 24. Basis Path Testing -- III Next, we derive the independent paths: Since V(G) = 4, there are four paths Path 1: 1,2,3,6,7,8 Path 2: 1,2,3,5,7,8 Path 3: 1,2,4,7,8 Path 4: 1,2,4,7,2,...7,8 Note the … implies insertion of path 1, 2, or 3 here. Finally, we derive test cases to exercise these Paths. 8 From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000 8 1 2 3 4 5 6 7
- 25. Creating Flow Graphs <ul><li>Circle (node) represents one or more statements </li></ul><ul><li>Arrows (edges) represent flow or control. Must terminate in a node. </li></ul><ul><li>Region is an area bounded by edges and nodes. The area outside the flow graph is included as a region. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 26. Calculating Cyclomatic Complexity from Flow Graph <ul><li>Count the number of regions </li></ul><ul><li>V(G) = E - N + 2 </li></ul><ul><ul><li>where E = number of edges </li></ul></ul><ul><ul><li>N = number of nodes </li></ul></ul><ul><li>V(G) = P + 1 </li></ul><ul><ul><li>where P = number of predicate nodes (2 or more edges leave the node) </li></ul></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 27. Basis Path Testing Notes <ul><li>- You don’t need a flow chart or graph but the picture helps when you trace program paths </li></ul><ul><li>- Count each simple logical test as 1, compound tests count as 2 or more (depending on number of tests) </li></ul><ul><li>Basis Path Testing should be applied to critical modules. </li></ul><ul><li>Some Development Environments will automate calculation of V(G) </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 28. Deriving Test Cases <ul><li>Using design or code as a foundation, draw a corresponding flow graph </li></ul><ul><li>Determine the cyclomatic complexity </li></ul><ul><li>Identify the basis set of linearly independent paths </li></ul><ul><li>Prepare test cases that will force execution of each path in the basis set </li></ul><ul><li>Exercise - create flow graph from example </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 29. Graph Matrices <ul><li>Software tools exist that use a graph matrix to derive the flow graph and determine the set of basis paths </li></ul><ul><li>Square matrix whose size equals the number of nodes on the flow graph </li></ul><ul><li>Each node is identified by number and each edge by letter </li></ul><ul><li>Can add link weight for other more interesting properties (e.g. processing time, memory required, etc.) </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 30. Control Structure Testing <ul><li>Basis path testing is not enough </li></ul><ul><li>Must broaden testing coverage and improve quality of testing </li></ul><ul><ul><li>Condition Testing </li></ul></ul><ul><ul><li>Data Flow Testing </li></ul></ul><ul><ul><li>Loop Testing </li></ul></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 31. Condition Testing --I <ul><li>Exercise the logical conditions in a program module </li></ul><ul><li>Boolean variable or relational expression </li></ul><ul><li>Compound conditions - one or more conditions </li></ul><ul><li>Detect errors in conditions AND also in rest of program. If test set is effective for conditions, likely also for other errors. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 32. Condition Testing --II <ul><li>Branch Testing - test each True and False branch at least once </li></ul><ul><li>Domain Testing - 3 or 4 tests for a relational expression. Test for greater than, equal to, less than. Also a test which makes the difference between the 2 values as small as possible. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 33. Data Flow Testing <ul><li>Selects test paths according to the locations of definitions and uses of variables in the program. </li></ul><ul><li>Can’t use for large system but can target for suspect areas of the software </li></ul><ul><li>Useful for selecting test paths containing nested if and loop statements </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 34. Loop Testing <ul><li>White box technique focuses on validity of loop constructs </li></ul><ul><li>Four different types of loops: </li></ul><ul><ul><li>Simple loops </li></ul></ul><ul><ul><li>Nested loops </li></ul></ul><ul><ul><li>Concatenated loops </li></ul></ul><ul><ul><li>Unstructured loops - should redesign to reflect structured constructs </li></ul></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 35. Loop Testing Nested Loops Concatenated Loops Unstructured Loops Simple loop From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 36. Loop Testing: Simple Loops Minimum conditions—Simple Loops 1. skip the loop entirely 2. only one pass through the loop 3. two passes through the loop 4. m passes through the loop m < n 5. (n-1), n, and (n+1) passes through the loop where n is the maximum number of allowable passes From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 37. Loop Testing: Nested Loops Nested Loops Concatenated Loops <ul><li>Start at the innermost loop. Set all outermost loops to their minimum values. </li></ul><ul><li>Test the min+1, typical, max-1 and max for the innermost loop while holding the outermost loops at minimum values. </li></ul><ul><li>3. Move out one loop and set it up as in step 2 holding all loops at typical values until the outermost loop has been tested. </li></ul>If the loops are independent of each other then treat as simple loops. Otherwise treat as nested loops. From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 38. Black-Box Testing requirements events input output Also called behavioral testing From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 39. Black Box Testing <ul><li>Does not replace white box testing </li></ul><ul><li>A complementary approach </li></ul><ul><li>Focuses on functional requirements of the software </li></ul><ul><li>Tries to find following types of errors: </li></ul><ul><ul><li>incorrect or missing functions </li></ul></ul><ul><ul><li>interface errors </li></ul></ul><ul><ul><li>errors in data structures or database access </li></ul></ul><ul><ul><li>behavior or performance errors </li></ul></ul><ul><ul><li>initialization or termination errors </li></ul></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 40. Black Box Testing <ul><li>Done during later stages of testing </li></ul><ul><li>Tests designed to answer following questions </li></ul><ul><ul><li>how is functional validity tested? </li></ul></ul><ul><ul><li>How is system behavior and perf tested? </li></ul></ul><ul><ul><li>What classes of input will make good test cases? </li></ul></ul><ul><ul><li>Is system sensitive to certain input values? </li></ul></ul><ul><ul><li>How are the boundaries of a data class isolated? </li></ul></ul><ul><ul><li>What data rates and data volume can the system take? </li></ul></ul><ul><ul><li>What effect will specific data comb. have on system </li></ul></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 41. Equivalence Partitioning <ul><li>Black box method that divides the input domain of a program into classes of data from which test cases can be derived </li></ul><ul><li>Strive to design a test case that uncovers classes of errors and reduces the total number of test cases that must be developed and run. E.g. incorrect processing of all character data </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 42. Equivalence Partitioning user queries mouse picks output formats prompts data errors From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 43. Sample Equivalence Classes <ul><li>Data outside bounds of the program </li></ul><ul><li>Physically impossible data </li></ul><ul><li>Proper value supplied in the wrong place </li></ul>Valid data Invalid data <ul><li>User supplied commands </li></ul><ul><li>Responses to system prompts </li></ul><ul><li>Filenames </li></ul><ul><li>Computational Data </li></ul><ul><ul><li>physical parameters </li></ul></ul><ul><ul><li>bounding values </li></ul></ul><ul><ul><li>initiation values </li></ul></ul><ul><li>Output data formatting </li></ul><ul><li>Responses to error msgs </li></ul><ul><li>Graphical data (e.g. mouse picks) </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 44. Equivalence Class Definition Guidelines <ul><li>Input condition specified range - one valid and 2 invalid classes defined </li></ul><ul><li>Input condition requires specific value, one valid and 2 invalid classes defined </li></ul><ul><li>Input condition specifies a number of a set, one valid and one invalid class defined </li></ul><ul><li>Input condition is Boolean, one valid and one invalid class defined </li></ul><ul><li>E.g. prefix - 3 digit number not beginning with 0 or 1; Input condition: range - specified value >200; Input condition: value - 4 digit length </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 45. Boundary Value Analysis <ul><li>More errors occur at boundary of the input domain </li></ul><ul><li>BVA leads to selection of test cases that exercise the boundaries </li></ul><ul><li>Guidelines: </li></ul><ul><ul><li>Input in range a..b: select a, b, just above and just below a and b </li></ul></ul><ul><ul><li>Inputs with number of values: select min and max, just above and below min, max </li></ul></ul><ul><ul><li>Use same guidelines for output conditions </li></ul></ul><ul><ul><li>boundaries on data structures (array with 100 entries): test at boundary </li></ul></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 46. Software Testing Strategies
- 47. Testing Goals - Review <ul><li>Goal is to discover as many errors as possible with minimum effort and time </li></ul><ul><li>Destructive activity - people who constructed the sw now asked to test it </li></ul><ul><ul><li>Vested interest in showing sw is error-free, meets requirements, and will meet budget and schedule </li></ul></ul><ul><ul><li>Works against thorough testing </li></ul></ul><ul><li>Therefore, should the developer do no testing? Should all testing be done independently and testers get involved only when developers finished with construction? </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 48. Testing Strategies ... <ul><li>In the past, only defense against programming errors was careful design and the intelligence of the programmer </li></ul><ul><li>Now we have modern design techniques and formal technical reviews to reduce the number of initial errors in the code </li></ul><ul><li>In Chapter 17 we discussed how to design effective test cases, now we discuss the strategy we use to execute them. </li></ul><ul><li>Strategy is developed by project manager, software engineer, and testing specialists. It may also be mandated by customer. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 49. Why is Testing Important? <ul><li>Testing often accounts for more effort than any other sw engineering activity </li></ul><ul><li>If done haphazardly, we </li></ul><ul><ul><li>waste time </li></ul></ul><ul><ul><li>waste effort </li></ul></ul><ul><ul><li>errors sneak thru </li></ul></ul><ul><li>Therefore need a systematic approach for testing software </li></ul><ul><li>Work product is a Test Specification (Test Plan) </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 50. What is a Test Plan? <ul><li>a road map describing the steps to be conducted </li></ul><ul><li>specifies when the steps are planned and then undertaken </li></ul><ul><li>states how much effort, time, and resources will be required </li></ul><ul><li>must incorporate test planning, test case design, test execution, and data collection and evaluation </li></ul>Should be flexible for customized testing but rigid enough for planning and management tracking. From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 51. Strategic Issues <ul><li>Specify requirements in a quantifiable manner so the requirement can be tested. </li></ul><ul><li>State testing objectives explicitly </li></ul><ul><li>Understand potential users and develop profiles </li></ul><ul><li>Develop testing plan - in increments quickly </li></ul><ul><li>Build robust software with error checking </li></ul><ul><li>Use effective Formal Test Reviews (FTRs) to find errors early - save time/$ </li></ul><ul><li>Conduct FTRs on tests and test strategy </li></ul><ul><li>Develop continuous improvement - collect metrics </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 52. Testing Strategy unit test integration test validation test system test Component level Integrate components Requirements level System elements tested as a whole From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 53. Verification and Validation <ul><li>Verification - ensure sw correctly implements specified function </li></ul><ul><ul><li>“ Are we building the product right?” </li></ul></ul><ul><li>Validation - ensure sw is traceable to requirements </li></ul><ul><ul><li>“ Are we building the right product?” </li></ul></ul><ul><li>Independent Test Group (ITG) performs V&V - works closely with developer to fix errors as they are found </li></ul><ul><li>ITG starts at beginning of project thru finish </li></ul><ul><li>ITG reports to organization apart from SW </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 54. Comparison of Testing Types Eliminate duplication of testing between different groups to save time/$ From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 55. Unit Testing interface local data structures boundary conditions independent paths (basis paths) error handling paths module to be tested test cases Types of testing From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 56. Unit Test Environment Module stub stub driver RESULTS interface local data structures boundary conditions independent paths error handling paths test cases Testing is simplified if unit has only one function (hi cohesion) - fewer test cases From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 57. Drivers and Stubs <ul><li>Driver - Main program that accepts test case data, passes such data to the module, and prints relevant results </li></ul><ul><li>Stub - replace modules that are subordinate to unit under test; uses the subordinate module’s I/F, may do minimal data manipulation, prints verification of entry, returns control to module undergoing testing. </li></ul><ul><li>Overhead when writing drivers and stubs </li></ul><ul><li>Sometimes, can’t adequately unit test with simple overhead sw - then wait till integration (drivers and stubs may be used here) </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 58. Types of Computation Errors <ul><li>Misunderstood or incorrect arithmetic precedence </li></ul><ul><li>Mixed Mode operations </li></ul><ul><li>Incorrect initialization </li></ul><ul><li>Precision inaccuracy </li></ul><ul><li>Incorrect symbolic representation of an expression </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 59. Types of Control Flow Errors <ul><li>Comparison of different data types </li></ul><ul><li>Incorrect logical operators or precedence </li></ul><ul><li>Expectation of equality when precision error makes unlikely </li></ul><ul><li>Incorrect comparison of variables </li></ul><ul><li>Improper or nonexistent loop termination </li></ul><ul><li>Failure to exit </li></ul><ul><li>Improperly modified loop variables </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 60. Error Handling Evaluation <ul><li>Error conditions must be anticipated and error handling must reroute or cleanly terminate processing – Antibugging </li></ul><ul><li>Typical Antibugging Errors: </li></ul><ul><li>Error description is unintelligible </li></ul><ul><li>Error noted doesn’t match error encountered </li></ul><ul><li>Error condition causes system intervention </li></ul><ul><li>Exception condition processing is incorrect </li></ul><ul><li>Error description doesn’t provide enough info </li></ul>Make sure error handling is tested! From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 61. Integration Testing Strategies <ul><li>Options: </li></ul><ul><ul><li>• The “big bang” approach </li></ul></ul><ul><ul><li>OR </li></ul></ul><ul><ul><li>• An incremental construction strategy </li></ul></ul><ul><ul><ul><li>Top Down </li></ul></ul></ul><ul><ul><ul><li>Bottom Up </li></ul></ul></ul><ul><ul><ul><li>Sandwich </li></ul></ul></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 62. What is Integration Testing? <ul><li>Take unit tested components and build a program structure by joining the components while testing to find errors associated with interfaces between components </li></ul><ul><li>Data can be lost across an interface, one module can have inadvertent adverse affect on another, etc. </li></ul><ul><li>Program is constructed and tested in small increments. Errors are easier to isolate, interfaces are more likely to be tested completely, systematic test approach is applied. </li></ul><ul><li>Software gains maturity as integrate modules. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 63. Top Down Integration top module is tested with Stubs for B, F, G stubs are replaced one at a time with real components, "depth first" as new modules are integrated, some subset of tests is re-run - regression A B C D E F G What would be replaced next? From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 64. Bottom-Up Integration drivers are removed and builds combined Moving upward one at a time, "depth first" worker modules are grouped into builds that perform specific subfunction and integrated A B C D E F G cluster From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 65. Top-down vs Bottom Up Integration <ul><li>Top Down </li></ul><ul><li>Stubs replace low level modules which normally supply data </li></ul><ul><li>Therefore may delay some testing (not good) </li></ul><ul><li>Simulate the actual module in the stub (high overhead) </li></ul><ul><li>Verifies major control early </li></ul><ul><li>Bottom up </li></ul><ul><li>First integrate the low level modules that supply data </li></ul><ul><li>Program doesn’t exist until last module integrated </li></ul><ul><li>Easier test case design </li></ul><ul><li>Don’t need stubs - need drivers. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 66. Sandwich Testing Top modules are tested with stubs Worker modules are grouped into builds and integrated A B C D E F G cluster From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 67. Critical Modules <ul><li>Identify critical modules and target for early testing and focus regression testing on them. Critical modules: </li></ul><ul><li>Address several sw req </li></ul><ul><li>High level of control (high in sw structure) </li></ul><ul><li>Complex or error prone (high V(G)) </li></ul><ul><li>Has definite performance requirements </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 68. High Order Testing <ul><li>Validation Test - Test Plan outlines classes of tests to be performed. Test Procedures have specific test cases. </li></ul><ul><ul><li>After each test case runs, either passes or have deviation which is recorded as a Software Trouble Report (STR) </li></ul></ul><ul><ul><li>Resolution of STRs is monitored </li></ul></ul><ul><li>Alpha and Beta Testing - Alpha at developer’s site and Beta at customer site </li></ul><ul><li>System Test - tests to verify system elements have been properly integrated and perform required functions </li></ul><ul><ul><li>This was performed as a System Level Acceptance Test (SLAT) at IBM. </li></ul></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 69. Debugging: A Diagnostic Process From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 70. Debugging Process <ul><li>Debugging is a consequence of testing </li></ul><ul><li>Debugging effort is combination of the time required to diagnose the symptom and determine the cause of the error AND the time required to correct the error and conduct regression tests. </li></ul><ul><li>Regression test is a selective re-running of tests to assure that nothing has been broken when fix or modification was implemented. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 71. Symptoms & Causes symptom cause <ul><li>Symptom and cause may be geographically remote: </li></ul><ul><li>Symptom may disappear when another error is fixed </li></ul><ul><li>Symptom may be caused by nonerror (e.g. roundoff) </li></ul><ul><li>Symptom caused by human error; compiler error; assumptions </li></ul><ul><li>Symptom caused by timing problems </li></ul><ul><li>Hard to duplicate conditions (real-time application) </li></ul><ul><li>Symptom intermittent - with embedded systems </li></ul><ul><li>Symptom due to causes distributed across number of tasks on different processors. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 72. Consequences of Bugs damage mild annoying disturbing serious extreme catastrophic infectious Bug Type Bug Categories: function-related bugs, system-related bugs, data bugs, coding bugs, design bugs, documentation bugs, standards violations, etc. From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 73. Debugging Techniques brute force / testing backtracking Cause elimination From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 74. Brute Force Debugging <ul><li>“ Let the computer find the error” - memory dumps, run-time traces, WRITE statements all over program </li></ul><ul><li>Most common and least efficient method for isolating cause of error </li></ul><ul><li>Wasted effort and time </li></ul><ul><li>Think first! </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 75. Backtracking Debugging <ul><li>Begin at the site where symptom uncovered, source code is traced backward manually until cause is found. </li></ul><ul><li>As number of LOC increases, number of backward paths becomes unmanageably large </li></ul><ul><li>Fairly common debugging approach - successful for small programs </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 76. Cause Elimination - Debugging <ul><li>Data related to the error occurrence is organized to isolate potential causes </li></ul><ul><li>“ Cause hypothesis” is devised and data used to prove or disprove the hypothesis </li></ul><ul><li>Or, a list of all possible causes is developed and tests run to eliminate each. </li></ul>From Pressman: Software Engineering: A Practitioner’s Approach , Fifth Edition, McGraw-Hill, 2000
- 77. Debugging: Final Thoughts <ul><li>Think about the symptom you are seeing </li></ul><ul><li>Use tools such as dynamic debugger to gain more insight about the bug. </li></ul><ul><li>Get help from somebody else if you are stuck. Just talking to another person can help you see the cause of the bug. </li></ul><ul><li>Every time you touch existing code, you run the risk of injecting errors. Therefore ALWAYS run regression tests on all fixes. </li></ul><ul><li>Ask the following questions: Is the bug also in another part of program? How can we prevent the bug in the first place? </li></ul>
- 78. Webliography <ul><li>Check the Webliography for some interesting cases of software bugs such as the Therac radiation bug and other information about testing. </li></ul>

No public clipboards found for this slide

×
### Save the most important slides with Clipping

Clipping is a handy way to collect and organize the most important slides from a presentation. You can keep your great finds in clipboards organized around topics.

Be the first to comment