CS48717-1 /30 Illinois Institute of Technology


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

CS48717-1 /30 Illinois Institute of Technology

  1. 1. Illinois Institute of Technology <ul><li>CS487 </li></ul><ul><li>Software Engineering </li></ul><ul><li>Testing - Part II </li></ul><ul><li>Software Testing Strategies </li></ul><ul><li>Mr. David A. Lash </li></ul>© Carl J. Mueller, 2000
  2. 2. Why Software Testing Strategy? <ul><li>Some Test Strategy Issues: </li></ul><ul><ul><li>Should develop a formal test plan? </li></ul></ul><ul><ul><li>Test entire program as a whole or various pieces? </li></ul></ul><ul><ul><li>When should customer be included? </li></ul></ul><ul><li>Testing often accounts for more effort then development </li></ul><ul><li>Usually strategy is defined in an overall testing strategy and procedure document (often by independent test group) </li></ul>
  3. 3. Elements Of Software Test Strategy <ul><li>Unit Testing - Tests the implemented source code of the component. Uses White Box. </li></ul><ul><li>Integration Testing - focuses on the design, testing and integration of components. Black box and limited White Box. </li></ul><ul><li>Validation Testing - Looks at requirements a validates software against it. Black Box Testing. </li></ul><ul><li>System Testing - Looks at software and other system elements tested as a whole. (Other hardware, people, databases, processes.) </li></ul>
  4. 4. Unit Testing <ul><li>Unit Testing - Module test </li></ul><ul><ul><li>interfaces are tested to ensure information flows into and out of the program correctly. </li></ul></ul><ul><ul><li>Data structures are tested. </li></ul></ul><ul><ul><li>Independent paths (basis paths) are exercised) </li></ul></ul><ul><ul><li>Error handling paths - A mistake to miss these. </li></ul></ul><ul><li>Use drivers (accepts data </li></ul><ul><li>and tests and passes </li></ul><ul><li>to module) and stubs </li></ul><ul><li>(replace other modules) </li></ul>
  5. 5. Integration <ul><li>A systematic technique to uncover problems while putting the modules together. </li></ul><ul><li>Composed of two activities: </li></ul><ul><ul><li>Building </li></ul></ul><ul><ul><ul><li>Compile and link all of the modules together </li></ul></ul></ul><ul><ul><li>Testing </li></ul></ul><ul><ul><ul><li>Verify interface function as specified </li></ul></ul></ul><ul><li>Avoid the “big bang” approach A.K.A. non-incremental integration </li></ul><ul><ul><li>can yield chaotic results. </li></ul></ul>
  6. 6. Implementation Process
  7. 7. <ul><li>Top-down (depth-first or breadth-first) </li></ul><ul><ul><li>Test higher level modules first </li></ul></ul><ul><ul><li>Require stubs (not just returns) </li></ul></ul><ul><li>Bottom-up </li></ul><ul><ul><li>Test lower-level modules first </li></ul></ul><ul><ul><li>Require drivers (require logic) </li></ul></ul><ul><li>Sandwich </li></ul><ul><ul><li>Test highest level modules in conjunction with lowest level meeting in the middle </li></ul></ul><ul><ul><li>Use stubs and drivers </li></ul></ul>Approaches to Integration Testing
  8. 8. Example System
  9. 9. Top Down (depth) Example <ul><li>Depth First </li></ul><ul><ul><li>Integrate M 1, use stubs for M 2, M 3 and M 5 . Test all M 1 interfaces </li></ul></ul><ul><ul><li>Next would test M 2 and use operational stubs for M 3 , M 4 , M 5 , M 6 . Test all M 2 interfaces </li></ul></ul><ul><ul><li>Repeat for M 5 , then M 8 , M 6 , and M 3 ... </li></ul></ul>
  10. 10. Top Down (depth) Example <ul><li>Integrate M 1, use stubs for M 2, M 3 and M 5 . Test all M 1 interfaces </li></ul><ul><li>Next would test M 2 and use operational stubs for M 3 , M 4 , M 5 , M 6 . Test all M 2 interfaces </li></ul><ul><li>Replace stub M 5 with module M 5 . Use operational stubs for M 3 , M 4 , M 6 , M 8 . Test all the interfaces in M 5 </li></ul><ul><li>Repeat for M 8 , M 6 , and M 3 ... </li></ul>
  11. 11. Top Down (breath) Example <ul><li>Breath First Order </li></ul><ul><ul><li>M 1 , M 2 , M 3 , M 4 , </li></ul></ul><ul><ul><li>Next level would be M 5 , M6, ... </li></ul></ul><ul><li>Next Top down testing can have problems when low level modules are needed (delay full tests or develop better stubs or go bottom up) </li></ul>
  12. 12. Bottom Up <ul><li>Lower Items first using drivers : </li></ul><ul><ul><li>Using an M 8 driver test module M 8 </li></ul></ul><ul><ul><li>Using an M 7 driver test module M 7 </li></ul></ul><ul><ul><li>Using an M 6 driver test module M 6 </li></ul></ul><ul><ul><li>Using an M 5 driver test module M 5 </li></ul></ul>
  13. 13. Bottom Up Example II <ul><li>In reality lower items are clustered together into sub-functions (aka builds). </li></ul>
  14. 14. Sandwich Example <ul><li>First Round </li></ul><ul><ul><li>Top-down </li></ul></ul><ul><ul><ul><li>Use operational stubs for M 2 , M 3 , M 4 </li></ul></ul></ul><ul><ul><ul><li>Test all the interfaces in M 1 </li></ul></ul></ul><ul><ul><li>Bottom-up </li></ul></ul><ul><ul><ul><li>Using an M 8 driver test module M 8 </li></ul></ul></ul><ul><ul><ul><li>Using an M 7 driver test module M 7 </li></ul></ul></ul><ul><ul><ul><li>Using an M 6 driver test module M 6 </li></ul></ul></ul>
  15. 15. Regression Testing <ul><li>The reexecution of some subset of tests to ensure changes have not propagated errors. </li></ul><ul><li>Tests conducted when a previously tested software is delivered </li></ul><ul><ul><li>Continuing testing after repair </li></ul></ul><ul><ul><li>New Version </li></ul></ul><ul><li>Partial testing old features when testing a maintenance release. </li></ul><ul><li>Might include tests of features/functions that were unchanged doing the current release. </li></ul><ul><li>Capture/playback tools enable re-execute of tests previously run. </li></ul>
  16. 16. Verification <ul><li>(1) The process of determining whether or not the products of a given phase of the software development cycle fulfill the requirements established during the previous phase </li></ul><ul><li>(2) Formal proof of program correctness </li></ul><ul><li>(3) The act of reviewing, inspecting, testing, checking, auditing, or otherwise establishing and documenting whether or not items, processes, services and documents conform to specified requirements </li></ul>
  17. 17. Functional Verification <ul><li>Verification of the Functional Design Specification </li></ul><ul><li>Should be done by developers Normally done by user or testing group </li></ul><ul><li>Can be combined with integration testing </li></ul>
  18. 18. Validation <ul><li>Are we building the right product? </li></ul><ul><li>Does it meet the requirements? </li></ul>
  19. 19. Validation Phase
  20. 20. Validation <ul><li>The process of evaluating software at the end of the software development process to ensure compliance with software requirements. </li></ul><ul><li>Test plans define classes of test to be completed </li></ul>
  21. 21. Validation Activities
  22. 22. Validation Process
  23. 23. System Tests
  24. 24. System Tests
  25. 25. System Tests
  26. 26. System Tests
  27. 27. Rules for High-Level Tests <ul><li>1. Test are run on operational production hardware and software. </li></ul><ul><li>2. Tests must stress the system significantly </li></ul><ul><li>3. All interfaced systems must be in operation throughout the test. </li></ul><ul><li>4. Test duration should run a full cycle </li></ul><ul><li>5. Tests should exercise the system over a full range of inputs, both legal and illegal </li></ul><ul><li>6. All major functions and interfaces should be exercised. </li></ul><ul><li>7. If running the entire test cannot be completed, a new run should include a complete restart. </li></ul>
  28. 28. Acceptance Testing <ul><li>Depends on the type of product and the situation. </li></ul><ul><li>When working with an a non-technical user, the following are recommended. </li></ul><ul><ul><li>First, train user on system with a number of test cases. </li></ul></ul><ul><ul><li>Use a comparison test. </li></ul></ul><ul><ul><ul><li>Re-processing </li></ul></ul></ul><ul><ul><ul><li>Parallel to existing system </li></ul></ul></ul>
  29. 29. Acceptance Testing <ul><li>Product Development </li></ul><ul><ul><li>Alpha testing </li></ul></ul><ul><ul><ul><li>Performed by a selected end-users usually in side the development environment. </li></ul></ul></ul><ul><ul><li>Beta testing </li></ul></ul><ul><ul><ul><li>Performed by a subset of actual customers outside the company, before the software is made available to the customer. </li></ul></ul></ul><ul><ul><li>Field Trial </li></ul></ul><ul><ul><ul><li>Large number of users, but not general release. </li></ul></ul></ul>
  30. 30. Acceptance Testing <ul><li>Work for Hire </li></ul><ul><ul><li>Training Programming </li></ul></ul><ul><ul><ul><li>Evaluate based on comments users general reaction to system. </li></ul></ul></ul><ul><ul><ul><li>Usability Test </li></ul></ul></ul><ul><ul><li>Parallel Operation </li></ul></ul><ul><ul><ul><li>Comparison Testing </li></ul></ul></ul><ul><ul><ul><li>New software used after old system process. </li></ul></ul></ul><ul><ul><li>Controlled deployment </li></ul></ul><ul><ul><ul><li>Representative monitors software at site. </li></ul></ul></ul>