Testing

3,537 views
3,452 views

Published on

Published in: Technology
0 Comments
6 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
3,537
On SlideShare
0
From Embeds
0
Number of Embeds
9
Actions
Shares
0
Downloads
263
Comments
0
Likes
6
Embeds 0
No embeds

No notes for slide
  • Testing

    1. 1. SOFTWARE TESTING SONALI CHAUHAN SYBSC IT UDIT Software Engineering: A Practitioner's Approach - R.S. Pressman & Associates, Inc.
    2. 2. INTRODUCTION <ul><li>It is critical element of s/w quality assurance. </li></ul><ul><li>S/w must be tested to uncover as many as errors before delivery to customer. </li></ul><ul><li>This is where s/w testing technique comes into picture. </li></ul>
    3. 3. <ul><li>Technique provides systematic guidance for designing test: </li></ul><ul><ul><li>Exercise internal logic of software components </li></ul></ul><ul><ul><li>Exercise the i/p, o/p domain of program to uncover errors in program function, behavior and performance </li></ul></ul><ul><li>S/w is tested in 2 perspective: </li></ul><ul><ul><li>Internal logic of software components is exercise “ WHITEBOX ” test case technique </li></ul></ul><ul><ul><li>S/w requirement are exercised using “ BLACKBOX ” test case technique </li></ul></ul>CONT…
    4. 4. <ul><li>In both the cases, the intend is to find maximum number of error with minimum amount of effort and time. </li></ul>CONT…
    5. 5. WHO TEST THE SOFTWARE??? <ul><li>Understands the system, but will test “gently” and, is driven by “delivery” </li></ul><ul><li>Must learn about system but, will attempt to break it and, is driven by quality </li></ul>independent tester developer
    6. 6. TESTIBILITY FEATURE <ul><li>OPERABILITY </li></ul><ul><li>OBSERVABILITY </li></ul><ul><li>CONTROLABILITY </li></ul><ul><li>DECOMPOSABILITY </li></ul><ul><li>SIMPLICITY </li></ul><ul><li>STABILITY </li></ul><ul><li>UNDERSTANDABILITY </li></ul>
    7. 7. TESTING OBJECTIVE <ul><li>Testing is a process of executing a program with the intent of finding an error. </li></ul><ul><li>A good test case is one which has high probability to find an as-yet-undiscovered error. </li></ul><ul><li>A successful test is one that uncovers as-yet-undiscovered errors. </li></ul><ul><li>Our objective is to design test that systematically uncover different classes of errors and to do so in minimum amount of time and efforts. </li></ul><ul><li>If testing is conducted successfully it will uncover errors in s/w. </li></ul>
    8. 8. PRINCIPLE OF TESTING <ul><li>Testability </li></ul><ul><ul><li>All tests should be traceable to customer requirement. </li></ul></ul><ul><li>Planning </li></ul><ul><ul><li>Tests should be planed long before testing begins. </li></ul></ul><ul><li>Execution </li></ul><ul><ul><li>Should begin in small and progress towards in the large. </li></ul></ul>
    9. 9. <ul><li>Pareto </li></ul><ul><ul><li>The pareto principle: implies that 80% of all the errors uncovered during testing will likely be traceable to 20% of all programs components </li></ul></ul><ul><li>Exhaustive testing </li></ul><ul><ul><li>Exhaustive testing is not possible </li></ul></ul><ul><li>Moderation </li></ul><ul><ul><li>To be more effective testing should be conducted by a independent third party. </li></ul></ul>CONT…
    10. 10. TEST CASE DESIGN <ul><li>Designing a test is difficult </li></ul><ul><li>Recalling, test should be find max error in min efforts n time </li></ul>
    11. 11. <ul><li>Whitebox testing: </li></ul><ul><ul><li>Knowing the internal working of the product, tests can be conducted to ensure that “all gears mesh”, i.e,internal operation are adequately exercised. </li></ul></ul><ul><li>Blackbox testing </li></ul><ul><ul><li>Knowing the specified function that a product has been designed to performed, tests can be conducted that demonstrate each function </li></ul></ul>CONT…
    12. 12. WHITEBOX TESTING <ul><li>Close examination of procedural details. </li></ul><ul><li>Goal is to ensure that all the conditions and statements are been executed at least once in loop. </li></ul><ul><li>It is also known as ‘‘ GLASS BOX TESTING ’’ </li></ul>
    13. 13. <ul><li>Software engineers can drive test cases that: </li></ul><ul><ul><li>guarantee that all independent paths are within a module have been exercised at least once. </li></ul></ul><ul><ul><li>Exercise all logical decision on their true and false sides </li></ul></ul><ul><ul><li>Execute all loops at their boundaries and within their operational bound </li></ul></ul><ul><ul><li>Exercise internal data structures to ensure their validity </li></ul></ul>CONT…
    14. 14. WHY WHITEBOX TESTING? <ul><li>logic errors and incorrect assumptions are inversely proportional to a path's execution probability. </li></ul><ul><li>we often believe that a logical path is not likely to be executed;  in fact, it may be executed on a regular basis. </li></ul><ul><li>Typographical errors are random;  it's likely that untested paths will contain some </li></ul>
    15. 15. I.BASIC PATH TESTING <ul><li>Basic path testing is white box testing </li></ul><ul><li>It enables the test case design to derive a logical complexity basic of procedural design </li></ul><ul><li>It contains following types </li></ul><ul><ul><li>Flow graph notation </li></ul></ul><ul><ul><li>Cyclomatic complexity </li></ul></ul><ul><ul><li>Deriving test case </li></ul></ul>CONT…
    16. 16. 1.Flow Graph Notation <ul><li>It depicts the logical control flow of program </li></ul>
    17. 17. 2.CYCLOMATIC COMPLEXITY <ul><li>Determines no of region of flow chart corresponds to CYCLOMATIC COMPLEXITY </li></ul><ul><li>Cyclomatic complexity V(G)=E-N+2 </li></ul><ul><ul><li>E = no of flow graph edges </li></ul></ul><ul><ul><li>N = number of flow graph nodes </li></ul></ul><ul><li>V(G) = P+1 </li></ul><ul><ul><li>P=no of predicate node </li></ul></ul>
    18. 18. 3.DERIVING TES CASE <ul><li>Draw flow chart </li></ul><ul><li>Determine CC </li></ul><ul><li>Determine a basis set of linear independent path </li></ul><ul><li>Prepare test cases which will execute each path in basic set. </li></ul>
    19. 19. II. GRAPH MATRICES <ul><li>A graph matrix is a square matrix whose size (i.e., number of rows and columns) is equal to the number of nodes on a flow graph </li></ul><ul><li>Each row and column corresponds to an identified node, and matrix entries correspond to connections (an edge) between nodes. </li></ul><ul><li>By adding a link weight to each matrix entry, the graph matrix can become a powerful tool for evaluating program control structure during testing </li></ul>
    20. 20. III. CONTROL STRUCTURE TESTING <ul><li>Condition testing — a test case design method that exercises the logical conditions contained in a program module </li></ul><ul><li>Data flow testing — selects test paths of a program according to the locations of definitions and uses of variables in the program </li></ul>
    21. 21. IV. LOOP TESTING Nested Loops Concatenated Loops Unstructured Loops Simple loop
    22. 22. LOOP TESTING:SIMPLE LOOPS <ul><li>Minimum conditions—Simple Loops </li></ul><ul><ul><li>skip the loop entirely </li></ul></ul><ul><ul><li>only one pass through the loop </li></ul></ul><ul><ul><li>two passes through the loop </li></ul></ul><ul><ul><li>m passes through the loop m < n </li></ul></ul><ul><ul><li>(n-1), n, and (n+1) passes through the loop </li></ul></ul><ul><ul><li>where n is the maximum number of allowable passes </li></ul></ul>
    23. 23. LOOP TESTING:NESTED LOOPS <ul><li>Nested Loop: </li></ul><ul><ul><li>Start at the innermost loop. Set all outer loops to their minimum iteration parameter values. </li></ul></ul><ul><ul><li>Test the min+1, typical, max-1 and max for the innermost loop, while holding the outer loops at their minimum values. </li></ul></ul><ul><ul><li>Move out one loop and set it up as in step 2, holding all other loops at typical values. Continue this step until the outermost loop has been tested . </li></ul></ul>
    24. 24. <ul><li>Concatenated Loops </li></ul><ul><ul><li>If the loops are independent of one another </li></ul></ul><ul><ul><ul><li>then treat each as a simple loop </li></ul></ul></ul><ul><ul><li> else* treat as nested loops </li></ul></ul><ul><li>endif* </li></ul>
    25. 25. BLACK BOX TESTING <ul><li>Also called as Behavioral Testing </li></ul><ul><li>Focuses on the functional requirements of S/W </li></ul><ul><li>It is complimentary to whitebox testing that uncovers errors </li></ul>
    26. 26. <ul><li>Black-Box attempts to fin errors in following categories: </li></ul><ul><ul><li>Incorrect or missing function </li></ul></ul><ul><ul><li>Interface errors </li></ul></ul><ul><ul><li>Errors in data structures or external data base access </li></ul></ul><ul><ul><li>Behavior or performance errors </li></ul></ul><ul><ul><li>Initialization and termination error </li></ul></ul>CONT…
    27. 27. <ul><li>Black Box are designed to answer: </li></ul><ul><ul><li>How is functional validity tested? </li></ul></ul><ul><ul><li>How is system behavior and performance tested? </li></ul></ul><ul><ul><li>What classes of input will make good test cases? </li></ul></ul><ul><ul><li>Is the system particularly sensitive to certain input values? </li></ul></ul><ul><ul><li>How are the boundaries of a data class isolated? </li></ul></ul><ul><ul><li>What data rates and data volume can the system tolerate? </li></ul></ul><ul><ul><li>What effect will specific combinations of data have on system operation? </li></ul></ul>CONT…
    28. 28. <ul><li>To understand the objects that are modeled in software and the relationships that connect these objects </li></ul><ul><li>Software testing begins by creating graph of object and relation among them to uncover errors. </li></ul><ul><li>Graph is collection of nodes,links. </li></ul>GRAPH BASE TESTING METHOD CONT…
    29. 29. CONT…
    30. 30. EQUIVALENCE PARTITIONING <ul><li>Focuses on the test case for input condition </li></ul><ul><ul><li>If as input condition specifies a range, 1-valid and 2-invalid equivalence class are define </li></ul></ul><ul><ul><li>If as input condition specifies a value, 1-valid and 2-invalid equivalence class are define </li></ul></ul><ul><ul><li>If as input condition specifies a number of a set, 1-valid and 1-invalid equivalence class are define </li></ul></ul><ul><ul><li>If input condition is Boolean ,1 valid and one invalid class are define </li></ul></ul>
    31. 31. BOUNDARY VALUE ANALYSIS <ul><li>It selects test case at the edges of the class </li></ul><ul><li>it focuses on input as well as output domain </li></ul><ul><li>Guidelines for BVA are: </li></ul>
    32. 32. <ul><li>If the input condition specified a range bounded by values a and b, test case should be just above a and just below b. </li></ul><ul><li>If the input condition specified a number of values, test case should be maximum and minimum </li></ul><ul><li>Apply guideline 1 and 2 for output domain </li></ul><ul><li>If internal program data structure have prescribed boundaries, design test case for data structure as its boundaries </li></ul>CONT…
    33. 33. COMPARISON TESTING <ul><li>Used only in situations in which the reliability of software is absolutely critical (e.g., human-rated systems) </li></ul><ul><ul><li>Separate software engineering teams develop independent versions of an application using the same specification </li></ul></ul><ul><ul><li>Each version can be tested with the same test data to ensure that all provide identical output </li></ul></ul><ul><ul><li>Then all versions are executed in parallel with real-time comparison of results to ensure consistency </li></ul></ul>
    34. 34. ORTHOGANAL ARRAY TESTING <ul><li>Used when the number of input parameters is small and the values that each of the parameters may take are clearly bounded </li></ul>
    35. 35. TESTING STRATEGY <ul><li>We begin by ‘ testing-in-the-small ’ and move toward ‘ testing-in-the-large ’ </li></ul><ul><li>For conventional software </li></ul><ul><ul><li>The module (component) is our initial focus </li></ul></ul><ul><ul><li>Integration of modules follows </li></ul></ul><ul><li>For OO software </li></ul><ul><ul><li>our focus when “testing in the small” changes from an individual module (the conventional view) to an OO class that encompasses attributes and operations and implies communication and collaboration </li></ul></ul>
    36. 36. TESTING STRATEGY unit test integration test validation test system test CONT…
    37. 37. VALIDATION AND VERIFICATION <ul><li>Software testing is one type of a broader domain that is known as verification and validation (V&V). </li></ul><ul><li>Verification related to a set of operations that the software correctly implements a particular function. </li></ul><ul><li>Validation related to a different set of activities that ensures that the software that has been produced is traceable to customer needs. </li></ul>
    38. 38. UNIT TESTING module to be tested test cases results software engineer
    39. 39. UNIT TESTING <ul><li>Unit testing concentrates verification on the smallest element of the program. </li></ul><ul><li>Control paths are tested to uncover errors within the boundary of module. </li></ul><ul><li>Is whitebox oriented. </li></ul>CONT…
    40. 40. UNIT TESTING CONT… … … MODULE … … Interface Local Data Structure Boundary Condition Independent Paths Error Handling Paths TEST CASES
    41. 41. UNIT TESTING <ul><li>Module interface is tested to ensure that information properly flows into and out of the program unit being tested. </li></ul><ul><li>Local data structure is tested to ensure that data stored temporarily maintains its integrity for all stages in an algorithm’s execution. </li></ul><ul><li>Boundary conditions are tested to ensure that the modules perform correctly at boundaries created to limit or restrict processing. </li></ul><ul><li>All independent paths through the control structure are exercised to ensure that all statements in been executed once. </li></ul><ul><li>Finally, all error-handling paths are examined. </li></ul>CONT…
    42. 42. UNIT TESTING-Environment CONT… Driver Modules Stubs Stubs Result Interface Local Data Structure Boundary Condition Independent Paths Error Handling Paths TEST CASES
    43. 43. UNIT TESTING-Procedure <ul><li>Once source code has been produced, reviewed, and verified for correct syntax, unit test case design can start. </li></ul><ul><li>As a module is not a stand-alone program, driver and/stub software must be produced for each test units. </li></ul><ul><li>Drivers are nothing more than “main program” that accepts test case, passes such information to the component (to be tested), and print relevant results. </li></ul><ul><li>Stubs is “dummy subprogram” uses subordinate module interface ,may do minimal data manipulation, print verification of entry and returns controls to the modules undergoing testing. Stubs replaces modules. </li></ul>CONT…
    44. 44. INTEGRATION TESTING <ul><li>Once all the modules has been unit tested, Integration testing is performed. </li></ul><ul><li>Next problem is Interfacing </li></ul><ul><li>Data may lost across interface, one module may not have an adverse impact on another and a function is may not performed correctly when combined. </li></ul><ul><li>IT is systematic testing. </li></ul><ul><li>Produce tests to identify errors associated with interfacing. </li></ul>
    45. 45. INTEGRATION TESTING <ul><li>The “big bang” approach. </li></ul><ul><li>An entire program is tested as a whole creates a problem. </li></ul><ul><li>An incremental construction strategy </li></ul>CONT…
    46. 46. INTEGRATION TESTING- Top Down Testing <ul><li>Top-down integration is an incremental approach. </li></ul><ul><li>Modules are integrated by moving downwards through the control hierarchy, starting with the main control module. </li></ul><ul><li>Modules subordinate to the main control module are included into the structure in either a depth-first or breadth-first manner. </li></ul>CONT…
    47. 47. INTEGRATION TESTING- Top Down Testing <ul><li>For instance in depth-first integration, selecting the left-hand path, modules M1, M2, M5 would be integrated first. </li></ul><ul><li>Next M8 or M6 would be integrated. </li></ul><ul><li>Then the central and right-hand control paths are produced. </li></ul><ul><li>Breath-first integration includes all modules directly subordinate at each level, moving across the structure horizontally. </li></ul><ul><li>From the figure modules M2, M3 and M4 would be integrated first. </li></ul><ul><li>The next control level, M5, M6 etc., follows. </li></ul>CONT…
    48. 48. INTEGRATION TESTING- Top Down Testing CONT… M1 M7 M8 M9 M6 M5 M3 M4 M2 Top module is tested with stub stubs are replaced one at a time, &quot;depth first&quot; as new modules are integrated, some subset of tests is re-run
    49. 49. INTEGRATION TESTING- Top Down Testing <ul><li>The integration process is performed in a series of five stages: </li></ul><ul><li>The main control module is used as a test driver and stubs are substituted for all modules directly subordinate to the main control module. </li></ul><ul><li>Depending on the integration technique chosen, subordinate stubs are replaced one at a time with actual modules. </li></ul><ul><li>Tests are conducted as each module is integrated. </li></ul><ul><li>On the completion of each group of tests, another stub is replaced with the real module. </li></ul><ul><li>Regression testing may be performed to ensure that new errors have been introduced. </li></ul>CONT…
    50. 50. INTEGRATION TESTING- Top Down Testing <ul><li>Problem on Top Down </li></ul><ul><li>Processing at low levels in the hierarchy is required to test upper level </li></ul><ul><li>Solution </li></ul><ul><li>Delay test until stubs are replaced with actual module </li></ul><ul><ul><li>Difficult to determine cause of errors </li></ul></ul><ul><li>Develop stubs that perform limited functions to simulate actual module </li></ul><ul><ul><li>Overhead, as stubs become more compels. </li></ul></ul><ul><li>Integrate software from bottom of the hierarchy </li></ul>CONT…
    51. 51. INTEGRATION TESTING- Bottom Up Testing <ul><li>Begins testing with the modules at the lowest level (atomic modules). </li></ul><ul><li>As modules are integrated bottom up, processing required for modules subordinates to a given level is always available and the need for stubs is eliminated. </li></ul>CONT…
    52. 52. INTEGRATION TESTING- Bottom Up Testing <ul><li>A bottom-up integration strategy may be implemented with the following steps: </li></ul><ul><li>Low-level modules are combined into clusters that perform a particular software subfunction. </li></ul><ul><li>A driver is written to coordinate test cases input and output. </li></ul><ul><li>The cluster is tested. </li></ul><ul><li>Drivers are removed and clusters are combined moving upward in the program structure. </li></ul>CONT…
    53. 53. INTEGRATION TESTING- Bottom Up Testing CONT…
    54. 54. REGRESSION TESTING <ul><li>Any time changes are made at any level, all previous testing must be considered invalid </li></ul><ul><ul><li>can do regression testing at unit, integration, and system level </li></ul></ul><ul><li>this means tests must be re-run to ensure the software still passes </li></ul><ul><ul><li>re-running previous tests is regression testing </li></ul></ul><ul><li>particularly problematic for user interface software </li></ul>
    55. 55. SMOKE TESTING <ul><li>A common approach for creating “daily builds” for product software. </li></ul><ul><li>Smoke testing steps: </li></ul>
    56. 56. SMOKE TESTING <ul><li>Software components that have been translated into code are integrated into a “build.” </li></ul><ul><ul><li>A build includes all data files, libraries, reusable modules, and engineered components that are required to implement one or more product functions. </li></ul></ul><ul><li>A series of tests is designed to expose errors that will keep the build from properly performing its function. </li></ul><ul><ul><li>The intent should be to uncover “show stopper” errors that have the highest likelihood of throwing the software project behind schedule. </li></ul></ul><ul><li>The build is integrated with other builds and the entire product (in its current form) is smoke tested daily. </li></ul><ul><ul><li>The integration approach may be top down or bottom up. </li></ul></ul>
    57. 57. VALIDATION TESTING <ul><li>Software is completely assembled as a package, interfacing errors have been identified and corrected, and a final set of software tests validation testing are started. </li></ul><ul><li>Validation can be defined in various ways, but a basic one is valid succeeds when the software functions in a fashion that can reasonably expected by the customer. </li></ul>
    58. 58. VALIDATION TESTING <ul><li>Validation Test Criteria </li></ul><ul><li>Configuration review </li></ul><ul><li>Alpha and Beta testing </li></ul>CONT…
    59. 59. SYSTEM TESTING <ul><li>System testing is a series of different tests whose main aim is to fully exercise the computer-based system </li></ul><ul><li>Although each test has a different role, all work should verify that all system elements have been properly integrated and form allocated functions. </li></ul><ul><li>Below we consider various system tests for computer-based systems. </li></ul>
    60. 60. SYSTEM TESTING <ul><li>Recovery Testing </li></ul><ul><ul><li>Recovery testing is a system test that forces the software to fail in various ways and verifies the recovery is performed correctly. </li></ul></ul><ul><li>Security Testing </li></ul><ul><ul><li>Security testing tries to verify that protection approaches built into a system will protect it from improper penetration. </li></ul></ul><ul><li>Stress Testing </li></ul><ul><ul><li>Stress testing executes a system in the demands resources in abnormal quantity </li></ul></ul>CONT…
    61. 61. <ul><li>Example of Stress testing </li></ul><ul><li>most systems have requirements for performance under load, e.g., </li></ul><ul><ul><li>100 hits per second </li></ul></ul><ul><ul><li>500 ambulances dispatched per day </li></ul></ul><ul><ul><li>all data processed using 70% processor capacity while operating in flight mode with all sensors live </li></ul></ul><ul><li>systems with load requirements must be tested under load </li></ul><ul><ul><li>simulated load scenarios must be designed and supported </li></ul></ul><ul><ul><li>frequently requires significant test code and equipment to adequately support </li></ul></ul>
    62. 62. PERFORMANCE TESTING <ul><li>Test the runtime performance of the s/w within he context of integrated system. </li></ul>
    63. 63. Thank You Any Doubt, question, query????

    ×