Your SlideShare is downloading. ×
Coward-p421.ppt
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Coward-p421.ppt

286
views

Published on


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
286
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. A Review of Software Testing - P David Coward Reprinted: Information and Software Technology; Vol. 30, No. 3 April 1988 Software Engineering: The Development Process, Vol 1, Chapter 7 Presented By: Andrew Diemer Software Engineering II – EEL 6883
  • 2. Aim of paper
    • No guarantee that software meets functional requirements
    • Introduces software testing techniques
  • 3. Needs
    • Software is to be correct
      • What does this mean
        • It often means the program matches the specifications.
    • Problem with specification
      • Specification could be wrong
  • 4. Needs
    • If this happens then the correctness is measured by the software meeting the user requirements
  • 5. Needs
    • Testing
      • Why test
        • Tests may have not been adequate enough
    • Asses the performance of the tasks
  • 6. Terminology
    • Verification –vs- Validation
    • Verification
      • Ensures correctness from phase to phase of the software life cycle process
      • Formal proofs of correctness
  • 7. Terminology
    • Validation
      • Checks software against requirements
        • Executes software with test data
    • Author uses testing and checking instead of verification and validation
  • 8. Categories of Testing
    • Two categories of testing:
      • Functional
      • Non-functional
  • 9. Functional Testing
    • Functional
      • Addresses to see if the program obtains the correct output
        • It is normally used when testing a modified or new program
  • 10. Functional Testing
    • Regression Testing
      • Tests following modification
      • Tests to see if the unchanging functions have indeed changed
  • 11. Non-functional Requirements
    • Style
    • Documentation standards
    • Response times
    • Legal obligations
  • 12. Situation testing
    • Two situations testing can fall under:
      • Testing which finds faults in the software
      • Testing which does NOT find faults in the software
  • 13. Situation testing
    • Finding faults
      • Destructive process
        • more probing
    • Not finding faults
      • miss inherent faults
        • too gentle
  • 14. Questions
    • How much testing is needed?
    • Confidence in testing?
    • Ignore faults?
      • Which ones are important?
    • Are there more faults?
    • What is the purpose of this testing?
  • 15. Testing Strategies
    • Functional -vs- Structural
    • Static -vs- Dynamic analysis
  • 16. Strategy starting points
    • Specification
      • It makes known the required functions
      • Asses to see if they are provided
      • Functional testing
  • 17. Strategy starting points
    • Software
      • Tests the structure of the system
      • Structural testing
      • Functions are included into the system but are NOT required
      • Example: accessing a database that has not been asked by the user
  • 18. Functional testing
    • Identify the functions which the software is expected to perform
    • Creating test data that will check to see if these functions are performed by the software
    • Does NOT matter how the program performs these functions
  • 19. Functional testing
    • Rules may be applied to uncover the functions
    • Functional testing methods of formal documentation that includes descriptions of faults that correlate to each part of the design and the design features themselves
  • 20. Functional testing
    • Isolation of these particular properties of each function should take place
    • Fault class associations
    • Black box approach
    • Testers have an understanding of what the output should be
  • 21. Functional testing
    • Oracle
      • An expert on what the outcome of a program will be for a particular test
    • When might the oracle approach not work?
      • Simulation testing
        • Only provides a “range of values”
  • 22. Structural testing
    • Testing is based on the detailed design rather than the functions required by the program
  • 23. Structural testing
    • Two approaches for this testing
      • First and most common is to execute the program with test cases
      • Second is symbolic execution
        • Functions of the program are compared to the required functions for congruency
  • 24. Structural testing
    • May require single path or percentage testing
    • Research has been conducted to find out what the minimum amount of testing would be to ensure a degree of reliability
  • 25. Structural testing
    • Measure of reliability
      • All statements should be executed at least once
      • All branches should be executed at least once
      • All linear code sequence and jumps in the program should be executed at least once
  • 26. Structural testing
    • Measure of reliability (cont.)
      • Best approach would be the exhaustive approach in which every path is tested
  • 27. Structural testing
    • Problems with the exhaustive approach
      • Extensive number of paths
      • Multiple combinations constitutes multiple conditions
      • Infeasible paths
        • Contradictions of predicates at conditional statements
  • 28. Structural testing
    • Path issues
      • There is a path for a loop not executing, executing once, and executing multiple of times
      • Control loops determine the number of paths
  • 29. Structural testing
    • Path issues
      • Known as the “level-i” path or island code
      • Island code
        • A series of lines of code, following a program termination, which is not the destination of a transfer control from somewhere else in the program
  • 30. Structural testing
    • Path issues
      • When does island code occur?
        • When failing to delete redundant code after maintenance
  • 31. Static analysis
    • Does NOT involve execution of software with data but involves the use of constraints on the input and output data sets mathematically on software components
    • Examples of static analysis would be program proving and symbolic execution
  • 32. Static analysis
    • Symbolic execution
      • Use symbolic values for variables instead of numeric or string values
  • 33. Dynamic analysis
    • Relies on program statements which make calls to analysis routines
    • They record the frequency of execution of elements of the program
  • 34. Dynamic analysis
    • Act as a liaison between functional and structural testing
    • Test cases monitored dynamically, then structurally tested to see what idle code is left by previous tests
    • Shows functions the program should perform
  • 35. Classification of Techniques
    • There are three classifications:
    • Static – Structural
      • Symbolic execution
      • Partition analysis
      • Program proving
      • Anomaly analysis
  • 36. Classification of Techniques
    • Dynamic - Functional
      • Domain testing
      • Random testing
      • Adaptive perturbation
      • Cause-effect graphing
  • 37. Classification of Techniques
    • Dynamic - Structural
      • Domain and computational testing
      • Automatic test data generation
      • Mutation analysis
  • 38. Classification of Techniques
    • Dynamic - Structural
      • Domain and computational testing
      • Automatic test data generation
      • Mutation analysis
  • 39. Symbolic execution
    • Non traditional approach
      • traditional is the execution requires that a selection of paths through a program is exercised by a set of test classes
    • Produces a set of expressions, one per output variable
  • 40. Symbolic execution
    • Usually a program executes using input data values with the output resulting in a set of actual values
    • Use of flow-graphs
      • Each branch contains decision points (directed graph)
      • Branch predictions are produced
  • 41. Symbolic execution
    • Use of top down approach
    • During the top down traversal, the input variable is given a symbol in place of an actual value
    • A problem is in the iterations
    • As mentioned before, no executing loop, executing once, and then executing multiple times.
  • 42. Partition analysis
    • Uses symbolic execution to find sub domains of the input domain
    • Path conditions are used to find them
    • Input domains that cannot be allocated to a sub domain infer a fault
  • 43. Partition analysis
    • Specifications need to be written at a higher lever
  • 44. Program proving
    • At the beginning and end of the procedure, place a mathematical method assertions
    • Similar to symbolic execution
    • Neither of them execute actual data and both examine source code
  • 45. Program proving
    • Tries to come up with a proof that encompasses all possible iterations
    • Program proving steps:
      • Construct a program
      • Examine the program and insert mathematical assertions at the beginning and end of procedures
  • 46. Program proving
    • Program proving steps (cont):
      • Determine whether the code between each pair of start and end assertions will achieve the end assertion given the start assertion
      • If the code reaches the end assertion then the block has been proven
  • 47. Program proving
    • DeMillo says that proofs can only be stated as acceptable and not correct
    • His acceptance is determined by a gathering of people who cannot find fault with the proof
  • 48. Program proving
    • The larger the audience, the more confidence in the software
    • Total correctness means loops will terminate
  • 49. Anomaly analysis
    • Two levels of anomalies:
      • made by the compiler (language syntax)
      • problems that are not necessarily wrong by the programming language
  • 50. Anomaly analysis
    • Some anomalies are:
      • Unexecutable code
      • Array boundaries
      • Initializing variables wrong
      • Unused labels and variables
      • Traversing in and out of loops
  • 51. Anomaly analysis
    • Produce flow-graphs
    • Determine infeasible paths
    • Some use data-flow analysis
      • Where the input values turn into intermediate, which then turn into output values
  • 52. Anomaly analysis
    • Some data-flow anomalies:
      • Assigning values to variables which are not used
      • Using variables that have not been assigned previously to a value
      • Re-assigning of a variable without making use of a previous variable
      • Indicates possible faults
  • 53. Domain testing
    • Based upon informal classifications of the requirements
    • Test cases executed and compared against the expected to determine whether faults have been detected
  • 54. Random testing
    • Produces data without reference to code or specification
    • Random number generation is used
    • Main problem is there is no complete coverage guarantee
  • 55. Random testing
    • Key is to examine small subsets
    • If followed it will give you a high branch coverage success rate
  • 56. Adaptive perturbation testing
    • Concerns with assessing the effectiveness of sets of test cases
    • Used to generate further test cases for effectiveness
  • 57. Adaptive perturbation testing
    • Optimization is reached when routines find the best value to replace the discarded value so the number of assertions is maximized
    • Process is replaced until the violated assertions are maximized to the limit
  • 58. Cause-effect graphing
    • Power comes from the input combinations used by logic (AND, OR, NOT, etc)
  • 59. Cause-effect graphing
    • Five steps:
      • Divide into workable pieces
      • Identify cause and effect
      • Represent a graph to link cause and effect semantics
  • 60. Cause-effect graphing
    • Five steps (cont):
      • Annotate to show impossible combinations and effects
      • Convert the graph into a limited-entity decision table
    • It helps identify small test cases
  • 61. Domain and computational testing
    • Based upon selecting test cases
    • Assignment statements are used for deciding computational paths
  • 62. Domain and computational testing
    • Paths considered:
      • Path computation
        • Set of algebraic expressions, one for each output variable, in terms of input variables and constraints
      • Path condition
        • A joining of constraints of a path
  • 63. Domain and computational testing
    • Paths considered (cont):
      • Path domain
        • Set of input values that satisfy the path condition
      • Empty path
        • Infeasible paths and cannot execute
    • A path that follows errors is a computation error
  • 64. Automatic test data generation
    • Courage comes with covering metrics
    • Contradictory paths are infeasible
    • Needs detailed specification to achieve this testing
    • Formal specifications may provide fundamental help
  • 65. Mutation analysis
    • Concerns the quality of sets of test data
    • Uses the program to test the test data
    • This means the original and mutant program are tested using the same test data
  • 66. Mutation analysis
    • The two outputs are compared
    • If the mutant output is different from the original output, then the mutant is of little value
    • If the two outputs are the same, then the problem is there has been a change
  • 67. Mutation analysis
    • The mutant is then said to be live
    • Ratios are then taken (dead/alive)
    • High ratio of live mutants equals poor test data
    • If this happens then more tests need to be run until the ratio goes down
  • 68. Conclusion
    • I thought this paper was thorough
    • This paper gave good examples (compartmentalized)
    • I thought this paper was a little out of date