Program Testing +
Integration Testing
Zeeshan Rana
 Types of faults and how to clasify them
 The purpose of testing
 Unit testing
 Integration testing strategies
 Test planning
 When to stop testing
 Wrong requirement: not what the customer
wants
 Missing requirement
 Requirement impossible to implement
 Faulty design
 Faulty code
 Improperly implemented design
 Objective of testing: discover faults
 A test is successful only when a fault is
discovered
◦ Fault identification is the process of determining
what fault caused the failure
◦ Fault correction is the process of making changes
to the system so that the faults are removed
 Purpose
 Input
 Expected Output
 Actual Output
 Sample Format:
 Algorithmic fault
 Computation and precision fault
◦ a formula’s implementation is wrong
 Documentation fault
◦ Documentation doesn’t match what program does
 Capacity or boundary faults
◦ System’s performance not acceptable when certain limits
are reached
 Timing or coordination faults
 Performance faults
◦ System does not perform at the speed prescribed
 An algorithmic fault occurs when a
component’s algorithm or logic does not
produce proper output
◦ Branching too soon
◦ Branching too late
◦ Forgetting to initialize variable or set loop
invariants
◦ Comparing variables of inappropriate data types
 Module testing, component testing, or unit
testing
 Integration testing
 System Testing
◦ Function testing
◦ Performance testing
 Acceptance testing
 Installation testing
 Egoless programming: programs are viewed
as components of a larger system, not as the
property of those who wrote them
 Independent test team
◦ avoid conflict
◦ improve objectivity
◦ allow testing and coding concurrently
 Closed box or black box: functionality of the
test objects
◦ Equivalence Class, Boundary Value Analysis,
Scenario-based, Decision Table based, State
Machine based…
 Clear box or white box: structure of the test
objects
◦ Control Flow
 Basis Path, Branch, Statement, Decision…
◦ Data Flow
 Du Path, All-uses Path
 Black box: external behavior description
 State box: black box with state information
 White box: state box with a procedure
 Code walkthrough
 Code inspection
 Testing the unit for correct functionality
 Testing the unit for correct execution
 Determining test objectives
 Selecting test cases
 Executing test cases
 Statement testing
 Branch testing
 Path testing
 …
Cyclomatic Complexity = E – N + 2 = 9 – 9 + 2 = 2
• Effectiveness of fault-discovery techniques
Requirements
Faults Design Faults Code Faults
Documentation
Faults
Reviews Fair Excellent Excellent Good
Prototypes Good Fair Fair Not applicable
Testing Poor Poor Good Fair
Correctness Proofs Poor Poor Fair Fair
 Big-bang
 Bottom-up
 Top-down
 Sandwich testing
 Component Driver: a routine that calls a
particular component and passes a test case
to it
 Stub: a special-purpose program to simulate
the activity of the missing component
 System viewed as a hierarchy of components
 All components integrated at once
 Locating faults?
 Only A is tested by itself
 Stubs of B, C and D are used at first level
 N-1 stubs required (N=Number of nodes)
 Locating faults?
 Drivers are used to call the child functions
 Drivers are relatively intelligent
 N-leaves drivers
 Locating faults?
 Viewed system as three layers
 Employ BU where
writing drivers is
not costly
 Employ TD where
stubs are easier to
Write
 Locating faults?
• Adjacency Matrix
• NxN matrix that tells which components call
the other components
• Pairwise Integration
• Test each pair (i.e. each edge)
• E testing sessions
• Neighborhood based Integration
• Integrate each neighborhood
• The nodes at one edge distance from the
node to be integrated
• N-sink nodes sessions
 Establish test objectives
 Design and Write test cases
 Test test cases
 Execute tests
 Evaluate test results
 Test plan explains
◦ who does the testing
◦ why the tests are performed
◦ how tests are conducted
◦ when the tests are scheduled
 What the test objectives are
 How the test will be run
 What criteria will be used to determine when
the testing is complete
 Automated Testing Tools
 Testing Management Tools
 Bug Tracking/Configuration Management
Tools
 No time left
 No money left
 Statistical Criteria
◦ Number of defects found per week becomes
lower than a set threshold
 The Ariane-5’s flight control system was
tested in four ways
◦ equipment testing
◦ on-board computer software testing
◦ staged integration
◦ system validation tests
 The Ariane-5 developers relied on insufficient
reviews and test coverage
 It is important to understand the difference
between faults and failures
 The goal of testing is to find faults, not to
prove correctness
 UCF Slides
 Software Testing, A Craftsman’s Approach by
Jorgensen
 Software Testing Tools by Prasad

Pa chapter08-testing integrating-the_programs-cs_390

  • 1.
    Program Testing + IntegrationTesting Zeeshan Rana
  • 2.
     Types offaults and how to clasify them  The purpose of testing  Unit testing  Integration testing strategies  Test planning  When to stop testing
  • 3.
     Wrong requirement:not what the customer wants  Missing requirement  Requirement impossible to implement  Faulty design  Faulty code  Improperly implemented design
  • 4.
     Objective oftesting: discover faults  A test is successful only when a fault is discovered ◦ Fault identification is the process of determining what fault caused the failure ◦ Fault correction is the process of making changes to the system so that the faults are removed
  • 5.
     Purpose  Input Expected Output  Actual Output  Sample Format:
  • 6.
     Algorithmic fault Computation and precision fault ◦ a formula’s implementation is wrong  Documentation fault ◦ Documentation doesn’t match what program does  Capacity or boundary faults ◦ System’s performance not acceptable when certain limits are reached  Timing or coordination faults  Performance faults ◦ System does not perform at the speed prescribed
  • 7.
     An algorithmicfault occurs when a component’s algorithm or logic does not produce proper output ◦ Branching too soon ◦ Branching too late ◦ Forgetting to initialize variable or set loop invariants ◦ Comparing variables of inappropriate data types
  • 8.
     Module testing,component testing, or unit testing  Integration testing  System Testing ◦ Function testing ◦ Performance testing  Acceptance testing  Installation testing
  • 10.
     Egoless programming:programs are viewed as components of a larger system, not as the property of those who wrote them
  • 11.
     Independent testteam ◦ avoid conflict ◦ improve objectivity ◦ allow testing and coding concurrently
  • 12.
     Closed boxor black box: functionality of the test objects ◦ Equivalence Class, Boundary Value Analysis, Scenario-based, Decision Table based, State Machine based…  Clear box or white box: structure of the test objects ◦ Control Flow  Basis Path, Branch, Statement, Decision… ◦ Data Flow  Du Path, All-uses Path
  • 13.
     Black box:external behavior description  State box: black box with state information  White box: state box with a procedure
  • 14.
     Code walkthrough Code inspection
  • 15.
     Testing theunit for correct functionality  Testing the unit for correct execution
  • 16.
     Determining testobjectives  Selecting test cases  Executing test cases
  • 17.
     Statement testing Branch testing  Path testing  …
  • 18.
    Cyclomatic Complexity =E – N + 2 = 9 – 9 + 2 = 2
  • 19.
    • Effectiveness offault-discovery techniques Requirements Faults Design Faults Code Faults Documentation Faults Reviews Fair Excellent Excellent Good Prototypes Good Fair Fair Not applicable Testing Poor Poor Good Fair Correctness Proofs Poor Poor Fair Fair
  • 20.
     Big-bang  Bottom-up Top-down  Sandwich testing
  • 21.
     Component Driver:a routine that calls a particular component and passes a test case to it  Stub: a special-purpose program to simulate the activity of the missing component
  • 22.
     System viewedas a hierarchy of components
  • 23.
     All componentsintegrated at once  Locating faults?
  • 24.
     Only Ais tested by itself  Stubs of B, C and D are used at first level  N-1 stubs required (N=Number of nodes)  Locating faults?
  • 25.
     Drivers areused to call the child functions  Drivers are relatively intelligent  N-leaves drivers  Locating faults?
  • 26.
     Viewed systemas three layers  Employ BU where writing drivers is not costly  Employ TD where stubs are easier to Write  Locating faults?
  • 27.
    • Adjacency Matrix •NxN matrix that tells which components call the other components • Pairwise Integration • Test each pair (i.e. each edge) • E testing sessions • Neighborhood based Integration • Integrate each neighborhood • The nodes at one edge distance from the node to be integrated • N-sink nodes sessions
  • 28.
     Establish testobjectives  Design and Write test cases  Test test cases  Execute tests  Evaluate test results
  • 29.
     Test planexplains ◦ who does the testing ◦ why the tests are performed ◦ how tests are conducted ◦ when the tests are scheduled
  • 30.
     What thetest objectives are  How the test will be run  What criteria will be used to determine when the testing is complete
  • 31.
     Automated TestingTools  Testing Management Tools  Bug Tracking/Configuration Management Tools
  • 32.
     No timeleft  No money left  Statistical Criteria ◦ Number of defects found per week becomes lower than a set threshold
  • 33.
     The Ariane-5’sflight control system was tested in four ways ◦ equipment testing ◦ on-board computer software testing ◦ staged integration ◦ system validation tests  The Ariane-5 developers relied on insufficient reviews and test coverage
  • 34.
     It isimportant to understand the difference between faults and failures  The goal of testing is to find faults, not to prove correctness
  • 35.
     UCF Slides Software Testing, A Craftsman’s Approach by Jorgensen  Software Testing Tools by Prasad