Conquering Complex and Changing Systems
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Conquering Complex and Changing Systems

on

  • 1,030 views

 

Statistics

Views

Total Views
1,030
Views on SlideShare
1,030
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Conquering Complex and Changing Systems Presentation Transcript

  • 1. Chapter 9, Testing
  • 2. Terminology
    • Reliability: The measure of success with which the observed behavior of a system confirms to some specification of its behavior.
    • Failure: Any deviation of the observed behavior from the specified behavior.
    • Error: The system is in a state such that further processing by the system will lead to a failure.
    • Fault (Bug): The mechanical or algorithmic cause of an error.
    • There are many different types of errors and different ways how we can deal with them.
  • 3. What is this?
  • 4. Erroneous State (“Error”)
  • 5. How do we deal with Errors and Faults?
  • 6. Verification?
  • 7. Modular Redundancy?
  • 8. Declaring the Bug as a Feature?
  • 9. Patching?
  • 10. Testing?
  • 11. Examples of Faults and Errors
    • Faults in the requirements specification
      • Mismatch between what the clients need and what the developers offer
      • Mismatch between requirements and implementation
    • Algorithmic Faults
      • Missing initialization
      • Branching errors (too soon, too late)
      • Missing test for nil
    • Errors
      • Stress or overload errors
      • Capacity or boundary errors
      • Timing errors
      • Throughput or performance errors
  • 12. Dealing with Errors
    • Avoid them:
      • software quality assurance, best practices, on going reviews/inspections
    • Verification (mathematical):
      • Assumes hypothetical environment that does not match real environment
      • Proof might be buggy (omits important constraints; simply wrong)
    • Modular redundancy:
      • Expensive
    • Declaring a bug to be a “feature”
      • Bad practice
    • Patching
      • Slows down performance
    • Testing (this lecture)
  • 13. Some Observations
    • It is impossible to completely test any nontrivial module or any system
    • Testing can only show the presence of bugs, not their absence (Dijkstra)
  • 14. Testing takes creativity
    • Testing is not always viewed as a glamorous activity
    • To develop an effective test, one must have:
        • Detailed understanding of the system
        • Knowledge of the testing techniques
        • Skill to apply these techniques in an effective and efficient manner
    • Testing is done best by independent testers
      • We often develop a certain mental attitude that the program should in a certain way when in fact it does not.
    • Programmer often uses a data set that makes the program work
    • A program often does not work when tried by somebody else.
      • Don't let this be the end-user.
  • 15. Testing Activities Tested Subsystem Subsystem Code Functional Integration Unit Tested Subsystem Requirements Analysis Document System Design Document Tested Subsystem Test Test T est Unit T est Unit T est User Manual Requirements Specification Subsystem Code Subsystem Code All tests by developer Functioning System Integrated Subsystems
  • 16. Testing Activities ctd Requirements Specification (non-functional) User’s understanding Tests by developer Performance Acceptance Client’s Understanding of Requirements Test Functioning System Test Installation User Environment Test System in Use Usable System Validated System Accepted System Tests by client
  • 17. Component Testing
    • Unit Testing:
      • Individual subsystem
      • Carried out by developers
      • Goal: Confirm that subsystems is correctly coded and carries out the intended functionality
    • Integration Testing:
      • Groups of subsystems (collection of classes) and eventually the entire system
      • Carried out by developers
      • Goal: Test the interface among the subsystem
  • 18. System Testing
    • System Testing:
      • The entire system
      • Carried out by developers
      • Goal: Determine if the system meets the requirements (functional and non-functional)
    • Acceptance Testing:
      • Evaluates the system delivered by developers
      • Carried out by the client. May involve executing typical transactions on site on a trial basis
      • Goal: Demonstrate that the system meets customer requirements and is ready to use
  • 19. The 4 Testing Steps
    • 1. Select what has to be measured
    • Code tested for correctness with respect to:
      • requirements
      • architecture
      • detailed design
    • 2. Decide how the testing is done
      • Code inspection
      • Proofs
      • Black-box, white box,
      • Select integration testing strategy (big bang, bottom up, top down, sandwich)
    • 3. Develop test cases
      • A test case is a set of test data or situations that will be used to exercise the unit (code, module, system) being tested or about the attribute being measured
    • 4. Create the test oracle
      • An oracle contains of the predicted results for a set of test cases
        • I.e., expected output for each test
      • The test oracle has to be written down before the actual testing takes place
  • 20. System Testing
    • Functional Testing
    • Performance Testing
    • Acceptance Testing
    • Installation Testing
    • Impact of model quality on system testing:
      • The more explicit the requirements, the easier they are to test.
      • Quality of use cases determines the ease of functional testing
      • Quality of subsystem decomposition determines the ease of integration testing
      • Quality of detailed design determines the ease of unit testing
  • 21. Functional Testing
    • Black box testing
    • Goal: Test functionality of system
    • Test cases are designed from the requirements analysis document (better: user manual) and centered around requirements and key functions (use cases)
    • The system is treated as black box.
    • user oriented test cases have to be developed
    .
  • 22. Sample test case – typical content
    • Name
    • Purpose
    • Pre-conditions for running the test
    • Define the input
    • Define the expected output
    • Test status
      • To be run
      • Passed
      • Failed
  • 23. Sample test case – example
    • Name: Test Case 42 (add supplier)
    • Purpose. The purpose of this test case is to test the normal flow of events for the Add Supplier Use Case
    • Pre-conditions for running the test. The Dental Office System has been initialized.
    • Define the input.
    • 1. The user selects the Add Supplier menu option.
    • 2. The user enters the following data into the form:
      • Supplier name: Acme Floss
      • Supplier address: 2601 N. Floyd Rd
      • Supplier phone: 972 883 4216
      • Supplier fax: 972 883 2349
      • Supplier web: www.AcmeFloss.com
    • 3. The user selects the OK button
    • Define the expected output. The system displays a message indicating the supplier “Acme Floss” has been added.
    • Test status
      • To be run
  • 24. Acceptance Testing
    • Goal: Demonstrate system is ready for operational use
      • Choice of tests is made by client/sponsor
      • Acceptance test is performed by the client , not by the developer.
    • Requirements specification forms the contract between the customer and the developers
    • Customer uses the requirements to derive their own test cases…
  • 25. Test Team Test Analyst Team User Programmer too familiar with code Professional Tester Configuration Management Specialist System Designer
  • 26. Summary
    • Testing is still a black art, but many rules and heuristics are available
      • active research area in software engineering
    • Testing consists of component-testing ( unit testing, integration testing) and system testing
    • Design Patterns can be used for component-based testing
    • Testing has its own lifecycle