Conquering Complex and Changing Systems
Upcoming SlideShare
Loading in...5
×
 

Conquering Complex and Changing Systems

on

  • 926 views

 

Statistics

Views

Total Views
926
Views on SlideShare
926
Embed Views
0

Actions

Likes
0
Downloads
3
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Conquering Complex and Changing Systems Conquering Complex and Changing Systems Presentation Transcript

    • Chapter 9, Testing
    • Terminology
      • Reliability: The measure of success with which the observed behavior of a system confirms to some specification of its behavior.
      • Failure: Any deviation of the observed behavior from the specified behavior.
      • Error: The system is in a state such that further processing by the system will lead to a failure.
      • Fault (Bug): The mechanical or algorithmic cause of an error.
      • There are many different types of errors and different ways how we can deal with them.
    • What is this?
    • Erroneous State (“Error”)
    • How do we deal with Errors and Faults?
    • Verification?
    • Modular Redundancy?
    • Declaring the Bug as a Feature?
    • Patching?
    • Testing?
    • Examples of Faults and Errors
      • Faults in the requirements specification
        • Mismatch between what the clients need and what the developers offer
        • Mismatch between requirements and implementation
      • Algorithmic Faults
        • Missing initialization
        • Branching errors (too soon, too late)
        • Missing test for nil
      • Errors
        • Stress or overload errors
        • Capacity or boundary errors
        • Timing errors
        • Throughput or performance errors
    • Dealing with Errors
      • Avoid them:
        • software quality assurance, best practices, on going reviews/inspections
      • Verification (mathematical):
        • Assumes hypothetical environment that does not match real environment
        • Proof might be buggy (omits important constraints; simply wrong)
      • Modular redundancy:
        • Expensive
      • Declaring a bug to be a “feature”
        • Bad practice
      • Patching
        • Slows down performance
      • Testing (this lecture)
    • Some Observations
      • It is impossible to completely test any nontrivial module or any system
      • Testing can only show the presence of bugs, not their absence (Dijkstra)
    • Testing takes creativity
      • Testing is not always viewed as a glamorous activity
      • To develop an effective test, one must have:
          • Detailed understanding of the system
          • Knowledge of the testing techniques
          • Skill to apply these techniques in an effective and efficient manner
      • Testing is done best by independent testers
        • We often develop a certain mental attitude that the program should in a certain way when in fact it does not.
      • Programmer often uses a data set that makes the program work
      • A program often does not work when tried by somebody else.
        • Don't let this be the end-user.
    • Testing Activities Tested Subsystem Subsystem Code Functional Integration Unit Tested Subsystem Requirements Analysis Document System Design Document Tested Subsystem Test Test T est Unit T est Unit T est User Manual Requirements Specification Subsystem Code Subsystem Code All tests by developer Functioning System Integrated Subsystems
    • Testing Activities ctd Requirements Specification (non-functional) User’s understanding Tests by developer Performance Acceptance Client’s Understanding of Requirements Test Functioning System Test Installation User Environment Test System in Use Usable System Validated System Accepted System Tests by client
    • Component Testing
      • Unit Testing:
        • Individual subsystem
        • Carried out by developers
        • Goal: Confirm that subsystems is correctly coded and carries out the intended functionality
      • Integration Testing:
        • Groups of subsystems (collection of classes) and eventually the entire system
        • Carried out by developers
        • Goal: Test the interface among the subsystem
    • System Testing
      • System Testing:
        • The entire system
        • Carried out by developers
        • Goal: Determine if the system meets the requirements (functional and non-functional)
      • Acceptance Testing:
        • Evaluates the system delivered by developers
        • Carried out by the client. May involve executing typical transactions on site on a trial basis
        • Goal: Demonstrate that the system meets customer requirements and is ready to use
    • The 4 Testing Steps
      • 1. Select what has to be measured
      • Code tested for correctness with respect to:
        • requirements
        • architecture
        • detailed design
      • 2. Decide how the testing is done
        • Code inspection
        • Proofs
        • Black-box, white box,
        • Select integration testing strategy (big bang, bottom up, top down, sandwich)
      • 3. Develop test cases
        • A test case is a set of test data or situations that will be used to exercise the unit (code, module, system) being tested or about the attribute being measured
      • 4. Create the test oracle
        • An oracle contains of the predicted results for a set of test cases
          • I.e., expected output for each test
        • The test oracle has to be written down before the actual testing takes place
    • System Testing
      • Functional Testing
      • Performance Testing
      • Acceptance Testing
      • Installation Testing
      • Impact of model quality on system testing:
        • The more explicit the requirements, the easier they are to test.
        • Quality of use cases determines the ease of functional testing
        • Quality of subsystem decomposition determines the ease of integration testing
        • Quality of detailed design determines the ease of unit testing
    • Functional Testing
      • Black box testing
      • Goal: Test functionality of system
      • Test cases are designed from the requirements analysis document (better: user manual) and centered around requirements and key functions (use cases)
      • The system is treated as black box.
      • user oriented test cases have to be developed
      .
    • Sample test case – typical content
      • Name
      • Purpose
      • Pre-conditions for running the test
      • Define the input
      • Define the expected output
      • Test status
        • To be run
        • Passed
        • Failed
    • Sample test case – example
      • Name: Test Case 42 (add supplier)
      • Purpose. The purpose of this test case is to test the normal flow of events for the Add Supplier Use Case
      • Pre-conditions for running the test. The Dental Office System has been initialized.
      • Define the input.
      • 1. The user selects the Add Supplier menu option.
      • 2. The user enters the following data into the form:
        • Supplier name: Acme Floss
        • Supplier address: 2601 N. Floyd Rd
        • Supplier phone: 972 883 4216
        • Supplier fax: 972 883 2349
        • Supplier web: www.AcmeFloss.com
      • 3. The user selects the OK button
      • Define the expected output. The system displays a message indicating the supplier “Acme Floss” has been added.
      • Test status
        • To be run
    • Acceptance Testing
      • Goal: Demonstrate system is ready for operational use
        • Choice of tests is made by client/sponsor
        • Acceptance test is performed by the client , not by the developer.
      • Requirements specification forms the contract between the customer and the developers
      • Customer uses the requirements to derive their own test cases…
    • Test Team Test Analyst Team User Programmer too familiar with code Professional Tester Configuration Management Specialist System Designer
    • Summary
      • Testing is still a black art, but many rules and heuristics are available
        • active research area in software engineering
      • Testing consists of component-testing ( unit testing, integration testing) and system testing
      • Design Patterns can be used for component-based testing
      • Testing has its own lifecycle