• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Introduction & Manual Testing

Introduction & Manual Testing






Total Views
Views on SlideShare
Embed Views



1 Embed 37

http://www.slideshare.net 37



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Introduction & Manual Testing Introduction & Manual Testing Presentation Transcript

    • Introduction & Manual Testing
      • Software Development Life Cycle
      • Software Life Cycle Models
      • Water Fall Model
      • Prototype Model
      • Rapid Application Model
      • Spiral or Iterative Model
      • Component Assembly Model
      • Testing Fundamentals
      • Testing Objectives
      • Testing Information Flow
      • Test Case Design
      • White Box Testing
      • Basis Path Testing
      • Flow Graph Notation
      • Cyclomatic Complexity
      • Deriving Test Cases
      • Graphic Metrics
      • Control Structure Testing
      • Conditions Testing
      • Dataflow Testing
      • Loop Testing
      • Black Box Testing
      • Equivalence Partitioning
      • Boundary Value Analysis
      • Comparision Testing
      • Verification and Validation
      • Different Kinds of tests to be considered
    • SDLC Model (or) Linear Sequential Model (or) Classic Life Cycle Model
      • System/Information Engineering and Modeling
      • Software Requirements Analysis
      • System Analysis and Design
      • Code Generation
      • Testing
      • Maintenance
    • Quality. Quality Assurance, And Quality Control Quality is meeting the requirements expected of the software, consistently and predictably.
      • Quality Assurance
      • Concentrates on the process of producing the products.
      • Defect-prevention oriented.
      • Usually done throughout the life cycle.
      • This is usually a staff function.
      • Examples : Reviews and Audits
      • Quality Control
      • Concentrates on specific products.
      • Defect-detection and correction oriented.
      • Usually done after the product is built.
      • This is usually a line function.
      • Examples : Software testing at various levels.
    • Testing, Verification, And Validation
      • Testing is the phase that follows coding and precedes deployment.
      • Verification is the process of evaluating a system or component to determine whether the products of a given phase satisfy the conditions imposed at the start of that phase.
      • Validation is the process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements.
    • Quality Assurance = Verification Quality Control = Validation = Testing
    • Waterfall Model
      • A Waterfall Model is Characterized by three attributes.
      • The project is divided into separate distinct phases.
      • Each phase communicates to the next through pre-specified outputs.
      • When an error is detected, it is traced back to one previous phase at a time, until it gets resolved at some earlier phase.
    • Overall business requirements. Software requirements. Planning. High-level design. Low-level design. Coding. Testing.
    • Prototyping Model
      • A Prototyping model uses constant user interaction, early in the requirements gathering stage, to produce a prototype.
      • The proto-type is used to derive the system requirements specification and can be discarded after the SRS is built.
      • An appropriate life cycle model is chosen for building the actual product after the user accepts the SRS.
    • Rapid Application Model
      • The RAD is a linear sequential software development process that emphasizes an extremely short development cycle. It includes the following phases.
      • Business Modeling.
      • Data Modeling.
      • Process Modeling.
      • Application Generation.
      • Testing and Turnover.
    • Spiral or Iterative Model
      • Most life cycle models can be derived as special cases of this model. The Spiral uses a risk management approach to software development. Some advantages of this model are:
      • Defers elaboration of low risk software elements.
      • Incorporates prototyping as a risk reduction strategy.
      • Gives a early focus to reusable software.
      • Accommodates life-cycle evolution, growth, and requirement changes.
      • Incorporates software quality objectives into the product.
      • Focus on early error detection and design flaws.
      • Uses identical approaches for development and maintenance.
    • Component Assembly Model
      • Object technologies provide the technical framework for a component-based process model for software engineering.
      • The object oriented paradigm emphasizes the creation of classes that encapsulate both data and the algorithm that are used to manipulate data.
      • If properly designed and implemented, object oriented classes are reusable across different applications and computer based system architecture.
      • Component Assembly Model leads to software reusability.
      • The integration/assembly of already existing software components accelerate the development process.
    • Testing Fundamentals
      • Testing Objectives
      • Testing is the process of executing a program with the intent of finding errors.
      • A good test is one that has a high probability of finding an as yet undiscovered error.
      • A successful test is one that uncovers an as yet undiscovered error.
    • Test Information Flow Testing Reliability model Evaluation Debug Software Configuration Test Configuration Corrections Predicted Reliability Error Rate Data Expected results Test Results Errors
    • Test Case Design
      • Can be difficult at the initial stage.
      • Can test if a component conforms to specification – Black Box testing.
      • Can test if a component conforms to design – White Box Testing.
      • Testing can not prove correctness as not all execution paths can be tested.
    • White Box Testing
      • Testing control structures of a procedural design. Can derive test cases to ensure:
      • All independent paths are exercised at least once.
      • All logical decisions are exercised for both true and false paths.
      • All loops are executed at their boundaries and within operational bounds.
      • All internal data structures are exercised to ensure validity.