SOFTWARE TESTING
By Sabeen Amjad
Click icon to add picture
Outline
 Formal Definition of Software Testing
 What is a Bug?
 Test Case
 Software Testing Life Cycle
 Waterfall Model
 V Model
 Modified V Model
 Spiral Model
 Agile Model
Software Testing
 Testing is the process of executing a program with intention of finding errors.
Formal Definition of Software Testing
 Software testing is a formal process carried out by a specialized testing team in
which a software unit, several integrated software units or an entire software
package are examined by running the programs on a computer. All the
associated tests are performed according to approved test procedures on
approved test cases.
Formal Definition of Software Testing
 Formal:
 Software test plans are part of project’s development and quality plans,
scheduled in advance
 It is often signed between developer and customer
 Ad hoc examination by colleague and or regular checks by the programming
team leader cannot be considered software tests
Formal Definition of Software Testing
 Specialized testing team:
 An independent team or external consultants who specialize in testing are
assigned to perform these tasks to
 Eliminate bias
 Guarantee effective testing
 Tests performed by the developers themselves will yield poor results
 Unit tests continue to be performed by developers in many organizations
Formal Definition of Software Testing
 Running the programs:
 Any form of quality assurance activity that does not involve running the
software, for example code inspection cannot be considered as a test.
 Approved test procedures:
 The testing process performed according to a test plan and testing
procedures
 These are approved SQA procedures adopted by the developing
organizations
 Approved test cases:
 The test cases to be examined are defined in full by the test plan.
 No omissions or additions are expected to occur during testing.
What is a Bug?
 Informally, it is “what happens when software fails”, whether the failure was
o Inconvenient
o Catastrophic
 Terms for software failure
* Fault * Anomaly * Problem
* Inconsistency * Failure * Incident
* Error * Defect
* Variance * Bug
What is a Bug?
 Formally, we say that a software bug occurs when one or more of the following
five rules is true: when the software
o doesn't do something that the product specification says it should do.
o does something that the product specification says it shouldn't do.
o does something that the product specification doesn't mention.
o doesn't do something that the product specification doesn't mention but
should.
o is difficult to understand, hard to use, slow, or will be viewed by the end user
as just pain not right.
Test Case
 Test Case
 A set of
 Input values
 Execution preconditions
 Expected results
 Execution post conditions
 Developed for a particular objective or test condition, such as
 To exercise a particular program path
 To verify compliance with a specific requirement.
Test Case Template
 Below are the standard fields of sample test case template
 Test case ID:
 Unique ID for each test case.
 Follow some convention to indicate types of test. E.g. ‘TC_UI_1’ indicating ‘user
interface test case #1’.
 Product / Ver./ Module:
 Mention product name.
 Mention name of main module or sub module.
 Mention version information of the product.
 Test case Version: (Optional)
 Mention the test case version number.
 Use Case Reference(s):
 Mention the use case reference for which the test case is written.
Test Case Template
 GUI Reference(s) :(Optional)
 Mention the GUI reference for which the test case is written.
 QA Test Engineer / Test Designed By:
 Name of the tester
 Test Designed Date:
 Date when test case is written.
 Test Executed By:
 Name of tester who executed this test.
 To be filled after test execution.
 Test Execution Date:
 Date when test case is executed.
Test Case Template
 Test Title/Name:
 Test case title. E.g. verify login page with valid username and password.
 Test Case Summary/Description:
 Describe test objective.
 Pre-Requisite/Pre-condition:
 Any prerequisite that must be fulfilled before execution of this test case.
 List all pre-conditions in order to successfully execute this test case.
 Dependencies: (Optional)
 Mention any dependencies on other test cases or test requirement.
 Test Steps:
 List all test execution steps in detail.
 Write test steps in the order in which these should be executed.
 Make sure to provide as much details as you can.
Test Case Template
 Test Data/Input Specification:
 Use of test data as an input for the test case.
 You can provide different data sets with exact values to be used as an input.
 Examples: If you’re testing Calculator, this may be as simple as 1+1.
 If you’re testing cellular telephone switching software, there could be
hundreds or thousands of input conditions.
 If you’re testing a file-based product, it would be the name of the file and a
description of its contents.
 Expected Result/ Output Specification:
 What should be the system output after test execution?
 Describe the expected result in detail including message/error that should be
displayed on screen.
 Examples: Did 1+1 equal 2?
 Were the thousands of output variables set correctly in the cell phone
software?
 Did all the contents of the file load as expected?
Test Case Template
 Actual result:
 Actual test result should be filled after test execution.
 Describe system behaviour after test execution.
 Status (Pass/Fail):
 If actual result is not as per the expected result mark this test as failed.
 Otherwise update as passed.
 Notes/Comments/Questions: To support above fields if there are some special
conditions which can’t be described in any of the above fields or there are
questions related to the expected or actual results, mention those here.
 Post-condition:
 What should be the state of the system after executing this test case?
Test Case Template
 Environmental needs: (Optional)
 Environmental needs that are necessary to run the test case include:
 Hardware
 Software
 Test tools
 Facilities
 Staff and so on.
 Special procedural requirements: (Optional)
 This section describes anything unusual that must be done to perform the
test.
 Example: Testing WordPad probably doesn’t need anything special, but
testing nuclear power plant software might.
Level of Detail For Test Case
 If you follow this level of documentation, you could be writing at least a page of
descriptive text for each test case you identify.
 Thousands of test cases could take thousands of pages of documentation.
 The project could be outdated by the time you finish writing.
 Many government projects and industries are required to document their test
cases to this level.
 In other cases, you can take some shortcuts.
 Taking a shortcut doesn’t mean dismissing or neglecting important
information.
Level of Detail For Test Case
 You can use the following test case format for printer compatibility matrix
 All the other information that goes with a test case are most likely common to all
these cases and could be written once and attached to the table
Project Initiation
System Study
Summary Reports
Analysis
Regression Test
Report Defects
Execute Test Cases
( manual /automated )
Design Test Cases
Test Plan
Testing Life Cycle
Test Environment Setup
12/06/2024 01:27 PM
Testing Life Cycle
 Project Initiation
 All the necessary analysis is undertaken to allow the project to be planned.
 System Study
 To test, we need to know the product functionality(Understanding the
product)
 Test Plan
 It is a Systematic approach to test a system or s/w.
 Contains a detailed understanding of what the eventual testing workflow will
be or should be.
 Design Test Cases:
 Test case is the specific procedure of testing a particular requirement by
giving specific input to the system and defining the expected results.
 Executing(Manual):
 Usually Test cases will be written by person A and will be executed by person
Testing Life Cycle
 Report defects:
 Reporting defects in Issue Logger (Ex: JIRA, HP-QC)
 Issues will be fixed by the DEV team.
 Regression testing:
 Verify whether the new functionality or bug correction affected the previous
behaviour
 Analysis:
 Analysis about the Testing is Done here.
 Test Summary Report
 An important deliverable which is prepared after Testing is completed.
 This document is to explain various details and activities about the Testing
performed for the Project, to the respective stakeholders like Senior
Management, Client etc.
Product development activity represented as Waterfall
Model
Overall Business
Requirements
Software
Requirements
High Level Design
Low Level Design
Coding
Testing
Testing
Phases of testing for different development phases
Overall Business
Requirements
Software
Requirements
High Level Design
Low Level Design
Coding Unit Testing
Component Testing
Integration Testing
System Testing
Acceptance Testing
The V Model
 Overall Business Requirement
 These requirements cover hardware, software, and operational requirements
 Software Requirements
 Next step is moving from overall requirements to software requirements.
 High Level Design
 Software system is imagined as a set of sub systems that work together
 Low Level Design
 High level design gets translated to a more detailed or low level design. In
this data structures, algorithms choices, table layouts, processing logics and
exception conditions etc are decided
 Coding
 Program code is written in appropriate languages
The V Model
 Unit Testing
 Coding produces several program units, each of these units have to be tested
independently before combining them to form components. The testing of
program units form the unit testing.
 Component Testing
 The components that are the outputs of low level design have to be tested
independently before being integrated. This type of testing is component
level testing.
The V Model
 Integration Testing
 High level design views the system as being made up of interoperating and
integrated subsystems. The individual subsystems should be integrated and
tested. This type of testing corresponds to integration testing.
 System Testing
 Before product deployment, the product tested as an entire unit to make sure
that all the software requirements are satisfied by the product. This testing of
entire software system is system testing.
 Acceptance Testing
 For overall business requirements, eventually whatever software is developed
should fit into and work in overall context and should be accepted by end
user. This testing is acceptance testing.
The V Model
 Planning of testing for different development phases
 Planning phase is not shown as a separate entity since it is common for all
testing phases.
 It is still not possible to execute any of these tests until the product is actually
built.
 In other words, the step called "testing" is now broken down into different
sub-steps.
 It is still the case that all the testing execution related activities are done only
at the end of the life cycle.
The V Model
 Who should design test
 Execution of the tests cannot be done till the product is built, but the design of
tests can be carried out much earlier.
 Skill sets required for designing each type of tests,
 The people who are actually performing the function of creating the
corresponding artifact.
 For example,
 Acceptance tests should be designed by those who formulate the overall
business requirements (the customers, where possible).
 Those should design the integration tests who know how the system is
broken into subsystems i.e. those who perform the high level design.
 Again, the people doing development know the innards of the program code
and thus are best equipped to design the unit tests.
The V Model
 Benefits of early design
 We achieve more parallelism and reduce the end-of-cycle time taken for testing.
 By designing tests for each activity upfront, we are building in better upfront
validation, thus again reducing last-minute surprises.
 Tests are designed by people with appropriate skill sets.
V-Model
Unit Testing
Component Testing
Integration Testing
System Testing
Acceptance Testing
Overall Business
Requirements
Acceptance Test
Design
Software
Requirements
System Testing
Design
High Level Design
Integration Test
Design
Low Level Design
Component Test
Design
Coding Unit Test Design
Verification Validation
V-Model
 Advantages of V- Model
 Testing activities like planning, test designing happens well before coding.
 This saves a lot of time hence higher chances of success over the waterfall
model.
 Proactive defect tracking – that is defects are found at early stage
 it avoids the downward flow of the defects.
 Dis-advantages of V-Model
 It is Very rigid and least flexible.
 No early prototypes of the software are produced.
 If any changes happen in midway, then the test documents along with
requirement documents has to be updated.
Modified V-Model
 In the V-Model there is an assumption
 Even the activity of test execution was split into execution of tests of different
types, the execution cannot happen until the entire product is built.
 For a given product, the different units and components can be in different
stages of evolution
 For example one unit may be in development and thus subject to unit testing
whereas another unit may be ready for component testing
 The V model does not explicitly address this parallelism commonly found in
the product development
Modified V-Model
 In the modified V Model,
 Each unit or component or module is given explicit exit criteria to pass on to
the subsequent stage
 The units or components or modules that satisfy a given phase of testing
move to the next phase of testing where possible.
 They do not wait for all the units or components or modules to move from
one phase of testing to another.
Modified V-Model
The Spiral Model
 It represents a risk-driven approach, i.e., the assessment of risks determines the
next project phase.
 The spiral model combines aspects of the waterfall model, the iterative
enhancement model, and prototyping.
 Each spiral cycle starts with the identification of the objectives of the product part
being elaborated (e.g., performance or functionality), the different alternatives of
implementing the product part (e.g., different designs or reusing existing
components), and the constraints for each of the identified alternatives (e.g., cost
or schedule).
 The next step evaluates the identified alternatives and identifies and resolves
risks that come with the different alternatives.
 During the third step, the development approach that best fits the risks is chosen.
 Finally, the next phases are planned, and the complete cycle is reviewed by the
stakeholders.
The Spiral Model
 Advantages
 The third step accommodates features of other process models as needed. The
spiral model is therefore very flexible.
 The explicit consideration of risks avoids many of the difficulties of other process
models.
 Unattractive alternatives are identified and eliminated early.
 Challenges
 It relies heavily on the organization’s expertise with respect to risk assessment –
therefore, a bad risk assessment may lead to the selection of bad alternatives or
development approaches
The Spiral Model
Agile Testing
 A software testing practice that follow the principle of agile software
development is called Agile Testing.
 Agile is an iterative methodology
 Requirements evolve through collaboration between the customer and self-
organizing teams.
 Agile aligns development with customer needs.
Agile Testing
Agile Testing
 Principles of Agile Testing
 Testing is not a phase:
 Agile team tests continuously and continuous testing is the only way to
ensure continuous progress.
 Testing moves the project forward:
 When following conventional method, testing is considered as quality gate
but agile testing provides feedback on an ongoing basis and the product
meets the business demand.
 Everyone Tests:
 In conventional SDLC only test teams tests, while in agile including developer
and business analyst test the application.
Agile Testing
 Shortening feedback response time:
 In conventional SDLC, the business team will get to know about the product
development during the acceptance testing while in agile for each and every
iteration they are involved.
 Continuous feedback shorten the feedback response time and the cost involved
in fixing is also less.

Lecture9 10.pptx or software testing pptx

  • 1.
    SOFTWARE TESTING By SabeenAmjad Click icon to add picture
  • 2.
    Outline  Formal Definitionof Software Testing  What is a Bug?  Test Case  Software Testing Life Cycle  Waterfall Model  V Model  Modified V Model  Spiral Model  Agile Model
  • 3.
    Software Testing  Testingis the process of executing a program with intention of finding errors.
  • 4.
    Formal Definition ofSoftware Testing  Software testing is a formal process carried out by a specialized testing team in which a software unit, several integrated software units or an entire software package are examined by running the programs on a computer. All the associated tests are performed according to approved test procedures on approved test cases.
  • 5.
    Formal Definition ofSoftware Testing  Formal:  Software test plans are part of project’s development and quality plans, scheduled in advance  It is often signed between developer and customer  Ad hoc examination by colleague and or regular checks by the programming team leader cannot be considered software tests
  • 6.
    Formal Definition ofSoftware Testing  Specialized testing team:  An independent team or external consultants who specialize in testing are assigned to perform these tasks to  Eliminate bias  Guarantee effective testing  Tests performed by the developers themselves will yield poor results  Unit tests continue to be performed by developers in many organizations
  • 7.
    Formal Definition ofSoftware Testing  Running the programs:  Any form of quality assurance activity that does not involve running the software, for example code inspection cannot be considered as a test.  Approved test procedures:  The testing process performed according to a test plan and testing procedures  These are approved SQA procedures adopted by the developing organizations  Approved test cases:  The test cases to be examined are defined in full by the test plan.  No omissions or additions are expected to occur during testing.
  • 8.
    What is aBug?  Informally, it is “what happens when software fails”, whether the failure was o Inconvenient o Catastrophic  Terms for software failure * Fault * Anomaly * Problem * Inconsistency * Failure * Incident * Error * Defect * Variance * Bug
  • 9.
    What is aBug?  Formally, we say that a software bug occurs when one or more of the following five rules is true: when the software o doesn't do something that the product specification says it should do. o does something that the product specification says it shouldn't do. o does something that the product specification doesn't mention. o doesn't do something that the product specification doesn't mention but should. o is difficult to understand, hard to use, slow, or will be viewed by the end user as just pain not right.
  • 10.
    Test Case  TestCase  A set of  Input values  Execution preconditions  Expected results  Execution post conditions  Developed for a particular objective or test condition, such as  To exercise a particular program path  To verify compliance with a specific requirement.
  • 11.
    Test Case Template Below are the standard fields of sample test case template  Test case ID:  Unique ID for each test case.  Follow some convention to indicate types of test. E.g. ‘TC_UI_1’ indicating ‘user interface test case #1’.  Product / Ver./ Module:  Mention product name.  Mention name of main module or sub module.  Mention version information of the product.  Test case Version: (Optional)  Mention the test case version number.  Use Case Reference(s):  Mention the use case reference for which the test case is written.
  • 12.
    Test Case Template GUI Reference(s) :(Optional)  Mention the GUI reference for which the test case is written.  QA Test Engineer / Test Designed By:  Name of the tester  Test Designed Date:  Date when test case is written.  Test Executed By:  Name of tester who executed this test.  To be filled after test execution.  Test Execution Date:  Date when test case is executed.
  • 13.
    Test Case Template Test Title/Name:  Test case title. E.g. verify login page with valid username and password.  Test Case Summary/Description:  Describe test objective.  Pre-Requisite/Pre-condition:  Any prerequisite that must be fulfilled before execution of this test case.  List all pre-conditions in order to successfully execute this test case.  Dependencies: (Optional)  Mention any dependencies on other test cases or test requirement.  Test Steps:  List all test execution steps in detail.  Write test steps in the order in which these should be executed.  Make sure to provide as much details as you can.
  • 14.
    Test Case Template Test Data/Input Specification:  Use of test data as an input for the test case.  You can provide different data sets with exact values to be used as an input.  Examples: If you’re testing Calculator, this may be as simple as 1+1.  If you’re testing cellular telephone switching software, there could be hundreds or thousands of input conditions.  If you’re testing a file-based product, it would be the name of the file and a description of its contents.  Expected Result/ Output Specification:  What should be the system output after test execution?  Describe the expected result in detail including message/error that should be displayed on screen.  Examples: Did 1+1 equal 2?  Were the thousands of output variables set correctly in the cell phone software?  Did all the contents of the file load as expected?
  • 15.
    Test Case Template Actual result:  Actual test result should be filled after test execution.  Describe system behaviour after test execution.  Status (Pass/Fail):  If actual result is not as per the expected result mark this test as failed.  Otherwise update as passed.  Notes/Comments/Questions: To support above fields if there are some special conditions which can’t be described in any of the above fields or there are questions related to the expected or actual results, mention those here.  Post-condition:  What should be the state of the system after executing this test case?
  • 16.
    Test Case Template Environmental needs: (Optional)  Environmental needs that are necessary to run the test case include:  Hardware  Software  Test tools  Facilities  Staff and so on.  Special procedural requirements: (Optional)  This section describes anything unusual that must be done to perform the test.  Example: Testing WordPad probably doesn’t need anything special, but testing nuclear power plant software might.
  • 17.
    Level of DetailFor Test Case  If you follow this level of documentation, you could be writing at least a page of descriptive text for each test case you identify.  Thousands of test cases could take thousands of pages of documentation.  The project could be outdated by the time you finish writing.  Many government projects and industries are required to document their test cases to this level.  In other cases, you can take some shortcuts.  Taking a shortcut doesn’t mean dismissing or neglecting important information.
  • 18.
    Level of DetailFor Test Case  You can use the following test case format for printer compatibility matrix  All the other information that goes with a test case are most likely common to all these cases and could be written once and attached to the table
  • 19.
    Project Initiation System Study SummaryReports Analysis Regression Test Report Defects Execute Test Cases ( manual /automated ) Design Test Cases Test Plan Testing Life Cycle Test Environment Setup 12/06/2024 01:27 PM
  • 20.
    Testing Life Cycle Project Initiation  All the necessary analysis is undertaken to allow the project to be planned.  System Study  To test, we need to know the product functionality(Understanding the product)  Test Plan  It is a Systematic approach to test a system or s/w.  Contains a detailed understanding of what the eventual testing workflow will be or should be.  Design Test Cases:  Test case is the specific procedure of testing a particular requirement by giving specific input to the system and defining the expected results.  Executing(Manual):  Usually Test cases will be written by person A and will be executed by person
  • 21.
    Testing Life Cycle Report defects:  Reporting defects in Issue Logger (Ex: JIRA, HP-QC)  Issues will be fixed by the DEV team.  Regression testing:  Verify whether the new functionality or bug correction affected the previous behaviour  Analysis:  Analysis about the Testing is Done here.  Test Summary Report  An important deliverable which is prepared after Testing is completed.  This document is to explain various details and activities about the Testing performed for the Project, to the respective stakeholders like Senior Management, Client etc.
  • 22.
    Product development activityrepresented as Waterfall Model Overall Business Requirements Software Requirements High Level Design Low Level Design Coding Testing
  • 23.
    Testing Phases of testingfor different development phases Overall Business Requirements Software Requirements High Level Design Low Level Design Coding Unit Testing Component Testing Integration Testing System Testing Acceptance Testing
  • 24.
    The V Model Overall Business Requirement  These requirements cover hardware, software, and operational requirements  Software Requirements  Next step is moving from overall requirements to software requirements.  High Level Design  Software system is imagined as a set of sub systems that work together  Low Level Design  High level design gets translated to a more detailed or low level design. In this data structures, algorithms choices, table layouts, processing logics and exception conditions etc are decided  Coding  Program code is written in appropriate languages
  • 25.
    The V Model Unit Testing  Coding produces several program units, each of these units have to be tested independently before combining them to form components. The testing of program units form the unit testing.  Component Testing  The components that are the outputs of low level design have to be tested independently before being integrated. This type of testing is component level testing.
  • 26.
    The V Model Integration Testing  High level design views the system as being made up of interoperating and integrated subsystems. The individual subsystems should be integrated and tested. This type of testing corresponds to integration testing.  System Testing  Before product deployment, the product tested as an entire unit to make sure that all the software requirements are satisfied by the product. This testing of entire software system is system testing.  Acceptance Testing  For overall business requirements, eventually whatever software is developed should fit into and work in overall context and should be accepted by end user. This testing is acceptance testing.
  • 27.
    The V Model Planning of testing for different development phases  Planning phase is not shown as a separate entity since it is common for all testing phases.  It is still not possible to execute any of these tests until the product is actually built.  In other words, the step called "testing" is now broken down into different sub-steps.  It is still the case that all the testing execution related activities are done only at the end of the life cycle.
  • 28.
    The V Model Who should design test  Execution of the tests cannot be done till the product is built, but the design of tests can be carried out much earlier.  Skill sets required for designing each type of tests,  The people who are actually performing the function of creating the corresponding artifact.  For example,  Acceptance tests should be designed by those who formulate the overall business requirements (the customers, where possible).  Those should design the integration tests who know how the system is broken into subsystems i.e. those who perform the high level design.  Again, the people doing development know the innards of the program code and thus are best equipped to design the unit tests.
  • 29.
    The V Model Benefits of early design  We achieve more parallelism and reduce the end-of-cycle time taken for testing.  By designing tests for each activity upfront, we are building in better upfront validation, thus again reducing last-minute surprises.  Tests are designed by people with appropriate skill sets.
  • 30.
    V-Model Unit Testing Component Testing IntegrationTesting System Testing Acceptance Testing Overall Business Requirements Acceptance Test Design Software Requirements System Testing Design High Level Design Integration Test Design Low Level Design Component Test Design Coding Unit Test Design Verification Validation
  • 31.
    V-Model  Advantages ofV- Model  Testing activities like planning, test designing happens well before coding.  This saves a lot of time hence higher chances of success over the waterfall model.  Proactive defect tracking – that is defects are found at early stage  it avoids the downward flow of the defects.  Dis-advantages of V-Model  It is Very rigid and least flexible.  No early prototypes of the software are produced.  If any changes happen in midway, then the test documents along with requirement documents has to be updated.
  • 32.
    Modified V-Model  Inthe V-Model there is an assumption  Even the activity of test execution was split into execution of tests of different types, the execution cannot happen until the entire product is built.  For a given product, the different units and components can be in different stages of evolution  For example one unit may be in development and thus subject to unit testing whereas another unit may be ready for component testing  The V model does not explicitly address this parallelism commonly found in the product development
  • 33.
    Modified V-Model  Inthe modified V Model,  Each unit or component or module is given explicit exit criteria to pass on to the subsequent stage  The units or components or modules that satisfy a given phase of testing move to the next phase of testing where possible.  They do not wait for all the units or components or modules to move from one phase of testing to another.
  • 34.
  • 35.
    The Spiral Model It represents a risk-driven approach, i.e., the assessment of risks determines the next project phase.  The spiral model combines aspects of the waterfall model, the iterative enhancement model, and prototyping.  Each spiral cycle starts with the identification of the objectives of the product part being elaborated (e.g., performance or functionality), the different alternatives of implementing the product part (e.g., different designs or reusing existing components), and the constraints for each of the identified alternatives (e.g., cost or schedule).  The next step evaluates the identified alternatives and identifies and resolves risks that come with the different alternatives.  During the third step, the development approach that best fits the risks is chosen.  Finally, the next phases are planned, and the complete cycle is reviewed by the stakeholders.
  • 36.
    The Spiral Model Advantages  The third step accommodates features of other process models as needed. The spiral model is therefore very flexible.  The explicit consideration of risks avoids many of the difficulties of other process models.  Unattractive alternatives are identified and eliminated early.  Challenges  It relies heavily on the organization’s expertise with respect to risk assessment – therefore, a bad risk assessment may lead to the selection of bad alternatives or development approaches
  • 37.
  • 38.
    Agile Testing  Asoftware testing practice that follow the principle of agile software development is called Agile Testing.  Agile is an iterative methodology  Requirements evolve through collaboration between the customer and self- organizing teams.  Agile aligns development with customer needs.
  • 39.
  • 40.
    Agile Testing  Principlesof Agile Testing  Testing is not a phase:  Agile team tests continuously and continuous testing is the only way to ensure continuous progress.  Testing moves the project forward:  When following conventional method, testing is considered as quality gate but agile testing provides feedback on an ongoing basis and the product meets the business demand.  Everyone Tests:  In conventional SDLC only test teams tests, while in agile including developer and business analyst test the application.
  • 41.
    Agile Testing  Shorteningfeedback response time:  In conventional SDLC, the business team will get to know about the product development during the acceptance testing while in agile for each and every iteration they are involved.  Continuous feedback shorten the feedback response time and the cost involved in fixing is also less.

Editor's Notes

  • #1 NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image.
  • #9 Product Specification: “an agreement among the software development team. It defines the product they are creating, detailing what it will be, how it will act, what it will do, and what it won't do”