SlideShare a Scribd company logo
1 of 61
B. Computer Sci. (SE) (Hons.)

CSEB233: Fundamentals
of Software Engineering
Software Verification and Validation
Objectives
īą
īą
īą

Discuss the fundamental concepts of software verification
and validation
Conduct software testing and determine when to stop
Describe several types of testing:
unit testing,
ī­ integration testing,
ī­ validation testing, and
ī­ system testing
ī­

īą
īą

Produce standard for software test documentation
Use a set of techniques for the creation of test cases that
meet overall testing objectives and the testing strategies
Software Verification &
Validation
Fundamental Concepts
Verification & Validation (1)
V & V must be applied at each framework activity in
the software process
īą Verification refers to the set of tasks that ensure that
software correctly implements a specific function
īą Validation refers to a different set of tasks that ensure
that the software that has been built is traceable to
customer requirements
īą Boehm states this another way:
īą

ī­ Verification:

"Are we building the product right?"
ī­ Validation: "Are we building the right product?”
Verification & Validation (2)
īą

V&V have two principal objectives:
ī­ Discover

defects in a system
ī­ Assess whether or not the system is useful and useable in
an operational situation
īą

V&V should establish confidence that the software is
fit for purpose
ī­ This

does NOT mean completely free of defects
ī­ Rather, it must be good enough for its intended use and the
type of use will determine the degree of confidence that is
needed
Verification & Validation (3)
â€ĸ V & V (SQA) activities include:
īą

SQA activities:
ī­

ī­
ī­
ī­
ī­

ī­
ī­
ī­

Technical reviews
Quality and configuration
audits
Performance monitoring
Simulation
Feasibility study
Documentation review
Database review
Algorithm analysis

īą

Testing activities:
ī­

ī­
ī­
ī­

Development testing
Qualification testing
Acceptance testing
Installation testing
Software Verification &
Validation
Software Testing
Software Testing
īą

īą

īą

The process of exercising
a program with the specific
intent of finding errors prior
to delivery to the end user
Must be planned carefully
to
avoid
wasting
development time and
resources, and conducted
systematically
What testing shows?
Who Tests the Software? (1)
īą

Developer
ī­ Understands

system
but, will test "gently“
ī­ Driven by "delivery"

īą

Independent Tester
ī­ Must

learn about the
system,
ī­ Will attempt to break it
ī­ Driven by quality
Who Tests the Software? (2)
īą

Misconceptions:
ī­ The

developer should do no testing at all
ī­ Software should be “tossed over the wall” to stranger
who will test it mercilessly
ī­ Testers are not involved with the project until it is time
for it to be tested
Who Tests the Software? (3)
īą

The developer and Independent Test Group (ITG)
must work together throughout the software project
to ensure that thorough tests will be conducted
ī­ An

ITG does not have the “conflict of interest” that the
software developer might experience
ī­ While testing is conducted, the developer must be
available to correct errors that are uncovered
Testing Strategy (1)
Identifies steps to be undertaken; when these steps
are undertaken; how much effort; time; and resources
required.
īą Any testing strategy must incorporate:
īą

ī­ Test

planning
ī­ Test case design
ī­ Test execution
ī­ Resultant data collection and evaluation
īą

Should provide guidance for the practitioners and a
set of milestones for the manager
Testing Strategy (2)
īą

Characteristics of software testing strategies
proposed in the literature:
ī­ To

perform effective testing, you should conduct
effective technical reviews.
īĩ

By doing this, many errors will be eliminated before testing
commences.

ī­ Testing

begins at the component level and works
“outward” toward the integration of the entire computerbased system
Testing Strategy (3)
ī­ Different

testing techniques are appropriate for different
software engineering approaches and at different
points in time.
ī­ Testing is conducted by the developer of the software
and (for large projects) an independent test group.
ī­ Testing and debugging are different activities, but
debugging must be accommodated in any testing
strategy.
Overall Software Testing Strategy
Maybe viewed in the context of the spiral
īą Begins by „testing-in-the-small‟ and move toward
„testing-in-the-large‟
īą
Overall Software Testing Strategy
īą

Unit Testing
ī­ focuses

on each unit of the software
(e.g., component, module, class) as implemented in
source code

īą

Integration Testing
ī­ focuses

on issues associated with verification and
program construction as components begin interacting
with one another
Overall Software Testing Strategy
īą

Validation Testing
ī­ provides

assurance that the software validation criteria
(established during requirements analysis) meets all
functional, behavioral, and performance requirements

īą

System Testing
ī­ verifies

that all system elements mesh properly and
that overall system function and performance has been
achieved
When to Stop Testing?
īą

Testing is potentially endless
ī­ We

cannot test until all the defects are unearthed and
removed – which is impossible

īą

At some point, we have to stop testing and ship the
software
ī­ The

question is, When?

Realistically, testing is
budget, time and quality
īą It is driven by profit models
īą

a

trade-off

between
(Pan, 1999)
When to Stop Testing?
The pessimistic, and unfortunately most often used
approach is to stop testing whenever some, or any
of the allocated resources - time, budget, or test
cases - are exhausted
īą The optimistic stopping rule is to stop testing when
either reliability meets the requirement, or the
benefit from continuing testing cannot justify the
testing cost
īą
Software Verification &
Validation
Types of Test
Unit Testing
īą

Focuses on assessing:
ī­ internal

processing logic and data structures within the
boundaries of a component (module).
ī­ proper information flow of module interfaces.
ī­ local data to ensure that integrity is maintained.
ī­ boundary conditions.
ī­ basis (independent) path.
ī­ all error handling paths.
īą

If resources are scarce to do comprehensive unit
testing, select critical or complex modules and unit test
these only
Unit Testing
Integration Testing
After unit testing of individual modules, they are
combined together into a system
īą Question commonly asked once all modules have
been unit tested:
īą

ī­ “If

they work individually, why do you doubt that they‟ll work
when we put them together?”

īą

The problem is “putting them together” – interfacing
ī­ Data

can be lost across an interface
ī­ Global data structures can present problems
ī­ Subfunctions, when combined, may not produce the
desired function
Integration Testing
īą

Incremental integration testing strategies:
ī­ Bottom-up

integration
ī­ Top-down integration
ī­ Regression testing
ī­ Smoke testing
Bottom-up Integration
īą

An approach where the lowest level modules are
tested first, then used to facilitate the testing of
higher level modules
ī­ The

process is repeated until the module at the top of
the hierarchy is tested
ī­ Top level modules are the most important yet tested
last
īą

Is helpful only when all or most of the modules of
the same development level are ready
Bottom-up Integration
The steps:
īą Test D, E individually
īą Using a dummy program - „Driver‟
īą Low-level components are combined into clusters that perform a
specific software function.
ī­

īą
īą

Test C such that it call D/E - If an
error occurs we know that the
problem is in C or in the
interface between C and D/E

The cluster is tested
Drivers are removed and clusters
are combined moving upward in
the program structure
Top-down Integration
The steps:
īą Main/top module used as a test driver and stubs are substitutes
for modules directly subordinate to it.
īą Subordinate stubs are replaced one at a time with real modules
(following the depth-first or breadth-first approach).
īą Tests are conducted as each module is integrated.
īą On completion of each set of tests and other stub is replaced with
a real module.
īą Regression testing may be used to ensure that new errors not
introduced.
īą The process continues from 2nd step until the entire program
structure is built
Top-down Integration
Example steps:
īą Test A individually (use stubs for
other modules)
īą Depending on the integration
approach selected, subordinate
stubs are replaced one at a time
with actual components
ī­

īą

Test A such that it calls B (use
stub for other modules)
ī­

īą

In a „depth-first‟ structure:
If an error occurs we know that
the problem is in B or in the
interface between A and B

Replace stubs one at a
time, „depth-first‟ and re-run tests
Regression Testing (1)
īą

Focuses on retesting after changes are made
ī­ Whenever

software is corrected, some aspects of the
software configuration is changed
īĩ

e.g., the program, its documentation, or the data that
support it

ī­ Regression

testing helps to ensure that changes - due
to testing or for other reasons - do not introduce
unintended behavior or additional errors
Regression Testing (2)
In traditional regression testing, we reuse the same
tests
īą In risk-oriented regression testing, we test the
same areas as before, but we use different
(increasingly complex) tests
īą Regression testing may be conducted manually, by
re-executing a subset of all test cases or using
automated capture/playback tools
īą
Smoke Testing (1)
A common approach for creating “daily builds” for
product software
īą Software components that have been translated into
code are integrated into a “build”
īą A build includes all data files, libraries, reusable
modules, and engineered components that are
required to implement one or more product functions
īą A series of tests is designed to expose errors that will
keep the build from properly performing its function
īą
Smoke Testing (2)
The intent should be to uncover “show stopper”
errors that have the highest likelihood of throwing
the software project behind schedule
īą The build is integrated with other builds and the
entire product (in its current form) is smoke tested
daily
īą The integration approach may be top down or
bottom up
īą
Validation Testing (1)
Focuses on uncovering errors at the software
requirements level.
īą SRS might contain a „Validation Criteria‟ that forms
the basis for a validation-testing approach
īą
Validation Testing (2)
īą

Validation-Test Criteria:
ī­ all

functional requirements are satisfied
ī­ all behavior characteristics are achieved
ī­ all content is accurate and properly presented
ī­ all
performance
requirements
attained, documentation is correct, and
ī­ usability and other requirements are met

are
Validation Testing (3)
īą

An important element of the validation process is a
configuration review/audit
ī­ Ensure

that all elements of the software configuration
have been properly developed, are cataloged, and
have the necessary detail to strengthen the support
activities.
Validation Testing (4)
īą

A series of acceptance tests are conducted to enable
the customer to validate all requirements
ī­ To

make sure the software works correctly for intended
user in his or her normal work environment
ī­ Alpha test
īĩ

Version of the complete software is tested by customer under the
supervision of the developer at the developer‟s site

ī­ Beta
īĩ

test

Version of the complete software is tested by customer at his or
her own site without the developer being present
System Testing (1)
A series of different tests to verify that system
elements have been properly integrated and
perform allocated functions.
īą Types of system tests:
īą

ī­ Recovery

Testing
ī­ Security Testing
ī­ Stress Testing
ī­ Performance Testing
ī­ Deployment Testing
System Testing (2)
īą

Recovery Testing
ī­ forces

the software to fail in a variety of ways and
verifies that recovery is properly performed

īą

Security Testing
ī­ verifies

that protection mechanisms built into a system
will, in fact, protect it from improper penetration

īą

Stress Testing
ī­ executes

a system in a manner that demands
resources in abnormal quantity, frequency, or volume
System Testing (3)
īą

Performance Testing
ī­ test

the run-time performance of software within the
context of an integrated system

īą

Deployment Testing
ī­ examines

all installation procedures and specialized
installation software that will be used by customers
ī­ all documentation that will be used to introduce the
software to end users
Software Verification &
Validation
Software Test Documentation
Software Test Documentation (1)
IEEE 829 2008 Standard for Software Test
Documentation
īą IEEE standard that
specifies the form of a
set of documents for
use in eight defined
stages of software
testing
īą

īą

The documents are:
ī­ Test

Plan
ī­ Test Design Specification
ī­ Test Case Specification
ī­ Test Procedure
Specification
ī­ Test Item Transmittal
Report
ī­ Test Log
ī­ Test Incident Report
ī­ Test Summary Report
Software Test Documentation (2)
īą

Test Plan - A management planning document that
shows:
ī­ How
īĩ

the testing will be done

including System Under Test (SUT) configurations.

ī­ Who

will do it
ī­ What will be tested
ī­ How long it will take - may vary, depending upon resource
availability
ī­ What the test coverage will be, i.e. what quality level is
required
Software Test Documentation (3)
īą

Test Design Specification:
ī­ detailing

test conditions and the expected results as
well as test pass criteria.

īą

Test Procedure Specification:
ī­ detailing

how to run each test, including any set-up
preconditions and the steps that need to be followed
Software Test Documentation (4)
īą

Test Item Transmittal Report:
ī­

īą

reporting on when tested software components have
progressed from one stage of testing to the next

Test Log:
ī­ recording

which tests cases were run, who ran them, in
what order, and whether each test passed or failed

īą

Test Incident Report:
ī­ detailing,

for any test that failed, the actual versus expected
result, and other information intended to throw light on why
a test has failed.
Software Test Documentation (5)
īą

Test Summary Report:
ī­A

management report providing any important information
uncovered by the tests accomplished, and including
assessments of the quality of the testing effort, the quality
of the software system under test, and statistics derived
from Incident Reports
ī­ The report also records what testing was done and how
long it took, in order to improve any future test planning
ī­ This final document is used to indicate whether the
software system under test is fit for purpose according to
whether or not it has met acceptance criteria defined by the
project stakeholders
Software Verification &
Validation
Creating Test Cases
Test-case Design (1)
Focuses on a set of techniques for the creation of test
cases that meet overall testing objectives and the
testing strategies
īą These techniques provide a systematic guidance for
designing tests that
īą

ī­ Exercise

the internal logic and interfaces of every software
component/module
ī­ Exercise the input and output domains of the program to
uncover errors in program function, behaviour, and
performance
Test-case Design (2)
â€ĸ For conventional application, software is tested from two
perspectives:
īą

White-box‟ testing
ī­
ī­

ī­

Focus on the program control
structure (internal program logic)
Test cases are derived to ensure
that all statements in the
program have been executed at
least once during testing and all
logical conditions have been
exercised
Performed early in the testing
process

īą

„Black-box‟ testing
Examines some
fundamental aspect of a
system with little regard for
the internal logical
structure of the software
ī­ Performed during later
stages of testing
ī­
White-box Testing (1)
īą

Using white-box testing method, you may derive testcases that:
ī­ Guarantee

that al independent paths within a module have
been exercised at least once
ī­ Exercise all logical decisions on their true and false sides
ī­ Execute all loops at their boundaries and within their
operational bounds
ī­ Exercise internal data structures to ensure their validity
īą

Example method: basis path testing
White-box Testing (2)
īą

Basis path testing:
ī­ Test

cases derived to
exercise the basis set
are guaranteed to execute every statement in
the program at least
once during testing
Deriving Test Cases (1)
īą

Steps to derive the test cases by applying the basis
path testing method:
ī­ Using

the design or code, draw a corresponding flow graph.

The flow graph depicts logical control flow using the notation
illustrated in next slide.
īĩ Refer Figure 18.2 in page 486 - comparison between a flowchart
and a flow graph
īĩ

ī­ Calculate

the Cyclometic Complexity V(G) of the flow graph
ī­ Determine a basis set of independent paths
ī­ Prepare test cases that will force execution of each path in
the basis set
Deriving Test Cases (2)
īą

Flow graph notation:
UNTIL

Sequence

IF

WHILE

CASE
Drawing Flow Graph: Example
void foo (float y, float a *, int n)
{
float x = sin (y) ;
if (x > 0.01) 1
z = tan (x) ; 2
else
z = cos (x) ; 3
for (int i = 0 ; i < x ; + + i) 5
{
a[i] = a[i] * z ; 6
Cout < < a [i]; 7
}
} 8

Predicate
nodes

1
2

Predicate
nodes

R1

4

5

3

R3
R2

6

8

7
Deriving Test Cases (3)
īą

īą

The arrows on the flow
graph, called edges or
links, represent flow of
control and are analogous
to flowchart arrows
Area bounded by edges
and nodes are called
regions
ī­

When counting regions, we
include the area outside
the graph as region
Deriving Test Cases: Example
Step 1: Draw a flow graph
Deriving Test Cases: Example
Step 2: Calculate the Cyclomatic complexity, V(G)
īą Cyclomatic complexity can be used to count the minimum
number of independent paths.
īą A number of industry studies have indicated that the higher
V(G), the higher the probability or errors.
īą The SEI provides the following basic risk assessment based
on the value of code:
Cyclomatic Complexity Risk Evaluation
1 to 10

A simple program, without very much risk

11 to 20

A more complex program, moderate risk

21 to 50

A complex, high risk program

> 50

An un-testable program (very high risk)
Deriving Test Cases: Example
īą

Ways to calculate V(G):
ī­ V(G)

= the number of regions of the flow graph.
ī­ V(G) = E – N + 2 ( Where “E” are edges & “N” are nodes)
ī­ V(G) = P + 1 (Where P is the predicate nodes in the flow
graph, each node that contain a condition)
īą

Example:
ī­ V(G)

= Number of regions = 4
ī­ V(G) = E – N + 2 = 16 – 14 + 2 = 4
ī­ V(G) = P + 1 = 3 + 1 = 4
Deriving Test Cases: Example 1
īą
īą
īą
īą
īą
īą
īą
īą

Step 3: Determine a basis set of independent paths
Path 1: 1, 2, 3, 4, 5, 6, 7, 8, 12
Path 2: 1, 2, 3, 12
Path 3: 1, 2, 3, 4, 5, 9, 10, 3, â€Ļ
Path 4: 1, 2, 3, 4, 5, 9, 11, 3, â€Ļ
Step 4: Prepare test cases
Test cases should be derived so that all of these paths are
executed
A dynamic program analyser may be used to check that
paths have been executed
Summary (1)
Software testing plays an extremely important role
in V&V, but many other SQA activities are also
necessary
īą Testing must be planned carefully to avoid wasting
development time and resources, and conducted
systematically
īą The developer and ITG must work together
throughout the software project to ensure that
thorough tests will be conducted
īą
Summary (2)
The software testing strategy is to begins by „testingin-the-small‟ and move toward „testing-in-the-large‟
īą The IEEE 829.2009 standard specifies a set of
documents for use in eight defined stages of software
testing
īą The „white-box‟ and „black-box‟ techniques provide a
systematic guidance for designing test cases
īą We need to know when is the right time to stop testing
īą
THE END
Copyright Š 2013
Mohd. Sharifuddin Ahmad, PhD
College of Information Technology

More Related Content

What's hot

Types of software testing
Types of software testingTypes of software testing
Types of software testingTestbytes
 
Requirements Based Testing
Requirements Based TestingRequirements Based Testing
Requirements Based TestingSSA KPI
 
Smoke Testing
Smoke TestingSmoke Testing
Smoke TestingKanoah
 
Importance of a Test Management Tool for Your Project
Importance of a Test Management Tool for Your ProjectImportance of a Test Management Tool for Your Project
Importance of a Test Management Tool for Your ProjectSarah Elson
 
Softwaretestingstrategies
SoftwaretestingstrategiesSoftwaretestingstrategies
Softwaretestingstrategiessaieswar19
 
Software Testing - Introduction
Software Testing - IntroductionSoftware Testing - Introduction
Software Testing - IntroductionAjeng Savitri
 
12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluating12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluatingMike Cusack
 
Integration testing
Integration testingIntegration testing
Integration testingTechversant
 
Software Testing Life Cycle
Software Testing Life CycleSoftware Testing Life Cycle
Software Testing Life Cyclegueste730d5
 
Types of Software Testing: Definition, Objectives and Advantages
Types of Software Testing: Definition, Objectives and AdvantagesTypes of Software Testing: Definition, Objectives and Advantages
Types of Software Testing: Definition, Objectives and AdvantagesSimform
 
Testing terms & definitions
Testing terms & definitionsTesting terms & definitions
Testing terms & definitionsSachin MK
 
software testing aplikasi
software testing aplikasisoftware testing aplikasi
software testing aplikasicecep2502
 
Software engineering- system testing
Software engineering- system testingSoftware engineering- system testing
Software engineering- system testingTejas Mhaske
 

What's hot (18)

Types of software testing
Types of software testingTypes of software testing
Types of software testing
 
Requirements Based Testing
Requirements Based TestingRequirements Based Testing
Requirements Based Testing
 
Smoke Testing
Smoke TestingSmoke Testing
Smoke Testing
 
Importance of a Test Management Tool for Your Project
Importance of a Test Management Tool for Your ProjectImportance of a Test Management Tool for Your Project
Importance of a Test Management Tool for Your Project
 
Sdlc
SdlcSdlc
Sdlc
 
Softwaretestingstrategies
SoftwaretestingstrategiesSoftwaretestingstrategies
Softwaretestingstrategies
 
Software Testing - Introduction
Software Testing - IntroductionSoftware Testing - Introduction
Software Testing - Introduction
 
Software testing
Software testingSoftware testing
Software testing
 
12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluating12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluating
 
Integration testing
Integration testingIntegration testing
Integration testing
 
Software testing
Software testingSoftware testing
Software testing
 
SOFTWARE TESTING
SOFTWARE TESTINGSOFTWARE TESTING
SOFTWARE TESTING
 
Software Testing Life Cycle
Software Testing Life CycleSoftware Testing Life Cycle
Software Testing Life Cycle
 
Types of Software Testing: Definition, Objectives and Advantages
Types of Software Testing: Definition, Objectives and AdvantagesTypes of Software Testing: Definition, Objectives and Advantages
Types of Software Testing: Definition, Objectives and Advantages
 
Testing terms & definitions
Testing terms & definitionsTesting terms & definitions
Testing terms & definitions
 
Testing Plan
Testing PlanTesting Plan
Testing Plan
 
software testing aplikasi
software testing aplikasisoftware testing aplikasi
software testing aplikasi
 
Software engineering- system testing
Software engineering- system testingSoftware engineering- system testing
Software engineering- system testing
 

Viewers also liked

09 fse qualitymanagement
09 fse qualitymanagement09 fse qualitymanagement
09 fse qualitymanagementMohesh Chandran
 
02 fse processmodels
02 fse processmodels02 fse processmodels
02 fse processmodelsMohesh Chandran
 
04 fse understandingrequirements
04 fse understandingrequirements04 fse understandingrequirements
04 fse understandingrequirementsMohesh Chandran
 
01 fse software&sw-engineering
01 fse software&sw-engineering01 fse software&sw-engineering
01 fse software&sw-engineeringMohesh Chandran
 
07 fse implementation
07 fse implementation07 fse implementation
07 fse implementationMohesh Chandran
 
05 fse requirementsengineering
05 fse requirementsengineering05 fse requirementsengineering
05 fse requirementsengineeringMohesh Chandran
 
03 fse agiledevelopment
03 fse agiledevelopment03 fse agiledevelopment
03 fse agiledevelopmentMohesh Chandran
 
Software Engineering Fundamentals - Svetlin Nakov
Software Engineering Fundamentals - Svetlin NakovSoftware Engineering Fundamentals - Svetlin Nakov
Software Engineering Fundamentals - Svetlin NakovSvetlin Nakov
 
Software design methodologies
Software design methodologiesSoftware design methodologies
Software design methodologiesDr. C.V. Suresh Babu
 

Viewers also liked (10)

09 fse qualitymanagement
09 fse qualitymanagement09 fse qualitymanagement
09 fse qualitymanagement
 
02 fse processmodels
02 fse processmodels02 fse processmodels
02 fse processmodels
 
06 fse design
06 fse design06 fse design
06 fse design
 
04 fse understandingrequirements
04 fse understandingrequirements04 fse understandingrequirements
04 fse understandingrequirements
 
01 fse software&sw-engineering
01 fse software&sw-engineering01 fse software&sw-engineering
01 fse software&sw-engineering
 
07 fse implementation
07 fse implementation07 fse implementation
07 fse implementation
 
05 fse requirementsengineering
05 fse requirementsengineering05 fse requirementsengineering
05 fse requirementsengineering
 
03 fse agiledevelopment
03 fse agiledevelopment03 fse agiledevelopment
03 fse agiledevelopment
 
Software Engineering Fundamentals - Svetlin Nakov
Software Engineering Fundamentals - Svetlin NakovSoftware Engineering Fundamentals - Svetlin Nakov
Software Engineering Fundamentals - Svetlin Nakov
 
Software design methodologies
Software design methodologiesSoftware design methodologies
Software design methodologies
 

Similar to 08 fse verification

Software Testing Strategies ,Validation Testing and System Testing.
Software Testing Strategies ,Validation Testing and System Testing.Software Testing Strategies ,Validation Testing and System Testing.
Software Testing Strategies ,Validation Testing and System Testing.Tanzeem Aslam
 
Software testing
Software testingSoftware testing
Software testingAshu Bansal
 
Object Oriented Testing
Object Oriented TestingObject Oriented Testing
Object Oriented TestingAMITJain879
 
Software testing2
Software testing2Software testing2
Software testing2suneeth kumar
 
Software testing
Software testingSoftware testing
Software testingsuneeth kumar
 
software testing strategies
software testing strategiessoftware testing strategies
software testing strategiesHemanth Gajula
 
unit 4.pptx very needful and important p
unit 4.pptx very needful and important punit 4.pptx very needful and important p
unit 4.pptx very needful and important p20EC040
 
Chapter 9 Testing Strategies.ppt
Chapter 9 Testing Strategies.pptChapter 9 Testing Strategies.ppt
Chapter 9 Testing Strategies.pptVijayaPratapReddyM
 
Software Testing
Software TestingSoftware Testing
Software TestingMousmi Pawar
 
Testing
Testing Testing
Testing poojadatt
 
What is integration testing
What is integration testingWhat is integration testing
What is integration testingTestingXperts
 
Testing strategies in Software Engineering
Testing strategies in Software EngineeringTesting strategies in Software Engineering
Testing strategies in Software EngineeringMuhammadTalha436
 
CTFL Module 02
CTFL Module 02CTFL Module 02
CTFL Module 02Davis Thomas
 
Sftwre engg.testng
Sftwre engg.testngSftwre engg.testng
Sftwre engg.testngkanika20071990
 

Similar to 08 fse verification (20)

Software testing
Software testingSoftware testing
Software testing
 
Software Testing Strategies ,Validation Testing and System Testing.
Software Testing Strategies ,Validation Testing and System Testing.Software Testing Strategies ,Validation Testing and System Testing.
Software Testing Strategies ,Validation Testing and System Testing.
 
Software testing
Software testingSoftware testing
Software testing
 
Object Oriented Testing
Object Oriented TestingObject Oriented Testing
Object Oriented Testing
 
Software test proposal
Software test proposalSoftware test proposal
Software test proposal
 
Software testing2
Software testing2Software testing2
Software testing2
 
Software testing
Software testingSoftware testing
Software testing
 
Software testing
Software testingSoftware testing
Software testing
 
Software testing
Software testingSoftware testing
Software testing
 
Software testing
Software testingSoftware testing
Software testing
 
software testing strategies
software testing strategiessoftware testing strategies
software testing strategies
 
Testing strategies
Testing strategiesTesting strategies
Testing strategies
 
unit 4.pptx very needful and important p
unit 4.pptx very needful and important punit 4.pptx very needful and important p
unit 4.pptx very needful and important p
 
Chapter 9 Testing Strategies.ppt
Chapter 9 Testing Strategies.pptChapter 9 Testing Strategies.ppt
Chapter 9 Testing Strategies.ppt
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
Testing
Testing Testing
Testing
 
What is integration testing
What is integration testingWhat is integration testing
What is integration testing
 
Testing strategies in Software Engineering
Testing strategies in Software EngineeringTesting strategies in Software Engineering
Testing strategies in Software Engineering
 
CTFL Module 02
CTFL Module 02CTFL Module 02
CTFL Module 02
 
Sftwre engg.testng
Sftwre engg.testngSftwre engg.testng
Sftwre engg.testng
 

Recently uploaded

How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
PSYCHIATRIC History collection FORMAT.pptx
PSYCHIATRIC   History collection FORMAT.pptxPSYCHIATRIC   History collection FORMAT.pptx
PSYCHIATRIC History collection FORMAT.pptxPoojaSen20
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppCeline George
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 

Recently uploaded (20)

How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
PSYCHIATRIC History collection FORMAT.pptx
PSYCHIATRIC   History collection FORMAT.pptxPSYCHIATRIC   History collection FORMAT.pptx
PSYCHIATRIC History collection FORMAT.pptx
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website App
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 

08 fse verification

  • 1. B. Computer Sci. (SE) (Hons.) CSEB233: Fundamentals of Software Engineering Software Verification and Validation
  • 2. Objectives īą īą īą Discuss the fundamental concepts of software verification and validation Conduct software testing and determine when to stop Describe several types of testing: unit testing, ī­ integration testing, ī­ validation testing, and ī­ system testing ī­ īą īą Produce standard for software test documentation Use a set of techniques for the creation of test cases that meet overall testing objectives and the testing strategies
  • 4. Verification & Validation (1) V & V must be applied at each framework activity in the software process īą Verification refers to the set of tasks that ensure that software correctly implements a specific function īą Validation refers to a different set of tasks that ensure that the software that has been built is traceable to customer requirements īą Boehm states this another way: īą ī­ Verification: "Are we building the product right?" ī­ Validation: "Are we building the right product?”
  • 5. Verification & Validation (2) īą V&V have two principal objectives: ī­ Discover defects in a system ī­ Assess whether or not the system is useful and useable in an operational situation īą V&V should establish confidence that the software is fit for purpose ī­ This does NOT mean completely free of defects ī­ Rather, it must be good enough for its intended use and the type of use will determine the degree of confidence that is needed
  • 6. Verification & Validation (3) â€ĸ V & V (SQA) activities include: īą SQA activities: ī­ ī­ ī­ ī­ ī­ ī­ ī­ ī­ Technical reviews Quality and configuration audits Performance monitoring Simulation Feasibility study Documentation review Database review Algorithm analysis īą Testing activities: ī­ ī­ ī­ ī­ Development testing Qualification testing Acceptance testing Installation testing
  • 8. Software Testing īą īą īą The process of exercising a program with the specific intent of finding errors prior to delivery to the end user Must be planned carefully to avoid wasting development time and resources, and conducted systematically What testing shows?
  • 9. Who Tests the Software? (1) īą Developer ī­ Understands system but, will test "gently“ ī­ Driven by "delivery" īą Independent Tester ī­ Must learn about the system, ī­ Will attempt to break it ī­ Driven by quality
  • 10. Who Tests the Software? (2) īą Misconceptions: ī­ The developer should do no testing at all ī­ Software should be “tossed over the wall” to stranger who will test it mercilessly ī­ Testers are not involved with the project until it is time for it to be tested
  • 11. Who Tests the Software? (3) īą The developer and Independent Test Group (ITG) must work together throughout the software project to ensure that thorough tests will be conducted ī­ An ITG does not have the “conflict of interest” that the software developer might experience ī­ While testing is conducted, the developer must be available to correct errors that are uncovered
  • 12. Testing Strategy (1) Identifies steps to be undertaken; when these steps are undertaken; how much effort; time; and resources required. īą Any testing strategy must incorporate: īą ī­ Test planning ī­ Test case design ī­ Test execution ī­ Resultant data collection and evaluation īą Should provide guidance for the practitioners and a set of milestones for the manager
  • 13. Testing Strategy (2) īą Characteristics of software testing strategies proposed in the literature: ī­ To perform effective testing, you should conduct effective technical reviews. īĩ By doing this, many errors will be eliminated before testing commences. ī­ Testing begins at the component level and works “outward” toward the integration of the entire computerbased system
  • 14. Testing Strategy (3) ī­ Different testing techniques are appropriate for different software engineering approaches and at different points in time. ī­ Testing is conducted by the developer of the software and (for large projects) an independent test group. ī­ Testing and debugging are different activities, but debugging must be accommodated in any testing strategy.
  • 15. Overall Software Testing Strategy Maybe viewed in the context of the spiral īą Begins by „testing-in-the-small‟ and move toward „testing-in-the-large‟ īą
  • 16. Overall Software Testing Strategy īą Unit Testing ī­ focuses on each unit of the software (e.g., component, module, class) as implemented in source code īą Integration Testing ī­ focuses on issues associated with verification and program construction as components begin interacting with one another
  • 17. Overall Software Testing Strategy īą Validation Testing ī­ provides assurance that the software validation criteria (established during requirements analysis) meets all functional, behavioral, and performance requirements īą System Testing ī­ verifies that all system elements mesh properly and that overall system function and performance has been achieved
  • 18. When to Stop Testing? īą Testing is potentially endless ī­ We cannot test until all the defects are unearthed and removed – which is impossible īą At some point, we have to stop testing and ship the software ī­ The question is, When? Realistically, testing is budget, time and quality īą It is driven by profit models īą a trade-off between (Pan, 1999)
  • 19. When to Stop Testing? The pessimistic, and unfortunately most often used approach is to stop testing whenever some, or any of the allocated resources - time, budget, or test cases - are exhausted īą The optimistic stopping rule is to stop testing when either reliability meets the requirement, or the benefit from continuing testing cannot justify the testing cost īą
  • 21. Unit Testing īą Focuses on assessing: ī­ internal processing logic and data structures within the boundaries of a component (module). ī­ proper information flow of module interfaces. ī­ local data to ensure that integrity is maintained. ī­ boundary conditions. ī­ basis (independent) path. ī­ all error handling paths. īą If resources are scarce to do comprehensive unit testing, select critical or complex modules and unit test these only
  • 23. Integration Testing After unit testing of individual modules, they are combined together into a system īą Question commonly asked once all modules have been unit tested: īą ī­ “If they work individually, why do you doubt that they‟ll work when we put them together?” īą The problem is “putting them together” – interfacing ī­ Data can be lost across an interface ī­ Global data structures can present problems ī­ Subfunctions, when combined, may not produce the desired function
  • 24. Integration Testing īą Incremental integration testing strategies: ī­ Bottom-up integration ī­ Top-down integration ī­ Regression testing ī­ Smoke testing
  • 25. Bottom-up Integration īą An approach where the lowest level modules are tested first, then used to facilitate the testing of higher level modules ī­ The process is repeated until the module at the top of the hierarchy is tested ī­ Top level modules are the most important yet tested last īą Is helpful only when all or most of the modules of the same development level are ready
  • 26. Bottom-up Integration The steps: īą Test D, E individually īą Using a dummy program - „Driver‟ īą Low-level components are combined into clusters that perform a specific software function. ī­ īą īą Test C such that it call D/E - If an error occurs we know that the problem is in C or in the interface between C and D/E The cluster is tested Drivers are removed and clusters are combined moving upward in the program structure
  • 27. Top-down Integration The steps: īą Main/top module used as a test driver and stubs are substitutes for modules directly subordinate to it. īą Subordinate stubs are replaced one at a time with real modules (following the depth-first or breadth-first approach). īą Tests are conducted as each module is integrated. īą On completion of each set of tests and other stub is replaced with a real module. īą Regression testing may be used to ensure that new errors not introduced. īą The process continues from 2nd step until the entire program structure is built
  • 28. Top-down Integration Example steps: īą Test A individually (use stubs for other modules) īą Depending on the integration approach selected, subordinate stubs are replaced one at a time with actual components ī­ īą Test A such that it calls B (use stub for other modules) ī­ īą In a „depth-first‟ structure: If an error occurs we know that the problem is in B or in the interface between A and B Replace stubs one at a time, „depth-first‟ and re-run tests
  • 29. Regression Testing (1) īą Focuses on retesting after changes are made ī­ Whenever software is corrected, some aspects of the software configuration is changed īĩ e.g., the program, its documentation, or the data that support it ī­ Regression testing helps to ensure that changes - due to testing or for other reasons - do not introduce unintended behavior or additional errors
  • 30. Regression Testing (2) In traditional regression testing, we reuse the same tests īą In risk-oriented regression testing, we test the same areas as before, but we use different (increasingly complex) tests īą Regression testing may be conducted manually, by re-executing a subset of all test cases or using automated capture/playback tools īą
  • 31. Smoke Testing (1) A common approach for creating “daily builds” for product software īą Software components that have been translated into code are integrated into a “build” īą A build includes all data files, libraries, reusable modules, and engineered components that are required to implement one or more product functions īą A series of tests is designed to expose errors that will keep the build from properly performing its function īą
  • 32. Smoke Testing (2) The intent should be to uncover “show stopper” errors that have the highest likelihood of throwing the software project behind schedule īą The build is integrated with other builds and the entire product (in its current form) is smoke tested daily īą The integration approach may be top down or bottom up īą
  • 33. Validation Testing (1) Focuses on uncovering errors at the software requirements level. īą SRS might contain a „Validation Criteria‟ that forms the basis for a validation-testing approach īą
  • 34. Validation Testing (2) īą Validation-Test Criteria: ī­ all functional requirements are satisfied ī­ all behavior characteristics are achieved ī­ all content is accurate and properly presented ī­ all performance requirements attained, documentation is correct, and ī­ usability and other requirements are met are
  • 35. Validation Testing (3) īą An important element of the validation process is a configuration review/audit ī­ Ensure that all elements of the software configuration have been properly developed, are cataloged, and have the necessary detail to strengthen the support activities.
  • 36. Validation Testing (4) īą A series of acceptance tests are conducted to enable the customer to validate all requirements ī­ To make sure the software works correctly for intended user in his or her normal work environment ī­ Alpha test īĩ Version of the complete software is tested by customer under the supervision of the developer at the developer‟s site ī­ Beta īĩ test Version of the complete software is tested by customer at his or her own site without the developer being present
  • 37. System Testing (1) A series of different tests to verify that system elements have been properly integrated and perform allocated functions. īą Types of system tests: īą ī­ Recovery Testing ī­ Security Testing ī­ Stress Testing ī­ Performance Testing ī­ Deployment Testing
  • 38. System Testing (2) īą Recovery Testing ī­ forces the software to fail in a variety of ways and verifies that recovery is properly performed īą Security Testing ī­ verifies that protection mechanisms built into a system will, in fact, protect it from improper penetration īą Stress Testing ī­ executes a system in a manner that demands resources in abnormal quantity, frequency, or volume
  • 39. System Testing (3) īą Performance Testing ī­ test the run-time performance of software within the context of an integrated system īą Deployment Testing ī­ examines all installation procedures and specialized installation software that will be used by customers ī­ all documentation that will be used to introduce the software to end users
  • 41. Software Test Documentation (1) IEEE 829 2008 Standard for Software Test Documentation īą IEEE standard that specifies the form of a set of documents for use in eight defined stages of software testing īą īą The documents are: ī­ Test Plan ī­ Test Design Specification ī­ Test Case Specification ī­ Test Procedure Specification ī­ Test Item Transmittal Report ī­ Test Log ī­ Test Incident Report ī­ Test Summary Report
  • 42. Software Test Documentation (2) īą Test Plan - A management planning document that shows: ī­ How īĩ the testing will be done including System Under Test (SUT) configurations. ī­ Who will do it ī­ What will be tested ī­ How long it will take - may vary, depending upon resource availability ī­ What the test coverage will be, i.e. what quality level is required
  • 43. Software Test Documentation (3) īą Test Design Specification: ī­ detailing test conditions and the expected results as well as test pass criteria. īą Test Procedure Specification: ī­ detailing how to run each test, including any set-up preconditions and the steps that need to be followed
  • 44. Software Test Documentation (4) īą Test Item Transmittal Report: ī­ īą reporting on when tested software components have progressed from one stage of testing to the next Test Log: ī­ recording which tests cases were run, who ran them, in what order, and whether each test passed or failed īą Test Incident Report: ī­ detailing, for any test that failed, the actual versus expected result, and other information intended to throw light on why a test has failed.
  • 45. Software Test Documentation (5) īą Test Summary Report: ī­A management report providing any important information uncovered by the tests accomplished, and including assessments of the quality of the testing effort, the quality of the software system under test, and statistics derived from Incident Reports ī­ The report also records what testing was done and how long it took, in order to improve any future test planning ī­ This final document is used to indicate whether the software system under test is fit for purpose according to whether or not it has met acceptance criteria defined by the project stakeholders
  • 47. Test-case Design (1) Focuses on a set of techniques for the creation of test cases that meet overall testing objectives and the testing strategies īą These techniques provide a systematic guidance for designing tests that īą ī­ Exercise the internal logic and interfaces of every software component/module ī­ Exercise the input and output domains of the program to uncover errors in program function, behaviour, and performance
  • 48. Test-case Design (2) â€ĸ For conventional application, software is tested from two perspectives: īą White-box‟ testing ī­ ī­ ī­ Focus on the program control structure (internal program logic) Test cases are derived to ensure that all statements in the program have been executed at least once during testing and all logical conditions have been exercised Performed early in the testing process īą „Black-box‟ testing Examines some fundamental aspect of a system with little regard for the internal logical structure of the software ī­ Performed during later stages of testing ī­
  • 49. White-box Testing (1) īą Using white-box testing method, you may derive testcases that: ī­ Guarantee that al independent paths within a module have been exercised at least once ī­ Exercise all logical decisions on their true and false sides ī­ Execute all loops at their boundaries and within their operational bounds ī­ Exercise internal data structures to ensure their validity īą Example method: basis path testing
  • 50. White-box Testing (2) īą Basis path testing: ī­ Test cases derived to exercise the basis set are guaranteed to execute every statement in the program at least once during testing
  • 51. Deriving Test Cases (1) īą Steps to derive the test cases by applying the basis path testing method: ī­ Using the design or code, draw a corresponding flow graph. The flow graph depicts logical control flow using the notation illustrated in next slide. īĩ Refer Figure 18.2 in page 486 - comparison between a flowchart and a flow graph īĩ ī­ Calculate the Cyclometic Complexity V(G) of the flow graph ī­ Determine a basis set of independent paths ī­ Prepare test cases that will force execution of each path in the basis set
  • 52. Deriving Test Cases (2) īą Flow graph notation: UNTIL Sequence IF WHILE CASE
  • 53. Drawing Flow Graph: Example void foo (float y, float a *, int n) { float x = sin (y) ; if (x > 0.01) 1 z = tan (x) ; 2 else z = cos (x) ; 3 for (int i = 0 ; i < x ; + + i) 5 { a[i] = a[i] * z ; 6 Cout < < a [i]; 7 } } 8 Predicate nodes 1 2 Predicate nodes R1 4 5 3 R3 R2 6 8 7
  • 54. Deriving Test Cases (3) īą īą The arrows on the flow graph, called edges or links, represent flow of control and are analogous to flowchart arrows Area bounded by edges and nodes are called regions ī­ When counting regions, we include the area outside the graph as region
  • 55. Deriving Test Cases: Example Step 1: Draw a flow graph
  • 56. Deriving Test Cases: Example Step 2: Calculate the Cyclomatic complexity, V(G) īą Cyclomatic complexity can be used to count the minimum number of independent paths. īą A number of industry studies have indicated that the higher V(G), the higher the probability or errors. īą The SEI provides the following basic risk assessment based on the value of code: Cyclomatic Complexity Risk Evaluation 1 to 10 A simple program, without very much risk 11 to 20 A more complex program, moderate risk 21 to 50 A complex, high risk program > 50 An un-testable program (very high risk)
  • 57. Deriving Test Cases: Example īą Ways to calculate V(G): ī­ V(G) = the number of regions of the flow graph. ī­ V(G) = E – N + 2 ( Where “E” are edges & “N” are nodes) ī­ V(G) = P + 1 (Where P is the predicate nodes in the flow graph, each node that contain a condition) īą Example: ī­ V(G) = Number of regions = 4 ī­ V(G) = E – N + 2 = 16 – 14 + 2 = 4 ī­ V(G) = P + 1 = 3 + 1 = 4
  • 58. Deriving Test Cases: Example 1 īą īą īą īą īą īą īą īą Step 3: Determine a basis set of independent paths Path 1: 1, 2, 3, 4, 5, 6, 7, 8, 12 Path 2: 1, 2, 3, 12 Path 3: 1, 2, 3, 4, 5, 9, 10, 3, â€Ļ Path 4: 1, 2, 3, 4, 5, 9, 11, 3, â€Ļ Step 4: Prepare test cases Test cases should be derived so that all of these paths are executed A dynamic program analyser may be used to check that paths have been executed
  • 59. Summary (1) Software testing plays an extremely important role in V&V, but many other SQA activities are also necessary īą Testing must be planned carefully to avoid wasting development time and resources, and conducted systematically īą The developer and ITG must work together throughout the software project to ensure that thorough tests will be conducted īą
  • 60. Summary (2) The software testing strategy is to begins by „testingin-the-small‟ and move toward „testing-in-the-large‟ īą The IEEE 829.2009 standard specifies a set of documents for use in eight defined stages of software testing īą The „white-box‟ and „black-box‟ techniques provide a systematic guidance for designing test cases īą We need to know when is the right time to stop testing īą
  • 61. THE END Copyright Š 2013 Mohd. Sharifuddin Ahmad, PhD College of Information Technology