Application of Genetic Algorithm and Particle Swarm Optimization in Software ...
Automatic Test Data Generation Using Soft Computing Approaches
1. AN AUTOMATIC TEST DATA
GENERATION FOR DATA FLOW
COVERAGE USING SOFT
COMPUTING APPROACH
SANJAY SINGLA, PRITI SINGLA
AND H M RAI
Vol. 2, No. 2, April 2011
3. ABSTRACT
Testing is the most important quality
assurance measure for software.
Testing is time consuming and
laborious process.
Automatic test data generation would
be useful to reduce the cost and time.
4. Paper presents an automatic test data
generation technique that uses a
particle swarm optimization (PSO) to
generate test data that satisfy data
flow coverage criteria.
5. INTRODUCTION
Almost every service/system we use
today has an element of software in it.
To make the system reliable,
predictable and to run all the time and
every time, software has to tested
before delivery.
Generally the goal of software testing
is to design a set of minimal number of
test cases such that it reveals as
many faults as possible.
6. An automated software testing can
significantly reduce time and cost of
developing software.
7. DATA FLOW ANALYSIS
TECHNIQUE
This section uses the all-uses criterion
and the data flow analysis technique.
The example program determines the
middle value of three given integers
I, J, K.
10. Defs and c-uses are associated with
nodes.
p-uses are associated with edges.
Two sets dcu(i) and dpu(i,j) are
determined.
The def-clear paths are constructed
from the dcu and dpu sets.
11.
12.
13. ALGORITHMS
GENETIC ALGORITHMS
Inspired by Darwin’s theory of
evolution.
It generates useful solutions.
Operates on strings called
chromosomes.
Each digit that makes up a
chromosome is called a gene.
Each chromosome has a fitness value
associated with it, which is the
15. The fitness function used is
mathematically expressed as:
16.
17. PARTICLE SWARM
OPTIMIZATION
Developed by Kennedy and Eberhart.
Simulates the behaviour of bird
flocking by which they find food
sources.
PSO algorithm works by having a
population (called a swarm)
of candidate solutions (called
particles).
These particles are moved around in
the search-space according to a few
18. For each particle i = 1, ..., S do:
Initialize the particle's position with
a uniformly distributed random vector:
xi =[xi1 , xi2, … xid ]
Initialize the particle's best known
position to its initial position: pi ← xi
If (f(pi) < f(g)) update the swarm's best
known position: g ← pi
Initialize the particle's velocity:
vi =[vi1 , vi2, … vid ]
19. ◦ For each particle i = 1, ..., S do:
For each dimension d = 1, ..., n do:
Update the particle's velocity: vi,d ← ω vi,d + cp rp (pi,d-xi,d) +
cg rg (gd-xi,d)
ω is the inertia weight and cp cg are the acceleration constants
and rp rg are two random values in range [0,1]
Update the particle's position: xi ← xi + vi
If (f(xi) < f(pi)) do:
Update the particle's best known position: pi ← xi
If (f(pi) < f(g)) update the swarm's best known position: g ← pi
Now g holds the best found solution.
Stopping criterion: If the number of iterations
exceeds the maximum number of iteration or
accumulated coverage is 100% then stop.
22. PSO is able to generate test data
automatically that cover successfully
the sample program under all du path
criteria.
It requires less number of generation
to achieve def- use percentage.