SlideShare a Scribd company logo
1 of 169
Introduction
 The idea behind soft computing is to model cognitive behavior of human
mind.
 Soft computing introduced by Zadeh is an innovative approach to construct
computationally intelligent hybrid systems.
 Soft computing is an association of computing methodologies which
collectively provide a foundation for the conception and deployment of
intelligent systems[1].
 Soft Computing is an approach for constructing systems which are:
• Computationally intelligent and possess human like expertise in
particular domain.
• Adapt to the changing environment and learn to do better.
Some Domains of Intelligence in Biological Systems (Computational
Perspective)
Evolution
Competition
Reproduction
Swarming
Communication
Learning
Soft Computing Models
 Components of SC includes
• Fuzzy Logic (FL)
• Artificial Neural Network (ANN)
• Evolutionary Computation (EC) - based on the origin of the species.
• Evolutionary Algorithm (EA): Copying ideas of Nature
• Swarm algorithm or Swarm Intelligence (SI) : group of species or
animals exhibiting collective behavior.
• Hybrid Systems
 Soft computing models employs different techniques like ANN,FL,EA’s & SI in a
complementary rather than a competitive way.
 Integrated architectures like Neuro Fuzzy , ANN-EA combination and ANN-SI
techniques are some of the hybrid approaches used for performance improvements.
Biological Basis: Neural Networks & EC
 Complex adaptive biological structure of human brain facilitates performance of
complex tasks.
 The processing element in an ANN is generally considered to be very roughly
analogous to a biological neuron .
 Neural networks ties with genetics “branch of biology that deals with the heredity
and variation of organisms”[3].
 Chromosomes: Structures in cell bodies that transmit genetic information,
Individual patterns in EA corresponds to chromosomes in biological systems.
 The genotype completely specifies an organism, in EC a structure specifies a
system.
 Swarm Intelligence is the new member of EC , nature inspired algorithm which
mimic insect’s problem solution abilities[4].
Artificial Neural Network
 The basic concept of an artificial neural network (ANN) is derived from an analogy
with the biological nervous system of the human brain.
 An ANN is a massively parallel system composed of many neurons, where
synapses are actually variable weights, specifying the connections between
individual neurons[2].
 The neurons continuously evaluate their output by looking at their inputs
,calculating the weighted sum and compares to a threshold to decide if they should
fire.
 Learning algorithm gives the inputs ,adjust the weights to produce the required
output.
 ANN’s are algorithms for optimization & learning based loosely on concepts
inspired by the nature of the brain.
Two-layer feed forward network
 Just as individuals learn differently, neural network have different learning rules.
Learning may be Supervised or Unsupervised.
 Supervised learning requires that when the input stimuli are applied, the desired
output is known a priori.
 The most popular algorithm for adjusting weights during the training phase is called
back propagation of error.
• Feed-forward neural network is the simplest form of an ANN
INPUT
LAYER X
( 1,X2)
HIDDEN
LAYER(H1,H2,H3)
OUTPUT
LAYER(O)
W 1
1,
W ,1
2 W 1
,
1
W1,2 W 1
,
2
W3,1
W 2
2, W1,3
W ,3
2
X1
X2
H1
H2
H3
O
Error correction learning
 Error correction learning used with supervised learning is the technique of
comparing the system output to the desired output value and using that
error to direct the training.
 It is formulated as the minimization of an error function such as the total
mean square error between the actual output and the desired output
summed over all available data.
 The most popular learning algorithm for use with error correction learning
is the back propagation algorithm(BP).
 The delta rule is often utilized by the most common class of ANNs called
back propagation neural networks.
Gradient Decent Optimization
 A gradient descent based optimization algorithm such as BP can be used to adjust
connection weights in the ANN iteratively in order to minimize the error.
 BP is the variation on gradient search, the key to BP is a ,method for calculating the
gradient of the error with respect to the weights for a given input by propagating
error backwards through the network.
 When a neural network is initially presented with a pattern it makes a random
guess as to what it might be.
 It then sees how far its answer was from the actual one and makes an appropriate
adjustment to its connection weights[2].
Input
Desired
Output
Limitations
 Neural networks are used for solving a variety of problems but they still have
some limitations. One of the most common is associated with neural network
training.
 The BP learning algorithm cannot guarantee an optimal solution.
 In real world applications ,the BP algorithm might converge to a set of
suboptimal weights from which it cannot escape. So the neural network is often
unable to find a desirable solution to a problem at hand.
 Another difficulty is related selecting an optimal topology for the neural network.
 Evolutionary computations are effective optimization techniques that can guide
both weight optimization and topology selection[6].
Optimization
• The main goal of optimization is to find values of the variables that minimize or
maximize the objective function.
• The main components of optimization problem are objective function which we
want to minimize or maximize and the design variables.
• Modelling is the process of identifying objective function and variables.
• Formulating good mathematical model of the optimization problem requires
algorithms with robustness, efficiency and accuracy.
• Optimization algorithms are classified in to:
– Local Optimization
– Global Optimization
Issues in evolving Neural Networks
 EC methodologies have greatly been applied to four main attributes of neural
networks:
 Network connection weights and network architecture
 Network learning algorithm and evolution of inputs
 The architecture of the network often determines the success or failure of the
application, usually the network architecture is decided by trial and error .
 There is a great need for a method of automatically designing the architecture for a
particular application.
 Both the network architecture and the connection weights need to be adapted
simultaneously or sequentially.
 Thus EC methodologies have been applied to evolve the network weights, the
network topology.
Evolutionary Computation -EC
 Evolutionary Computation (EC) refers to computer-based problem solving
systems that use computational models of evolutionary process.
 EC is the study of computational systems which use ideas and inspirations
from natural evolution and other biological systems.
 EC is based on biological metaphors[4].
 Two biological metaphors which are the two important
classes of population based optimization algorithms are :
• Evolutionary algorithms
• Swarm algorithms
 EC techniques are used in optimization ,machine learning and
Automatic design.
 GA is specific class of EC that performs a stochastic search by using the basic
principles of natural selection.
 GA’s are algorithms for optimization and learning based loosely on several
features of biological evolution[3].
 GA’s are inspired by Darwin's theory of natural evolution.
 Formally introduced in the US in the 1975 by John Holland referred as simple
genetic algorithm( SGA).
 Used to optimize a given objective function, where parameters are encoded in
something analogous to a gene.
 A GA applies biological principles into computational algorithm to obtain the
optimum solutions and is a robust method for searching the optimum solution to a
complex problem.
Evolutionary Algorithms
Evolutionary Algorithms
 EA’s are optimization methods based on evolutionary metaphor that
showed effective in solving difficult problems.
 The 4 main processes in evolutionary algorithms are:
• Initialization process
• Fitness evaluation
• selection
• Generation of new population
 After initialization, the population is evaluated and stopping criteria are
checked.
 If none of the stopping criteria is met ,a new population is generated again
and again and the process is repeated.
Framework of Genetic algorithm
1. t := 0;
2. Generate initial Population P(t) at random;
3. Evaluate the fitness of each individual in P(t);
4. while (not termination condition) do
5. Select parents, Pa(t) from P(t) based on their fitness in P(t);
6. Apply crossover to create offspring from parents: Pa(t) ->O(t)
7. Apply mutation to the offspring: O(t) ->O(t)
8. Evaluate the fitness of each individual in O(t);
9. Select population P(t+1) from current offspring O(t) and parents P(t);
10. t := t+1;
11. end-do
Genetic Algorithms (II)
Population of
individuals or alternative
(feasible) solutions
Next generation
of
individuals
Mating pool of
“fitter”
individuals
Evaluate individuals
on their fitness
Select individuals
based on fitness
for subsequent mating
Select individuals
& exchange charac-
teristics to create
new individuals
Arbitrarily change
some characteristic
Heredity
Genetic Algorithms (IV)
Generation of initial population
Basic Tasks
Evaluation
Selection (Reproduction operation)
Exchange characteristics to develop new
individuals (Crossover operation)
Arbitrarily modify characteristics in new
individuals (Mutation operation)
Genetic Algorithms (V)
Reproduction / Selection Operator
The purpose is to bias the mating pool (those who can pass on their
traits to the next generation) with fitter individuals
Assign p as the prob. of choosing
an individual for the mating pool
p is proportional to the fitness
Choose an individual with prob. p
and place it in the mating pool
Continue till the mating pool size
is the same as the initial population’s
Choose n individuals randomly
Pick the one with highest fitness
Place n copies of this individual in
the mating pool
Choose n different individuals and
repeat the process till all in the
original population have been chosen
Genetic Algorithms (VI)
Crossover operator
1 0 0 1 1 0 1
1 1 0 0 1 1 1
1 0 0 1 1 1 1
1 1 0 0 1 0 1
Genetic Algorithms (VII)
Mutation
1 0 0 1 1 0 1 1 0 0 0 1 0 1
Components of a GA
A problem to solve…..
• Encoding technique :Representation of individuals (gene, chromosome)
• Initialization procedure (Creation)
• Evaluation function (Environment)
• Selection of parents (Reproduction)
• Genetic operators (Mutation, Recombination)
• Parameter settings (Practice and art)
GA working principle
Start
Create initial random
population
Evaluate fitness for each
member of the population
Store best individuals
Create mating pool
Create next generation
using crossover
Optimal or good
solution found?
Stop
Perform Mutation
NO
YES
Fitness evaluation
Population
Selection
Crossover
Mutation
Basic Genetic Algorithm
 [Start] algorithm begins with a set of initial solutions (represented by set of
chromosomes) called population.
 [Fitness] Evaluate the fitness f(x) of each chromosome x in the population.
 Repeat until terminating condition is satisfied
[Selection] Select two parent chromosomes from a population according
to their fitness (the better fitness, the bigger chance to be selected).
• [Crossover] Crossover the parents to form new offspring's (children). If
no crossover was performed, offspring is the exact copy of parents.
• [Mutation] Mutate new offspring at selected position(s) in
chromosome).
• [Accepting] Generate new population by placing new offspring's.
 Return the best solution in current population
 Simple problem: max x2 over {0,1,…,31}
 GA approach:
• Representation: binary code, e.g. 01101  13
• Population size: 4
• one-point cross over, bitwise mutation
• Roulette wheel selection
• Random initialization
An example after Goldberg
Initial population
 Encoding:
• code the decision variable ‘x’ into a finite length string. Using a
five-bit unsigned integer, numbers between 0 (00000) and
31(11111) can be obtained.
• The objective function here is f(x) = x2 which is to be maximized.
 Initial Population :
• An initial population of size 4 is randomly chosen : {12, 25, 5, 19}
• Then, we should obtain the decoded x values for the initial
population generated:
String # X value Binary Code
1 12 01100
2 25 11001
3 5 00101
4 19 10011
Fitness evaluation
 Objective function
• Calculate the fitness or objective function for each individual.
• This is obtained by simply squaring the ‘x’ value, since the given
function is f(x) = x2.
 Probability of selection :
Compute the probability of selection as follows:
• for string 1, Fitness f(x1) = 144, and Σf(x
i
) = 1155
• The probability that string 1 occurs is given by, =144/1155=0.1247=12.47%.
• Similarly calculate for all strings.
Roulette Selection
 Roulette selection: Expected count
• The expected and actual count method is proposed for roulette selection.
• The next step is to calculate the expected count
• For string 1, Expected count = Fitness/Average = 144/288.75 = 0.4987
 The expected count gives an idea of which population can be selected for further
processing in the mating pool.
 Roulette selection: Actual count :
• The actual count is to be obtained to select the individuals, which would
participate in the crossover cycle using Roulette wheel selection.
Roulette wheel Selection
 The Roulette wheel is formed as follows:
String1 (12)
12.47%
String2 (25)
54.11%
String3 (5)
2.16%
String4 (19)
31.26%
Maximum 625 54.11% 2.1645 2
Mating pool
 String 1 occupies 12.47%, so there is a chance for it to occur at least once. Hence its
actual count may be 1.
With string 2 occupying 54.11% of the Roulette wheel, it has a fair chance of being
selected more than once. Thus its actual count can be considered as 2.
 On the other hand, string 3 has the least probability percentage of 2.16%, so their
occurrence for next cycle is very poor. As a result, it actual count is 0.
 String 4 with 31.26% has at least one chance for occurring while Roulette wheel is spun,
thus its actual count is 1.
 Based on actual count the mating pool is formed as follows:
String #
X value Mating pool
1 12 01100
2 25 11001
2 25 11001
4 19 10011
Crossover
 Crossover operation is performed to produce new offspring (children)
.
 The crossover probability is assumed to 1.0.
 The crossover point is specified and based on the crossover point
(chosen randomly), single point crossover is performed and new
offspring is produced.
String # X value Mating pool Cross point Offsprings code Offsprings x value
1 12 0110|0 4 0110|1 13
2 25 1100|1 4 1100|0 24
2 25 11|001 2 11|011 27
4 19 10|011 2 10|001 17
Mutation
 Mutation operation is performed to produce new off springs after
crossover operation.
 We select the mutation-flipping operation to be performed and then
new off springs are produced.
 The mutation probability is assumed to 0.001.
String
#
Offspring
X value
Offspring code
before mutation
Mutation
Chromosome
Offsprings code after
mutation
Offsprings x value
after mutation
1 13 01101 10000 11101 29
2 24 11000 00000 11000 24
3 27 11011 00000 11011 27
4 17 10001 00100 10101 21
Evaluation
 Once selection, crossover and mutation are performed, the new population is now
ready to be tested. The population and the corresponding fitness values are now ready
for another round producing another generation. More generations are produced until
some stopping criterion is met.
 It can be noted how maximal and average performance has improved in the new population.
The population average fitness has improved from 288.75 to 646.75 in one generation.
 The maximum fitness has increased from 625 to 841 during same period. This example has
shown one generation of a simple genetic algorithm. Therefore, many generations can be
produced to get more optimal solutions.
Example of Genetic Algorithm
Example
Testing GA
• It cannot be said with certainty that the genetic algorithm has found
the global minimum value. Only by testing the algorithm with
analytical benchmark functions, you can find the algorithm is correct.
• In other cases, you should compare the results with the laboratory
data, or find a way to make sure the answers are correct. for
example, find a way to predicts the order of magnitude of optimal
points.
• Looking at the fitness of the best-found solution so far can be a
good sign, but totally if you have no idea of the global optimum, let to
progress the optimisation until the rate of improvement is negligible.
Advantages/ disadvantages of GA
• Advantages
– parallelism and solution space is wide
• Disadvantages :
– The problem of finding fitness function
– definition of representation of the problem
– premature convergence occurs
– parameter sensitive
– An effective GA representation and meaningful fitness
evaluation are the keys of the success in GA
applications.
ANN-GA Hybrid
Some randomly generated chromosome made of 8 genes
representing 8 weights for BPN
ANN-GA hybrid
Steps
Advantages/ disadvantages of GA
• Advantages
– parallelism and solution space is wide
• Disadvantages :
– The problem of finding fitness function
– definition of representation of the problem
– premature convergence occurs
– parameter sensitive
– An effective GA representation and meaningful fitness
evaluation are the keys of the success in GA
applications.
Swarm Intelligence
• Swarm Intelligence has two fundamental concepts:
• self organizing
– Positive feedback: Amplification
– Negative feedback: Balancing
– Fluctuations
– Multiple interactions
• Division of labor
– simultaneous task performance by cooperating specialized individuals
– enables the swam to respond to changed conditions in the search space.
Swarm algorithm
 Mimicking emergent behaviors observed in social animals on computer systems[4].
• Bacteria
• Immune system
• Ants (ACO) , Honey bees ( ABC)
• Birds (PSO) and other social animals
 Particle Swarm Optimization( PSO) and Artificial Bee Colony(ABC) are widely
used Swarm Intelligence based method.
 PSO is Inspired by simulation social behavior related to bird flocking, fish
schooling and swarming theory.
 ABC is inspired by simulation of foraging behavior related to real honey bees and
swarming theory.
Particle Swarm Optimization
 Inspired by simulation social behavior Related to bird flocking, fish
schooling and swarming theory:
- steer toward the center
- match neighbors’ velocity
- avoid collisions
 Suppose A group of birds are randomly searching food in an area.
• There is only one piece of food in the area being searched.
• All the birds do not know where the food is. But they know how far
the food is in each iteration.
• So what's the best strategy to find the food? The effective one is to
follow the bird which is nearest to the food.
Overview of basic PSO
 Particle swarm optimization (PSO) is a population based on stochastic
optimization algorithms to find a solution and then solve an optimization
problem in a search space.
 It has been developed by Eberhart and Kennedy in 1995, inspired by social
behavior of bird flocking or fish schooling.
 How can birds or fish exhibit such a coordinated collective behavior?
PSO
 PSO is a robust stochastic optimization technique based on the movement and
intelligence of swarms, applies this concept of social interaction to problem
solving.
 It uses a number of agents (particles) that constitute a swarm moving around in the
search space looking for the best solution.
 Each particle is treated as a point in a N-dimensional space which adjusts its
“flying” according to its own flying experience as well as the flying experience of
other particles.
 Each particle keeps track of its coordinates in the solution space which are
associated with the best solution (fitness) that has achieved so far by that particle
pbest and the best value obtained so far by any particle in the neighborhood of
that particle called gbest.
PSO
 In PSO, each single solution is a "bird" in the search space called "particle".
• All of particles have fitness values which are evaluated by the fitness function to be
optimized.
 All particles have velocities which direct the flying of the particles. The particles fly
through the problem space by following the current optimum particles.
 Initialize with randomly generated particles. Update through generations in search
for optima.
• Each particle has a velocity and position.
• Update for each particle uses two “best” values.
• pbest: best solution (fitness) it has achieved so far. (The fitness value is also
stored.)
• gbest: best value, obtained so far by any particle in the population.
PSO
 Each particle tries to modify its position using the following information:
• the current positions and the current velocities,
• the distance between the current position and pbest,
• the distance between the current position and the gbest.
 The modification of the particle’s position can be mathematically modeled
according the following equation :
Vi
k+1 = wVi
k +c1 rand1(…) x (pbesti-si
k) + c2 rand2(…) x (gbest-si
k)
Where vi
k : velocity of agent i at iteration k,
w: weighting function,
cj : weighting factor or learning factor
rand : uniformly distributed random number between 0 and 1,
si
k : current position of agent i at iteration k,
pbesti : pbest of agent i, and gbest : gbest of the group
PSO algorithm
 Let particle swarm move towards the best position in search space,
remembering each particle’s best known position and global (swarm’s) best
known position.
 Let
xi – specific particle
vi – particle’s velocity
pi – particle’s (personal) best known position
g – swarm’s (global) best known position
vi ← ωvi + φprp(pi - xi) + φgrg(g - xi)
inertia cognitive social
and xi ← xi + vi
Example problem : PSO
ANN weight optimization process using PSO
• The PSO technique is used for the weight optimization
of feed forward neural network structure. The network
• was pre-trained using the PSO to arrive at the initial
network weights. The searching process of PSO-BP
algorithms
• is started from initializing the starting position and the
velocity of particles. In this case, the particle is a group
of the
• weights of the feed forward neural network structure.
There are 13 weights for the 2-3-1 feed forward neural
• network structure node topology and thus the particle
consists of 13 real numbers as shown in Fig1.
ANN weight optimization process using PSO
Methodology
• The present work integrates the PSO with the Back Propagation
algorithm to form a hybrid-learning algorithm for training the feed
forward neural networks.
• In the proposed work for calculating the global optimum, the PSO
and the ANN algorithms are integrated to increase the efficiency.
The forecasting models were developed using the historical
groundwater level and the rainfall data, which were recorded from
three observation wells, located inUdupi district, India.
• The water level and the rainfall data of the observation wells located
in Brahmavar, Kundapur, and Hebri taluks were used for the year
2000-2013.
• The groundwater in these regions mainly occurs in water table
conditions. The PSO is used to evolve the neural network weights
• The particles are evaluated and updated until a new
generation set of particles are generated. The Root
Mean Square Error (RMSE) is used as the fitness
function.
• This searching procedure is repeated to search the
global best position in the search space. If the fitness
function is greater than the particle best, then the particle
best is considered the particle position, otherwise the
global best as the particle best which has the minimum
value of fitness function.
• Based on the pBest, the gBest, and the current best
values, the updated velocity is computed.
• The particle position is updated based on the updated
velocity. The process is repeated for iterations until a
minimum error is obtained.
• The PSO can be applied to train the ANN and this process is
iterated until we get a minimum error. Thus, the PSO is integrated
with the ANN in order to search the optimal weights for the network.
• Finally, the network is trained using the updated weights and finally
the trained network is used to forecast the groundwater level of the
testing set.
• The analysis is being performed for forecasting the groundwater levels for
the different input combinations as identified by all the three well locations.
Initially nine years (2000-2008) of data is considered as the training set and
the ground water level is forecasted for 2009.
•
• a comparison was made
• between the values predicted using the BP and the Hybrid ANN-PSO
algorithms. The forecasted groundwater level
• using ANN and ANN-PSO models during testing for the located wells of the
study area are shown graphically from
• Fig. 3 to Fig. 8.
Artificial Bee Colony
• ABC algorithm is one of the most recently introduced swarm based
optimization algorithm proposed by Karaboga (2005)
• ABC simulates the intelligent foraging behavior of honeybee swarm.
• Based on inspecting the behavior of honey bees on finding nectar and sharing
the information of food sources to the bees in the hive.
• Observations and studies on honey bee behaviors resulted in a new
generation of optimization algorithm called as “Artificial Bee Colony”.
• Karaboga has described the Artificial Bee Colony (ABC) algorithm based on
the foraging behavior of honey bees for numerical optimization problems.
Behavior of Honey Bee Swarm
Three essential components of forage selection:
• Food Sources: The value of a food source depends on many factors such as its
proximity to the nest, its richness or concentration of its energy, and the ease of
extracting this energy.
• Employed Foragers: Associated with a particular food source which they are
currently exploiting or are “employed” at. They carry with them information about
this particular source, its distance and direction from the nest, the profitability of
the source and share this information with a certain probability.
• Unemployed Foragers: Continually at look out for a food source to exploit. There
are two types of unemployed foragers: scouts, searching the environment
surrounding the nest for new food sources and onlookers waiting in the nest and
establishing a food source through the information shared by employed foragers.
Exchange of Information among bees
• The model defines two leading modes of the behavior:
– recruitment to a nectar source
– the abandonment of a source.
• The exchange of information among bees is the most important occurrence
in the formation of collective knowledge.
• The most important part of the hive with respect to exchanging information
is the dancing area.
• Communication among bees related to the quality of food sources takes
place in the dancing area and this dance is called a Waggle dance.
ABC
• Employed foragers share their information with a probability proportional to the
profitability of the food source, and the sharing of this information through waggle
dancing is longer in duration.
• An onlooker on the dance floor, probably can watch numerous dances and decides
to employ themselves at the most profitable source.
• The bees evaluate the different patches according to the quality of the food and the
amount of energy usage.
• Bees communicate through a waggle dance which contains information about
– the direction of flower patches (angle between the Sun and patch
– the distance from the hive( duration of the dance)
– The quality rating( frequency of the dance)
• Thus ABC is developed based on inspecting the behaviors of real
bees on finding nectar and sharing the information of food sources to the bees in the
hive.
Bees in Nature
• Colony Contains 3 groups of bees :
• The employed Bees( 50%)
– It stays on a food source and provides the neighborhood of the source in its
memory
• The onlooker Bee( 50%)
– It gets the information of food sources from the employed bees in the hive and select one of
the food sources from the employed bees in the hive and select one of the food source to
gathers the nectar.
• The Scout ( 5-10%)
– It is responsible for finding new food ,the new nectar, sources.
• The employed bee whose food source has been exhausted by the bees ,becomes
a scout. Scouts are the colony’s explorer’s.
• Number of employed bees=number of food sources
• Food source position=possible solution to the problem
• The amount of nectar of a food source=quality of the solution
• There is a greater probability of onlookers choosing more profitable sources
since more information is circulated about the more profitable sources.
Artificial Bee Colony Algorithm
• Simulates behavior of real bees for solving multidimensional and multimodal
optimization problems.
• The first half of the colony consists of the employed artificial bees and the
second half includes the onlookers.
• The number of employed bees is equal to the number of food sources around
the hive.
• The employed bee whose food source has been exhausted by the bees becomes
a scout.
Components of Honey bee swarm
ABC Fitness Evaluation
Employed Bee Phase
Onlooker bee phase
Employed Bee Phase Implementation
Evaluation & soln generation
Pseudocode for Employed bee phase
Generation and selection
Condition for Onlooker bee phase
Onlooker bee phase
Onlooker Bee phase
Pseudocode for OLB phase
Limit
Scout phase
Pseudocode SBP
Selection
Complete pseudocode ABC
Framework of ABC Algorithm
ABC algorithm
• Each cycle of search consists of three steps:
– moving the employed and onlooker bees onto the food sources
– calculating their nectar amounts
– determining the scout bees and directing them onto possible food sources.
• A food source position represents a possible solution to the problem to be
optimized.
• The amount of nectar of a food source corresponds to the quality of the solution.
• Onlookers are placed on the food sources by using a probability based selection
process.
• As the nectar amount of a food source increases, the probability value with which
the food source is preferred by onlookers increases, too.
• The scouts are characterized by low search costs and a low average in food
source quality. One bee is selected as the scout bee.
• The selection is controlled by a control parameter called "limit".
• If a solution representing a food source is not improved by a predetermined
number of trials, then that food source is abandoned and the employed bee
is converted to a scout.
• Control parameters of ABC algorithm are:
– Swarm size
– Limit
– number of onlookers: 50% of the swarm
– number of employed bees: 50% of the swarm
– number of scouts: 1
Flow chart of ABC
Evaluate the Fitness of the Population
Determine the Size of Neighbourhood
(Patch Size ngh)
Recruit Bees for Selected Sites
(more Bees for the Best e Sites)
Select the Fittest Bee from Each Site
Assign the (n–m) Remaining Bees to Random Search
New Population of Scout Bees
Select m Sites for Neighbourhood Search
Neighbourhood
Search
Initialise a Population of n Scout Bees
Movement of the Onlookers
Example :ABC
2rd
4th
Limit
Scout phase
Pseudocode SBP
Selection
Hybrid approaches
• ABC is good at exploration but poor at exploitation
• There are some studies touching on the hybridization of
PSO and ABC algorithms
• the hybridization of techniques is realized based on the
need of the PSO algorithm.
• Particle Swarm Optimization includes a handicap, which
is the absence of the regeneration of ineffective particles
that cannot improve their Pbest values. On the other
hand, the ABC algorithm contains a scout bee phase to
eliminate the handicap of regeneration. For this reason,
we added the scout bee phase into Standard PSO to
upgrade its performance
• The Standard PSO algorithm doesn’t contain a control
parameter to regenerate insufficient particles. At this
point, these particles are the ones that cannot retrieve
their Pbest value.
• Particles are updated without any diversity in Standard
PSO, and their adequacies are not controlled . It is
obvious that PSO needs a control parameter to improve
its convergence capability, but this parameter must not
increase its convergence time significantly.
• Consequently, it looks similar to a reasonable idea to
insert the scout bee phase into the Standard PSO
algorithm. By adding the scout bee phase into PSO,
ScPSO is obtained . In ScPSO, all processes
(except limit) are the same with the PSO algorithm.
Pseudocode for ScPSO
Pseudocode of ScPSO
-Initialize all particles within the user defined boundaries
(The first best position (Pbest) values are equal to the position of particles)
-Define a limit value within the range [1, (maximum iteration number-1)]
While (iteration number < maximum iteration number)
-Calculate fitness according to the cost function for all particles
-Update the best position values according to fitness values for all particles
-Choose the best Pbest vector as being Gbest (vector achieved to the minimum cost)
-Calculate new positions according to following equations for all particles
Vi(t + 1) = ωVi(t)+c1r1(Xpbest(i)(t)-Xi(t))+c2r2(Xgbest(t)-Xi(t))
Xi(t + 1) = Xi(t)+Vi(t + 1)
-If a variable inertia weight is used, change it in accordance with the utilized rule
-Control all particles which exceed the parameter ‘limit’, then regenerate the useless ones
End
Hybrid systems
• The combination of knowledge based systems ,neural networks and evolutionary
computation forms the core of an emerging approach to building hybrid intelligent
systems.
• The hybridization of genetic algorithm with other methods like gradient descent
methods will help to achieve balance between robustness and efficiency.
• Start with GA a search heuristic which mimics evolution by taking a population of
strings ,which encode possible solutions and combines them based on a fittest
function to produce individuals that are more fit and switch later to a gradient
descent based method.
• There has been a great interest in combining learning and evolution with ANN in
recent years.
• A GA-based ANN (ANN-GA) model, a hybrid integration of ANN and GA
algorithms may have better performance by taking advantages of the characteristics
of both of them.
Evolutionary neural networks
• The architecture of the network often determines the success or failures of the
application.
• There is a great need for a method of automatically designing the architecture for a
particular application.
• Evolutionary Computations are effective optimization techniques that can guide
both weight optimization and topology selection.
• Genetic algorithms may well suited for this task.
• The basic idea behind evolving a suitable network architecture is to conduct a
genetic search in a population of possible architectures.
• The GA performs global search capable of effectively exploring large search space
which has been used for optimally designing the ANN parameters including
connection weights connection weights, ANN architectures and input selection.
Integrated Back propagation based genetic
algorithm
BP/GA algorithm
Start: generate random population of ‘p’ chromosomes ( suitable solution
for the problem)
Extraction: extract weights for input-hidden-output layers from each
chromosome x.
Fitness: evaluate the fitness f(x) of each chromosome x in the population by
reciprocating the cumulative error obtained for each input set.
New population : Create a new population by reprating following steps until
the new population is complete.
– selection: select two parent chromosomes from a population according to their fitness
– Crossover: cross over the parents to form new offstring.
– Mutation: with a mutation probability mutate new offspring at each position in
chromosome.
– Acceptance: place the new offspring in the new population
Repeat steps 3 to 5 until stopping condition is met
Test: return the best solution in current population using the test set inputs
and the weights.
Hybrid approach
 The original population is a set of N chromosomes which is generated
randomly.
• Fitness of each chromosome is computed by minimum optimization
method.
• The training set of examples is presented to the network & the sum of
squared errors is calculated.
• Fitness is given by fitness value formula which is error minimization, a
simple function defined by the sum of squared errors. Smaller the sum fitter
the chromosome.
• The GA attempts to find a set of weights that minimizes the sum of squared
errors.
Hybrid approach
• The new population is given as input to PN to compute the fitness of each
chromosome followed by selection, crossover and mutation to generate the
next population.
• This process is repeated till more or less all the chromosomes converge to
the same fitness value.
• The weights represented by the chromosomes in the final converged
population are the optimized connection weights of the BPN.
ANN weight optimization using GA
1. Encoding a set of weights in a chromosome.
 We must first choose a method of encoding a network’s architecture into a
chromosome.
 The second step is to define a fitness function for evaluating the chromosome’s
performance. This function must estimate the performance of a given neural
network. We can apply here a simple function defined by the sum of squared
errors.
 The training set of examples is presented to the network ,and the sum of squared
errors is calculated. The smaller the sum ,the fitter the chromosome. The genetic
algorithm attempts to find a set of weights that minimises the sum of squared
errors.
 The third step is to choose the genetic operators- crossover and mutation. A
crossover operator takes two parent chromosomes and creates a single crossover
child with genetic material from both parents. Each gene in the child’s
chromosome is represented by the corresponding gene of the randomly selected
parent.
 A mutation operator selects a gene in a chromosome and adds a small random
value between -1 and 1 to each weight in this gene
Crossover in weight optimisation
THANK YOU
Mutation in weight optimisation
Encoding the network architecture
 The network architecture is decided by trial and error; there is a great need
for a method of automatically designing the architecture for a particular
application.
 Genetic algorithms may well be suited for this task.
 The basic idea behind evolving a suitable network architecture is to
conduct a genetic search in a population of possible architectures.
 The connection topology of a neural network can be represented by a
square connectivity matrix
 We must choose a method of encoding a networks architecture in to a
chromosomes.
Encoding of the network topology
 Each entry in the matrix defines the type of connection from one neuron
(column) to another (row), where 0 means no connection and 1 denotes
connection for which the weight can be changed through learning.
 To transform the connectivity matrix into a chromosome, we need only to
string the rows of the matrix together.
The cycle of evolving a neural network topology
Conclusion
 Soft computing is an association of computing methodologies for
constructing computationally intelligent hybrid systems.
 The core techniques of SC are ANN,EC ,Fuzzy Logic and probabilistic
reasoning and hybridization is one of the central aspect of this field.
 BP algorithm cannot guarantee an optimal solution. EC’s are effective
optimization techniques that can guide both weight optimization and
topology selection
 Integrated architectures like Neuro-Fuzzy, ANN-GA combinations and
ANN-SI techniques are some of the hybrid approaches used for
performance improvement.
 Direct EC technique could fail to obtain a n optimal solution. This clearly
says the need for hybridization of EC techniques with other optimization
algorithms , machine learning techniques and heuristics.
Quiz link GA
•
https://nptel.ac.in/content/storage2/course
s/downloads_new/103103164/noc20_ch19
_assigment_5.pdf

More Related Content

Similar to CI_GA_module2_ABC_updatedG.ppt .

AN EFFICIENT PSO BASED ENSEMBLE CLASSIFICATION MODEL ON HIGH DIMENSIONAL DATA...
AN EFFICIENT PSO BASED ENSEMBLE CLASSIFICATION MODEL ON HIGH DIMENSIONAL DATA...AN EFFICIENT PSO BASED ENSEMBLE CLASSIFICATION MODEL ON HIGH DIMENSIONAL DATA...
AN EFFICIENT PSO BASED ENSEMBLE CLASSIFICATION MODEL ON HIGH DIMENSIONAL DATA...ijsc
 
An Efficient PSO Based Ensemble Classification Model on High Dimensional Data...
An Efficient PSO Based Ensemble Classification Model on High Dimensional Data...An Efficient PSO Based Ensemble Classification Model on High Dimensional Data...
An Efficient PSO Based Ensemble Classification Model on High Dimensional Data...ijsc
 
IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...
IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...
IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...IRJET Journal
 
Evolving Connection Weights for Pattern Storage and Recall in Hopfield Model ...
Evolving Connection Weights for Pattern Storage and Recall in Hopfield Model ...Evolving Connection Weights for Pattern Storage and Recall in Hopfield Model ...
Evolving Connection Weights for Pattern Storage and Recall in Hopfield Model ...ijsc
 
Prediction of Euro 50 Using Back Propagation Neural Network (BPNN) and Geneti...
Prediction of Euro 50 Using Back Propagation Neural Network (BPNN) and Geneti...Prediction of Euro 50 Using Back Propagation Neural Network (BPNN) and Geneti...
Prediction of Euro 50 Using Back Propagation Neural Network (BPNN) and Geneti...AI Publications
 
32_Nov07_MachineLear..
32_Nov07_MachineLear..32_Nov07_MachineLear..
32_Nov07_MachineLear..butest
 
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...IAEME Publication
 
Comparison between the genetic algorithms optimization and particle swarm opt...
Comparison between the genetic algorithms optimization and particle swarm opt...Comparison between the genetic algorithms optimization and particle swarm opt...
Comparison between the genetic algorithms optimization and particle swarm opt...IAEME Publication
 
A Binary Bat Inspired Algorithm for the Classification of Breast Cancer Data
A Binary Bat Inspired Algorithm for the Classification of Breast Cancer Data A Binary Bat Inspired Algorithm for the Classification of Breast Cancer Data
A Binary Bat Inspired Algorithm for the Classification of Breast Cancer Data ijscai
 
Artificial Intelligence for Automated Decision Support Project
Artificial Intelligence for Automated Decision Support ProjectArtificial Intelligence for Automated Decision Support Project
Artificial Intelligence for Automated Decision Support ProjectValerii Klymchuk
 
Neural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlabNeural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlabijcses
 
Intro to machine learning
Intro to machine learningIntro to machine learning
Intro to machine learningAkshay Kanchan
 
EVOLVING CONNECTION WEIGHTS FOR PATTERN STORAGE AND RECALL IN HOPFIELD MODEL ...
EVOLVING CONNECTION WEIGHTS FOR PATTERN STORAGE AND RECALL IN HOPFIELD MODEL ...EVOLVING CONNECTION WEIGHTS FOR PATTERN STORAGE AND RECALL IN HOPFIELD MODEL ...
EVOLVING CONNECTION WEIGHTS FOR PATTERN STORAGE AND RECALL IN HOPFIELD MODEL ...ijsc
 
CSA 3702 machine learning module 1
CSA 3702 machine learning module 1CSA 3702 machine learning module 1
CSA 3702 machine learning module 1Nandhini S
 
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...Sarvesh Kumar
 
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...CSCJournals
 
Minimizing Musculoskeletal Disorders in Lathe Machine Workers
Minimizing Musculoskeletal Disorders in Lathe Machine WorkersMinimizing Musculoskeletal Disorders in Lathe Machine Workers
Minimizing Musculoskeletal Disorders in Lathe Machine WorkersWaqas Tariq
 
Lecture7 Ml Machines That Can Learn
Lecture7 Ml Machines That Can LearnLecture7 Ml Machines That Can Learn
Lecture7 Ml Machines That Can LearnKodok Ngorex
 

Similar to CI_GA_module2_ABC_updatedG.ppt . (20)

AN EFFICIENT PSO BASED ENSEMBLE CLASSIFICATION MODEL ON HIGH DIMENSIONAL DATA...
AN EFFICIENT PSO BASED ENSEMBLE CLASSIFICATION MODEL ON HIGH DIMENSIONAL DATA...AN EFFICIENT PSO BASED ENSEMBLE CLASSIFICATION MODEL ON HIGH DIMENSIONAL DATA...
AN EFFICIENT PSO BASED ENSEMBLE CLASSIFICATION MODEL ON HIGH DIMENSIONAL DATA...
 
An Efficient PSO Based Ensemble Classification Model on High Dimensional Data...
An Efficient PSO Based Ensemble Classification Model on High Dimensional Data...An Efficient PSO Based Ensemble Classification Model on High Dimensional Data...
An Efficient PSO Based Ensemble Classification Model on High Dimensional Data...
 
IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...
IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...
IRJET-Performance Enhancement in Machine Learning System using Hybrid Bee Col...
 
Evolving Connection Weights for Pattern Storage and Recall in Hopfield Model ...
Evolving Connection Weights for Pattern Storage and Recall in Hopfield Model ...Evolving Connection Weights for Pattern Storage and Recall in Hopfield Model ...
Evolving Connection Weights for Pattern Storage and Recall in Hopfield Model ...
 
Prediction of Euro 50 Using Back Propagation Neural Network (BPNN) and Geneti...
Prediction of Euro 50 Using Back Propagation Neural Network (BPNN) and Geneti...Prediction of Euro 50 Using Back Propagation Neural Network (BPNN) and Geneti...
Prediction of Euro 50 Using Back Propagation Neural Network (BPNN) and Geneti...
 
32_Nov07_MachineLear..
32_Nov07_MachineLear..32_Nov07_MachineLear..
32_Nov07_MachineLear..
 
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
 
Comparison between the genetic algorithms optimization and particle swarm opt...
Comparison between the genetic algorithms optimization and particle swarm opt...Comparison between the genetic algorithms optimization and particle swarm opt...
Comparison between the genetic algorithms optimization and particle swarm opt...
 
A Binary Bat Inspired Algorithm for the Classification of Breast Cancer Data
A Binary Bat Inspired Algorithm for the Classification of Breast Cancer Data A Binary Bat Inspired Algorithm for the Classification of Breast Cancer Data
A Binary Bat Inspired Algorithm for the Classification of Breast Cancer Data
 
Artificial Intelligence for Automated Decision Support Project
Artificial Intelligence for Automated Decision Support ProjectArtificial Intelligence for Automated Decision Support Project
Artificial Intelligence for Automated Decision Support Project
 
Neural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlabNeural network based numerical digits recognization using nnt in matlab
Neural network based numerical digits recognization using nnt in matlab
 
Intro to machine learning
Intro to machine learningIntro to machine learning
Intro to machine learning
 
PPT
PPTPPT
PPT
 
EVOLVING CONNECTION WEIGHTS FOR PATTERN STORAGE AND RECALL IN HOPFIELD MODEL ...
EVOLVING CONNECTION WEIGHTS FOR PATTERN STORAGE AND RECALL IN HOPFIELD MODEL ...EVOLVING CONNECTION WEIGHTS FOR PATTERN STORAGE AND RECALL IN HOPFIELD MODEL ...
EVOLVING CONNECTION WEIGHTS FOR PATTERN STORAGE AND RECALL IN HOPFIELD MODEL ...
 
CSA 3702 machine learning module 1
CSA 3702 machine learning module 1CSA 3702 machine learning module 1
CSA 3702 machine learning module 1
 
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
Model of Differential Equation for Genetic Algorithm with Neural Network (GAN...
 
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Ne...
 
Minimizing Musculoskeletal Disorders in Lathe Machine Workers
Minimizing Musculoskeletal Disorders in Lathe Machine WorkersMinimizing Musculoskeletal Disorders in Lathe Machine Workers
Minimizing Musculoskeletal Disorders in Lathe Machine Workers
 
Membrane computing
Membrane computingMembrane computing
Membrane computing
 
Lecture7 Ml Machines That Can Learn
Lecture7 Ml Machines That Can LearnLecture7 Ml Machines That Can Learn
Lecture7 Ml Machines That Can Learn
 

More from Athar739197

META Finals '22 .pdf .
META Finals '22 .pdf                     .META Finals '22 .pdf                     .
META Finals '22 .pdf .Athar739197
 
2_Working_principles_of_Microsystems_revised.pdf
2_Working_principles_of_Microsystems_revised.pdf2_Working_principles_of_Microsystems_revised.pdf
2_Working_principles_of_Microsystems_revised.pdfAthar739197
 
1_Introduction_MEMS_&_Microsystems.pdf .
1_Introduction_MEMS_&_Microsystems.pdf  .1_Introduction_MEMS_&_Microsystems.pdf  .
1_Introduction_MEMS_&_Microsystems.pdf .Athar739197
 
CI_SIModule_QGIS.pptx .
CI_SIModule_QGIS.pptx                         .CI_SIModule_QGIS.pptx                         .
CI_SIModule_QGIS.pptx .Athar739197
 
CI_module1.pptx .
CI_module1.pptx                                   .CI_module1.pptx                                   .
CI_module1.pptx .Athar739197
 
LECTURE 9 ROPE DRIVES THEORY.pdf .
LECTURE 9 ROPE DRIVES THEORY.pdf         .LECTURE 9 ROPE DRIVES THEORY.pdf         .
LECTURE 9 ROPE DRIVES THEORY.pdf .Athar739197
 
2021 6th SEM EOM Online Class Introduction.pdf
2021 6th SEM EOM Online Class Introduction.pdf2021 6th SEM EOM Online Class Introduction.pdf
2021 6th SEM EOM Online Class Introduction.pdfAthar739197
 
LECTURE 4 BELTS DRIVES.pdf .
LECTURE 4 BELTS DRIVES.pdf                      .LECTURE 4 BELTS DRIVES.pdf                      .
LECTURE 4 BELTS DRIVES.pdf .Athar739197
 
Sliding contact Problem 2.pdf .
Sliding contact Problem 2.pdf                 .Sliding contact Problem 2.pdf                 .
Sliding contact Problem 2.pdf .Athar739197
 
1 - Introduction to Management-merged.pdf
1 - Introduction to Management-merged.pdf1 - Introduction to Management-merged.pdf
1 - Introduction to Management-merged.pdfAthar739197
 
ENTREPRENEURSHIP EOM.pdf .
ENTREPRENEURSHIP EOM.pdf                 .ENTREPRENEURSHIP EOM.pdf                 .
ENTREPRENEURSHIP EOM.pdf .Athar739197
 
Theory z and US Japan management Onlline classes 2020.pdf
Theory z and US Japan management Onlline classes 2020.pdfTheory z and US Japan management Onlline classes 2020.pdf
Theory z and US Japan management Onlline classes 2020.pdfAthar739197
 
market analysis eom.pdf .
market analysis eom.pdf                    .market analysis eom.pdf                    .
market analysis eom.pdf .Athar739197
 
BusinessPlan eom.pdf ..
BusinessPlan eom.pdf                  ..BusinessPlan eom.pdf                  ..
BusinessPlan eom.pdf ..Athar739197
 
4. Market strategy 5C, 4P, STP.pdf .
4. Market strategy 5C, 4P, STP.pdf         .4. Market strategy 5C, 4P, STP.pdf         .
4. Market strategy 5C, 4P, STP.pdf .Athar739197
 
Controlling.pdf .
Controlling.pdf                            .Controlling.pdf                            .
Controlling.pdf .Athar739197
 
Controlling detailed.pdf .
Controlling detailed.pdf                  .Controlling detailed.pdf                  .
Controlling detailed.pdf .Athar739197
 
Motivation Online classes 2020.pdf .
Motivation Online classes 2020.pdf     .Motivation Online classes 2020.pdf     .
Motivation Online classes 2020.pdf .Athar739197
 

More from Athar739197 (20)

META Finals '22 .pdf .
META Finals '22 .pdf                     .META Finals '22 .pdf                     .
META Finals '22 .pdf .
 
2_Working_principles_of_Microsystems_revised.pdf
2_Working_principles_of_Microsystems_revised.pdf2_Working_principles_of_Microsystems_revised.pdf
2_Working_principles_of_Microsystems_revised.pdf
 
1_Introduction_MEMS_&_Microsystems.pdf .
1_Introduction_MEMS_&_Microsystems.pdf  .1_Introduction_MEMS_&_Microsystems.pdf  .
1_Introduction_MEMS_&_Microsystems.pdf .
 
ERGO 2.pptx .
ERGO 2.pptx                                              .ERGO 2.pptx                                              .
ERGO 2.pptx .
 
CI_SIModule_QGIS.pptx .
CI_SIModule_QGIS.pptx                         .CI_SIModule_QGIS.pptx                         .
CI_SIModule_QGIS.pptx .
 
CI_module1.pptx .
CI_module1.pptx                                   .CI_module1.pptx                                   .
CI_module1.pptx .
 
Brakes.pdf .
Brakes.pdf                                  .Brakes.pdf                                  .
Brakes.pdf .
 
LECTURE 9 ROPE DRIVES THEORY.pdf .
LECTURE 9 ROPE DRIVES THEORY.pdf         .LECTURE 9 ROPE DRIVES THEORY.pdf         .
LECTURE 9 ROPE DRIVES THEORY.pdf .
 
2021 6th SEM EOM Online Class Introduction.pdf
2021 6th SEM EOM Online Class Introduction.pdf2021 6th SEM EOM Online Class Introduction.pdf
2021 6th SEM EOM Online Class Introduction.pdf
 
LECTURE 4 BELTS DRIVES.pdf .
LECTURE 4 BELTS DRIVES.pdf                      .LECTURE 4 BELTS DRIVES.pdf                      .
LECTURE 4 BELTS DRIVES.pdf .
 
Sliding contact Problem 2.pdf .
Sliding contact Problem 2.pdf                 .Sliding contact Problem 2.pdf                 .
Sliding contact Problem 2.pdf .
 
1 - Introduction to Management-merged.pdf
1 - Introduction to Management-merged.pdf1 - Introduction to Management-merged.pdf
1 - Introduction to Management-merged.pdf
 
ENTREPRENEURSHIP EOM.pdf .
ENTREPRENEURSHIP EOM.pdf                 .ENTREPRENEURSHIP EOM.pdf                 .
ENTREPRENEURSHIP EOM.pdf .
 
Theory z and US Japan management Onlline classes 2020.pdf
Theory z and US Japan management Onlline classes 2020.pdfTheory z and US Japan management Onlline classes 2020.pdf
Theory z and US Japan management Onlline classes 2020.pdf
 
market analysis eom.pdf .
market analysis eom.pdf                    .market analysis eom.pdf                    .
market analysis eom.pdf .
 
BusinessPlan eom.pdf ..
BusinessPlan eom.pdf                  ..BusinessPlan eom.pdf                  ..
BusinessPlan eom.pdf ..
 
4. Market strategy 5C, 4P, STP.pdf .
4. Market strategy 5C, 4P, STP.pdf         .4. Market strategy 5C, 4P, STP.pdf         .
4. Market strategy 5C, 4P, STP.pdf .
 
Controlling.pdf .
Controlling.pdf                            .Controlling.pdf                            .
Controlling.pdf .
 
Controlling detailed.pdf .
Controlling detailed.pdf                  .Controlling detailed.pdf                  .
Controlling detailed.pdf .
 
Motivation Online classes 2020.pdf .
Motivation Online classes 2020.pdf     .Motivation Online classes 2020.pdf     .
Motivation Online classes 2020.pdf .
 

Recently uploaded

Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineeringmalavadedarshan25
 
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZTE
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSSIVASHANKAR N
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Dr.Costas Sachpazis
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSCAESB
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...Soham Mondal
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile servicerehmti665
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxAsutosh Ranjan
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130Suhani Kapoor
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).pptssuser5c9d4b1
 
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...Call Girls in Nagpur High Profile
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINESIVASHANKAR N
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingrakeshbaidya232001
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 

Recently uploaded (20)

Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineering
 
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINEDJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
 
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentation
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile service
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptx
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
 
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writing
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 

CI_GA_module2_ABC_updatedG.ppt .

  • 1. Introduction  The idea behind soft computing is to model cognitive behavior of human mind.  Soft computing introduced by Zadeh is an innovative approach to construct computationally intelligent hybrid systems.  Soft computing is an association of computing methodologies which collectively provide a foundation for the conception and deployment of intelligent systems[1].  Soft Computing is an approach for constructing systems which are: • Computationally intelligent and possess human like expertise in particular domain. • Adapt to the changing environment and learn to do better.
  • 2. Some Domains of Intelligence in Biological Systems (Computational Perspective) Evolution Competition Reproduction Swarming Communication Learning
  • 3. Soft Computing Models  Components of SC includes • Fuzzy Logic (FL) • Artificial Neural Network (ANN) • Evolutionary Computation (EC) - based on the origin of the species. • Evolutionary Algorithm (EA): Copying ideas of Nature • Swarm algorithm or Swarm Intelligence (SI) : group of species or animals exhibiting collective behavior. • Hybrid Systems  Soft computing models employs different techniques like ANN,FL,EA’s & SI in a complementary rather than a competitive way.  Integrated architectures like Neuro Fuzzy , ANN-EA combination and ANN-SI techniques are some of the hybrid approaches used for performance improvements.
  • 4. Biological Basis: Neural Networks & EC  Complex adaptive biological structure of human brain facilitates performance of complex tasks.  The processing element in an ANN is generally considered to be very roughly analogous to a biological neuron .  Neural networks ties with genetics “branch of biology that deals with the heredity and variation of organisms”[3].  Chromosomes: Structures in cell bodies that transmit genetic information, Individual patterns in EA corresponds to chromosomes in biological systems.  The genotype completely specifies an organism, in EC a structure specifies a system.  Swarm Intelligence is the new member of EC , nature inspired algorithm which mimic insect’s problem solution abilities[4].
  • 5. Artificial Neural Network  The basic concept of an artificial neural network (ANN) is derived from an analogy with the biological nervous system of the human brain.  An ANN is a massively parallel system composed of many neurons, where synapses are actually variable weights, specifying the connections between individual neurons[2].  The neurons continuously evaluate their output by looking at their inputs ,calculating the weighted sum and compares to a threshold to decide if they should fire.  Learning algorithm gives the inputs ,adjust the weights to produce the required output.  ANN’s are algorithms for optimization & learning based loosely on concepts inspired by the nature of the brain.
  • 6. Two-layer feed forward network  Just as individuals learn differently, neural network have different learning rules. Learning may be Supervised or Unsupervised.  Supervised learning requires that when the input stimuli are applied, the desired output is known a priori.  The most popular algorithm for adjusting weights during the training phase is called back propagation of error. • Feed-forward neural network is the simplest form of an ANN INPUT LAYER X ( 1,X2) HIDDEN LAYER(H1,H2,H3) OUTPUT LAYER(O) W 1 1, W ,1 2 W 1 , 1 W1,2 W 1 , 2 W3,1 W 2 2, W1,3 W ,3 2 X1 X2 H1 H2 H3 O
  • 7. Error correction learning  Error correction learning used with supervised learning is the technique of comparing the system output to the desired output value and using that error to direct the training.  It is formulated as the minimization of an error function such as the total mean square error between the actual output and the desired output summed over all available data.  The most popular learning algorithm for use with error correction learning is the back propagation algorithm(BP).  The delta rule is often utilized by the most common class of ANNs called back propagation neural networks.
  • 8. Gradient Decent Optimization  A gradient descent based optimization algorithm such as BP can be used to adjust connection weights in the ANN iteratively in order to minimize the error.  BP is the variation on gradient search, the key to BP is a ,method for calculating the gradient of the error with respect to the weights for a given input by propagating error backwards through the network.  When a neural network is initially presented with a pattern it makes a random guess as to what it might be.  It then sees how far its answer was from the actual one and makes an appropriate adjustment to its connection weights[2]. Input Desired Output
  • 9. Limitations  Neural networks are used for solving a variety of problems but they still have some limitations. One of the most common is associated with neural network training.  The BP learning algorithm cannot guarantee an optimal solution.  In real world applications ,the BP algorithm might converge to a set of suboptimal weights from which it cannot escape. So the neural network is often unable to find a desirable solution to a problem at hand.  Another difficulty is related selecting an optimal topology for the neural network.  Evolutionary computations are effective optimization techniques that can guide both weight optimization and topology selection[6].
  • 10. Optimization • The main goal of optimization is to find values of the variables that minimize or maximize the objective function. • The main components of optimization problem are objective function which we want to minimize or maximize and the design variables. • Modelling is the process of identifying objective function and variables. • Formulating good mathematical model of the optimization problem requires algorithms with robustness, efficiency and accuracy. • Optimization algorithms are classified in to: – Local Optimization – Global Optimization
  • 11. Issues in evolving Neural Networks  EC methodologies have greatly been applied to four main attributes of neural networks:  Network connection weights and network architecture  Network learning algorithm and evolution of inputs  The architecture of the network often determines the success or failure of the application, usually the network architecture is decided by trial and error .  There is a great need for a method of automatically designing the architecture for a particular application.  Both the network architecture and the connection weights need to be adapted simultaneously or sequentially.  Thus EC methodologies have been applied to evolve the network weights, the network topology.
  • 12. Evolutionary Computation -EC  Evolutionary Computation (EC) refers to computer-based problem solving systems that use computational models of evolutionary process.  EC is the study of computational systems which use ideas and inspirations from natural evolution and other biological systems.  EC is based on biological metaphors[4].  Two biological metaphors which are the two important classes of population based optimization algorithms are : • Evolutionary algorithms • Swarm algorithms  EC techniques are used in optimization ,machine learning and Automatic design.
  • 13.  GA is specific class of EC that performs a stochastic search by using the basic principles of natural selection.  GA’s are algorithms for optimization and learning based loosely on several features of biological evolution[3].  GA’s are inspired by Darwin's theory of natural evolution.  Formally introduced in the US in the 1975 by John Holland referred as simple genetic algorithm( SGA).  Used to optimize a given objective function, where parameters are encoded in something analogous to a gene.  A GA applies biological principles into computational algorithm to obtain the optimum solutions and is a robust method for searching the optimum solution to a complex problem. Evolutionary Algorithms
  • 14. Evolutionary Algorithms  EA’s are optimization methods based on evolutionary metaphor that showed effective in solving difficult problems.  The 4 main processes in evolutionary algorithms are: • Initialization process • Fitness evaluation • selection • Generation of new population  After initialization, the population is evaluated and stopping criteria are checked.  If none of the stopping criteria is met ,a new population is generated again and again and the process is repeated.
  • 15. Framework of Genetic algorithm 1. t := 0; 2. Generate initial Population P(t) at random; 3. Evaluate the fitness of each individual in P(t); 4. while (not termination condition) do 5. Select parents, Pa(t) from P(t) based on their fitness in P(t); 6. Apply crossover to create offspring from parents: Pa(t) ->O(t) 7. Apply mutation to the offspring: O(t) ->O(t) 8. Evaluate the fitness of each individual in O(t); 9. Select population P(t+1) from current offspring O(t) and parents P(t); 10. t := t+1; 11. end-do
  • 16. Genetic Algorithms (II) Population of individuals or alternative (feasible) solutions Next generation of individuals Mating pool of “fitter” individuals Evaluate individuals on their fitness Select individuals based on fitness for subsequent mating Select individuals & exchange charac- teristics to create new individuals Arbitrarily change some characteristic Heredity
  • 17. Genetic Algorithms (IV) Generation of initial population Basic Tasks Evaluation Selection (Reproduction operation) Exchange characteristics to develop new individuals (Crossover operation) Arbitrarily modify characteristics in new individuals (Mutation operation)
  • 18. Genetic Algorithms (V) Reproduction / Selection Operator The purpose is to bias the mating pool (those who can pass on their traits to the next generation) with fitter individuals Assign p as the prob. of choosing an individual for the mating pool p is proportional to the fitness Choose an individual with prob. p and place it in the mating pool Continue till the mating pool size is the same as the initial population’s Choose n individuals randomly Pick the one with highest fitness Place n copies of this individual in the mating pool Choose n different individuals and repeat the process till all in the original population have been chosen
  • 19. Genetic Algorithms (VI) Crossover operator 1 0 0 1 1 0 1 1 1 0 0 1 1 1 1 0 0 1 1 1 1 1 1 0 0 1 0 1
  • 20. Genetic Algorithms (VII) Mutation 1 0 0 1 1 0 1 1 0 0 0 1 0 1
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35. Components of a GA A problem to solve….. • Encoding technique :Representation of individuals (gene, chromosome) • Initialization procedure (Creation) • Evaluation function (Environment) • Selection of parents (Reproduction) • Genetic operators (Mutation, Recombination) • Parameter settings (Practice and art)
  • 36. GA working principle Start Create initial random population Evaluate fitness for each member of the population Store best individuals Create mating pool Create next generation using crossover Optimal or good solution found? Stop Perform Mutation NO YES Fitness evaluation Population Selection Crossover Mutation
  • 37. Basic Genetic Algorithm  [Start] algorithm begins with a set of initial solutions (represented by set of chromosomes) called population.  [Fitness] Evaluate the fitness f(x) of each chromosome x in the population.  Repeat until terminating condition is satisfied [Selection] Select two parent chromosomes from a population according to their fitness (the better fitness, the bigger chance to be selected). • [Crossover] Crossover the parents to form new offspring's (children). If no crossover was performed, offspring is the exact copy of parents. • [Mutation] Mutate new offspring at selected position(s) in chromosome). • [Accepting] Generate new population by placing new offspring's.  Return the best solution in current population
  • 38.  Simple problem: max x2 over {0,1,…,31}  GA approach: • Representation: binary code, e.g. 01101  13 • Population size: 4 • one-point cross over, bitwise mutation • Roulette wheel selection • Random initialization An example after Goldberg
  • 39. Initial population  Encoding: • code the decision variable ‘x’ into a finite length string. Using a five-bit unsigned integer, numbers between 0 (00000) and 31(11111) can be obtained. • The objective function here is f(x) = x2 which is to be maximized.  Initial Population : • An initial population of size 4 is randomly chosen : {12, 25, 5, 19} • Then, we should obtain the decoded x values for the initial population generated: String # X value Binary Code 1 12 01100 2 25 11001 3 5 00101 4 19 10011
  • 40. Fitness evaluation  Objective function • Calculate the fitness or objective function for each individual. • This is obtained by simply squaring the ‘x’ value, since the given function is f(x) = x2.  Probability of selection : Compute the probability of selection as follows: • for string 1, Fitness f(x1) = 144, and Σf(x i ) = 1155 • The probability that string 1 occurs is given by, =144/1155=0.1247=12.47%. • Similarly calculate for all strings.
  • 41. Roulette Selection  Roulette selection: Expected count • The expected and actual count method is proposed for roulette selection. • The next step is to calculate the expected count • For string 1, Expected count = Fitness/Average = 144/288.75 = 0.4987  The expected count gives an idea of which population can be selected for further processing in the mating pool.  Roulette selection: Actual count : • The actual count is to be obtained to select the individuals, which would participate in the crossover cycle using Roulette wheel selection.
  • 42. Roulette wheel Selection  The Roulette wheel is formed as follows: String1 (12) 12.47% String2 (25) 54.11% String3 (5) 2.16% String4 (19) 31.26% Maximum 625 54.11% 2.1645 2
  • 43. Mating pool  String 1 occupies 12.47%, so there is a chance for it to occur at least once. Hence its actual count may be 1. With string 2 occupying 54.11% of the Roulette wheel, it has a fair chance of being selected more than once. Thus its actual count can be considered as 2.  On the other hand, string 3 has the least probability percentage of 2.16%, so their occurrence for next cycle is very poor. As a result, it actual count is 0.  String 4 with 31.26% has at least one chance for occurring while Roulette wheel is spun, thus its actual count is 1.  Based on actual count the mating pool is formed as follows: String # X value Mating pool 1 12 01100 2 25 11001 2 25 11001 4 19 10011
  • 44. Crossover  Crossover operation is performed to produce new offspring (children) .  The crossover probability is assumed to 1.0.  The crossover point is specified and based on the crossover point (chosen randomly), single point crossover is performed and new offspring is produced. String # X value Mating pool Cross point Offsprings code Offsprings x value 1 12 0110|0 4 0110|1 13 2 25 1100|1 4 1100|0 24 2 25 11|001 2 11|011 27 4 19 10|011 2 10|001 17
  • 45. Mutation  Mutation operation is performed to produce new off springs after crossover operation.  We select the mutation-flipping operation to be performed and then new off springs are produced.  The mutation probability is assumed to 0.001. String # Offspring X value Offspring code before mutation Mutation Chromosome Offsprings code after mutation Offsprings x value after mutation 1 13 01101 10000 11101 29 2 24 11000 00000 11000 24 3 27 11011 00000 11011 27 4 17 10001 00100 10101 21
  • 46. Evaluation  Once selection, crossover and mutation are performed, the new population is now ready to be tested. The population and the corresponding fitness values are now ready for another round producing another generation. More generations are produced until some stopping criterion is met.  It can be noted how maximal and average performance has improved in the new population. The population average fitness has improved from 288.75 to 646.75 in one generation.  The maximum fitness has increased from 625 to 841 during same period. This example has shown one generation of a simple genetic algorithm. Therefore, many generations can be produced to get more optimal solutions.
  • 47. Example of Genetic Algorithm
  • 49.
  • 50.
  • 51.
  • 52.
  • 53. Testing GA • It cannot be said with certainty that the genetic algorithm has found the global minimum value. Only by testing the algorithm with analytical benchmark functions, you can find the algorithm is correct. • In other cases, you should compare the results with the laboratory data, or find a way to make sure the answers are correct. for example, find a way to predicts the order of magnitude of optimal points. • Looking at the fitness of the best-found solution so far can be a good sign, but totally if you have no idea of the global optimum, let to progress the optimisation until the rate of improvement is negligible.
  • 54.
  • 55.
  • 56. Advantages/ disadvantages of GA • Advantages – parallelism and solution space is wide • Disadvantages : – The problem of finding fitness function – definition of representation of the problem – premature convergence occurs – parameter sensitive – An effective GA representation and meaningful fitness evaluation are the keys of the success in GA applications.
  • 58. Some randomly generated chromosome made of 8 genes representing 8 weights for BPN
  • 59.
  • 60.
  • 62. Steps
  • 63. Advantages/ disadvantages of GA • Advantages – parallelism and solution space is wide • Disadvantages : – The problem of finding fitness function – definition of representation of the problem – premature convergence occurs – parameter sensitive – An effective GA representation and meaningful fitness evaluation are the keys of the success in GA applications.
  • 64. Swarm Intelligence • Swarm Intelligence has two fundamental concepts: • self organizing – Positive feedback: Amplification – Negative feedback: Balancing – Fluctuations – Multiple interactions • Division of labor – simultaneous task performance by cooperating specialized individuals – enables the swam to respond to changed conditions in the search space.
  • 65. Swarm algorithm  Mimicking emergent behaviors observed in social animals on computer systems[4]. • Bacteria • Immune system • Ants (ACO) , Honey bees ( ABC) • Birds (PSO) and other social animals  Particle Swarm Optimization( PSO) and Artificial Bee Colony(ABC) are widely used Swarm Intelligence based method.  PSO is Inspired by simulation social behavior related to bird flocking, fish schooling and swarming theory.  ABC is inspired by simulation of foraging behavior related to real honey bees and swarming theory.
  • 66. Particle Swarm Optimization  Inspired by simulation social behavior Related to bird flocking, fish schooling and swarming theory: - steer toward the center - match neighbors’ velocity - avoid collisions  Suppose A group of birds are randomly searching food in an area. • There is only one piece of food in the area being searched. • All the birds do not know where the food is. But they know how far the food is in each iteration. • So what's the best strategy to find the food? The effective one is to follow the bird which is nearest to the food.
  • 67. Overview of basic PSO  Particle swarm optimization (PSO) is a population based on stochastic optimization algorithms to find a solution and then solve an optimization problem in a search space.  It has been developed by Eberhart and Kennedy in 1995, inspired by social behavior of bird flocking or fish schooling.  How can birds or fish exhibit such a coordinated collective behavior?
  • 68. PSO  PSO is a robust stochastic optimization technique based on the movement and intelligence of swarms, applies this concept of social interaction to problem solving.  It uses a number of agents (particles) that constitute a swarm moving around in the search space looking for the best solution.  Each particle is treated as a point in a N-dimensional space which adjusts its “flying” according to its own flying experience as well as the flying experience of other particles.  Each particle keeps track of its coordinates in the solution space which are associated with the best solution (fitness) that has achieved so far by that particle pbest and the best value obtained so far by any particle in the neighborhood of that particle called gbest.
  • 69. PSO  In PSO, each single solution is a "bird" in the search space called "particle". • All of particles have fitness values which are evaluated by the fitness function to be optimized.  All particles have velocities which direct the flying of the particles. The particles fly through the problem space by following the current optimum particles.  Initialize with randomly generated particles. Update through generations in search for optima. • Each particle has a velocity and position. • Update for each particle uses two “best” values. • pbest: best solution (fitness) it has achieved so far. (The fitness value is also stored.) • gbest: best value, obtained so far by any particle in the population.
  • 70. PSO  Each particle tries to modify its position using the following information: • the current positions and the current velocities, • the distance between the current position and pbest, • the distance between the current position and the gbest.  The modification of the particle’s position can be mathematically modeled according the following equation : Vi k+1 = wVi k +c1 rand1(…) x (pbesti-si k) + c2 rand2(…) x (gbest-si k) Where vi k : velocity of agent i at iteration k, w: weighting function, cj : weighting factor or learning factor rand : uniformly distributed random number between 0 and 1, si k : current position of agent i at iteration k, pbesti : pbest of agent i, and gbest : gbest of the group
  • 71.
  • 72. PSO algorithm  Let particle swarm move towards the best position in search space, remembering each particle’s best known position and global (swarm’s) best known position.  Let xi – specific particle vi – particle’s velocity pi – particle’s (personal) best known position g – swarm’s (global) best known position vi ← ωvi + φprp(pi - xi) + φgrg(g - xi) inertia cognitive social and xi ← xi + vi
  • 74.
  • 75.
  • 76.
  • 77.
  • 78.
  • 79.
  • 80.
  • 81.
  • 82.
  • 83.
  • 84.
  • 85.
  • 86.
  • 87.
  • 88. ANN weight optimization process using PSO • The PSO technique is used for the weight optimization of feed forward neural network structure. The network • was pre-trained using the PSO to arrive at the initial network weights. The searching process of PSO-BP algorithms • is started from initializing the starting position and the velocity of particles. In this case, the particle is a group of the • weights of the feed forward neural network structure. There are 13 weights for the 2-3-1 feed forward neural • network structure node topology and thus the particle consists of 13 real numbers as shown in Fig1.
  • 89. ANN weight optimization process using PSO
  • 90. Methodology • The present work integrates the PSO with the Back Propagation algorithm to form a hybrid-learning algorithm for training the feed forward neural networks. • In the proposed work for calculating the global optimum, the PSO and the ANN algorithms are integrated to increase the efficiency. The forecasting models were developed using the historical groundwater level and the rainfall data, which were recorded from three observation wells, located inUdupi district, India. • The water level and the rainfall data of the observation wells located in Brahmavar, Kundapur, and Hebri taluks were used for the year 2000-2013. • The groundwater in these regions mainly occurs in water table conditions. The PSO is used to evolve the neural network weights
  • 91.
  • 92. • The particles are evaluated and updated until a new generation set of particles are generated. The Root Mean Square Error (RMSE) is used as the fitness function. • This searching procedure is repeated to search the global best position in the search space. If the fitness function is greater than the particle best, then the particle best is considered the particle position, otherwise the global best as the particle best which has the minimum value of fitness function. • Based on the pBest, the gBest, and the current best values, the updated velocity is computed. • The particle position is updated based on the updated velocity. The process is repeated for iterations until a minimum error is obtained.
  • 93. • The PSO can be applied to train the ANN and this process is iterated until we get a minimum error. Thus, the PSO is integrated with the ANN in order to search the optimal weights for the network. • Finally, the network is trained using the updated weights and finally the trained network is used to forecast the groundwater level of the testing set.
  • 94. • The analysis is being performed for forecasting the groundwater levels for the different input combinations as identified by all the three well locations. Initially nine years (2000-2008) of data is considered as the training set and the ground water level is forecasted for 2009. •
  • 95. • a comparison was made • between the values predicted using the BP and the Hybrid ANN-PSO algorithms. The forecasted groundwater level • using ANN and ANN-PSO models during testing for the located wells of the study area are shown graphically from • Fig. 3 to Fig. 8.
  • 96.
  • 97.
  • 98.
  • 99. Artificial Bee Colony • ABC algorithm is one of the most recently introduced swarm based optimization algorithm proposed by Karaboga (2005) • ABC simulates the intelligent foraging behavior of honeybee swarm. • Based on inspecting the behavior of honey bees on finding nectar and sharing the information of food sources to the bees in the hive. • Observations and studies on honey bee behaviors resulted in a new generation of optimization algorithm called as “Artificial Bee Colony”. • Karaboga has described the Artificial Bee Colony (ABC) algorithm based on the foraging behavior of honey bees for numerical optimization problems.
  • 100. Behavior of Honey Bee Swarm Three essential components of forage selection: • Food Sources: The value of a food source depends on many factors such as its proximity to the nest, its richness or concentration of its energy, and the ease of extracting this energy. • Employed Foragers: Associated with a particular food source which they are currently exploiting or are “employed” at. They carry with them information about this particular source, its distance and direction from the nest, the profitability of the source and share this information with a certain probability. • Unemployed Foragers: Continually at look out for a food source to exploit. There are two types of unemployed foragers: scouts, searching the environment surrounding the nest for new food sources and onlookers waiting in the nest and establishing a food source through the information shared by employed foragers.
  • 101. Exchange of Information among bees • The model defines two leading modes of the behavior: – recruitment to a nectar source – the abandonment of a source. • The exchange of information among bees is the most important occurrence in the formation of collective knowledge. • The most important part of the hive with respect to exchanging information is the dancing area. • Communication among bees related to the quality of food sources takes place in the dancing area and this dance is called a Waggle dance.
  • 102. ABC • Employed foragers share their information with a probability proportional to the profitability of the food source, and the sharing of this information through waggle dancing is longer in duration. • An onlooker on the dance floor, probably can watch numerous dances and decides to employ themselves at the most profitable source. • The bees evaluate the different patches according to the quality of the food and the amount of energy usage. • Bees communicate through a waggle dance which contains information about – the direction of flower patches (angle between the Sun and patch – the distance from the hive( duration of the dance) – The quality rating( frequency of the dance) • Thus ABC is developed based on inspecting the behaviors of real bees on finding nectar and sharing the information of food sources to the bees in the hive.
  • 103. Bees in Nature • Colony Contains 3 groups of bees : • The employed Bees( 50%) – It stays on a food source and provides the neighborhood of the source in its memory • The onlooker Bee( 50%) – It gets the information of food sources from the employed bees in the hive and select one of the food sources from the employed bees in the hive and select one of the food source to gathers the nectar. • The Scout ( 5-10%) – It is responsible for finding new food ,the new nectar, sources. • The employed bee whose food source has been exhausted by the bees ,becomes a scout. Scouts are the colony’s explorer’s. • Number of employed bees=number of food sources • Food source position=possible solution to the problem • The amount of nectar of a food source=quality of the solution • There is a greater probability of onlookers choosing more profitable sources since more information is circulated about the more profitable sources.
  • 104. Artificial Bee Colony Algorithm • Simulates behavior of real bees for solving multidimensional and multimodal optimization problems. • The first half of the colony consists of the employed artificial bees and the second half includes the onlookers. • The number of employed bees is equal to the number of food sources around the hive. • The employed bee whose food source has been exhausted by the bees becomes a scout.
  • 105. Components of Honey bee swarm
  • 109. Employed Bee Phase Implementation
  • 110. Evaluation & soln generation
  • 117. Limit
  • 122. Framework of ABC Algorithm
  • 123. ABC algorithm • Each cycle of search consists of three steps: – moving the employed and onlooker bees onto the food sources – calculating their nectar amounts – determining the scout bees and directing them onto possible food sources. • A food source position represents a possible solution to the problem to be optimized. • The amount of nectar of a food source corresponds to the quality of the solution. • Onlookers are placed on the food sources by using a probability based selection process. • As the nectar amount of a food source increases, the probability value with which the food source is preferred by onlookers increases, too.
  • 124. • The scouts are characterized by low search costs and a low average in food source quality. One bee is selected as the scout bee. • The selection is controlled by a control parameter called "limit". • If a solution representing a food source is not improved by a predetermined number of trials, then that food source is abandoned and the employed bee is converted to a scout. • Control parameters of ABC algorithm are: – Swarm size – Limit – number of onlookers: 50% of the swarm – number of employed bees: 50% of the swarm – number of scouts: 1
  • 125. Flow chart of ABC Evaluate the Fitness of the Population Determine the Size of Neighbourhood (Patch Size ngh) Recruit Bees for Selected Sites (more Bees for the Best e Sites) Select the Fittest Bee from Each Site Assign the (n–m) Remaining Bees to Random Search New Population of Scout Bees Select m Sites for Neighbourhood Search Neighbourhood Search Initialise a Population of n Scout Bees
  • 126. Movement of the Onlookers
  • 128.
  • 129.
  • 130.
  • 131.
  • 132.
  • 133. 2rd
  • 134.
  • 135. 4th
  • 136.
  • 137.
  • 138.
  • 139.
  • 140.
  • 141.
  • 142.
  • 143.
  • 144. Limit
  • 148.
  • 149.
  • 150.
  • 151. Hybrid approaches • ABC is good at exploration but poor at exploitation • There are some studies touching on the hybridization of PSO and ABC algorithms • the hybridization of techniques is realized based on the need of the PSO algorithm. • Particle Swarm Optimization includes a handicap, which is the absence of the regeneration of ineffective particles that cannot improve their Pbest values. On the other hand, the ABC algorithm contains a scout bee phase to eliminate the handicap of regeneration. For this reason, we added the scout bee phase into Standard PSO to upgrade its performance
  • 152.
  • 153. • The Standard PSO algorithm doesn’t contain a control parameter to regenerate insufficient particles. At this point, these particles are the ones that cannot retrieve their Pbest value. • Particles are updated without any diversity in Standard PSO, and their adequacies are not controlled . It is obvious that PSO needs a control parameter to improve its convergence capability, but this parameter must not increase its convergence time significantly. • Consequently, it looks similar to a reasonable idea to insert the scout bee phase into the Standard PSO algorithm. By adding the scout bee phase into PSO, ScPSO is obtained . In ScPSO, all processes (except limit) are the same with the PSO algorithm.
  • 154. Pseudocode for ScPSO Pseudocode of ScPSO -Initialize all particles within the user defined boundaries (The first best position (Pbest) values are equal to the position of particles) -Define a limit value within the range [1, (maximum iteration number-1)] While (iteration number < maximum iteration number) -Calculate fitness according to the cost function for all particles -Update the best position values according to fitness values for all particles -Choose the best Pbest vector as being Gbest (vector achieved to the minimum cost) -Calculate new positions according to following equations for all particles Vi(t + 1) = ωVi(t)+c1r1(Xpbest(i)(t)-Xi(t))+c2r2(Xgbest(t)-Xi(t)) Xi(t + 1) = Xi(t)+Vi(t + 1) -If a variable inertia weight is used, change it in accordance with the utilized rule -Control all particles which exceed the parameter ‘limit’, then regenerate the useless ones End
  • 155. Hybrid systems • The combination of knowledge based systems ,neural networks and evolutionary computation forms the core of an emerging approach to building hybrid intelligent systems. • The hybridization of genetic algorithm with other methods like gradient descent methods will help to achieve balance between robustness and efficiency. • Start with GA a search heuristic which mimics evolution by taking a population of strings ,which encode possible solutions and combines them based on a fittest function to produce individuals that are more fit and switch later to a gradient descent based method. • There has been a great interest in combining learning and evolution with ANN in recent years. • A GA-based ANN (ANN-GA) model, a hybrid integration of ANN and GA algorithms may have better performance by taking advantages of the characteristics of both of them.
  • 156. Evolutionary neural networks • The architecture of the network often determines the success or failures of the application. • There is a great need for a method of automatically designing the architecture for a particular application. • Evolutionary Computations are effective optimization techniques that can guide both weight optimization and topology selection. • Genetic algorithms may well suited for this task. • The basic idea behind evolving a suitable network architecture is to conduct a genetic search in a population of possible architectures. • The GA performs global search capable of effectively exploring large search space which has been used for optimally designing the ANN parameters including connection weights connection weights, ANN architectures and input selection.
  • 157. Integrated Back propagation based genetic algorithm
  • 158. BP/GA algorithm Start: generate random population of ‘p’ chromosomes ( suitable solution for the problem) Extraction: extract weights for input-hidden-output layers from each chromosome x. Fitness: evaluate the fitness f(x) of each chromosome x in the population by reciprocating the cumulative error obtained for each input set. New population : Create a new population by reprating following steps until the new population is complete. – selection: select two parent chromosomes from a population according to their fitness – Crossover: cross over the parents to form new offstring. – Mutation: with a mutation probability mutate new offspring at each position in chromosome. – Acceptance: place the new offspring in the new population Repeat steps 3 to 5 until stopping condition is met Test: return the best solution in current population using the test set inputs and the weights.
  • 159. Hybrid approach  The original population is a set of N chromosomes which is generated randomly. • Fitness of each chromosome is computed by minimum optimization method. • The training set of examples is presented to the network & the sum of squared errors is calculated. • Fitness is given by fitness value formula which is error minimization, a simple function defined by the sum of squared errors. Smaller the sum fitter the chromosome. • The GA attempts to find a set of weights that minimizes the sum of squared errors.
  • 160. Hybrid approach • The new population is given as input to PN to compute the fitness of each chromosome followed by selection, crossover and mutation to generate the next population. • This process is repeated till more or less all the chromosomes converge to the same fitness value. • The weights represented by the chromosomes in the final converged population are the optimized connection weights of the BPN.
  • 161. ANN weight optimization using GA 1. Encoding a set of weights in a chromosome.  We must first choose a method of encoding a network’s architecture into a chromosome.
  • 162.  The second step is to define a fitness function for evaluating the chromosome’s performance. This function must estimate the performance of a given neural network. We can apply here a simple function defined by the sum of squared errors.  The training set of examples is presented to the network ,and the sum of squared errors is calculated. The smaller the sum ,the fitter the chromosome. The genetic algorithm attempts to find a set of weights that minimises the sum of squared errors.  The third step is to choose the genetic operators- crossover and mutation. A crossover operator takes two parent chromosomes and creates a single crossover child with genetic material from both parents. Each gene in the child’s chromosome is represented by the corresponding gene of the randomly selected parent.  A mutation operator selects a gene in a chromosome and adds a small random value between -1 and 1 to each weight in this gene
  • 163. Crossover in weight optimisation THANK YOU
  • 164. Mutation in weight optimisation
  • 165. Encoding the network architecture  The network architecture is decided by trial and error; there is a great need for a method of automatically designing the architecture for a particular application.  Genetic algorithms may well be suited for this task.  The basic idea behind evolving a suitable network architecture is to conduct a genetic search in a population of possible architectures.  The connection topology of a neural network can be represented by a square connectivity matrix  We must choose a method of encoding a networks architecture in to a chromosomes.
  • 166. Encoding of the network topology  Each entry in the matrix defines the type of connection from one neuron (column) to another (row), where 0 means no connection and 1 denotes connection for which the weight can be changed through learning.  To transform the connectivity matrix into a chromosome, we need only to string the rows of the matrix together.
  • 167. The cycle of evolving a neural network topology
  • 168. Conclusion  Soft computing is an association of computing methodologies for constructing computationally intelligent hybrid systems.  The core techniques of SC are ANN,EC ,Fuzzy Logic and probabilistic reasoning and hybridization is one of the central aspect of this field.  BP algorithm cannot guarantee an optimal solution. EC’s are effective optimization techniques that can guide both weight optimization and topology selection  Integrated architectures like Neuro-Fuzzy, ANN-GA combinations and ANN-SI techniques are some of the hybrid approaches used for performance improvement.  Direct EC technique could fail to obtain a n optimal solution. This clearly says the need for hybridization of EC techniques with other optimization algorithms , machine learning techniques and heuristics.