SlideShare a Scribd company logo
Universit¨at Paderborn, Fachbereich 17 - Mathematik/Informatik
Elaboration for the course Evolutionary Algorithms
Self-adaptive Evolutionary
Algorithms
Bartlomiej Gloger
Matr.-Nr: 6054499
Betreuer: Oliver Kramer, Chuan-Kang Ting
Datum: 28. Januar 2004
Contents
1 Introduction 1
1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2 Evolutionary Algorithms 3
2.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Self-adaptiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2.1 Deterministic parameter control . . . . . . . . . . . . . . 5
2.2.2 Adaptive parameter control . . . . . . . . . . . . . . . . . 6
2.2.3 Self-adaptive parameter control . . . . . . . . . . . . . . . 6
2.3 Examples of self-adaptive Evolutionary Algorithms . . . . . . . . 7
2.3.1 Self-adaptive Evolutionary Algorithms adapting one pa-
rameter . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.3.2 Combining self-adaptive parameters . . . . . . . . . . . . 9
3 Summary 11
3.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.2 Future prospects . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
References 13
III
Chapter 1
Introduction
This elaboration which is created in the context of the course ”Evolutionary
Algorithms” held at the University of Paderborn covers the field of self-adaptive
Evolutionary Algorithms (EAs).
1.1 Motivation
Many parameters are used in a variety of Evolutionary Algorithms. Some of
them are the initial population size, the number of crossover points, the proba-
bilities for the mutation or crossover. Each of these parameters are often tuned
and adjusted ”by hand”. Typically one parameter is adjusted at a time, which
may lead to sub-optimal choices, since often it is not known how the parameters
interact. Parallel tuning of multiple parameters cause the amount of tests to
rise exponentially.
Another approach is to transfer the results for the parameters of a given
problem to a new similar one. But there is no guarantee that this approach
is viable. The rigid form of static parameters contradicts the dynamic nature
of EAs. E.g. a large mutation step in the early generations may lead to a
faster approximation and a small step in the late generation to a more accurate
solution. This is not possible with fixed parameters.
The nature of the parameter setting problem is typical for problems solved
by EAs1. So an intuitive approach is to evolve the parameters with the algo-
rithm, too. Many different procedures were researched to adapt the parameters
and the parameter choice differs strongly from case to case, but the main idea is
to no longer choose the parameters semi-arbitrarily2 but to let the parameters
adapt themselves to the problem.
1
It qualifies as an optimization problem.
2
The choices were often made from experience.
1
2 CHAPTER 1. INTRODUCTION
1.2 Overview
This work is organized as follows. The next chapter gives a short introduction
to evolutionary algorithms followed by the definition of terms used to describe
self-adaptive EAs. The chapter ends with examples of the use of self-adaptive
parameters in previous work. Chapter 3 gives a summary and ends with future
prospects.
Chapter 2
Evolutionary Algorithms
In the first part of this chapter I will describe the principles behind Evolutionary
Algorithms. This description will be followed by the idea of self-adaptiveness
and its application to EAs. The following part of this chapter give examples
of self-adaptive EAs which concentrate on different operators. The last part
considers the combination of parameter adaption.
2.1 Overview
Evolutionary computation and Evolutionary Algorithms use computational mod-
els of evolutionary processes from biology as key elements in the design and
implementation of computer-based problem solving systems. They take their
inspiration from natural selection and survival of the fittest. EAs differ from
more traditional optimization techniques in that they involve a search from a
population of solutions, not from a single point. Each iteration of an EA usually
involves a competitive selection that weeds out poor solutions. The solutions
with high fitness are recombined with other solutions by swapping parts of a
solution with another. Solutions are also mutated by making a small more or
less random change to a single element of the solution. Recombination and
mutation are used to generate new solutions that are biased towards regions
of the search space for which good solutions have already been seen. Figure
2.1 shows the usual approach to reflect the Evolution from biology to computer
science.
Several different types of evolutionary search methods were developed indepen-
dently. These include
• Genetic Programming (GP)
• Evolutionary Programming (EP)
• Evolutionary Strategies (ES)
• Genetic Algorithms (GA)
3
4 CHAPTER 2. EVOLUTIONARY ALGORITHMS
Figure 2.1: Most Evolutionary Algorithms use this sequence of steps: Selection,
Reproduction, Mutation and Evaluation.
Genetic Programming creates and evolves programs. They are well suited
for problems that require the determination of a function that can be simply
expressed in a function form. Evolutionary Programming focuses on optimizing
continuous functions without recombination while Evolutionary Strategies focus
on optimizing continuous functions with recombination. Both are well suited for
optimizing continuous functions. They use a real valued representation [Fog97].
Genetic Algorithms focus on optimizing general combinatorial problems though
they have occasionally been applied to continuous problems, too. They usually
use a binary representation.
The different steps in a typical EA cycle, (Selection, Reproduction, Muta-
tion and Evaluation) each have many parameters. Questions arise like how
many individuals should the initial population have, which crossover operator
to choose (e.g. uniform vs. 2-point crossover) or what value should the muta-
tion probability have. These parameters are usually chosen experimentally and
given to the algorithm ”by hand”1. The quality of the algorithm relies on the
quality of these parameters. The better the parameters are the faster an EA
reaches a solution, the better it can handle difficult situations2 and it may even
deliver better results.
Determining good parameters is a very time-consuming task. Often many
iterations are needed before e.g. a good mutation rate is found as the param-
eters are problem dependent and they can even interact between themselves,
complicating the search. Also the problem description often does not give clues
e.g. to how large a population should be.
The attributes of the search for good parameters qualifies it as an EA prob-
lem. Therefore attempts were made to apply EAs to its own parameters. This
1
The algorithm is initialized by a config file e.g.
2
Local minima in a minimization problem
2.2. SELF-ADAPTIVENESS 5
idea is called parameter control and explained in the following section.
2.2 Self-adaptiveness
Before I go into detail how self-adaptiveness works, I want to define and classify
this term. In [EHM99] Eiben, Hinterding and Michalewicz give a terminology
which I will use throughout this elaboration.
In the choice of parameters they distinguish two major forms: the parameter
tuning and parameter control. Parameter tuning describes the process of setting
parameters before the run of the EA while parameter control defers to changing
the parameters during the run. They subclassify the parameter control into
three categories: deterministic, adaptive and self-adaptive parameter control
(see Figure 2.2).
Figure 2.2: Classification of methods how parameters are chosen. Parameter tun-
ing refers to changing parameters ”by hand”, while parameter control refers to set-
ting the parameters e.g. based on some function which is dependent on a algorithm
variable like the number of generations.
In the following three sections I will give examples to each of these parameter
control approaches on the basis of mutation step size parameter with a Gaussian
mutation which uses the normal distribution of N(0, σ). The mean is set to zero
and the standard deviation of σ can be interpreted as the mutation step size.
The mutation is applied as follows:
xi = xi + N(0, σ)
where xi is each component of an individual x.
2.2.1 Deterministic parameter control
The first possibility for parameter control is to parameterize the σ with a vari-
able t leading to a function σ(t). Now we get the possibility to change the σ
during the run. As is shown in [B¨ac92] this may improve the algorithm speed.
One possible choice of t could be the generation number: the longer the algo-
rithm runs the smaller values for the mutation step size it gets. The idea is that
6 CHAPTER 2. EVOLUTIONARY ALGORITHMS
at the beginning of the algorithm we want a high diversity of the individuals
in order not to miss some solutions. We want a higher convergence in later
generations, so a smaller value for σ(t) can be chosen.
Still this approach needs much input from outside3. The function σ(t) is
determined for each value of t and predictable.
Deterministic parameter control modifies the parameter without
using any feedback from the search.
2.2.2 Adaptive parameter control
One step further in letting the algorithm find parameter values for itself is to
incorporate information from the search process into the mutation step size σ.
In [Rec73] the ’1/5 success rule’ for (1+1)-evolution strategies is presented. It
states that the ratio of successful mutations to all mutations should be 1/5. If
the ratio is greater then the mutation step size should be increased, and vice
versa. A successful mutation is a mutation which produces an offspring that
is better than its parent. The ratio is determined as an average of mutations
after a fixed number of generations. This approach is still heuristic but yet the
σ(t) is not deterministic.
Adaptive parameter control modifies the parameter with some feed-
back from the search (usually using a heuristic).
2.2.3 Self-adaptive parameter control
The main idea to let the algorithm set its own parameters can be implemented
as follows. Assume that an individual has the following form:
< x1, x2, ..., xi >
The mutation step size can be included in the individual itself as a gene re-
sulting in the following form:
< x1, x2, ..., xi, σ >
This additional gene is transformed during the mutation, too. It undergoes
an evolution similar to the individual. Usually the e function is used to trans-
form the σ value:
σi = σi ∗ eN(0,τ0)
xi = xi + N(0, σi)
3
This approach is also called extrinsic evolution.
2.3. EXAMPLES OF SELF-ADAPTIVE EVOLUTIONARY ALGORITHMS7
Through this self-adaption no input is needed for the parameter and the
values are set by the algorithm itself. Each individual has its own σ. Another
finer approach is to give each gene its own σ leading to the following represen-
tation:
< x1, x2, ..., xi, σ1, σ2, ..., σi >
In this form each gene gets its own step size and the individuals get a larger
freedom grade in adapting itself to the shape of the fitness function.
What has to be noticed is that the adaption of the parameter (the mutation
step size here) happens before the fitness is given to it. That means that getting
a good parameter doesn’t rise the individual’s fitness but only its performance
over time.
Self-adaptive parameter control encodes the parameter within each
individual evolving it with the individual.
2.3 Examples of self-adaptive Evolutionary Algorithms
In this part of this chapter I will describe some examples of self-adaptive evo-
lutionary algorithms. The first section describes different approaches to self-
adaption where different operators are used. In the last section an example of
a combination of different self-adapting parameters is given.
2.3.1 Self-adaptive Evolutionary Algorithms adapting one pa-
rameter
An interesting approach to a self-adaptive crossover operator in a genetic algo-
rithm is found in [Spe95]. Spears decided to let the GA be self-selective with
respect to its choice of crossover operator. He argues that as 2-point crossover
is the least disruptive4, and uniform crossover the most disruptive operator, it
is reasonable to have the GA select from only those two possibilities. Although
a high disruption may stress exploration at the expense of exploitation, there
are situations in which minimizing disruption hinders the adaptive search pro-
cess by overemphasizing exploitation at the expense of needed exploration. An
example of this is when the population size is too small to provide the necessary
sampling accuracy for complex search spaces [JS91].
The implementation appends one bit to the end of every individual in the
GA population. This bit decides whether it is better to use uniform crossover
or to use 2-point crossover. Also, since the approach is tightly coupled, all
genetic operators are allowed to manipulate this extra bit (including crossover).
There are two possibilities how to use this bit. Local adaption uses this bit
only for two individuals: if both bits are equal, the respective crossover is
performed, if not, a random crossover is chosen. Global adaption uses the last
bits of the population to probabilistically determine which crossover operator
4
His main motivation to let the algorithm decide.
8 CHAPTER 2. EVOLUTIONARY ALGORITHMS
to perform on each individual5. An important result is that local adaption
outperforms global adaption although there is no statistical difference between
both methods. The consequence is to tie the crossover information directly to
the individual using local adaption in order to improve the algorithm.
Another approach is presented in [ESKT97] which uses multi-parent repro-
duction. An adaptive mechanism is used to determine the number of parents
based on subpopulations. The idea is similar to coevolution. Different pools of
individuals (species) search by different strategies6. The adaptive population
redistribution is designed to grow successful species and shrink the others. The
crossover operator used is N-parent diagonal crossover (see Figure 2.3).
Figure 2.3: 3-parent diagonal crossover with three children (left) and one child (right).
The results were two edged. On the one hand the experiments have shown that
multi-parent crossover is superior to traditional two-parent crossover. On the
other hand the adaptive mechanism was not able to reward better crossovers
according to their performance. Yet the algorithm was comparable in perfor-
mance to the non-adaptive variant and thus made time consuming comparisons
in search of the best operator unnecessary, which is, as stated in the introduc-
tion, a goal of self-adaptive EAs.
A meta approach was used by [Gre86]. The parameter population size was
determined by another genetic algorithm. So each generation of the meta-GA
set off a whole run of the actual GA with adapted parameters. The outcome
was that the optimal population size for the actual algorithm was somewhere
between 30 and 50 individuals.
In a system where the search space is bounded by constraints the adaption
of the evaluation function has been tested. Assume the evaluation function has
the following form:
eval(x) = f(x) + W ∗ penalty(x)
5
If 10 of 100 individuals have the uniform crossover bit set, then each individual has the
probability of 10% to use uniform crossover.
6
In this case the species differ only in the crossover operator.
2.3. EXAMPLES OF SELF-ADAPTIVE EVOLUTIONARY ALGORITHMS9
where W is a weight which determines how strong the individual is penalized
if it violates a constraint. The value of W can be adapted in a similar fashion
as the standard deviation σ described earlier.
A typical self-adaptive parameter for evolutionary strategies, the mutation
rate, was also self-adapted in a GA by B¨ack [B¨ac92]. He uses a extended
bitstring representation of the mutation probability pm, appended to each indi-
vidual. Each mutation rate is applied to itself and to each individual gene (see
Figure 2.4). The experiments show that a significant improvement is reached
with this approach over the standard GA without self-adaption.
Figure 2.4: Each mutation rate is applied to itself and to each of it corresponding
genes.
2.3.2 Combining self-adaptive parameters
In [SF96] the representation of an individual is chosen to have four parts7 in or-
der to incorporate both an adaption of the mutation operator and the crossover
operator. Each individual has the usual problem encoding, a mutation rate
and now two additional linkage bits. These bits are interpreted as connecting
points, where new offspring individuals can connect to. These new individuals
are called blocks. The approach is similar to [SM87], where the crossover oper-
ator uses ”punctuation marks” to encode crossover points. It was confined to
only two parents while the blocks benefit from the benefits of multi-parent re-
combination [ERR94] without the necessity of tuning the type of recombination
to the nature of the problem.
An important result from this combined adaption is that the algorithm
does not seem to suffer from a great ”learning overhead” on simple problems
and on more complex functions it discovers significantly higher optima than
the majority of other algorithms. They state that this can attributed to the
7
In a binary representation.
10 CHAPTER 2. EVOLUTIONARY ALGORITHMS
synergistic effects of simultaneously adapting both recombination strategy and
mutation rates.
It is interesting to note that most papers on combining parameters use self-
adaption. In [EHM99] Eiben et al. presume that feedback-based heuristic are
even more difficult to handle than static parameter tuning.
Chapter 3
Summary
3.1 Conclusion
The effectiveness of an evolutionary algorithm depends on many factors, e.g.
representation, operators, etc.. The number and variety of the parameters and
the possible choices make a selection of good setting for an EA very difficult.
The ”No-Free-Lunch”-Theorem [WM97] states even that there exists no perfect
EA:
No search algorithm is superior on ALL problems.
A corollary to this theorem is that there is no set of parameters that is
superior on all problems. Yet this theorem assumes no knowledge about the
problem and incorporates ALL problems into its proof, even completely random
ones. So its practical relevance is minimal.
Still adaption provides the opportunity to customize the evolutionary algo-
rithm to the needs of the problem and to change the strategy parameters during
the search. This enables the possibility to incorporate information about the
search space into the algorithm and even let the algorithm select the appropriate
information. The can be considered as two separate searches. The first is the
usual search in the problem space. The second search considers the parameter
space to find an optimal configuration of the EA. Yet most presented algorithm
take only one small part of this space into account, e.g. if the mutation rate
pm is self-adapted it’s only a small part of the parameter space. The other
parameters are still static and tuned. Similarly the meta-GA[Gre86] is confined
to its own search space.
Another advantage of self-adaptive EAs is, that e.g. in mutation rate adap-
tion, each individual gets its own rate, rising the diversity of the population.
This rises the natural drift, meaning that the convergence to local minima is
not as high as it would be with standard EAs. This idea was used in [Spe95], as
the choice between the disruptive uniform crossover corresponds to exploration
of search space or diversity in the population, while the least disruptive 2-point
crossover matches the exploitation1 or the convergence to some solution. In
this way the algorithm adapted itself, whether it should explore or not.
1
E.g. Staying at the same place in the search space, having found a solution.
11
12 CHAPTER 3. SUMMARY
The fact that self-adaptive EAs do not deliver the hoped for outcomes and
complicates the EA let some researchers to disregard it. But often they do not
take the time of parameter tuning into account.
3.2 Future prospects
One of the difficulties of optimizing parameter settings of EAs is that the in-
teractions between these parameters are often unpredictable. Thus the combi-
nation of self-adaptive parameters is an area where research can result in new
insights and improvements to EAs.
Very little runtime analysis has been done on the topic of EAs, and even
less with respect to self-adaptive EAs. A framework where the efficiency of self-
adaptive EAs can be measured an compared is absent. A collection of functions
to optimize is available yet this does not cover all forms of problems covered by
EAs.
There is no treatment about the problems where self-adaption excels and in
which situation it fails, which should be an interesting topic, too.
References
[B¨ac92] T. B¨ack. Self-adaptation in genetic algorithms. In Varela and
Bourgine[723], pages 263–271, 1992.
[EHM99] ´Agoston Endre Eiben, Robert Hinterding, and Zbigniew
Michalewicz. Parameter control in evolutionary algorithms.
IEEE Trans. on Evolutionary Computation, 3(2):124–141, 1999.
[ERR94] Agoston E. Eiben, P.-E. Rau´e, and Zs. Ruttkay. Genetic algorithms
with multi-parent recombination. In Yuval Davidor, Hans-Paul
Schwefel, and Reinhard M¨anner, editors, Parallel Problem Solving
from Nature – PPSN III, pages 78–87, Berlin, 1994. Springer.
[ESKT97] A. Eiben, I. Sprinkhuizen-Kuyper, and B. Thijssen. Competing
crossovers in an adaptive ga framework. -, 1997.
[Fog97] David B. Fogel. Evolutionary computation: A new transactions.
IEEE Trans. Evolutionary Computation, 1(1):1–2, 1997.
[Gre86] J Grefenstette. Optimization of control parameters for genetic algo-
rithms. IEEE Trans. Syst. Man Cybern., 16(1):122–128, 1986.
[JS91] Kenneth A. De Jong and William M. Spears. An analysis of of the
interacting roles of population size and crossover in genetic algo-
rithms. In Parallel Problem Solving from Nature - Proceedings of 1st
Workshop, PPSN 1, 1991.
[Rec73] Ingo Rechenberg. Evolutionsstrategie: Optimierung Technischer
Systeme nach Prinzipien der Biologischen Evolution. Fromman-
Holzboog Verlag, 1973.
[SF96] Jim E. Smith and Terence C. Fogarty. Adaptively parameterised
evolutionary systems: Self adaptive recombination and mutation in
a genetic algorithm. In H. Voigt, W. Ebeling, and I. Rechenberg,
editors, Parallel Problem Solving from Nature – PPSN IV (Berlin,
1996), pages 441–450, Berlin, 1996. Springer.
13
14 REFERENCES
[SM87] J. D. Schaffer and A. Morishima. An adaptive crossover distribution
mechanisms for genetic algorithms. In Grefenstette J. (ed)., edi-
tor, Proceedings of the Second International Conference on Genetic
Algorithms, pages 36–40, Cambridge, 1987. Lawrence Erlbaum.
[Spe95] William Spears. Adapting crossover in a genetic algorithm. In
Proc. of the Fourth Annual Conference on Evolutionary Program-
ming, 1995.
[WM97] David H. Wolpert and William G. Macready. No free lunch theorems
for optimization. IEEE Transactions on Evolutionary Computation,
1(1):67–82, April 1997.

More Related Content

What's hot

A Review on Feature Selection Methods For Classification Tasks
A Review on Feature Selection Methods For Classification TasksA Review on Feature Selection Methods For Classification Tasks
A Review on Feature Selection Methods For Classification Tasks
Editor IJCATR
 
Modelling the expected loss of bodily injury claims using gradient boosting
Modelling the expected loss of bodily injury claims using gradient boostingModelling the expected loss of bodily injury claims using gradient boosting
Modelling the expected loss of bodily injury claims using gradient boosting
Gregg Barrett
 
Deep learning MindMap
Deep learning MindMapDeep learning MindMap
Deep learning MindMap
Ashish Patel
 
Introduction to modeling_and_simulation
Introduction to modeling_and_simulationIntroduction to modeling_and_simulation
Introduction to modeling_and_simulation
Aysun Duran
 
Feature Selection in Machine Learning
Feature Selection in Machine LearningFeature Selection in Machine Learning
Feature Selection in Machine Learning
Upekha Vandebona
 
Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...
Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...
Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...
Sagar Deogirkar
 
Evaluation measures for models assessment over imbalanced data sets
Evaluation measures for models assessment over imbalanced data setsEvaluation measures for models assessment over imbalanced data sets
Evaluation measures for models assessment over imbalanced data sets
Alexander Decker
 
System model.Chapter One(GEOFFREY GORDON)
System model.Chapter One(GEOFFREY GORDON)System model.Chapter One(GEOFFREY GORDON)
System model.Chapter One(GEOFFREY GORDON)Towfiq218
 
Discrete And Continuous Simulation
Discrete And Continuous SimulationDiscrete And Continuous Simulation
Discrete And Continuous SimulationNguyen Chien
 
Multi-objective Optimization of PID Controller using Pareto-based Surrogate ...
Multi-objective Optimization of PID Controller using  Pareto-based Surrogate ...Multi-objective Optimization of PID Controller using  Pareto-based Surrogate ...
Multi-objective Optimization of PID Controller using Pareto-based Surrogate ...
IJECEIAES
 
Integrating Fuzzy Dematel and SMAA-2 for Maintenance Expenses
Integrating Fuzzy Dematel and SMAA-2 for Maintenance ExpensesIntegrating Fuzzy Dematel and SMAA-2 for Maintenance Expenses
Integrating Fuzzy Dematel and SMAA-2 for Maintenance Expenses
inventionjournals
 
Multi Objective Optimization
Multi Objective OptimizationMulti Objective Optimization
Multi Objective Optimization
Nawroz University
 
A Novel Hybrid Voter Using Genetic Algorithm and Performance History
A Novel Hybrid Voter Using Genetic Algorithm and Performance HistoryA Novel Hybrid Voter Using Genetic Algorithm and Performance History
A Novel Hybrid Voter Using Genetic Algorithm and Performance History
Waqas Tariq
 
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
Ajay Kumar
 
Data Mining using SAS
Data Mining using SASData Mining using SAS
Data Mining using SAS
Tanu Puri
 
AN IMPROVE OBJECT-ORIENTED APPROACH FOR MULTI-OBJECTIVE FLEXIBLE JOB-SHOP SCH...
AN IMPROVE OBJECT-ORIENTED APPROACH FOR MULTI-OBJECTIVE FLEXIBLE JOB-SHOP SCH...AN IMPROVE OBJECT-ORIENTED APPROACH FOR MULTI-OBJECTIVE FLEXIBLE JOB-SHOP SCH...
AN IMPROVE OBJECT-ORIENTED APPROACH FOR MULTI-OBJECTIVE FLEXIBLE JOB-SHOP SCH...
ijcsit
 
Modeling & Simulation Lecture Notes
Modeling & Simulation Lecture NotesModeling & Simulation Lecture Notes
Modeling & Simulation Lecture Notes
FellowBuddy.com
 
Multitasking: a Review
Multitasking: a ReviewMultitasking: a Review
Multitasking: a Review
ijiert bestjournal
 

What's hot (18)

A Review on Feature Selection Methods For Classification Tasks
A Review on Feature Selection Methods For Classification TasksA Review on Feature Selection Methods For Classification Tasks
A Review on Feature Selection Methods For Classification Tasks
 
Modelling the expected loss of bodily injury claims using gradient boosting
Modelling the expected loss of bodily injury claims using gradient boostingModelling the expected loss of bodily injury claims using gradient boosting
Modelling the expected loss of bodily injury claims using gradient boosting
 
Deep learning MindMap
Deep learning MindMapDeep learning MindMap
Deep learning MindMap
 
Introduction to modeling_and_simulation
Introduction to modeling_and_simulationIntroduction to modeling_and_simulation
Introduction to modeling_and_simulation
 
Feature Selection in Machine Learning
Feature Selection in Machine LearningFeature Selection in Machine Learning
Feature Selection in Machine Learning
 
Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...
Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...
Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...
 
Evaluation measures for models assessment over imbalanced data sets
Evaluation measures for models assessment over imbalanced data setsEvaluation measures for models assessment over imbalanced data sets
Evaluation measures for models assessment over imbalanced data sets
 
System model.Chapter One(GEOFFREY GORDON)
System model.Chapter One(GEOFFREY GORDON)System model.Chapter One(GEOFFREY GORDON)
System model.Chapter One(GEOFFREY GORDON)
 
Discrete And Continuous Simulation
Discrete And Continuous SimulationDiscrete And Continuous Simulation
Discrete And Continuous Simulation
 
Multi-objective Optimization of PID Controller using Pareto-based Surrogate ...
Multi-objective Optimization of PID Controller using  Pareto-based Surrogate ...Multi-objective Optimization of PID Controller using  Pareto-based Surrogate ...
Multi-objective Optimization of PID Controller using Pareto-based Surrogate ...
 
Integrating Fuzzy Dematel and SMAA-2 for Maintenance Expenses
Integrating Fuzzy Dematel and SMAA-2 for Maintenance ExpensesIntegrating Fuzzy Dematel and SMAA-2 for Maintenance Expenses
Integrating Fuzzy Dematel and SMAA-2 for Maintenance Expenses
 
Multi Objective Optimization
Multi Objective OptimizationMulti Objective Optimization
Multi Objective Optimization
 
A Novel Hybrid Voter Using Genetic Algorithm and Performance History
A Novel Hybrid Voter Using Genetic Algorithm and Performance HistoryA Novel Hybrid Voter Using Genetic Algorithm and Performance History
A Novel Hybrid Voter Using Genetic Algorithm and Performance History
 
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
 
Data Mining using SAS
Data Mining using SASData Mining using SAS
Data Mining using SAS
 
AN IMPROVE OBJECT-ORIENTED APPROACH FOR MULTI-OBJECTIVE FLEXIBLE JOB-SHOP SCH...
AN IMPROVE OBJECT-ORIENTED APPROACH FOR MULTI-OBJECTIVE FLEXIBLE JOB-SHOP SCH...AN IMPROVE OBJECT-ORIENTED APPROACH FOR MULTI-OBJECTIVE FLEXIBLE JOB-SHOP SCH...
AN IMPROVE OBJECT-ORIENTED APPROACH FOR MULTI-OBJECTIVE FLEXIBLE JOB-SHOP SCH...
 
Modeling & Simulation Lecture Notes
Modeling & Simulation Lecture NotesModeling & Simulation Lecture Notes
Modeling & Simulation Lecture Notes
 
Multitasking: a Review
Multitasking: a ReviewMultitasking: a Review
Multitasking: a Review
 

Viewers also liked

Udahnyatudella
UdahnyatudellaUdahnyatudella
Udahnyatudelladella1214
 
Team work & performance
Team work & performanceTeam work & performance
Team work & performance
Mohamad Abd Rabbo
 
Action Plan for Change
Action Plan for ChangeAction Plan for Change
Action Plan for Change
Damoon Nozari
 
Fuerzas Armadas: Principales Marcas Materiales Oficina
Fuerzas Armadas: Principales Marcas Materiales OficinaFuerzas Armadas: Principales Marcas Materiales Oficina
Fuerzas Armadas: Principales Marcas Materiales Oficina
Tech-K
 
Drug discoveries by serendipity
Drug discoveries by serendipityDrug discoveries by serendipity
Drug discoveries by serendipity
Ramisa Tasnia
 
Heart Health Benefits of Dark Chocolate
Heart Health Benefits of Dark ChocolateHeart Health Benefits of Dark Chocolate
Heart Health Benefits of Dark Chocolate
DuPage Medical Group
 
Anemia aplasica MEDICINA INTERNA R2
Anemia aplasica  MEDICINA INTERNA R2Anemia aplasica  MEDICINA INTERNA R2
Anemia aplasica MEDICINA INTERNA R2
Rafael Roberto cruz Ramirez
 
Antidepresivos 26 02-16
Antidepresivos 26 02-16 Antidepresivos 26 02-16
Antidepresivos 26 02-16
lapedrera
 
Калентарно-тематичне планування для 5 класу
Калентарно-тематичне планування для 5 класуКалентарно-тематичне планування для 5 класу
Калентарно-тематичне планування для 5 класу
VsimPPT
 
Fibrilación Auricular
Fibrilación AuricularFibrilación Auricular
Fibrilación Auricular
Las Sesiones de San Blas
 
via subcutanea uso en Atención Primaria
via subcutanea uso en Atención Primariavia subcutanea uso en Atención Primaria
via subcutanea uso en Atención Primaria
Javier Blanquer
 
Riesgo biológico primaria 1
Riesgo biológico primaria 1 Riesgo biológico primaria 1
Riesgo biológico primaria 1
Las Sesiones de San Blas
 
Antidepresivos
AntidepresivosAntidepresivos
AntidepresivosHans Eguia
 
4. técnicas basadas en el condicionamiento clásico
4. técnicas basadas en el condicionamiento clásico4. técnicas basadas en el condicionamiento clásico
4. técnicas basadas en el condicionamiento clásico
Laura O. Eguia Magaña
 
Cibernetica primer y segundo orden
Cibernetica primer y segundo ordenCibernetica primer y segundo orden
Cibernetica primer y segundo orden
mayra talamantes arredondo
 

Viewers also liked (19)

Test
TestTest
Test
 
My resumé
My resuméMy resumé
My resumé
 
Udahnyatudella
UdahnyatudellaUdahnyatudella
Udahnyatudella
 
Energy by Slideshop
Energy by SlideshopEnergy by Slideshop
Energy by Slideshop
 
CIBC Co-op Student Achievement Award
CIBC Co-op Student Achievement AwardCIBC Co-op Student Achievement Award
CIBC Co-op Student Achievement Award
 
Team work & performance
Team work & performanceTeam work & performance
Team work & performance
 
Action Plan for Change
Action Plan for ChangeAction Plan for Change
Action Plan for Change
 
Fuerzas Armadas: Principales Marcas Materiales Oficina
Fuerzas Armadas: Principales Marcas Materiales OficinaFuerzas Armadas: Principales Marcas Materiales Oficina
Fuerzas Armadas: Principales Marcas Materiales Oficina
 
Drug discoveries by serendipity
Drug discoveries by serendipityDrug discoveries by serendipity
Drug discoveries by serendipity
 
Heart Health Benefits of Dark Chocolate
Heart Health Benefits of Dark ChocolateHeart Health Benefits of Dark Chocolate
Heart Health Benefits of Dark Chocolate
 
Anemia aplasica MEDICINA INTERNA R2
Anemia aplasica  MEDICINA INTERNA R2Anemia aplasica  MEDICINA INTERNA R2
Anemia aplasica MEDICINA INTERNA R2
 
Antidepresivos 26 02-16
Antidepresivos 26 02-16 Antidepresivos 26 02-16
Antidepresivos 26 02-16
 
Калентарно-тематичне планування для 5 класу
Калентарно-тематичне планування для 5 класуКалентарно-тематичне планування для 5 класу
Калентарно-тематичне планування для 5 класу
 
Fibrilación Auricular
Fibrilación AuricularFibrilación Auricular
Fibrilación Auricular
 
via subcutanea uso en Atención Primaria
via subcutanea uso en Atención Primariavia subcutanea uso en Atención Primaria
via subcutanea uso en Atención Primaria
 
Riesgo biológico primaria 1
Riesgo biológico primaria 1 Riesgo biológico primaria 1
Riesgo biológico primaria 1
 
Antidepresivos
AntidepresivosAntidepresivos
Antidepresivos
 
4. técnicas basadas en el condicionamiento clásico
4. técnicas basadas en el condicionamiento clásico4. técnicas basadas en el condicionamiento clásico
4. técnicas basadas en el condicionamiento clásico
 
Cibernetica primer y segundo orden
Cibernetica primer y segundo ordenCibernetica primer y segundo orden
Cibernetica primer y segundo orden
 

Similar to Selfadaptive report

Application Issues For Multiobjective Evolutionary Algorithms
Application Issues For Multiobjective Evolutionary AlgorithmsApplication Issues For Multiobjective Evolutionary Algorithms
Application Issues For Multiobjective Evolutionary Algorithms
Amy Isleb
 
H012225053
H012225053H012225053
H012225053
IOSR Journals
 
Comparison between the genetic algorithms optimization and particle swarm opt...
Comparison between the genetic algorithms optimization and particle swarm opt...Comparison between the genetic algorithms optimization and particle swarm opt...
Comparison between the genetic algorithms optimization and particle swarm opt...
IAEME Publication
 
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
IAEME Publication
 
Performance Comparision of Machine Learning Algorithms
Performance Comparision of Machine Learning AlgorithmsPerformance Comparision of Machine Learning Algorithms
Performance Comparision of Machine Learning Algorithms
Dinusha Dilanka
 
Adapted Branch-and-Bound Algorithm Using SVM With Model Selection
Adapted Branch-and-Bound Algorithm Using SVM With Model SelectionAdapted Branch-and-Bound Algorithm Using SVM With Model Selection
Adapted Branch-and-Bound Algorithm Using SVM With Model Selection
IJECEIAES
 
Biology-Derived Algorithms in Engineering Optimization
Biology-Derived Algorithms in Engineering OptimizationBiology-Derived Algorithms in Engineering Optimization
Biology-Derived Algorithms in Engineering Optimization
Xin-She Yang
 
Optimazation
OptimazationOptimazation
Optimazation
Ahmed M. Elkholy
 
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
ijscai
 
Integrated bio-search approaches with multi-objective algorithms for optimiza...
Integrated bio-search approaches with multi-objective algorithms for optimiza...Integrated bio-search approaches with multi-objective algorithms for optimiza...
Integrated bio-search approaches with multi-objective algorithms for optimiza...
TELKOMNIKA JOURNAL
 
Cuckoo Search: Recent Advances and Applications
Cuckoo Search: Recent Advances and ApplicationsCuckoo Search: Recent Advances and Applications
Cuckoo Search: Recent Advances and Applications
Xin-She Yang
 
O044049498
O044049498O044049498
O044049498
IJERA Editor
 
Sequential estimation of_discrete_choice_models
Sequential estimation of_discrete_choice_modelsSequential estimation of_discrete_choice_models
Sequential estimation of_discrete_choice_models
YoussefKitane
 
Sequential estimation of_discrete_choice_models__copy_-4
Sequential estimation of_discrete_choice_models__copy_-4Sequential estimation of_discrete_choice_models__copy_-4
Sequential estimation of_discrete_choice_models__copy_-4
YoussefKitane
 
The potential role of ai in the minimisation and mitigation of project delay
The potential role of ai in the minimisation and mitigation of project delayThe potential role of ai in the minimisation and mitigation of project delay
The potential role of ai in the minimisation and mitigation of project delay
Pieter Rautenbach
 
Applications and Analysis of Bio-Inspired Eagle Strategy for Engineering Opti...
Applications and Analysis of Bio-Inspired Eagle Strategy for Engineering Opti...Applications and Analysis of Bio-Inspired Eagle Strategy for Engineering Opti...
Applications and Analysis of Bio-Inspired Eagle Strategy for Engineering Opti...
Xin-She Yang
 
Wip 13
Wip 13Wip 13
Machine learning
Machine learningMachine learning
Machine learning
business Corporate
 
operation research notes
operation research notesoperation research notes
operation research notesRenu Thakur
 

Similar to Selfadaptive report (20)

Application Issues For Multiobjective Evolutionary Algorithms
Application Issues For Multiobjective Evolutionary AlgorithmsApplication Issues For Multiobjective Evolutionary Algorithms
Application Issues For Multiobjective Evolutionary Algorithms
 
H012225053
H012225053H012225053
H012225053
 
Comparison between the genetic algorithms optimization and particle swarm opt...
Comparison between the genetic algorithms optimization and particle swarm opt...Comparison between the genetic algorithms optimization and particle swarm opt...
Comparison between the genetic algorithms optimization and particle swarm opt...
 
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
COMPARISON BETWEEN THE GENETIC ALGORITHMS OPTIMIZATION AND PARTICLE SWARM OPT...
 
Performance Comparision of Machine Learning Algorithms
Performance Comparision of Machine Learning AlgorithmsPerformance Comparision of Machine Learning Algorithms
Performance Comparision of Machine Learning Algorithms
 
Adapted Branch-and-Bound Algorithm Using SVM With Model Selection
Adapted Branch-and-Bound Algorithm Using SVM With Model SelectionAdapted Branch-and-Bound Algorithm Using SVM With Model Selection
Adapted Branch-and-Bound Algorithm Using SVM With Model Selection
 
Biology-Derived Algorithms in Engineering Optimization
Biology-Derived Algorithms in Engineering OptimizationBiology-Derived Algorithms in Engineering Optimization
Biology-Derived Algorithms in Engineering Optimization
 
Optimazation
OptimazationOptimazation
Optimazation
 
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
 
Integrated bio-search approaches with multi-objective algorithms for optimiza...
Integrated bio-search approaches with multi-objective algorithms for optimiza...Integrated bio-search approaches with multi-objective algorithms for optimiza...
Integrated bio-search approaches with multi-objective algorithms for optimiza...
 
Cuckoo Search: Recent Advances and Applications
Cuckoo Search: Recent Advances and ApplicationsCuckoo Search: Recent Advances and Applications
Cuckoo Search: Recent Advances and Applications
 
O044049498
O044049498O044049498
O044049498
 
Sequential estimation of_discrete_choice_models
Sequential estimation of_discrete_choice_modelsSequential estimation of_discrete_choice_models
Sequential estimation of_discrete_choice_models
 
Sequential estimation of_discrete_choice_models__copy_-4
Sequential estimation of_discrete_choice_models__copy_-4Sequential estimation of_discrete_choice_models__copy_-4
Sequential estimation of_discrete_choice_models__copy_-4
 
The potential role of ai in the minimisation and mitigation of project delay
The potential role of ai in the minimisation and mitigation of project delayThe potential role of ai in the minimisation and mitigation of project delay
The potential role of ai in the minimisation and mitigation of project delay
 
Applications and Analysis of Bio-Inspired Eagle Strategy for Engineering Opti...
Applications and Analysis of Bio-Inspired Eagle Strategy for Engineering Opti...Applications and Analysis of Bio-Inspired Eagle Strategy for Engineering Opti...
Applications and Analysis of Bio-Inspired Eagle Strategy for Engineering Opti...
 
Wip 13
Wip 13Wip 13
Wip 13
 
Machine learning
Machine learningMachine learning
Machine learning
 
call for papers, research paper publishing, where to publish research paper, ...
call for papers, research paper publishing, where to publish research paper, ...call for papers, research paper publishing, where to publish research paper, ...
call for papers, research paper publishing, where to publish research paper, ...
 
operation research notes
operation research notesoperation research notes
operation research notes
 

Recently uploaded

Basic Industrial Engineering terms for apparel
Basic Industrial Engineering terms for apparelBasic Industrial Engineering terms for apparel
Basic Industrial Engineering terms for apparel
top1002
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
Kamal Acharya
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
SamSarthak3
 
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABSDESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
itech2017
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
Osamah Alsalih
 
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERSCW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
veerababupersonal22
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
MdTanvirMahtab2
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
manasideore6
 
weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdf
Pratik Pawar
 
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
thanhdowork
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
AJAYKUMARPUND1
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
AmarGB2
 
Unbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptxUnbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptx
ChristineTorrepenida1
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
ClaraZara1
 
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdfTop 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Teleport Manpower Consultant
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Dr.Costas Sachpazis
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
VENKATESHvenky89705
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation & Control
 
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
Amil Baba Dawood bangali
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
SUTEJAS
 

Recently uploaded (20)

Basic Industrial Engineering terms for apparel
Basic Industrial Engineering terms for apparelBasic Industrial Engineering terms for apparel
Basic Industrial Engineering terms for apparel
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
 
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABSDESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
 
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERSCW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
 
weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdf
 
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
RAT: Retrieval Augmented Thoughts Elicit Context-Aware Reasoning in Long-Hori...
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
 
Unbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptxUnbalanced Three Phase Systems and circuits.pptx
Unbalanced Three Phase Systems and circuits.pptx
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
 
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdfTop 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
 
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
 

Selfadaptive report

  • 1. Universit¨at Paderborn, Fachbereich 17 - Mathematik/Informatik Elaboration for the course Evolutionary Algorithms Self-adaptive Evolutionary Algorithms Bartlomiej Gloger Matr.-Nr: 6054499 Betreuer: Oliver Kramer, Chuan-Kang Ting Datum: 28. Januar 2004
  • 2.
  • 3. Contents 1 Introduction 1 1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 Evolutionary Algorithms 3 2.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Self-adaptiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.2.1 Deterministic parameter control . . . . . . . . . . . . . . 5 2.2.2 Adaptive parameter control . . . . . . . . . . . . . . . . . 6 2.2.3 Self-adaptive parameter control . . . . . . . . . . . . . . . 6 2.3 Examples of self-adaptive Evolutionary Algorithms . . . . . . . . 7 2.3.1 Self-adaptive Evolutionary Algorithms adapting one pa- rameter . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.3.2 Combining self-adaptive parameters . . . . . . . . . . . . 9 3 Summary 11 3.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.2 Future prospects . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 References 13 III
  • 4. Chapter 1 Introduction This elaboration which is created in the context of the course ”Evolutionary Algorithms” held at the University of Paderborn covers the field of self-adaptive Evolutionary Algorithms (EAs). 1.1 Motivation Many parameters are used in a variety of Evolutionary Algorithms. Some of them are the initial population size, the number of crossover points, the proba- bilities for the mutation or crossover. Each of these parameters are often tuned and adjusted ”by hand”. Typically one parameter is adjusted at a time, which may lead to sub-optimal choices, since often it is not known how the parameters interact. Parallel tuning of multiple parameters cause the amount of tests to rise exponentially. Another approach is to transfer the results for the parameters of a given problem to a new similar one. But there is no guarantee that this approach is viable. The rigid form of static parameters contradicts the dynamic nature of EAs. E.g. a large mutation step in the early generations may lead to a faster approximation and a small step in the late generation to a more accurate solution. This is not possible with fixed parameters. The nature of the parameter setting problem is typical for problems solved by EAs1. So an intuitive approach is to evolve the parameters with the algo- rithm, too. Many different procedures were researched to adapt the parameters and the parameter choice differs strongly from case to case, but the main idea is to no longer choose the parameters semi-arbitrarily2 but to let the parameters adapt themselves to the problem. 1 It qualifies as an optimization problem. 2 The choices were often made from experience. 1
  • 5. 2 CHAPTER 1. INTRODUCTION 1.2 Overview This work is organized as follows. The next chapter gives a short introduction to evolutionary algorithms followed by the definition of terms used to describe self-adaptive EAs. The chapter ends with examples of the use of self-adaptive parameters in previous work. Chapter 3 gives a summary and ends with future prospects.
  • 6. Chapter 2 Evolutionary Algorithms In the first part of this chapter I will describe the principles behind Evolutionary Algorithms. This description will be followed by the idea of self-adaptiveness and its application to EAs. The following part of this chapter give examples of self-adaptive EAs which concentrate on different operators. The last part considers the combination of parameter adaption. 2.1 Overview Evolutionary computation and Evolutionary Algorithms use computational mod- els of evolutionary processes from biology as key elements in the design and implementation of computer-based problem solving systems. They take their inspiration from natural selection and survival of the fittest. EAs differ from more traditional optimization techniques in that they involve a search from a population of solutions, not from a single point. Each iteration of an EA usually involves a competitive selection that weeds out poor solutions. The solutions with high fitness are recombined with other solutions by swapping parts of a solution with another. Solutions are also mutated by making a small more or less random change to a single element of the solution. Recombination and mutation are used to generate new solutions that are biased towards regions of the search space for which good solutions have already been seen. Figure 2.1 shows the usual approach to reflect the Evolution from biology to computer science. Several different types of evolutionary search methods were developed indepen- dently. These include • Genetic Programming (GP) • Evolutionary Programming (EP) • Evolutionary Strategies (ES) • Genetic Algorithms (GA) 3
  • 7. 4 CHAPTER 2. EVOLUTIONARY ALGORITHMS Figure 2.1: Most Evolutionary Algorithms use this sequence of steps: Selection, Reproduction, Mutation and Evaluation. Genetic Programming creates and evolves programs. They are well suited for problems that require the determination of a function that can be simply expressed in a function form. Evolutionary Programming focuses on optimizing continuous functions without recombination while Evolutionary Strategies focus on optimizing continuous functions with recombination. Both are well suited for optimizing continuous functions. They use a real valued representation [Fog97]. Genetic Algorithms focus on optimizing general combinatorial problems though they have occasionally been applied to continuous problems, too. They usually use a binary representation. The different steps in a typical EA cycle, (Selection, Reproduction, Muta- tion and Evaluation) each have many parameters. Questions arise like how many individuals should the initial population have, which crossover operator to choose (e.g. uniform vs. 2-point crossover) or what value should the muta- tion probability have. These parameters are usually chosen experimentally and given to the algorithm ”by hand”1. The quality of the algorithm relies on the quality of these parameters. The better the parameters are the faster an EA reaches a solution, the better it can handle difficult situations2 and it may even deliver better results. Determining good parameters is a very time-consuming task. Often many iterations are needed before e.g. a good mutation rate is found as the param- eters are problem dependent and they can even interact between themselves, complicating the search. Also the problem description often does not give clues e.g. to how large a population should be. The attributes of the search for good parameters qualifies it as an EA prob- lem. Therefore attempts were made to apply EAs to its own parameters. This 1 The algorithm is initialized by a config file e.g. 2 Local minima in a minimization problem
  • 8. 2.2. SELF-ADAPTIVENESS 5 idea is called parameter control and explained in the following section. 2.2 Self-adaptiveness Before I go into detail how self-adaptiveness works, I want to define and classify this term. In [EHM99] Eiben, Hinterding and Michalewicz give a terminology which I will use throughout this elaboration. In the choice of parameters they distinguish two major forms: the parameter tuning and parameter control. Parameter tuning describes the process of setting parameters before the run of the EA while parameter control defers to changing the parameters during the run. They subclassify the parameter control into three categories: deterministic, adaptive and self-adaptive parameter control (see Figure 2.2). Figure 2.2: Classification of methods how parameters are chosen. Parameter tun- ing refers to changing parameters ”by hand”, while parameter control refers to set- ting the parameters e.g. based on some function which is dependent on a algorithm variable like the number of generations. In the following three sections I will give examples to each of these parameter control approaches on the basis of mutation step size parameter with a Gaussian mutation which uses the normal distribution of N(0, σ). The mean is set to zero and the standard deviation of σ can be interpreted as the mutation step size. The mutation is applied as follows: xi = xi + N(0, σ) where xi is each component of an individual x. 2.2.1 Deterministic parameter control The first possibility for parameter control is to parameterize the σ with a vari- able t leading to a function σ(t). Now we get the possibility to change the σ during the run. As is shown in [B¨ac92] this may improve the algorithm speed. One possible choice of t could be the generation number: the longer the algo- rithm runs the smaller values for the mutation step size it gets. The idea is that
  • 9. 6 CHAPTER 2. EVOLUTIONARY ALGORITHMS at the beginning of the algorithm we want a high diversity of the individuals in order not to miss some solutions. We want a higher convergence in later generations, so a smaller value for σ(t) can be chosen. Still this approach needs much input from outside3. The function σ(t) is determined for each value of t and predictable. Deterministic parameter control modifies the parameter without using any feedback from the search. 2.2.2 Adaptive parameter control One step further in letting the algorithm find parameter values for itself is to incorporate information from the search process into the mutation step size σ. In [Rec73] the ’1/5 success rule’ for (1+1)-evolution strategies is presented. It states that the ratio of successful mutations to all mutations should be 1/5. If the ratio is greater then the mutation step size should be increased, and vice versa. A successful mutation is a mutation which produces an offspring that is better than its parent. The ratio is determined as an average of mutations after a fixed number of generations. This approach is still heuristic but yet the σ(t) is not deterministic. Adaptive parameter control modifies the parameter with some feed- back from the search (usually using a heuristic). 2.2.3 Self-adaptive parameter control The main idea to let the algorithm set its own parameters can be implemented as follows. Assume that an individual has the following form: < x1, x2, ..., xi > The mutation step size can be included in the individual itself as a gene re- sulting in the following form: < x1, x2, ..., xi, σ > This additional gene is transformed during the mutation, too. It undergoes an evolution similar to the individual. Usually the e function is used to trans- form the σ value: σi = σi ∗ eN(0,τ0) xi = xi + N(0, σi) 3 This approach is also called extrinsic evolution.
  • 10. 2.3. EXAMPLES OF SELF-ADAPTIVE EVOLUTIONARY ALGORITHMS7 Through this self-adaption no input is needed for the parameter and the values are set by the algorithm itself. Each individual has its own σ. Another finer approach is to give each gene its own σ leading to the following represen- tation: < x1, x2, ..., xi, σ1, σ2, ..., σi > In this form each gene gets its own step size and the individuals get a larger freedom grade in adapting itself to the shape of the fitness function. What has to be noticed is that the adaption of the parameter (the mutation step size here) happens before the fitness is given to it. That means that getting a good parameter doesn’t rise the individual’s fitness but only its performance over time. Self-adaptive parameter control encodes the parameter within each individual evolving it with the individual. 2.3 Examples of self-adaptive Evolutionary Algorithms In this part of this chapter I will describe some examples of self-adaptive evo- lutionary algorithms. The first section describes different approaches to self- adaption where different operators are used. In the last section an example of a combination of different self-adapting parameters is given. 2.3.1 Self-adaptive Evolutionary Algorithms adapting one pa- rameter An interesting approach to a self-adaptive crossover operator in a genetic algo- rithm is found in [Spe95]. Spears decided to let the GA be self-selective with respect to its choice of crossover operator. He argues that as 2-point crossover is the least disruptive4, and uniform crossover the most disruptive operator, it is reasonable to have the GA select from only those two possibilities. Although a high disruption may stress exploration at the expense of exploitation, there are situations in which minimizing disruption hinders the adaptive search pro- cess by overemphasizing exploitation at the expense of needed exploration. An example of this is when the population size is too small to provide the necessary sampling accuracy for complex search spaces [JS91]. The implementation appends one bit to the end of every individual in the GA population. This bit decides whether it is better to use uniform crossover or to use 2-point crossover. Also, since the approach is tightly coupled, all genetic operators are allowed to manipulate this extra bit (including crossover). There are two possibilities how to use this bit. Local adaption uses this bit only for two individuals: if both bits are equal, the respective crossover is performed, if not, a random crossover is chosen. Global adaption uses the last bits of the population to probabilistically determine which crossover operator 4 His main motivation to let the algorithm decide.
  • 11. 8 CHAPTER 2. EVOLUTIONARY ALGORITHMS to perform on each individual5. An important result is that local adaption outperforms global adaption although there is no statistical difference between both methods. The consequence is to tie the crossover information directly to the individual using local adaption in order to improve the algorithm. Another approach is presented in [ESKT97] which uses multi-parent repro- duction. An adaptive mechanism is used to determine the number of parents based on subpopulations. The idea is similar to coevolution. Different pools of individuals (species) search by different strategies6. The adaptive population redistribution is designed to grow successful species and shrink the others. The crossover operator used is N-parent diagonal crossover (see Figure 2.3). Figure 2.3: 3-parent diagonal crossover with three children (left) and one child (right). The results were two edged. On the one hand the experiments have shown that multi-parent crossover is superior to traditional two-parent crossover. On the other hand the adaptive mechanism was not able to reward better crossovers according to their performance. Yet the algorithm was comparable in perfor- mance to the non-adaptive variant and thus made time consuming comparisons in search of the best operator unnecessary, which is, as stated in the introduc- tion, a goal of self-adaptive EAs. A meta approach was used by [Gre86]. The parameter population size was determined by another genetic algorithm. So each generation of the meta-GA set off a whole run of the actual GA with adapted parameters. The outcome was that the optimal population size for the actual algorithm was somewhere between 30 and 50 individuals. In a system where the search space is bounded by constraints the adaption of the evaluation function has been tested. Assume the evaluation function has the following form: eval(x) = f(x) + W ∗ penalty(x) 5 If 10 of 100 individuals have the uniform crossover bit set, then each individual has the probability of 10% to use uniform crossover. 6 In this case the species differ only in the crossover operator.
  • 12. 2.3. EXAMPLES OF SELF-ADAPTIVE EVOLUTIONARY ALGORITHMS9 where W is a weight which determines how strong the individual is penalized if it violates a constraint. The value of W can be adapted in a similar fashion as the standard deviation σ described earlier. A typical self-adaptive parameter for evolutionary strategies, the mutation rate, was also self-adapted in a GA by B¨ack [B¨ac92]. He uses a extended bitstring representation of the mutation probability pm, appended to each indi- vidual. Each mutation rate is applied to itself and to each individual gene (see Figure 2.4). The experiments show that a significant improvement is reached with this approach over the standard GA without self-adaption. Figure 2.4: Each mutation rate is applied to itself and to each of it corresponding genes. 2.3.2 Combining self-adaptive parameters In [SF96] the representation of an individual is chosen to have four parts7 in or- der to incorporate both an adaption of the mutation operator and the crossover operator. Each individual has the usual problem encoding, a mutation rate and now two additional linkage bits. These bits are interpreted as connecting points, where new offspring individuals can connect to. These new individuals are called blocks. The approach is similar to [SM87], where the crossover oper- ator uses ”punctuation marks” to encode crossover points. It was confined to only two parents while the blocks benefit from the benefits of multi-parent re- combination [ERR94] without the necessity of tuning the type of recombination to the nature of the problem. An important result from this combined adaption is that the algorithm does not seem to suffer from a great ”learning overhead” on simple problems and on more complex functions it discovers significantly higher optima than the majority of other algorithms. They state that this can attributed to the 7 In a binary representation.
  • 13. 10 CHAPTER 2. EVOLUTIONARY ALGORITHMS synergistic effects of simultaneously adapting both recombination strategy and mutation rates. It is interesting to note that most papers on combining parameters use self- adaption. In [EHM99] Eiben et al. presume that feedback-based heuristic are even more difficult to handle than static parameter tuning.
  • 14. Chapter 3 Summary 3.1 Conclusion The effectiveness of an evolutionary algorithm depends on many factors, e.g. representation, operators, etc.. The number and variety of the parameters and the possible choices make a selection of good setting for an EA very difficult. The ”No-Free-Lunch”-Theorem [WM97] states even that there exists no perfect EA: No search algorithm is superior on ALL problems. A corollary to this theorem is that there is no set of parameters that is superior on all problems. Yet this theorem assumes no knowledge about the problem and incorporates ALL problems into its proof, even completely random ones. So its practical relevance is minimal. Still adaption provides the opportunity to customize the evolutionary algo- rithm to the needs of the problem and to change the strategy parameters during the search. This enables the possibility to incorporate information about the search space into the algorithm and even let the algorithm select the appropriate information. The can be considered as two separate searches. The first is the usual search in the problem space. The second search considers the parameter space to find an optimal configuration of the EA. Yet most presented algorithm take only one small part of this space into account, e.g. if the mutation rate pm is self-adapted it’s only a small part of the parameter space. The other parameters are still static and tuned. Similarly the meta-GA[Gre86] is confined to its own search space. Another advantage of self-adaptive EAs is, that e.g. in mutation rate adap- tion, each individual gets its own rate, rising the diversity of the population. This rises the natural drift, meaning that the convergence to local minima is not as high as it would be with standard EAs. This idea was used in [Spe95], as the choice between the disruptive uniform crossover corresponds to exploration of search space or diversity in the population, while the least disruptive 2-point crossover matches the exploitation1 or the convergence to some solution. In this way the algorithm adapted itself, whether it should explore or not. 1 E.g. Staying at the same place in the search space, having found a solution. 11
  • 15. 12 CHAPTER 3. SUMMARY The fact that self-adaptive EAs do not deliver the hoped for outcomes and complicates the EA let some researchers to disregard it. But often they do not take the time of parameter tuning into account. 3.2 Future prospects One of the difficulties of optimizing parameter settings of EAs is that the in- teractions between these parameters are often unpredictable. Thus the combi- nation of self-adaptive parameters is an area where research can result in new insights and improvements to EAs. Very little runtime analysis has been done on the topic of EAs, and even less with respect to self-adaptive EAs. A framework where the efficiency of self- adaptive EAs can be measured an compared is absent. A collection of functions to optimize is available yet this does not cover all forms of problems covered by EAs. There is no treatment about the problems where self-adaption excels and in which situation it fails, which should be an interesting topic, too.
  • 16. References [B¨ac92] T. B¨ack. Self-adaptation in genetic algorithms. In Varela and Bourgine[723], pages 263–271, 1992. [EHM99] ´Agoston Endre Eiben, Robert Hinterding, and Zbigniew Michalewicz. Parameter control in evolutionary algorithms. IEEE Trans. on Evolutionary Computation, 3(2):124–141, 1999. [ERR94] Agoston E. Eiben, P.-E. Rau´e, and Zs. Ruttkay. Genetic algorithms with multi-parent recombination. In Yuval Davidor, Hans-Paul Schwefel, and Reinhard M¨anner, editors, Parallel Problem Solving from Nature – PPSN III, pages 78–87, Berlin, 1994. Springer. [ESKT97] A. Eiben, I. Sprinkhuizen-Kuyper, and B. Thijssen. Competing crossovers in an adaptive ga framework. -, 1997. [Fog97] David B. Fogel. Evolutionary computation: A new transactions. IEEE Trans. Evolutionary Computation, 1(1):1–2, 1997. [Gre86] J Grefenstette. Optimization of control parameters for genetic algo- rithms. IEEE Trans. Syst. Man Cybern., 16(1):122–128, 1986. [JS91] Kenneth A. De Jong and William M. Spears. An analysis of of the interacting roles of population size and crossover in genetic algo- rithms. In Parallel Problem Solving from Nature - Proceedings of 1st Workshop, PPSN 1, 1991. [Rec73] Ingo Rechenberg. Evolutionsstrategie: Optimierung Technischer Systeme nach Prinzipien der Biologischen Evolution. Fromman- Holzboog Verlag, 1973. [SF96] Jim E. Smith and Terence C. Fogarty. Adaptively parameterised evolutionary systems: Self adaptive recombination and mutation in a genetic algorithm. In H. Voigt, W. Ebeling, and I. Rechenberg, editors, Parallel Problem Solving from Nature – PPSN IV (Berlin, 1996), pages 441–450, Berlin, 1996. Springer. 13
  • 17. 14 REFERENCES [SM87] J. D. Schaffer and A. Morishima. An adaptive crossover distribution mechanisms for genetic algorithms. In Grefenstette J. (ed)., edi- tor, Proceedings of the Second International Conference on Genetic Algorithms, pages 36–40, Cambridge, 1987. Lawrence Erlbaum. [Spe95] William Spears. Adapting crossover in a genetic algorithm. In Proc. of the Fourth Annual Conference on Evolutionary Program- ming, 1995. [WM97] David H. Wolpert and William G. Macready. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1):67–82, April 1997.