No Advance 8868886958 Chandigarh Call Girls , Indian Call Girls For Full Nigh...
Three rotational invariant variants of the 3SOME algorithms
1. Three Variants of Three Stage Optimal
Memetic Exploration for Handling
Non-Separable Fitness Landscapes
Fabio Caraffini, Giovanni Iacca, Ferrante Neri, and Ernesto
Mininno
De Montfort University
United Kingdom
05.09.2012
(UKCI2012, Edinburgh)
2. Outline
Separability in Computational Intelligence Optimisation
Three Stage Optimal Memetic Exploration (3SOME)
Rotation Invariant Shrinking 3SOME (RIS − 3SOME)
Micro-Population Differential Evolution 3SOME
(µDE − 3SOME)
3SOME with 1+1 Covariance Matrix Adaptation Evolution
Strategy ((1 + 1)CMA − ES − 3SOME)
Experiments and results
Conclusions
3. Separability in Computational Intelligende Optimisation
A mathematical function f (x), with x ∈ RN and N ∈ N, is
separable if and only if:
arg min
x
f (x) = arg min
x1
f(x1, . . . ), arg min
x2
f(. . . , x2, . . . ), arg min
xN
f(. . . , xN) (1)
A function with at most m non independent variables is said to be
m − nonseparable.
From an algorithmic point of view, functions can be grouped
depending on the number of their non-independent variables
⇒ In Computer Science
separability is a “fuzzy”
property
4. Three Stage Optimal Memetic Exploration
3SOME 1 is:
a Memetic Computing optimisation algorithm consisting of
3 heterogenous Memes
Memory Saving and Fast (suitable for Real-time and
embedded applications)
besed on the Ockham’s Razor in MC principle
“simple algorithms can display a performance which is as good as
that of complex algorithms1”
1
G. Iacca, F. Neri, E. Mininno, Y.S. Ong, M.H. Lim, Ockham’s Razor in Memetic Computing:
Three Stage Optimal Memetic Exploration, Information Sciences, Elsevier, Volume 188,
pages 17-43, April 2012
5. Three Stage Optimal Memetic Exploration (Stage 1)
Elite xe = current best solution Trial xt = current trial solution
Long distance exploration: at first a solution xt is randomly
generated and then the exponential crossover (Differential
Evolution) with xe is performed with a low inheritance factor
Long distance exploration iterates until a new promising solution is
found (S → “Success”)
6. Three Stage Optimal Memetic Exploration (Stage 2)
Middle distance exploration: an hypercube of side δ and centred
around the solution xe is constructed. For
K · (problem − dimension) times a trial point xt is generated within
the hypercube, then the exponential crossover is performed with an
high inheritance factor.
Middle distane exploration is continued while successful
7. Three Stage Optimal Memetic Exploration (Stage 3)
Short distance exploration: xe is refined by a deterministic
steepest descent local search
Short distance explorator has a fixed budget. If fails (F) the first
operator is then activated, conversely (S) the intermediate.
8. Rotation Invariant Shrinking 3SOME
Handling non separability, a first approach:
RI-Shrinking: a hypercube of side δ and centred around the
solution xe is constructed. Three points (xv , xr and xs) are
sampled within the hypercub, in order to apply the rotational
invariant movement given by the DE/current-to-rand/1:
xt = xe + K(xv − xe ) + F · K(xr − xs ), K ∼ U[0, 1] (2)
If xe has not been improved the hypercube is halved. This
procedure ends when the volume is under a threshold
Shrinking is restrted if successful.
9. Micro-Population Differential Evolution 3SOME
(µDE − 3SOME)
A slightly complex structure:
µDE: a DE with a micro population replaces the second operator.
The worst individual is replaced with xe
DE/current-to-rand/1 is applied N times, N = problem
dimension
µDE is repeated if successful
10. 3SOME with 1+1 Covariance Matrix Adaptation Evolution
Strategy ((1 + 1)CMA − ES − 3SOME)
A memory expensive approach:
(1 + 1)CMA − ES2 replaces the middle distance exploration.
At each iteration a new point xt is sampled from a
multivariate normal distribution N(xe, )
, which represents the dependencies between the variable, is
updated after every iteration
(1 + 1)CMA − ES is repeated if successful
2
C. Igel, T. Suttorp, and N. Hansen, “A computational efficient covariance matrix update and a
(1+1)-CMA for evolution strategies”, in Proceedings of the Genetic and Evolutionary Computation Conference.
11. Experiments and Results
Experimental setup:
BBOB20103 at 10, 20, 40 and 100 dimensions
100 runs, each of them has been performed for 5000 × n
fitness evaluations
3
N. Hansen, A. Auger, S. Finck, R. Ros et al., Real-parameter black- box optimization
benchmarking 2010: Noiseless functions definitions, INRIA, Tech. Rep. RR-6829, 2010
12. Experiments and Results
Numerical resulst show that all of the variants improve upon
3SOME performance
RIS − 3SOME and µDE − 3SOME have similiar performaces
(1 + 1)CMA − ES − 3SOME outperforms 3SOME also in
some separable ill-condinioned problems
13. Experiments and Results
Fitness trends for f1 (sphere) in 10 dimensions
0 500 1000 1500 2000 2500 3000 3500 4000 4500
50
100
150
200
250
300
350
400
450
500
FitnessValue
Fitness Evaluations
3SOME
DE−3SOME
(1+1)CMA−ES−3SOME
RIS−3SOME
14. Experiments and Results
Fitness trends for f10 (Ellipsoid with distortion) in 40 dimensions
40000 80000 120000
0
0.5
1
1.5
2
2.5
3
3.5
x 10
6
Fitness Evaluations
FitnessValue
3SOME
(1+1)CMA−ES−3SOME
DE−3SOME
RIS−3SOME
16. Conclusions
All of the variants improve upon 3SOME performance for
non-separable problems without a performance deterioration
in the other problems
The 3 proposed solution, are stastically very similiar on
non-separable problems
In accordance with the Ockham’s Razor principle, the simplest
solution has shown to be as effective as the others