Many multi-objective optimisation problems incorporate computationally or financially expensive objective functions. State-of-the-art algorithms therefore construct surrogate model(s) of the parameter space to objective functions mapping to guide the choice of the next solution to expensively evaluate. Starting from an initial set of solutions, an infill criterion — a surrogate-based indicator of quality — is extremised to determine which solution to evaluate next, until the budget of expensive evaluations is exhausted. Many successful infill criteria are dependent on multi-dimensional integration, which may result in infill criteria that are themselves impractically expensive. We propose a computationally cheap infill criterion based on the minimum probability of improvement over the estimated Pareto set. We also present a range of set-based scalarisation methods modelling hypervolume contribution, dominance ratio and distance measures. These permit the use of straightforward expected improvement as a cheap infill criterion. We investigated the performance of these novel strategies on standard multi-objective test problems, and compared them with the popular SMS-EGO and ParEGO methods. Unsurprisingly, our experiments show that the best strategy is problem dependent, but in many cases a cheaper strategy is at least as good as more expensive alternatives.
Preprint repository: https://ore.exeter.ac.uk/repository/handle/10871/27157
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
Alternative Infill Strategies for Expensive Multi-Objective Optimisation
1. Alternative Infill Strategies
for Expensive Multi-Objective Optimisation
Alma Rahat
Richard Everson
Jonathan Fieldsend
Department of Computer Science
University of Exeter
United Kingdom
Supported by Engineering and Physical Sciences Research Council (EPSRC), UK
Genetic and Evolutionary Computation Conference (GECCO), Berlin
18 July 2017
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 1 / 12
2. Expensive Optimisation Problems
x = (cheese, . . . )
ingredients vector
f (x)
taste
bake cake
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 2 / 12
3. Expensive Optimisation Problems
x = (cheese, . . . )
ingredients vector
f (x)
taste
bake cake
Expensive (computationally and/or financially) function evaluations.
Limited budget on function evaluations.
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 2 / 12
4. Expensive Optimisation Problems
x = (cheese, . . . )
ingredients vector
f (x)
taste
bake cake
Expensive (computationally and/or financially) function evaluations.
Limited budget on function evaluations.
Analytical model and gradients may not be available.
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 2 / 12
5. Expensive Optimisation Problems
x = (cheese, . . . )
ingredients vector
f (x)
taste
bake cake
Expensive (computationally and/or financially) function evaluations.
Limited budget on function evaluations.
Analytical model and gradients may not be available.
Solution: surrogate-assisted optimisation.
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 2 / 12
6. Efficient Global Optimisation (EGO)
x
f(x)
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 3 / 12
7. Efficient Global Optimisation (EGO)
x
f(x)
Initial samples (e.g. Latin
Hypercube):
D = {(xi , f (xi ))}
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 3 / 12
8. Efficient Global Optimisation (EGO)
x
p(ˆf|D)
Initial samples (e.g. Latin
Hypercube):
D = {(xi , f (xi ))}
Fit a Gaussian process
(GP) model: p(ˆf (x)|D)
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 3 / 12
9. Efficient Global Optimisation (EGO)
x
p(ˆf|D)
Initial samples (e.g. Latin
Hypercube):
D = {(xi , f (xi ))}
Fit a Gaussian process
(GP) model: p(ˆf (x)|D)
Define infill criterion:
expected improvement,
EI(x)
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 3 / 12
10. Efficient Global Optimisation (EGO)
p(ˆf|D)EI(x)
x
Initial samples (e.g. Latin
Hypercube):
D = {(xi , f (xi ))}
Fit a Gaussian process
(GP) model: p(ˆf (x)|D)
Define infill criterion:
expected improvement,
EI(x)
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 3 / 12
11. Efficient Global Optimisation (EGO)
p(ˆf|D)EI(x)
x
Initial samples (e.g. Latin
Hypercube):
D = {(xi , f (xi ))}
Fit a Gaussian process
(GP) model: p(ˆf (x)|D)
Define infill criterion:
expected improvement,
EI(x)
Sub-problem: maxx EI(x)
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 3 / 12
12. Efficient Global Optimisation (EGO)
p(ˆf|D)EI(x)
x
Initial samples (e.g. Latin
Hypercube):
D = {(xi , f (xi ))}
Fit a Gaussian process
(GP) model: p(ˆf (x)|D)
Define infill criterion:
expected improvement,
EI(x)
Sub-problem: maxx EI(x)
Repeat until budget is
exhausted
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 3 / 12
13. Efficient Global Optimisation (EGO)
p(ˆf|D)EI(x)
x
Initial samples (e.g. Latin
Hypercube):
D = {(xi , f (xi ))}
Fit a Gaussian process
(GP) model: p(ˆf (x)|D)
Define infill criterion:
expected improvement,
EI(x)
Sub-problem: maxx EI(x)
Repeat until budget is
exhausted
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 3 / 12
14. Efficient Global Optimisation (EGO)
p(ˆf|D)EI(x)
x
Initial samples (e.g. Latin
Hypercube):
D = {(xi , f (xi ))}
Fit a Gaussian process
(GP) model: p(ˆf (x)|D)
Define infill criterion:
expected improvement,
EI(x)
Sub-problem: maxx EI(x)
Repeat until budget is
exhausted (10 FEs)
Infill criterion is a surrogate based measure of utility.
Computation time for the infill criterion matters.
1 sec/evaluation × 100000 evaluations ≈ 1.15 days
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 3 / 12
15. Multi-Objective EGO
Multi-Surrogate Approaches
Model each function independently.
Infill criterion: S-metric, Expected Hypervolume Improvment, etc.
x Expensive Problem
f1(x)
f2(x)
p(ˆf1|D)
p(ˆf2|D)
Infill
Criterion
Mono-Surrogate Approaches
Model scalarised function, e.g. ParEGO (augmented Chebyshev).
Infill criterion: expected improvement in scalarised function.
x Expensive Problem
f1(x)
f2(x)
p(ˆg|D)g(x) Infill
Criterion
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 4 / 12
16. Multi-Objective EGO
Multi-Surrogate Approaches (effective but more expensive)
Model each function independently.
Infill criterion: S-metric, Expected Hypervolume Improvment, etc.
x Expensive Problem
f1(x)
f2(x)
p(ˆf1|D)
p(ˆf2|D)
Infill
Criterion
Mono-Surrogate Approaches (cheap but less effective)
Model scalarised function, e.g. ParEGO (augmented Chebyshev).
Infill criterion: expected improvement in scalarised function.
x Expensive Problem
f1(x)
f2(x)
p(ˆg|D)g(x) Infill
Criterion
Goal: cheap multi-surrogate infill criterion or effective scalarisation.
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 4 / 12
17. Minimum Probability of Improvement (MPoI)
p(ˆf2|D)
p(ˆf1|D)
(ˆµ1(xi ), ˆµ2(xi ))
ˆσ1(xi )
ˆσ2(xi )
Multi-Surrogates: multi-variate
predictive distribution.
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 5 / 12
18. Minimum Probability of Improvement (MPoI)
(ˆµ1(xj ), ˆµ2(xj ))
p(ˆf2|D)
p(ˆf1|D)
(ˆµ1(xi ), ˆµ2(xi ))
ˆσ1(xi )
ˆσ2(xi )
Multi-Surrogates: multi-variate
predictive distribution.
Probability of dominance.
P(xj xi ) =
M
m=1 P(ˆfm(xj ) < ˆfm(xi ))
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 5 / 12
19. Minimum Probability of Improvement (MPoI)
(ˆµ1(xj ), ˆµ2(xj ))
p(ˆf2|D)
p(ˆf1|D)
(ˆµ1(xi ), ˆµ2(xi ))
ˆσ1(xi )
ˆσ2(xi )
Multi-Surrogates: multi-variate
predictive distribution.
Probability of dominance.
P(xj xi ) =
M
m=1 P(ˆfm(xj ) < ˆfm(xi ))
Probability of improvement.
P(xi xj or xi ||xj ) = 1 − P(xj xi )
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 5 / 12
20. Minimum Probability of Improvement (MPoI)
(ˆµ1(xj ), ˆµ2(xj ))
p(ˆf2|D)
p(ˆf1|D)
(ˆµ1(xi ), ˆµ2(xi ))
ˆσ1(xi )
ˆσ2(xi )
Multi-Surrogates: multi-variate
predictive distribution.
Probability of dominance.
P(xj xi ) =
M
m=1 P(ˆfm(xj ) < ˆfm(xi ))
Probability of improvement.
P(xi xj or xi ||xj ) = 1 − P(xj xi )
Multi-Surrogate Infill Criterion.
Minimum probability of improvement over Pareto set P∗.
minx∈P∗ 1 − P(x xi )
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 5 / 12
21. Minimum Probability of Improvement (MPoI)
0.00 0.25 0.50 0.75 1.00
f1(x)
0.0
0.2
0.4
0.6
0.8
1.0
f2(x)
0.00 0.13 0.27 0.40 0.53 0.67 0.80 0.93
Minimum Probability of Improvement (MPoI)
Multi-Surrogates: multi-variate
predictive distribution.
Probability of dominance.
P(xj xi ) =
M
m=1 P(ˆfm(xj ) < ˆfm(xi ))
Probability of improvement.
P(xi xj or xi ||xj ) = 1 − P(xj xi )
Multi-Surrogate Infill Criterion.
Minimum probability of improvement over Pareto set P∗.
minx∈P∗ 1 − P(x xi )
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 5 / 12
31. Minimum Signed Distance (MSD)
f2(x)
f1(x)
Mono-surrogate approach:
set-based scalarisation function.
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 8 / 12
32. Minimum Signed Distance (MSD)
f2(x)
f1(x)
Mono-surrogate approach:
set-based scalarisation function.
Scalarisation using a distance
measure.
gd (x, X) = minx ∈P∗ d(x, x )
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 8 / 12
33. Minimum Signed Distance (MSD)
0.00 0.25 0.50 0.75 1.00
f1(x)
0.0
0.2
0.4
0.6
0.8
1.0
f2(x)
-1.69 -1.35 -1.00 -0.66 -0.31 0.04 0.38 0.73
Minimum Signed Distance (MSD)
Mono-surrogate approach:
set-based scalarisation function.
Scalarisation using a distance
measure.
gd (x, X) = minx ∈P∗ d(x, x )
Mono-Surrogate Scalarisation.
Distance measure:
d(x, x ) = M
m=1 fm(x) − fm(x ).
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 8 / 12
34. Experiment Setup
65 initial samples.
Budget: 250 function evaluations.
Infill criteria optimisation:
Optimiser: Bipop-CMA-ES.
Budget: 20000 function
evaluations per dimension.
Statistical tests:
11 simulation runs.
Matched samples.
Friedman test to determine if a
difference exists.
Wilcoxon Rank Sum test with
Bonferroni correction to identify
winner.
Mann-Whitney-U test to compare
with Latin Hypercube Samples.
Problem Parameters Objectives
n M
DTLZ1 6 3
DTLZ2 6 3
DTLZ5 6 6
DTLZ7 6 4
WFG1 6 2
WFG2 6 2
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 9 / 12
37. Performance Comparison: Computation Time
2 3 4 5 6
Number of Objectives
10−4
10−3
10−2
10−1
100
Time(seconds)
|P∗
| = 10
2 3 4 5 6
Number of Objectives
10−4
10−3
10−2
10−1
100
Time(seconds)
|P∗
| = 50
2 3 4 5 6
Number of Objectives
10−4
10−3
10−2
10−1
100
Time(seconds)
|P∗
| = 100
Mono-Surrogate
MPoI
SMS-EGO
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 11 / 12
38. Summary
5.1 5.2 5.3 5.4
Hypervolume
msd2(2)
msd3(7)
msd4(1)
msd5(3)
msd7(5)
msd8(0)
optSAF(5)
SMSEGO(2)
DTLZ2
97 98 99
Hypervolume
msd2(1)
msd3(5)
msd4(0)
msd5(2)
msd7(2)
msd8(0)
optSAF(0)
SMSEGO(6)
UF1
Fast alternative strategies
perform as well as SMS-EGO
in half the test problems and
outperform ParEGO.
Overall rank: SMS-EGO,
HypI, DomRank, MPoI,
MSD, ParEGO, LHS.
Performance is problem
dependent.
Current and Future work
Choosing the best infill
strategy from all available
strategies during
optimisation.
Python code available at: https://bitbucket.org/arahat/gecco-2017
Rahat, Everson and Fieldsend Expensive Multi-Objective Optimisation GECCO, Berlin, 18 July 2017 12 / 12