Your SlideShare is downloading. ×
0
Multi-objective meta-parameter tuning  for mono-objective stochastic metaheuristics <ul><ul><li>Johann Dréo </li></ul></ul...
Introduction <ul><ul><li>Multi-objective method </li></ul></ul><ul><ul><li>Parameter tuning </li></ul></ul><ul><ul><li>Sto...
Stochastic metaheuristics
Examples of stochastic metaheuristics
Parameter setting
Meta-parameter tuning
As a mono-objective problem <ul><li>Parameter setting: </li></ul><ul><li>Improve performance </li></ul>http://www.flickr.c...
As a multi-objective problem <ul><li>Parameter setting: </li></ul><ul><li>What is performance ? </li></ul><ul><li>-> multi...
Multi-objective problem <ul><li>Performance ? </li></ul><ul><ul><li>Precision </li></ul></ul><ul><ul><li>Speed </li></ul><...
Multi-objective problem <ul><li>Performance ? </li></ul><ul><ul><li>Precision </li></ul></ul><ul><ul><li>Speed </li></ul><...
Meta-parameter tuning Mono-objective problem Stochastic metaheuristic
Meta-parameter tuning Multi-objective parameter tuning problem Mono-objective problem Stochastic metaheuristic
Meta-parameter tuning Multi-objective parameter tuning problem Mono-objective problem Stochastic metaheuristic Meta-optimi...
Complexity Difficult Easier 1 time Multi-objective parameter tuning problem Mono-objective problem Stochastic metaheuristi...
Methodology Speed / Precision Median estimation Mono-objective problem Stochastic metaheuristic NSGA-2
Methodology Speed / Precision Median estimation Mono-objective problem Stochastic metaheuristic NSGA-2
Results plots Speed Precision <ul><ul><li>Performance profile / front </li></ul></ul>
Some results
Example <ul><ul><li>2 continuous EDA (CEDA, CHEDA) </li></ul></ul><ul><ul><ul><li>Sampling density parameter </li></ul></u...
Example <ul><li>+  simulated annealing </li></ul><ul><ul><ul><li>stable temperature parameter </li></ul></ul></ul><ul><ul>...
Example <ul><li>+  genetic algorithm </li></ul><ul><ul><ul><li>population parameter </li></ul></ul></ul><ul><ul><li>Rosenb...
SA JGEN CEDA CHEDA Speed Precision
Behaviour exploration Speed Precision <ul><ul><li>Genetic algorithm </li></ul></ul><ul><ul><li>Population size </li></ul><...
Performance front <ul><ul><li>Temporal planner, ''Divide & Evolve > CPT'', version ''GOAL'' </li></ul></ul><ul><ul><ul><li...
Performance front in Parameters space Speed Precision M1 M2
Previous parameters settings
Conclusion
Drawbacks <ul><ul><li>Computation cost </li></ul></ul><ul><ul><li>Stochastic  M.-O. algo.  -> supplementary bias </li></ul...
Drawbacks <ul><ul><li>Computation cost </li></ul></ul><ul><ul><li>Stochastic  M.-O. algo.  -> supplementary bias </li></ul...
Drawbacks <ul><ul><li>Computation cost </li></ul></ul><ul><ul><li>Stochastic  M.-O. algo.  -> supplementary bias </li></ul...
Advantages <ul><ul><li>Performance profiles </li></ul></ul><ul><ul><ul><li>Objectives space </li></ul></ul></ul><ul><ul><u...
Advantages <ul><ul><li>Performance profiles </li></ul></ul><ul><ul><ul><li>Objectives space </li></ul></ul></ul><ul><ul><u...
Advantages <ul><ul><li>Performance profiles </li></ul></ul><ul><ul><ul><li>Objectives space </li></ul></ul></ul><ul><ul><u...
Advantages <ul><ul><li>Performance profiles </li></ul></ul><ul><ul><ul><li>Objectives space </li></ul></ul></ul><ul><ul><u...
Perspectives <ul><ul><li>Include robustness </li></ul></ul><ul><ul><li>Include dispersion estimation </li></ul></ul><ul><u...
[email_address] http://www.flickr.com/photos/earlg/275371357/
Upcoming SlideShare
Loading in...5
×

Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics

399

Published on

Johann Dréo, \"Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics\", 2nd International Conference on Metaheuristics and Nature Inspired Computing - 30 October 2008

One of the main difficulties of applying a stochastic metaheuristics to an optimization problem is to choose the best parameter setting. Our work suggest that this can be considered as a bi-objective problem, where one try to optimize both speed and precision of the underlying algorithm. Moreover, this objective function is perturbated by noise, due to the stochastic nature of the metaheuristics, thus necessiting appropriate estimation of the measure bias. In this article, we propose to use bi-objective metaheuristic along with a simple method of noise estimation to find the Pareto front of the bests sets of parameters. The method has the advantages of: aggregating the several parameters of the studied metaheuristic into a single one, permitting to study their relative influence on its behavior and comparing several metaheuristics.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
399
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
8
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Transcript of "Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics"

    1. 1. Multi-objective meta-parameter tuning for mono-objective stochastic metaheuristics <ul><ul><li>Johann Dréo </li></ul></ul><ul><ul><li>THALES Research & Technology </li></ul></ul>
    2. 2. Introduction <ul><ul><li>Multi-objective method </li></ul></ul><ul><ul><li>Parameter tuning </li></ul></ul><ul><ul><li>Stochastic metaheuristics </li></ul></ul><ul><ul><li>Performance profiles </li></ul></ul>http://www.flickr.com/photos/k23/2792398403/ Dreo & Siarry, 2004
    3. 3. Stochastic metaheuristics
    4. 4. Examples of stochastic metaheuristics
    5. 5. Parameter setting
    6. 6. Meta-parameter tuning
    7. 7. As a mono-objective problem <ul><li>Parameter setting: </li></ul><ul><li>Improve performance </li></ul>http://www.flickr.com/photos/sigfrid/223626315/
    8. 8. As a multi-objective problem <ul><li>Parameter setting: </li></ul><ul><li>What is performance ? </li></ul><ul><li>-> multi-objective problem </li></ul>http://www.flickr.com/photos/jesusdq/345379863/
    9. 9. Multi-objective problem <ul><li>Performance ? </li></ul><ul><ul><li>Precision </li></ul></ul><ul><ul><li>Speed </li></ul></ul><ul><ul><li>Robustness </li></ul></ul><ul><ul><ul><li>Precision </li></ul></ul></ul><ul><ul><ul><li>Speed </li></ul></ul></ul><ul><ul><li>Stability (← benchmark) </li></ul></ul>http://www.flickr.com/photos/matthewfch/1688409628/
    10. 10. Multi-objective problem <ul><li>Performance ? </li></ul><ul><ul><li>Precision </li></ul></ul><ul><ul><li>Speed </li></ul></ul><ul><ul><li>Robustness </li></ul></ul><ul><ul><ul><li>Precision </li></ul></ul></ul><ul><ul><ul><li>Speed </li></ul></ul></ul><ul><ul><li>Stability (← benchmark) </li></ul></ul>
    11. 11. Meta-parameter tuning Mono-objective problem Stochastic metaheuristic
    12. 12. Meta-parameter tuning Multi-objective parameter tuning problem Mono-objective problem Stochastic metaheuristic
    13. 13. Meta-parameter tuning Multi-objective parameter tuning problem Mono-objective problem Stochastic metaheuristic Meta-optimizer
    14. 14. Complexity Difficult Easier 1 time Multi-objective parameter tuning problem Mono-objective problem Stochastic metaheuristic Meta-optimizer
    15. 15. Methodology Speed / Precision Median estimation Mono-objective problem Stochastic metaheuristic NSGA-2
    16. 16. Methodology Speed / Precision Median estimation Mono-objective problem Stochastic metaheuristic NSGA-2
    17. 17. Results plots Speed Precision <ul><ul><li>Performance profile / front </li></ul></ul>
    18. 18. Some results
    19. 19. Example <ul><ul><li>2 continuous EDA (CEDA, CHEDA) </li></ul></ul><ul><ul><ul><li>Sampling density parameter </li></ul></ul></ul><ul><ul><li>Rosenbrock, 2 dimensions </li></ul></ul><ul><ul><li>Median estimated with 10 runs </li></ul></ul><ul><ul><li>10 000 max eval. </li></ul></ul><ul><ul><li>NSGA-2 </li></ul></ul><ul><ul><ul><li>20 iter., 50 indiv. </li></ul></ul></ul><ul><ul><li>10 runs </li></ul></ul><ul><ul><li>3 days computation </li></ul></ul>+ Nelder-Mead Search
    20. 20. Example <ul><li>+ simulated annealing </li></ul><ul><ul><ul><li>stable temperature parameter </li></ul></ul></ul><ul><ul><li>Rosenbrock, 2 dimensions </li></ul></ul><ul><ul><li>Median estimated with 10 runs </li></ul></ul><ul><ul><li>10 000 max eval. </li></ul></ul><ul><ul><li>NSGA-2 </li></ul></ul><ul><ul><ul><li>20 iter., 50 indiv. </li></ul></ul></ul><ul><ul><li>10 runs </li></ul></ul><ul><ul><li>1 day computation </li></ul></ul>
    21. 21. Example <ul><li>+ genetic algorithm </li></ul><ul><ul><ul><li>population parameter </li></ul></ul></ul><ul><ul><li>Rosenbrock, 2 dimensions </li></ul></ul><ul><ul><li>Median estimated with 10 runs </li></ul></ul><ul><ul><li>10 000 max eval. </li></ul></ul><ul><ul><li>NSGA-2 </li></ul></ul><ul><ul><ul><li>20 iter., 50 indiv. </li></ul></ul></ul><ul><ul><li>10 runs </li></ul></ul><ul><ul><li>1 day computation </li></ul></ul>
    22. 22. SA JGEN CEDA CHEDA Speed Precision
    23. 23. Behaviour exploration Speed Precision <ul><ul><li>Genetic algorithm </li></ul></ul><ul><ul><li>Population size </li></ul></ul>
    24. 24. Performance front <ul><ul><li>Temporal planner, ''Divide & Evolve > CPT'', version ''GOAL'' </li></ul></ul><ul><ul><ul><li>2 mutation parameters </li></ul></ul></ul><ul><ul><li>IPC ''rovers'' problem, instance 06 </li></ul></ul><ul><ul><li>Median estimated with 10 runs </li></ul></ul><ul><ul><li>NSGA-2 </li></ul></ul><ul><ul><ul><li>10 iter., 5 indiv. </li></ul></ul></ul><ul><ul><li>30 runs </li></ul></ul><ul><ul><li>1 week computation for 1 run </li></ul></ul>
    25. 25. Performance front in Parameters space Speed Precision M1 M2
    26. 26. Previous parameters settings
    27. 27. Conclusion
    28. 28. Drawbacks <ul><ul><li>Computation cost </li></ul></ul><ul><ul><li>Stochastic M.-O. algo. -> supplementary bias </li></ul></ul>http://www.flickr.com/photos/orvaratli/2690949652/
    29. 29. Drawbacks <ul><ul><li>Computation cost </li></ul></ul><ul><ul><li>Stochastic M.-O. algo. -> supplementary bias </li></ul></ul><ul><ul><li>Valid only for: </li></ul></ul><ul><ul><ul><li>Algorithm implementation </li></ul></ul></ul><ul><ul><ul><li>Problem instance </li></ul></ul></ul><ul><ul><ul><li>Stopping criterion </li></ul></ul></ul><ul><ul><ul><ul><li>Error </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Time </li></ul></ul></ul></ul><ul><ul><ul><ul><li>t steps, improvement < ε </li></ul></ul></ul></ul>http://www.flickr.com/photos/orvaratli/2690949652/
    30. 30. Drawbacks <ul><ul><li>Computation cost </li></ul></ul><ul><ul><li>Stochastic M.-O. algo. -> supplementary bias </li></ul></ul><ul><ul><li>Valid only for: </li></ul></ul><ul><ul><ul><li>Algorithm implementation </li></ul></ul></ul><ul><ul><ul><li>Problem instance </li></ul></ul></ul><ul><ul><ul><li>Stopping criterion </li></ul></ul></ul><ul><ul><ul><ul><li>Error </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Time </li></ul></ul></ul></ul><ul><ul><ul><ul><li>t steps, improvement < ε </li></ul></ul></ul></ul><ul><ul><li>Fronts often convex -> aggregations ? </li></ul></ul><ul><ul><li>No benchmarking </li></ul></ul>http://www.flickr.com/photos/orvaratli/2690949652/
    31. 31. Advantages <ul><ul><li>Performance profiles </li></ul></ul><ul><ul><ul><li>Objectives space </li></ul></ul></ul><ul><ul><ul><li>Parameters space </li></ul></ul></ul><ul><ul><ul><li>Quantification of expert knowledge </li></ul></ul></ul>
    32. 32. Advantages <ul><ul><li>Performance profiles </li></ul></ul><ul><ul><ul><li>Objectives space </li></ul></ul></ul><ul><ul><ul><li>Parameters space </li></ul></ul></ul><ul><ul><ul><li>Quantification of expert knowledge </li></ul></ul></ul><ul><ul><li>Automatic parameter tuning </li></ul></ul><ul><ul><ul><li>One step before use </li></ul></ul></ul><ul><ul><ul><li>N parameters -> 1 parameter </li></ul></ul></ul><ul><ul><ul><li>More degrees of freedom </li></ul></ul></ul>
    33. 33. Advantages <ul><ul><li>Performance profiles </li></ul></ul><ul><ul><ul><li>Objectives space </li></ul></ul></ul><ul><ul><ul><li>Parameters space </li></ul></ul></ul><ul><ul><ul><li>Quantification of expert knowledge </li></ul></ul></ul><ul><ul><li>Automatic parameter tuning </li></ul></ul><ul><ul><ul><li>One step before use </li></ul></ul></ul><ul><ul><ul><li>N parameters -> 1 parameter </li></ul></ul></ul><ul><ul><ul><li>More degrees of freedom </li></ul></ul></ul><ul><ul><li>Algorithms comparison </li></ul></ul><ul><ul><ul><li>Statistical tests more meaningful </li></ul></ul></ul>
    34. 34. Advantages <ul><ul><li>Performance profiles </li></ul></ul><ul><ul><ul><li>Objectives space </li></ul></ul></ul><ul><ul><ul><li>Parameters space </li></ul></ul></ul><ul><ul><ul><li>Quantification of expert knowledge </li></ul></ul></ul><ul><ul><li>Automatic parameter tuning </li></ul></ul><ul><ul><ul><li>One step before use </li></ul></ul></ul><ul><ul><ul><li>N parameters -> 1 parameter </li></ul></ul></ul><ul><ul><ul><li>More degrees of freedom </li></ul></ul></ul><ul><ul><li>Algorithms comparison </li></ul></ul><ul><ul><ul><li>Statistical tests more meaningful </li></ul></ul></ul><ul><ul><li>Behaviour understanding </li></ul></ul>
    35. 35. Perspectives <ul><ul><li>Include robustness </li></ul></ul><ul><ul><li>Include dispersion estimation </li></ul></ul><ul><ul><li>Include benchmarking </li></ul></ul><ul><ul><li>Multi-objective SPO, F-Race </li></ul></ul><ul><ul><li>Regressions in parameters space </li></ul></ul><ul><ul><ul><li>Performances / parameters </li></ul></ul></ul><ul><ul><ul><li>Behaviour models? </li></ul></ul></ul><ul><ul><li>Links? </li></ul></ul><ul><ul><ul><li>Fitness Landscape / Performance profiles </li></ul></ul></ul><ul><ul><ul><li>Run time distribution </li></ul></ul></ul><ul><ul><ul><li>Taillard's significance plots </li></ul></ul></ul><ul><ul><ul><li>... </li></ul></ul></ul>http://www.flickr.com/photos/colourcrazy/2065575762/
    36. 36. [email_address] http://www.flickr.com/photos/earlg/275371357/
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×