Your SlideShare is downloading. ×
Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics

383
views

Published on

Johann Dréo, \"Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics\", 2nd International Conference on Metaheuristics and Nature Inspired Computing - 30 October …

Johann Dréo, \"Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics\", 2nd International Conference on Metaheuristics and Nature Inspired Computing - 30 October 2008

One of the main difficulties of applying a stochastic metaheuristics to an optimization problem is to choose the best parameter setting. Our work suggest that this can be considered as a bi-objective problem, where one try to optimize both speed and precision of the underlying algorithm. Moreover, this objective function is perturbated by noise, due to the stochastic nature of the metaheuristics, thus necessiting appropriate estimation of the measure bias. In this article, we propose to use bi-objective metaheuristic along with a simple method of noise estimation to find the Pareto front of the bests sets of parameters. The method has the advantages of: aggregating the several parameters of the studied metaheuristic into a single one, permitting to study their relative influence on its behavior and comparing several metaheuristics.

Published in: Technology

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
383
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
7
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Transcript

    • 1. Multi-objective meta-parameter tuning for mono-objective stochastic metaheuristics
        • Johann Dréo
        • THALES Research & Technology
    • 2. Introduction
        • Multi-objective method
        • Parameter tuning
        • Stochastic metaheuristics
        • Performance profiles
      http://www.flickr.com/photos/k23/2792398403/ Dreo & Siarry, 2004
    • 3. Stochastic metaheuristics
    • 4. Examples of stochastic metaheuristics
    • 5. Parameter setting
    • 6. Meta-parameter tuning
    • 7. As a mono-objective problem
      • Parameter setting:
      • Improve performance
      http://www.flickr.com/photos/sigfrid/223626315/
    • 8. As a multi-objective problem
      • Parameter setting:
      • What is performance ?
      • -> multi-objective problem
      http://www.flickr.com/photos/jesusdq/345379863/
    • 9. Multi-objective problem
      • Performance ?
        • Precision
        • Speed
        • Robustness
          • Precision
          • Speed
        • Stability (← benchmark)
      http://www.flickr.com/photos/matthewfch/1688409628/
    • 10. Multi-objective problem
      • Performance ?
        • Precision
        • Speed
        • Robustness
          • Precision
          • Speed
        • Stability (← benchmark)
    • 11. Meta-parameter tuning Mono-objective problem Stochastic metaheuristic
    • 12. Meta-parameter tuning Multi-objective parameter tuning problem Mono-objective problem Stochastic metaheuristic
    • 13. Meta-parameter tuning Multi-objective parameter tuning problem Mono-objective problem Stochastic metaheuristic Meta-optimizer
    • 14. Complexity Difficult Easier 1 time Multi-objective parameter tuning problem Mono-objective problem Stochastic metaheuristic Meta-optimizer
    • 15. Methodology Speed / Precision Median estimation Mono-objective problem Stochastic metaheuristic NSGA-2
    • 16. Methodology Speed / Precision Median estimation Mono-objective problem Stochastic metaheuristic NSGA-2
    • 17. Results plots Speed Precision
        • Performance profile / front
    • 18. Some results
    • 19. Example
        • 2 continuous EDA (CEDA, CHEDA)
          • Sampling density parameter
        • Rosenbrock, 2 dimensions
        • Median estimated with 10 runs
        • 10 000 max eval.
        • NSGA-2
          • 20 iter., 50 indiv.
        • 10 runs
        • 3 days computation
      + Nelder-Mead Search
    • 20. Example
      • + simulated annealing
          • stable temperature parameter
        • Rosenbrock, 2 dimensions
        • Median estimated with 10 runs
        • 10 000 max eval.
        • NSGA-2
          • 20 iter., 50 indiv.
        • 10 runs
        • 1 day computation
    • 21. Example
      • + genetic algorithm
          • population parameter
        • Rosenbrock, 2 dimensions
        • Median estimated with 10 runs
        • 10 000 max eval.
        • NSGA-2
          • 20 iter., 50 indiv.
        • 10 runs
        • 1 day computation
    • 22. SA JGEN CEDA CHEDA Speed Precision
    • 23. Behaviour exploration Speed Precision
        • Genetic algorithm
        • Population size
    • 24. Performance front
        • Temporal planner, ''Divide & Evolve > CPT'', version ''GOAL''
          • 2 mutation parameters
        • IPC ''rovers'' problem, instance 06
        • Median estimated with 10 runs
        • NSGA-2
          • 10 iter., 5 indiv.
        • 30 runs
        • 1 week computation for 1 run
    • 25. Performance front in Parameters space Speed Precision M1 M2
    • 26. Previous parameters settings
    • 27. Conclusion
    • 28. Drawbacks
        • Computation cost
        • Stochastic M.-O. algo. -> supplementary bias
      http://www.flickr.com/photos/orvaratli/2690949652/
    • 29. Drawbacks
        • Computation cost
        • Stochastic M.-O. algo. -> supplementary bias
        • Valid only for:
          • Algorithm implementation
          • Problem instance
          • Stopping criterion
            • Error
            • Time
            • t steps, improvement < ε
      http://www.flickr.com/photos/orvaratli/2690949652/
    • 30. Drawbacks
        • Computation cost
        • Stochastic M.-O. algo. -> supplementary bias
        • Valid only for:
          • Algorithm implementation
          • Problem instance
          • Stopping criterion
            • Error
            • Time
            • t steps, improvement < ε
        • Fronts often convex -> aggregations ?
        • No benchmarking
      http://www.flickr.com/photos/orvaratli/2690949652/
    • 31. Advantages
        • Performance profiles
          • Objectives space
          • Parameters space
          • Quantification of expert knowledge
    • 32. Advantages
        • Performance profiles
          • Objectives space
          • Parameters space
          • Quantification of expert knowledge
        • Automatic parameter tuning
          • One step before use
          • N parameters -> 1 parameter
          • More degrees of freedom
    • 33. Advantages
        • Performance profiles
          • Objectives space
          • Parameters space
          • Quantification of expert knowledge
        • Automatic parameter tuning
          • One step before use
          • N parameters -> 1 parameter
          • More degrees of freedom
        • Algorithms comparison
          • Statistical tests more meaningful
    • 34. Advantages
        • Performance profiles
          • Objectives space
          • Parameters space
          • Quantification of expert knowledge
        • Automatic parameter tuning
          • One step before use
          • N parameters -> 1 parameter
          • More degrees of freedom
        • Algorithms comparison
          • Statistical tests more meaningful
        • Behaviour understanding
    • 35. Perspectives
        • Include robustness
        • Include dispersion estimation
        • Include benchmarking
        • Multi-objective SPO, F-Race
        • Regressions in parameters space
          • Performances / parameters
          • Behaviour models?
        • Links?
          • Fitness Landscape / Performance profiles
          • Run time distribution
          • Taillard's significance plots
          • ...
      http://www.flickr.com/photos/colourcrazy/2065575762/
    • 36. [email_address] http://www.flickr.com/photos/earlg/275371357/