Comparison-Based Complexityof Multi-Objective Optimization                    Paper by O. Teytaud         Presented by M. ...
OutlineMultiobjective optimizationComplexity upper boundsComplexity lower boundsDiscussion
Evolutionary multi-objectiveoptimization Generate a population Evaluate fitness values Select the « best » (various rules ...
Model of fitness functions ?No assumption ?  Unrealistically pessimistic results  Unreadable lower boundsLets do as in:   ...
OutlineMultiobjective optimizationComplexity upper boundsComplexity lower boundsDiscussion
Complexity upper boundsEach objective function = a sphereBelow just a short overview of algorithms; ==> the real algorithm...
Finding one point of the Pareto Setd objective functionsIn dimension NOne point at distance at most e of the PFcost=O( (N+...
Finding the whole Pareto Setd objective functionsIn dimension NOne point at distance at most e of the PF for  the Hausdorf...
OutlineMultiobjective optimizationComplexity upper boundsComplexity lower bounds  Proof technique (monoobjective)  The MO ...
We want to know how many iterations we need for reaching precision   in an evolutionary algorithm.Key observation: (most)...
We want to know how many iterations we need for reaching precision   in an evolutionary algorithm.Key observation: (most)...
We want to know how many iterations we need for reaching precision   in an evolutionary algorithm.Key observation: (most)...
We want to know how many iterations we need for reaching precision   in an evolutionary algorithm.Key observation: (most)...
We want to know how many iterations we need for reaching precision   in an evolutionary algorithm.Key observation: (most)...
We want to know how many iterations we need for reaching precision   in an evolutionary algorithm.Key observation: (most)...
We want to know how many iterations we need for reaching precision   in an evolutionary algorithm.Key observation: (most)...
We want to know how many iterations we need for reaching precision   in an evolutionary algorithm.Key observation: (most)...
OutlineMultiobjective optimizationComplexity upper boundsComplexity lower bounds  Proof technique (monoobjective)  The MO ...
How to apply this in MOO ?Covering numbers can be computed also for  Hausdorff distance ==>Plus a little bit of boring mat...
Results in multiobjective casesThe proof method is not new          (Fournier & Teytaud, Algorithmica 2010)Its application...
OutlineMultiobjective optimizationComplexity upper boundsComplexity lower boundsDiscussion
Sorry for not being here                                  ==> really impossibleDiscussion                        ==> all e...
Upcoming SlideShare
Loading in...5
×

Complexity of multiobjective optimization

247

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
247
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
19
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Complexity of multiobjective optimization

  1. 1. Comparison-Based Complexityof Multi-Objective Optimization Paper by O. Teytaud Presented by M. Schoenauer TAO, Inria, Lri, UMR CNRS 8623, Université Paris-Sud
  2. 2. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower boundsDiscussion
  3. 3. Evolutionary multi-objectiveoptimization Generate a population Evaluate fitness values Select the « best » (various rules are possible here) Generate offspring Go back to the Evaluation step.
  4. 4. Model of fitness functions ?No assumption ? Unrealistically pessimistic results Unreadable lower boundsLets do as in: ==> quadratic convex objective functions
  5. 5. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower boundsDiscussion
  6. 6. Complexity upper boundsEach objective function = a sphereBelow just a short overview of algorithms; ==> the real algorithms are a bit more trickyFor finding the whole Pareto front : Optimize each objective separately The PF is the convex hullFor finding a single point : Optimize any single objective
  7. 7. Finding one point of the Pareto Setd objective functionsIn dimension NOne point at distance at most e of the PFcost=O( (N+1-d) log (1/e) )Proof : M log(1/e) in monoobjective optimization where M is the codimension of the set of optima (Gelly Teytaud, 2006)
  8. 8. Finding the whole Pareto Setd objective functionsIn dimension NOne point at distance at most e of the PF for the Hausdorff metriccost=O( (Nd) log (1/e) )Proof : d times the monoobjective case.
  9. 9. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower bounds Proof technique (monoobjective) The MO case
  10. 10. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 10
  11. 11. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 11
  12. 12. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 12
  13. 13. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 13
  14. 14. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 14
  15. 15. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 15
  16. 16. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 16
  17. 17. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 17
  18. 18. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower bounds Proof technique (monoobjective) The MO case
  19. 19. How to apply this in MOO ?Covering numbers can be computed also for Hausdorff distance ==>Plus a little bit of boring mathsLeads to bounds as expected Nd log(1/e) for the whole Pareto set (Hausdorff) (N+1-d) log(1/e) for pointwise convergence (distance to one point of the Pareto set) N=dimension, d=nb of objectives
  20. 20. Results in multiobjective casesThe proof method is not new (Fournier & Teytaud, Algorithmica 2010)Its application to MOO is new : Tight bounds But no result in case of use of surrogate models (as for corresponding results in the monoobjective case) ; in fact, the problem becomes unrealistically easy with surrogate models...
  21. 21. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower boundsDiscussion
  22. 22. Sorry for not being here ==> really impossibleDiscussion ==> all email questions welcomeTight bounds thanks to a realistic modelCombining previous papers Complexity bounds Relevant modelMaybe an extension : using VC-dimensionThis paper did it(single objective)
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×