Your SlideShare is downloading. ×
0
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Complexity of multiobjective optimization
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Complexity of multiobjective optimization

243

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
243
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
19
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Comparison-Based Complexityof Multi-Objective Optimization Paper by O. Teytaud Presented by M. Schoenauer TAO, Inria, Lri, UMR CNRS 8623, Université Paris-Sud
  • 2. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower boundsDiscussion
  • 3. Evolutionary multi-objectiveoptimization Generate a population Evaluate fitness values Select the « best » (various rules are possible here) Generate offspring Go back to the Evaluation step.
  • 4. Model of fitness functions ?No assumption ? Unrealistically pessimistic results Unreadable lower boundsLets do as in: ==> quadratic convex objective functions
  • 5. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower boundsDiscussion
  • 6. Complexity upper boundsEach objective function = a sphereBelow just a short overview of algorithms; ==> the real algorithms are a bit more trickyFor finding the whole Pareto front : Optimize each objective separately The PF is the convex hullFor finding a single point : Optimize any single objective
  • 7. Finding one point of the Pareto Setd objective functionsIn dimension NOne point at distance at most e of the PFcost=O( (N+1-d) log (1/e) )Proof : M log(1/e) in monoobjective optimization where M is the codimension of the set of optima (Gelly Teytaud, 2006)
  • 8. Finding the whole Pareto Setd objective functionsIn dimension NOne point at distance at most e of the PF for the Hausdorff metriccost=O( (Nd) log (1/e) )Proof : d times the monoobjective case.
  • 9. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower bounds Proof technique (monoobjective) The MO case
  • 10. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 10
  • 11. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 11
  • 12. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 12
  • 13. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 13
  • 14. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 14
  • 15. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 15
  • 16. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 16
  • 17. We want to know how many iterations we need for reaching precision  in an evolutionary algorithm.Key observation: (most) evolutionary algorithms are comparison-basedLets consider (for simplicity) a deterministic selection-based non-elitist algorithmFirst idea: how many different branches we have in a run ? We select  points among  Therefore, at most K = ! / ( ! (  -  )!) different branchesSecond idea: how many different answers should we able to give ? Use packing numbers: at least N() different possible answersConclusion: the number n of iterations should verify Kn ≥ N (  )Frédéric Lemoine MIG 11/07/2008 17
  • 18. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower bounds Proof technique (monoobjective) The MO case
  • 19. How to apply this in MOO ?Covering numbers can be computed also for Hausdorff distance ==>Plus a little bit of boring mathsLeads to bounds as expected Nd log(1/e) for the whole Pareto set (Hausdorff) (N+1-d) log(1/e) for pointwise convergence (distance to one point of the Pareto set) N=dimension, d=nb of objectives
  • 20. Results in multiobjective casesThe proof method is not new (Fournier & Teytaud, Algorithmica 2010)Its application to MOO is new : Tight bounds But no result in case of use of surrogate models (as for corresponding results in the monoobjective case) ; in fact, the problem becomes unrealistically easy with surrogate models...
  • 21. OutlineMultiobjective optimizationComplexity upper boundsComplexity lower boundsDiscussion
  • 22. Sorry for not being here ==> really impossibleDiscussion ==> all email questions welcomeTight bounds thanks to a realistic modelCombining previous papers Complexity bounds Relevant modelMaybe an extension : using VC-dimensionThis paper did it(single objective)

×