Your SlideShare is downloading. ×
On weighted averaging in optimization
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

On weighted averaging in optimization

631
views

Published on

@inproceedings{teytaud:inria-00451416, …

@inproceedings{teytaud:inria-00451416,
hal_id = {inria-00451416},
url = {http://hal.inria.fr/inria-00451416},
title = {{Bias and variance in continuous EDA}},
author = {Teytaud, Fabien and Teytaud, Olivier},
abstract = {{Estimation of Distribution Algorithms are based on statistical estimates. We show that when combining classical tools from statistics, namely bias/variance decomposition, reweighting and quasi-randomization, we can strongly improve the convergence rate. All modifications are easy, compliant with most algorithms, and experimentally very efficient in particular in the parallel case (large offsprings).}},
language = {Anglais},
affiliation = {TAO - INRIA Futurs , Laboratoire de Recherche en Informatique - LRI , TAO - INRIA Saclay - Ile de France},
booktitle = {{EA 09}},
address = {Strasbourg, France},
audience = {internationale },
year = {2009},
month = May,
pdf = {http://hal.inria.fr/inria-00451416/PDF/decsigma.pdf},
}

Published in: Technology, Business

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
631
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • I am Frederic Lemoine, PhD student at the University Paris Sud. I will present you my work on GenoQuery, a new querying module adapted to a functional genomics warehouse
  • Transcript

    • 1. Why one mustuse reweighting F. Teytaud, O. Teytaud Montréal, 2009Tao, Inria Saclay Ile-De-France, LRI (Université Paris Sud, France),UMR CNRS 8623, I&A team, Digiteo, Pascal Network of Excellence.
    • 2. Outline Idea of averaging in evolutionary algorithms This idea introduces a bias How to remove this bias The results Conclusions Teytaud and Teytaud Gecco 09 is great 2
    • 3. Idea in ES Average of selected points = good approximation of optimum Teytaud and Teytaud Gecco 09 is great 3
    • 4. Idea in ES Lets assume this is true (for the moment)... nonetheless, theres a bias. Teytaud and Teytaud Gecco 09 is great 4
    • 5. EMNA (P. Larranaga and J.-A. Lozano, 2001) While (not finished) - generate population - select best individuals - estimate mean / variance (and possibly covariance) Teytaud and Teytaud Gecco 09 is great 5
    • 6. EMNA (P. Larranaga and J.-A. Lozano, 2001) Teytaud and Teytaud Gecco 09 is great 6
    • 7. EMNA (P. Larranaga and J.-A. Lozano, 2001) While (not finished) - generate population - select best individuals - estimate mean / variance (and possibly covariance)Highly parallel (more than most ES; T. et al, EvoStar 2001)Very simpleCan handle covariance matrix easily Teytaud and Teytaud Gecco 09 is great 7
    • 8. Please wake up during 3 slides :-) Idea of averaging in evolutionary algorithms This idea introduces a bias How to remove this bias The results Conclusions Teytaud and Teytaud Gecco 09 is great 8
    • 9. Bias due to bad (Gaussian) distribution High density Teytaud and Teytaud Gecco 09 is great 9
    • 10. Bias due to bad distribution Teytaud and Teytaud Gecco 09 is great 10
    • 11. Bias due to bad distribution Teytaud and Teytaud Gecco 09 is great 11
    • 12. Bias due to bad distribution AVERAGE (biased by the distribution) Teytaud and Teytaud Gecco 09 is great 12
    • 13. Corrected by weighted average AVERAGE (the one we really want !) Teytaud and Teytaud Gecco 09 is great 13
    • 14. Outline Idea of averaging in evolutionary algorithms This idea introduces a bias How to remove this bias The results Conclusions Teytaud and Teytaud Gecco 09 is great 14
    • 15. American Election of 1936 (fun) Literary digest: pop size = 2 000 000 ==> predicts Landon Gallup: pop size = 50 000 ==> predicts Roosevelt (and was proved right) Teytaud and Teytaud Gecco 09 is great 15
    • 16. American Election of 1936 Literary digest: pop size = 2 300 000 ==> predicts Landon Gallup: pop size = 50 000 ==> predicts Roosevelt (and was proved right) The Literary digest failed because of a biased sampling. (much more affluent people and much more republicans among Literary Digest readers) Correction: Weight of individual = real density / biased density. Teytaud and Teytaud Gecco 09 is great 16
    • 17. REMNA (reweighted EMNA) Inverse Gaussian density Gaussian density (=weight for removing the bias!) Teytaud and Teytaud Gecco 09 is great 17
    • 18. REMNA (reweighted EMNA) Very simple modification: - compute weight ( individual ) = 1 / density - compute mean, variance, covariance with these weights ==> not only for Gaussians ==> ok for all surrogate models / EDA ==> just an application of standard statistics Teytaud and Teytaud Gecco 09 is great 18
    • 19. REMNA Teytaud and Teytaud Gecco 09 is great 19
    • 20. Outline Idea of averaging in evolutionary algorithms This idea introduces a bias How to remove this bias The results: less premature convergence Conclusions Teytaud and Teytaud Gecco 09 is great 20
    • 21. The results We do not prove that “theres no more premature convergence.” We just show that, for a fixed generation, “ IF the center of the level set is the optimum, THEN the asymptotic value of the estimated optimum = the optimum.” ==> is the condition really necessary ? Teytaud and Teytaud Gecco 09 is great 21
    • 22. Yes: center = optimum and Yes: situation better Yes and yes Teytaud and Teytaud Gecco 09 is great 22
    • 23. No: center ≠optimum but Yes: situation better (assumption not really necessary) No... but yes Teytaud and Teytaud Gecco 09 is great 23
    • 24. Results: convergence rate with  = d 2 Teytaud and Teytaud Gecco 09 is great 24
    • 25. Conclusions Idea of averaging in evolutionary algorithms This idea introduces a bias How to remove this bias The results Conclusions Teytaud and Teytaud Gecco 09 is great 25
    • 26. ConclusionsReduces the risk of premature convergenceNo proof on the complete algorithm (just step-wise consistency)Empirically quite good for EMNA (should be tested on other EDA / surrogate)Simple, sound, widely applicableBias of step-size adaptation not yetanalyzed (==> seemingly works quite well!) Teytaud and Teytaud Gecco 09 is great 26
    • 27. Related work Papers from D.V. Arnold et al. around reweighting for improved convergence rate (sphere, ridge) of ES (to be combined ?) Work from CMA-people around weights for improved conv. rate in CMA-ES Thanks! Questions ? Teytaud and Teytaud Gecco 09 is great 27

    ×