Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Population sizing for entropy-based model buliding In genetic algorithms


Published on

This paper presents a population-sizing model for the entropy-based model building in genetic algorithms. Specifically, the population size required for building an accurate model is investigated. The effect of the selection pressure on population sizing is also incorporated. The proposed model indicates that the population size
required for building an accurate model scales as Θ(m log m), where m is the number of substructures and proportional to the problem size. Experiments are conducted to verify the derivations, and the results agree with the proposed model.

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

Population sizing for entropy-based model buliding In genetic algorithms

  1. 1. Population Sizing for Entropy-based Model Building in Genetic Algorithms T.-L. Yu1, K. Sastry2, D. E. Goldberg2, & M. Pelikan3 1Department of Electrical Engineering National Taiwan University, Taiwan 2Illinois Genetic Algorithms Laboratory University of Illinois at Urbana-Champaign, IL, USA 3Missouri Estimation of Distribution Algorithms Laboratory University of Missouri at St. Louis, MO, USA Supported by AFOSR FA9550-06-1-0096, NSF DMR 03-25939, and CAREER ECS-0547013.
  2. 2. Motivation • Facetwise population sizing in GEC – Initial supply [Goldberg et al. 2001] – Decision-making [Goldberg et al. 1992] – Gambler’s ruin [Harik et al. 1997] • EDA—Model building is essential. • Population sizing for model building [Pelikan et al. 2003] • Better explanation and modeling are needed.
  3. 3. Roadmap • Entropy-based model building • Mutual information • The effect of selection • Distribution of mutual information under limited sampling • Building an accurate model • The effect of selection pressure • Conclusion
  4. 4. Entropy-based model building & Mutual information • Entropy: measurement of uncertainty. • Loss of entropy Gain in certainty Mutual information • Bivariate: MIMIC, BMDA • Multivariate: eCGA, BOA, EBNA, DSMGA • Most multivariate model building start from bivariate dependency detection.
  5. 5. Mutual information • Definition • Some facts: – –
  6. 6. Base: Bipolar Royal Road • Additively separable bipolar Royal road u 0 k • Given the minimal signal , the most difficult for model building. • Analytical simplicity, no gene-wise bias.
  7. 7. The effect of selection • 00******** and 11******** increase: • 10******** and 01******** decrease: • Define – – •
  8. 8. Growth of schemata and M.I. • • • Growth in mutual information
  9. 9. Limited sampling • In GAs, finite population limited sampling • Define two random variables: – :Signal of mutual information between two independent genes under n random samples. – :Signal of mutual information between two dependent genes under n random samples. • Ideally:
  10. 10. Distribution of mutual information [Hutter and Zaffalon, 2004] • •
  11. 11. Empirical verification
  12. 12. Building an accurate model • Define • Decision error • Building an accurate model • Finally
  13. 13. Verification of O(22k) DSMGA, m=10
  14. 14. Verification of O(mlogm) eCGA DSMGA
  15. 15. Effect of selection pressure • Quantitative, order statistics • Qualitative, consider truncation selection • Higher s – More growth of Hopt – Fewer number of effective samples
  16. 16. Empirical results on selection pressure Future work: Empirically, larger k larger s*
  17. 17. Summary and Conclusions • Refine the required population sizing for model building – From – To • Correct to • Preliminarily incorporate selection pressure into population-sizing model. – Qualitatively show the existence of s*