Your SlideShare is downloading.
×

×
# Introducing the official SlideShare app

### Stunning, full-screen experience for iPhone and Android

#### Text the download link to your phone

Standard text messaging rates apply

Like this presentation? Why not share!

- 論文紹介:Practical bayesian optimizatio... by Keisuke Uto 4548 views
- "Active Learning for Multi-Objectiv... by jkomiyama 4659 views
- 深層学習入門 by Danushka Bollegala 17133 views
- 深層学習(Deep Learning)とベイズ的最適化(Bayesia... by issei_sato 12768 views
- Online Chemical Database with Model... by SSA KPI 947 views
- GECCO 2010 OBUPM Workshop by Petr Pošík 396 views
- Bayesian Efficient Multiple Kernel ... by Junya Saito 337 views
- Bビジネスチーム 地域クラウト゛ソーシング基盤 by Tatsuya Kikuchi 565 views
- Dynamic Hyperparameter Optimization... by Tomonari Masada 448 views
- Analysis of Feature Selection Algor... by Parinda Rajapaksha 330 views
- Graph based search engine by Umar Afzal 397 views
- Docking Tutorial by Balachandramohan Bcm 2566 views

Like this? Share it with your network
Share

2,393

views

views

Published on

This work studies the utility of using substructural neighborhoods for local search in the Bayesian optimization algorithm (BOA). The probabilistic model of BOA, which automatically identifies …

This work studies the utility of using substructural neighborhoods for local search in the Bayesian optimization algorithm (BOA). The probabilistic model of BOA, which automatically identifies important problem substructures, is used to define the structure of the neighborhoods used in local search. Additionally, a surrogate fitness model is considered to evaluate the improvement of the local search steps. The results show that performing substructural local search in BOA significatively reduces the number of generations necessary to converge to optimal solutions and thus provides substantial speedups.

No Downloads

Total Views

2,393

On Slideshare

0

From Embeds

0

Number of Embeds

0

Shares

0

Downloads

104

Comments

0

Likes

4

No embeds

No notes for slide

- 1. The Bayesian Optimization Algorithm with Substructural Local Search Claudio Lima, Martin Pelikan, Kumara Sastry, Martin Butz, David Goldberg, and Fernando Lobo
- 2. Overview Motivation Bayesian Optimization Algorithm (BOA) Modeling fitness in BOA Substructural Neighborhoods BOA with Substuctural Hillclimbing Results Conclusions Future Work 2 OBUPM 2006
- 3. Motivation Probabilistic models of EDAs allow better recombination of subsolutions Get we can more from these models? Yes! Efficiency enhancement on EDAs Evaluation relaxation Local search in substructural neighborhoods 3 OBUPM 2006
- 4. Bayesian Optimization Algorithm Pelikan, Goldberg, and Cantú-Paz (1999) Use Bayesian networks to model good solutions Model structure => acyclic directed graph Nodes represent variables Edges represent conditional dependencies Model parameters => conditional probabilities Conditional Probability Tables based on the observed frequencies Local structures: Decision Trees or Graphs 4 OBUPM 2006
- 5. Learning a Bayesian Network Start with an empty network (independence assumption) Perform operation that improves the metric the most Edge addition, edge removal, edge reversal Metric quantifies the likelihood of the model wrt data (good solutions) Stop when no more improvement is possible 5 OBUPM 2006
- 6. A 3-bit Example Model Structure Model Parameters Directed Acyclic Graph Conditional Probability Tables Decision Trees X2X3 P(X1=1|X2X3) X2 X2 X3 00 0.20 0 1 01 0.20 X3 P(x1=1) = 0.20 10 0.15 X1 0 1 11 0.45 P(x1=1) = 0.15 P(x1=1) = 0.45 6 OBUPM 2006
- 7. Modeling Fitness in BOA Bayesian networks extended to store a surrogate fitness model (Pelikan & Sastry,2004) The surrogate fitness is learned from a proportion of the population... ...and is used to estimate the fitness of the remaining individuals (therefore reducing evals) 7 OBUPM 2006
- 8. The same 3-bit Example X2X3 P(X1=1|X2X3) f(X1=0|X2X3) f(X1=1|X2X3) X2 00 0.20 -0.49 0.53 0 1 01 0.20 -0.38 0.51 10 0.15 -0.55 0.47 X3 P(X1=1) = 0.20 f(X1=0) = -0.48 11 0.45 -0.52 0.62 f(X1=1) = 0.54 0 1 P(X1=1) = 0.15 P(X1=1) = 0.45 f(X1=0) = -0.55 f(X1=0) = -0.52 f(X1=1) = 0.47 f(X1=1) = 0.62 Estimated fitness: 8 OBUPM 2006
- 9. Why Substructural Neighborhoods? An efficient mutation operator should search in the correct neighborhood Oftentimes this is done by incorportaring domain- or problem-specific knowledge However, efficiency typically does not generalize beyond a small number of applications Bitwise local search have more general applicability but with inferior results 9 OBUPM 2006
- 10. Substructural Neighborhoods Neighborhoods defined by the probabilistic model of EDAs Exploits the underlying problem structure while not loosing generality of application Exploration of neighborhoods respect dependencies between variables If [X1X2X3] form a linkage group, the neighborhood considered will be 000, 001, 010, ..., 111 10 OBUPM 2006
- 11. Substructural Local Search For uniformly-scaled decomposable problems, substructural local search scales as 0(2km1.5) (Sastry & Goldberg, 2004) Bitwise hillclimber: O( mk log(m) ) Extended Compact GA with substructural local search is more robust than either single- operator-based aproaches (Lima et al., 2005) 11 OBUPM 2006
- 12. Substructural Neighborhoods in BOA Model is more complex than in eCGA What is a linkage group? Which dependencies to consider? Is order relevant? Example: topology of 3 different substructural neighborhoods for variable X2: 12 OBUPM 2006
- 13. BOA + Substructural Hillclimbing After model sampling each offspring undergoes local search with a certain probability pls Current model is used to define the neighborhoods Choice of best subsolutions => surrogate fitness model Cost of performing local search is then minimal 13 OBUPM 2006
- 14. Substructural Hillclimbing in BOA 14 OBUPM 2006
- 15. Substructural Hillclimbing in BOA Use reverse ancestral ordering of variables 2 different versions of the substructural hillclimber (step 3) Evaluated fitness Estimated fitness Result of local search is evaluated 15 OBUPM 2006
- 16. Experiments Additively decomposable problems Two important bounds: Onemax and concatenated k-bit traps Many things in between 16 OBUPM 2006
- 17. Onemax Results (l=50) 17 OBUPM 2006
- 18. Onemax Results (l=50) Correctness of substructural neighborhoohs is not relevant... ...but the choice of subsolutions relies on the accuracy of the surrogate fitness model More important, the acceptance of the best subsolutions depends also on the surrogate, if using estimated fitness 18 OBUPM 2006
- 19. 10x5-bit trap Results (l=50) 19 OBUPM 2006
- 20. 10x5-bit trap Results (l=50) Correct identification of problem substructure is crucial Different versions of the hillclimber perform similar (for small pls) Cost of using evaluated fitness increases significatively with pls (and with problem size) Phase transition in the population size required 20 OBUPM 2006
- 21. Scalability Results (5-bit traps) 21 OBUPM 2006
- 22. Scalability Results (5-bit traps) Substancial speedups are obtained (η=6 for l=140) Speedup scales as O(l0.45) for l<80 For bigger problem sizes the speedup is more moderate pls=5x10-4 adequate for range of problems tested, but optimal proportion should decrease for higher problem sizes 22 OBUPM 2006
- 23. More on Scalability... 23 OBUPM 2006
- 24. Scalability Issues Optimal proportion of local search slowly decreases with problem size Exploration of substructural neighborhoods is sensitive to the accuracy of model structure Spurious linkage size grows with problem size BOA’s sampling ability is not affected because conditional probabilities nearly express independence between spurious and linked variables 24 OBUPM 2006
- 25. Future Work Model optimal proportion of local search pls Get more accurate model structures Only accept pairwise depedencies that improve metric beyond some threshold (significance test) Study the improvement function of the metric Consider other neighborhood topologies Consider overlapping substructures 25 OBUPM 2006
- 26. Conclusions Incorporation of substructural local search in BOA leads to significant speedups Use of surrogate fitness in local search provides effective learning of substructures with minimal cost on evals. The importance of designing and hybridizing competent operators have been empirically demonstrated 26 OBUPM 2006

Be the first to comment