Successfully reported this slideshow.
Upcoming SlideShare
×

# Estimation of Distribution Algorithms Tutorial

10,466 views

Published on

Probabilistic model-building algorithms (PMBGAs), also called estimation of distribution algorithms (EDAs) and iterated density estimation algorithms (IDEAs), replace traditional variation of genetic and evolutionary algorithms by (1) building a probabilistic model of promising solutions and (2) sampling the built model to generate new candidate solutions. PMBGAs are also known as estimation of distribution algorithms (EDAs) and iterated density-estimation algorithms (IDEAs).

Replacing traditional crossover and mutation operators by building and sampling a probabilistic model of promising solutions enables the use of machine learning techniques for automatic discovery of problem regularities and exploitation of these regularities for effective exploration of the search space. Using machine learning in optimization enables the design of optimization techniques that can automatically adapt to the given problem. There are many successful applications of PMBGAs, for example, Ising spin glasses in 2D and 3D, graph partitioning, MAXSAT, feature subset selection, forest management, groundwater remediation design, telecommunication network design, antenna design, and scheduling.

This tutorial provides a gentle introduction to PMBGAs with an overview of major research directions in this area. Strengths and weaknesses of different PMBGAs will be discussed and suggestions will be provided to help practitioners to choose the best PMBGA for their problem.

The video of this tutorial presented at GECCO-2008 can be found at
http://medal.cs.umsl.edu/blog/?p=293

Published in: Technology, Education
• Full Name
Comment goes here.

Are you sure you want to Yes No
• great

Are you sure you want to  Yes  No

### Estimation of Distribution Algorithms Tutorial

1. 1. Probabilistic Model-BuildingGenetic Algorithms a.k.a. Estimation of Distribution Algorithms a.k.a. Iterated Density Estimation AlgorithmsMartin PelikanMissouri Estimation of Distribution Algorithms Laboratory (MEDAL)Department of Mathematics and Computer ScienceUniversity of Missouri - St. Louismartin@martinpelikan.net Copyright is held by the author/owner(s).http://martinpelikan.net/ GECCO’12 Companion, July 7–11, 2012, Philadelphia, PA, USA. ACM 978-1-4503-1178-6/12/07.
2. 2. Forewordn  Motivation ¨  Genetic and evolutionary computation (GEC) popular. ¨  Toy problems great, but difficulties in practice. ¨  Must design new representations, operators, tune, …n  This talk ¨  Discuss a promising direction in GEC. ¨  Combine machine learning and GEC. ¨  Create practical and powerful optimizers. Martin Pelikan, Probabilistic Model-Building GAs 2
3. 3. Overviewn  Introduction ¨  Black-box optimization via probabilistic modeling.n  Probabilistic Model-Building GAs ¨  Discreterepresentation ¨  Continuous representation ¨  Computer programs (PMBGP) ¨  Permutationsn  Conclusions Martin Pelikan, Probabilistic Model-Building GAs 3
4. 4. Problem Formulationn  Input ¨  How do potential solutions look like? ¨  How to evaluate quality of potential solutions?n  Output ¨  Best solution (the optimum).n  Important ¨  No additional knowledge about the problem. Martin Pelikan, Probabilistic Model-Building GAs 4
5. 5. Why View Problem as Black Box?n  Advantages ¨  Separate problem definition from optimizer. ¨  Easy to solve new problems. ¨  Economy argument.n  Difficulties ¨  Almost no prior problem knowledge. ¨  Problem specifics must be learned automatically. ¨  Noise, multiple objectives, interactive evaluation. Martin Pelikan, Probabilistic Model-Building GAs 5
6. 6. Representations Considered Heren  Start with ¨  Solutions are n-bit binary strings.n  Later ¨  Real-valuedvectors. ¨  Program trees. ¨  Permutations Martin Pelikan, Probabilistic Model-Building GAs 6
7. 7. Typical Situationn  Previously visited solutions + their evaluation: # Solution Evaluation 1 00100 1 2 11011 4 3 01101 0 4 10111 3 n  Question: What solution to generate next? Martin Pelikan, Probabilistic Model-Building GAs 7
8. 8. Many Answersn  Hill climber ¨  Start with a random solution. ¨  Flip bit that improves the solution most. ¨  Finish when no more improvement possible.n  Simulated annealing ¨  Introduce Metropolis.n  Probabilistic model-building GAs ¨  Inspiration from GAs and machine learning (ML). Martin Pelikan, Probabilistic Model-Building GAs 8
9. 9. Probabilistic Model-Building GAs Current Selected Newpopulation population population 11001 11001 01111 11101 10101 Probabilistic 11001 01011 01011 Model 11011 11000 11000 00111 …replace crossover+mutation with learning and sampling probabilistic model Martin Pelikan, Probabilistic Model-Building GAs 9
10. 10. Other Names for PMBGAsn  Estimation of distribution algorithms (EDAs) (Mühlenbein & Paass, 1996)n  Iterated density estimation algorithms (IDEA) (Bosman & Thierens, 2000) Martin Pelikan, Probabilistic Model-Building GAs 10
11. 11. Implicit vs. Explicit Modeln  GAs and PMBGAs perform similar task ¨  Generate new solutions using probability distribution based on selected solutions.n  GAs ¨  Variationdefines implicit probability distribution of target population given original population and variation operators (crossover and mutation).n  PMBGAs ¨  Explicit probabilistic model of selected candidate solutions is built and sampled. Martin Pelikan, Probabilistic Model-Building GAs 11
12. 12. What Models to Use?n  Start with a simple example ¨  Probability vector for binary strings.n  Later ¨  Dependency tree models (COMIT). ¨  Bayesian networks (BOA). ¨  Bayesian networks with local structures (hBOA). Martin Pelikan, Probabilistic Model-Building GAs 12
13. 13. Probability Vectorn  Assume n-bit binary strings.n  Model: Probability vector p=(p1, …, pn) ¨  pi= probability of 1 in position i ¨  Learn p: Compute proportion of 1 in each position. ¨  Sample p: Sample 1 in position i with prob. pi Martin Pelikan, Probabilistic Model-Building GAs 13
14. 14. Example: Probability Vector(Mühlenbein, Paass, 1996), (Baluja, 1994) Current Selected Newpopulation population population Probability 11001 11001 vector 10101 10101 10101 10001 1.0 0.5 0.5 0.0 1.0 01011 01011 11101 11000 11000 11001 Martin Pelikan, Probabilistic Model-Building GAs 14
15. 15. Probability Vector PMBGAsn  PBIL (Baluja, 1995) ¨  Incremental updates to the prob. vector.n  Compact GA (Harik, Lobo, Goldberg, 1998) ¨  Also incremental updates but better analogy with populations.n  UMDA (Mühlenbein, Paass, 1996) ¨  What we showed here.n  DEUM (Shakya et al., 2004)n  All variants perform similarly. Martin Pelikan, Probabilistic Model-Building GAs 15
16. 16. Probability Vector Dynamicsn  Bitsthat perform better get more copies.n  And are combined in new ways.n  But context of each bit is ignored.n  Example problem 1: Onemax n f ( X 1 , X 2 ,… , X n ) = ∑ X i i =1 Martin Pelikan, Probabilistic Model-Building GAs 16
17. 17. Probability Vector on Onemax 1 0.9 Probability vector entries 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 10 20 30 40 50 Generation Martin Pelikan, Probabilistic Model-Building GAs 17
18. 18. Probability Vector: Ideal Scale-upn  O(n log n) evaluations until convergence ¨  (Harik, Cantú-Paz, Goldberg, & Miller, 1997) ¨  (Mühlenbein, Schlierkamp-Vosen, 1993)n  Other algorithms ¨  Hill climber: O(n log n) (Mühlenbein, 1992) ¨  GA with uniform: approx. O(n log n) ¨  GA with one-point: slightly slower Martin Pelikan, Probabilistic Model-Building GAs 18
19. 19. When Does Prob. Vector Fail?n  Example problem 2: Concatenated traps ¨  Partition input string into disjoint groups of 5 bits. ¨  Groups contribute via trap (ones=number of ones): " 5 if ones = 5 trap ( ones ) = # \$ 4 − ones otherwise ¨  Concatenated trap = sum of single traps ¨  Optimum: String 111…1 Martin Pelikan, Probabilistic Model-Building GAs 19
20. 20. Trap-5 5 4 trap(u) 3 2 1 0 0 1 2 3 4 5 Number of ones, u Martin Pelikan, Probabilistic Model-Building GAs 20
21. 21. Probability Vector on Traps 1 0.9 Probability vector entries 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 10 20 30 40 50 Generation Martin Pelikan, Probabilistic Model-Building GAs 21
22. 22. Why Failure?n  Onemax: ¨  Optimum in 111…1 ¨  1 outperforms 0 on average.n  Traps: optimum in 11111, but n  f(0****) = 2 n  f(1****) = 1.375n  So single bits are misleading. Martin Pelikan, Probabilistic Model-Building GAs 22
23. 23. How to Fix It?n  Consider 5-bit statistics instead 1-bit ones.n  Then, 11111 would outperform 00000.n  Learn model ¨  Compute p(00000), p(00001), …, p(11111)n  Sample model ¨  Sample 5 bits at a time ¨  Generate 00000 with p(00000), 00001 with p(00001), … Martin Pelikan, Probabilistic Model-Building GAs 23
24. 24. Correct Model on Traps: Dynamics 1 0.9 Probabilities of 11111 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 10 20 30 40 50 Generation 24 Martin Pelikan, Probabilistic Model-Building GAs
25. 25. Good News: Good Stats Work Great!n  Optimum in O(n log n) evaluations.n  Same performance as on onemax!n  Others ¨  Hill climber: O(n5 log n) = much worse. ¨  GA with uniform: O(2n) = intractable. ¨  GA with k-point xover: O(2n) (w/o tight linkage). Martin Pelikan, Probabilistic Model-Building GAs 25
26. 26. Challengen  If we could learn and use relevant context for each position ¨  Find non-misleading statistics. ¨  Use those statistics as in probability vector.n  Then we could solve problems decomposable into statistics of order at most k with at most O(n2) evaluations! ¨  And there are many such problems (Simon, 1968). Martin Pelikan, Probabilistic Model-Building GAs 26
27. 27. What’s Next?n  COMIT ¨  Use tree modelsn  Extended compact GA ¨  Cluster bits into groups.n  Bayesian optimization algorithm (BOA) ¨  Use Bayesian networks (more general). Martin Pelikan, Probabilistic Model-Building GAs 27
28. 28. Beyond single bits: COMIT(Baluja, Davies, 1997) Model P(X=1) 75 % String X P(Y=1|X) 0 30 % 1 25 % X P(Z=1|X) 0 86 % 1 75 % Martin Pelikan, Probabilistic Model-Building GAs 28
29. 29. How to Learn a Tree Model?n  Mutual information: P(Xi = a, X j = b) I(Xi , X j ) = ∑ P(Xi = a, X j = b) log a,b P(Xi = a)P(X j = b)n  Goal ¨  Find tree that maximizes mutual information between connected nodes. ¨  Will minimize Kullback-Leibler divergence.n  Algorithm ¨  Prim’s algorithm for maximum spanning trees. Martin Pelikan, Probabilistic Model-Building GAs 29
30. 30. Prim’s Algorithmn  Start with a graph with no edges.n  Add arbitrary node to the tree.n  Iterate ¨  Hang a new node to the current tree. ¨  Prefer addition of edges with large mutual information (greedy approach).n  Complexity: O(n2) Martin Pelikan, Probabilistic Model-Building GAs 30
31. 31. Variants of PMBGAs with Tree Modelsn  COMIT (Baluja, Davies, 1997) ¨  Tree models.n  MIMIC (DeBonet, 1996) ¨  Chain distributions.n  BMDA (Pelikan, Mühlenbein, 1998) ¨  Forest distribution (independent trees or tree) Martin Pelikan, Probabilistic Model-Building GAs 31
32. 32. Beyond Pairwise Dependencies: ECGAn  Extended Compact GA (ECGA) (Harik, 1999).n  Consider groups of string positions. String Model 00 16 % 0 86 % 000 17 % 01 45 % 1 14 % 001 2 % 10 35 % ··· 11 4 % 111 24 % Martin Pelikan, Probabilistic Model-Building GAs 32
33. 33. Learning the Model in ECGAn  Start with each bit in a separate group.n  Each iteration merges two groups for best improvement. Martin Pelikan, Probabilistic Model-Building GAs 33
34. 34. How to Compute Model Quality?n  ECGA uses minimum description length.n  Minimize number of bits to store model+data: MDL( M , D) = DModel + DDatan  Each frequency needs (0.5 log N) bits: |g|−1 DModel = ∑ 2 log N g ∈Gn  Each solution X needs -log p(X) bits: DData = − N ∑ p( X ) log p( X ) X Martin Pelikan, Probabilistic Model-Building GAs 34
35. 35. Sampling Model in ECGAn  Sample groups of bits at a time.n  Based on observed probabilities/proportions.n  But can also apply population-based crossover similar to uniform but w.r.t. model. Martin Pelikan, Probabilistic Model-Building GAs 35
36. 36. Building-Block-Wise Mutation in ECGAn  Sastry, Goldberg (2004); Lima et al. (2005)n  Basic idea ¨  Use ECGA model builder to identify decomposition ¨  Use the best solution for BB-wise mutation ¨  For each k-bit partition (building block) n  Evaluate the remaining 2k-1 instantiations of this BB n  Use the best instantiation of this BBn  Result (for order-k separable problems) ¨  ( ) BB-wise mutation is O k log n times faster than ECGA! ¨  But only for separable problems (and similar ones). Martin Pelikan, Probabilistic Model-Building GAs 36
37. 37. What’s Next?n  We saw ¨  Probabilityvector (no edges). ¨  Tree models (some edges). ¨  Marginal product models (groups of variables).n  Next: Bayesian networks ¨  Can represent all above and more. Martin Pelikan, Probabilistic Model-Building GAs 37
38. 38. Bayesian Optimization Algorithm (BOA)n  Pelikan, Goldberg, & Cantú-Paz (1998)n  Use a Bayesian network (BN) as a model.n  Bayesian network ¨  Acyclicdirected graph. ¨  Nodes are variables (string positions). ¨  Conditional dependencies (edges). ¨  Conditional independencies (implicit). Martin Pelikan, Probabilistic Model-Building GAs 38
39. 39. Example: Bayesian Network (BN)n  Conditional dependencies.n  Conditional independencies. Martin Pelikan, Probabilistic Model-Building GAs 39
40. 40. BOA Bayesian Current Selected New networkpopulation population population Martin Pelikan, Probabilistic Model-Building GAs 40
41. 41. Learning BNsn  Two things again: ¨  Scoringmetric (as MDL in ECGA). ¨  Search procedure (in ECGA done by merging). Martin Pelikan, Probabilistic Model-Building GAs 41
42. 42. Learning BNs: Scoring Metricsn  Bayesian metrics ¨  Bayesian-Dirichlet with likelihood equivallence n Γ(m(π i )) Γ(m(xi , π i ) + m(xi , π i )) BD(B) = p(B)∏ ∏ ∏ Γ(m(x ,π )) i=1 π Γ(m(π i ) + m(π i )) xi i i in  Minimum description length metrics ¨  Bayesian information criterion (BIC) Πi log 2 N & n# BIC( B) = ∑ % − H ( X i | Πi )N − 2 i=1 \$ 2 ( Martin Pelikan, Probabilistic Model-Building GAs 42
43. 43. Learning BNs: Search Proceduren  Start with empty network (like ECGA).n  Execute primitive operator that improves the metric the most (greedy).n  Until no more improvement possible.n  Primitive operators ¨  Edge addition (most important). ¨  Edge removal. ¨  Edge reversal. Martin Pelikan, Probabilistic Model-Building GAs 43
44. 44. Learning BNs: Example Martin Pelikan, Probabilistic Model-Building GAs 44
45. 45. BOA and Problem Decompositionn  Conditions for factoring problem decomposition into a product of prior and conditional probabilities of small order in Mühlenbein, Mahnig, & Rodriguez (1999).n  In practice, approximate factorization sufficient that can be learned automatically.n  Learning makes complete theory intractable. Martin Pelikan, Probabilistic Model-Building GAs 45
46. 46. BOA Theory: Population Sizingn  Initial supply (Goldberg et al., 2001) ¨  Have enough stuff to combine. O 2k( )n  Decision making (Harik et al, 1997) ¨  Decide well between competing partial sols. O ( n log n )n  Drift (Thierens, Goldberg, Pereira, 1998) ¨  Don’t lose less salient stuff prematurely. O n ()n  Model building (Pelikan et al., 2000, 2002) ¨  Find a good model. ( ) O n1.05 Martin Pelikan, Probabilistic Model-Building GAs 46
47. 47. BOA Theory: Num. of Generationsn  Two extreme cases, everything in the middle.n  Uniform scaling ¨  Onemax model (Muehlenbein & Schlierkamp-Voosen, 1993) O ( n)n  Exponential scaling ¨  Domino convergence (Thierens, Goldberg, Pereira, 1998) O (n) Martin Pelikan, Probabilistic Model-Building GAs 47
48. 48. Good News: Challenge Met!n  Theory ¨  Population sizing (Pelikan et al., 2000, 2002) n  Initial supply. n  Decision making. O(n) to O(n1.05) n  Drift. n  Model building. ¨  Number of iterations (Pelikan et al., 2000, 2002) n  Uniform scaling. n  Exponential scaling. O(n0.5) to O(n)n  BOA solves order-k decomposable problems in O(n1.55) to O(n2) evaluations! Martin Pelikan, Probabilistic Model-Building GAs 48
49. 49. Theory vs. Experiment (5-bit Traps) 500000 450000 Experiment 400000 Theory 350000 Number of Evaluations 300000 250000 200000 150000 100000 100 125 150 175 200 225 250 Problem Size 49 Martin Pelikan, Probabilistic Model-Building GAs
50. 50. BOA Siblingsn  Estimation of Bayesian Networks Algorithm (EBNA) (Etxeberria, Larrañaga, 1999).n  Learning Factorized Distribution Algorithm (LFDA) (Mühlenbein, Mahnig, Rodriguez, 1999). Martin Pelikan, Probabilistic Model-Building GAs 50
51. 51. Another Option: Markov Networksn  MN-FDA, MN-EDA (Santana; 2003, 2005)n  Similar to Bayes nets but with undirected edges. Martin Pelikan, Probabilistic Model-Building GAs 51
52. 52. Yet Another Option: Dependency Networksn  Estimation of dependency networks algorithm (EDNA) ¨  Gamez, Mateo, Puerta (2007). ¨  Use dependency network as a model. ¨  Dependency network learned from pairwise interactions. ¨  Use Gibbs sampling to generate new solutions.n  Dependency network ¨  Parents of a variable= all variables influencing this variable. ¨  Dependency network can contain cycles. Martin Pelikan, Probabilistic Model-Building GAs 52
53. 53. Model Comparison BMDA ECGA BOA Model Expressiveness Increases Martin Pelikan, Probabilistic Model-Building GAs 53
54. 54. From single level to hierarchyn  Single-level decomposition powerful.n  But what if single-level decomposition is not enough?n  Learn from humans and nature ¨  Decompose problem over multiple levels. ¨  Use solutions from lower level as basic building blocks. ¨  Solve problem hierarchically. Martin Pelikan, Probabilistic Model-Building GAs 54
55. 55. Hierarchical Decomposition Car Engine Braking system Electrical systemFuel system Valves Ignition system Martin Pelikan, Probabilistic Model-Building GAs 55
56. 56. Three Keys to Hierarchy Successn  Proper decomposition ¨  Must decompose problem on each level properly.n  Chunking ¨  Must represent & manipulate large order solutions.n  Preservation of alternative solutions ¨  Must preserve alternative partial solutions (chunks). Martin Pelikan, Probabilistic Model-Building GAs 56
57. 57. Hierarchical BOA (hBOA)n  Pelikan & Goldberg (2000, 2001)n  Proper decomposition ¨  Use Bayesian networks like BOA.n  Chunking ¨  Use local structures in Bayesian networks.n  Preservation of alternative solutions. ¨  Use restricted tournament replacement (RTR). ¨  Can use other niching methods. Martin Pelikan, Probabilistic Model-Building GAs 57
58. 58. Local Structures in BNsn  Look at one conditional dependency. ¨  2k probabilities for k parents.n  Why not use more powerful representations for conditional probabilities? X2X3 P(X1=0|X2X3) X1 00 26 % 01 44 % X2 X3 10 15 % 11 15 % Martin Pelikan, Probabilistic Model-Building GAs 58
59. 59. Local Structures in BNsn  Look at one conditional dependency. ¨  2k probabilities for k parents.n  Why not use more powerful representations for conditional probabilities? X2 X1 0 1 X3 15% X2 X3 0 1 26% 44% Martin Pelikan, Probabilistic Model-Building GAs 59
60. 60. Restricted Tournament Replacementn  Used in hBOA for niching.n  Insert each new candidate solution x like this: ¨  Pick random subset of original population. ¨  Find solution y most similar to x in the subset. ¨  Replace y by x if x is better than y. Martin Pelikan, Probabilistic Model-Building GAs 60
61. 61. Hierarchical Traps: The Ultimate Testn  Combine traps on more levels.n  Each level contributes to fitness.n  Groups of bits map to next level. Martin Pelikan, Probabilistic Model-Building GAs 61
62. 62. hBOA on Hierarchical Traps Experiment 6 10 O(n1.63 log(n)) Number of Evaluations 5 10 4 10 27 81 243 729 Problem Size Martin Pelikan, Probabilistic Model-Building GAs 62
63. 63. PMBGAs Are Not Just Optimizersn  PMBGAs provide us with two things ¨  Optimum or its approximation. ¨  Sequence of probabilistic models.n  Probabilistic models ¨  Encode populations of increasing quality. ¨  Tell us a lot about the problem at hand. ¨  Can we use this information? Martin Pelikan, Probabilistic Model-Building GAs 63
64. 64. Efficiency Enhancement for PMBGAsn  Sometimes O(n2) is not enough ¨  High-dimensional problems (1000s of variables) ¨  Expensive evaluation (fitness) functionn  Solution ¨  Efficiency enhancement techniques Martin Pelikan, Probabilistic Model-Building GAs 64
65. 65. Efficiency Enhancement Typesn  7 efficiency enhancement types for PMBGAs ¨  Parallelization ¨  Hybridization ¨  Time continuation ¨  Fitness evaluation relaxation ¨  Prior knowledge utilization ¨  Incremental and sporadic model building ¨  Learning from experience Martin Pelikan, Probabilistic Model-Building GAs 65
66. 66. Multi-objective PMBGAsn  Methods for multi-objective GAs adopted ¨  Multi-objective mixture-based IDEAs (Thierens, & Bosman, 2001) ¨  Another multi-objective BOA (from SPEA2 and mBOA) (Laumanns, & Ocenasek, 2002) ¨  Multi-objective hBOA (from NSGA-II and hBOA) (Khan, Goldberg, & Pelikan, 2002) (Pelikan, Sastry, & Goldberg, 2005) ¨  Regularity Model Based Multiobjective EDA (RM-MEDA) (Zhang, Zhou, Jin, 2008) Martin Pelikan, Probabilistic Model-Building GAs 66
67. 67. Promising Results with Discrete PMBGAsn  Artificial classes of problemsn  Physicsn  Bioinformaticsn  Computational complexity and AIn  Others Martin Pelikan, Probabilistic Model-Building GAs 67
68. 68. Results: Artificial Problemsn  Decomposition ¨  Concatenated traps (Pelikan et al., 1998).n  Hierarchical decomposition ¨  Hierarchical traps (Pelikan, Goldberg, 2001).n  Other sources of difficulty ¨  Exponential scaling, noise (Pelikan, 2002). Martin Pelikan, Probabilistic Model-Building GAs 68
69. 69. BOA on Concatenated 5-bit Traps 500000 450000 Experiment 400000 Theory 350000 Number of Evaluations 300000 250000 200000 150000 100000 100 125 150 175 200 225 250 Problem Size Martin Pelikan, Probabilistic Model-Building GAs 69
70. 70. hBOA on Hierarchical Traps Experiment 6 10 O(n1.63 log(n)) Number of Evaluations 5 10 4 10 27 81 243 729 Problem Size Martin Pelikan, Probabilistic Model-Building GAs 70
71. 71. Results: Physicsn  Spin glasses (Pelikan et al., 2002, 2006, 2008) (Hoens, 2005) (Santana, 2005) (Shakya et al., 2006) ¨  ±J and Gaussian couplings ¨  2D and 3D spin glass ¨  Sherrington-Kirkpatrick (SK) spin glassn  Silicon clusters (Sastry, 2001) ¨  Gong potential (3-body) Martin Pelikan, Probabilistic Model-Building GAs 71
72. 72. hBOA on Ising Spin Glasses (2D) hBOA O(n1.51) Number ofof Evaluations Number Evaluations 3 10 64 100 144 196 256 324 400 Problem Size Number of Spins Martin Pelikan, Probabilistic Model-Building GAs 72
73. 73. Results on 2D Spin Glassesn  Number of evaluations is O(n 1.51).n  Overall time is O(n 3.51).n  Compare O(n3.51) to O(n3.5) for best method (Galluccio & Loebl, 1999)n  Great also on Gaussians. Martin Pelikan, Probabilistic Model-Building GAs 73
74. 74. hBOA on Ising Spin Glasses (3D) 6 10 Experimental average O(n3.63 ) Number of Evaluations Number of Evaluations 5 10 4 10 3 10 64 125 216 343 Problem Size Number of Spins Martin Pelikan, Probabilistic Model-Building GAs 74
75. 75. hBOA on SK Spin Glass Martin Pelikan, Probabilistic Model-Building GAs 75
76. 76. Results: Computational Complexity, AIn  MAXSAT, SAT (Pelikan, 2002) ¨  Random 3CNF from phase transition. ¨  Morphed graph coloring. ¨  Conversion from spin glass.n  Feature subset selection (Inza et al., 2001) (Cantu-Paz, 2004) Martin Pelikan, Probabilistic Model-Building GAs 76
77. 77. Results: Some Othersn  Military antenna design (Santarelli et al., 2004)n  Groundwater remediation design (Arst et al., 2004)n  Forest management (Ducheyne et al., 2003)n  Nurse scheduling (Li, Aickelin, 2004)n  Telecommunication network design (Rothlauf, 2002)n  Graph partitioning (Ocenasek, Schwarz, 1999; Muehlenbein, Mahnig, 2002; Baluja, 2004)n  Portfolio management (Lipinski, 2005, 2007)n  Quantum excitation chemistry (Sastry et al., 2005)n  Maximum clique (Zhang et al., 2005)n  Cancer chemotherapy optimization (Petrovski et al., 2006)n  Minimum vertex cover (Pelikan et al., 2007)n  Protein folding (Santana et al., 2007)n  Side chain placement (Santana et al., 2007) Martin Pelikan, Probabilistic Model-Building GAs 77
78. 78. Discrete PMBGAs: Summaryn  No interactions ¨  Univariate models; PBIL, UMDA, cGA.n  Some pairwise interactions ¨  Tree models; COMIT, MIMIC, BMDA.n  Multivariate interactions ¨  Multivariate models: BOA, EBNA, LFDA.n  Hierarchical decomposition ¨  hBOA Martin Pelikan, Probabilistic Model-Building GAs 78
79. 79. Discrete PMBGAs: Recommendationsn  Easy problems ¨  Use univariate models; PBIL, UMDA, cGA.n  Somewhat difficult problems ¨  Use bivariate models; MIMIC, COMIT, BMDA.n  Difficult problems ¨  Use multivariate models; BOA, EBNA, LFDA.n  Most difficult problems ¨  Use hierarchical decomposition; hBOA. Martin Pelikan, Probabilistic Model-Building GAs 79
80. 80. Real-Valued PMBGAsn  New challenge ¨  Infinite domain for each variable. ¨  How to model?n  2 approaches ¨  Discretize and apply discrete model/PMBGA ¨  Create model for real-valued variables n  Estimate pdf. Martin Pelikan, Probabilistic Model-Building GAs 80
81. 81. PBIL Extensions: First Stepn  SHCwL: Stochastic hill climbing with learning (Rudlof, Köppen, 1996).n  Model ¨  Single-peak Gaussian for each variable. ¨  Means evolve based on parents (promising solutions). ¨  Deviations equal, decreasing over time.n  Problems ¨  No interactions. ¨  Single Gaussians=can model only one attractor. ¨  Same deviations for each variable. Martin Pelikan, Probabilistic Model-Building GAs 81
82. 82. Use Different Deviationsn  Sebag, Ducoulombier (1998)n  Some variables have higher variance.n  Use special standard deviation for each variable. Martin Pelikan, Probabilistic Model-Building GAs 82
83. 83. Use Covariancen  Covariance allows rotation of 1-peak Gaussians.n  EGNA (Larrañaga et al., 2000)n  IDEA (Bosman, Thierens, 2000) Martin Pelikan, Probabilistic Model-Building GAs 83
84. 84. How Many Peaks?n  One Gaussian vs. kernel around each point.n  Kernel distribution similar to ES.n  IDEA (Bosman, Thierens, 2000) Martin Pelikan, Probabilistic Model-Building GAs 84
85. 85. Mixtures: Between One and Manyn  Mixture distributions provide transition between one Gaussian and Gaussian kernels.n  Mixture types 4 ¨  Over one variable. 2 n  Gallagher, Frean, & Downs (1999). ¨  Over all variables. 0 n  Pelikan & Goldberg (2000). -2 n  Bosman & Thierens (2000). ¨  Over partitions of variables. -4 -4 -2 0 2 4 n  Bosman & Thierens (2000). n  Ahn, Ramakrishna, and Goldberg (2004). Martin Pelikan, Probabilistic Model-Building GAs 85
86. 86. Mixed BOA (mBOA)n  Mixed BOA (Ocenasek, Schwarz, 2002)n  Local distributions ¨  A decision tree (DT) for every variable. ¨  Internal DT nodes encode tests on other variables n  Discrete:Equal to a constant n  Continuous: Less than a constant ¨  Discretevariables: DT leaves represent probabilities. ¨  Continuous variables: DT leaves contain a normal kernel distribution. Martin Pelikan, Probabilistic Model-Building GAs 86
87. 87. Real-Coded BOA (rBOA)n  Ahn, Ramakrishna, Goldberg (2003)n  Probabilistic Model ¨  Underlying structure: Bayesian network ¨  Local distributions: Mixtures of Gaussiansn  Also extended to multiobjective problems (Ahn, 2005) Martin Pelikan, Probabilistic Model-Building GAs 87
88. 88. Aggregation Pheromone System (APS)n  Tsutsui (2004)n  Inspired by aggregation pheromonesn  Basic idea ¨  Good solutions emit aggregation pheromones ¨  New candidate solutions based on the density of aggregation pheromones ¨  Aggregation pheromone density encodes a mixture distribution Martin Pelikan, Probabilistic Model-Building GAs 88
89. 89. Adaptive Variance Scalingn  Adaptive variance in mBOA ¨  Ocenasek et al. (2004)n  Normal IDEAs ¨  Bosman et al. (2006, 2007) ¨  Correlation-triggered adaptive variance scaling ¨  Standard-deviation ratio (SDR) triggered variance scaling Martin Pelikan, Probabilistic Model-Building GAs 89
90. 90. Real-Valued PMBGAs: Discretizationn  Idea: Transform into discrete domain.n  Fixed models ¨  2k equal-width bins with k-bit binary string. ¨  Goldberg (1989). ¨  Bosman & Thierens (2000); Pelikan et al. (2003).n  Adaptive models ¨  Equal-height histograms of 2k bins. ¨  k-means clustering on each variable. ¨  Pelikan, Goldberg, & Tsutsui (2003); Cantu-Paz (2001). Martin Pelikan, Probabilistic Model-Building GAs 90
91. 91. Real-Valued PMBGAs: Summaryn  Discretization ¨  Fixed ¨  Adaptiven  Real-valued models ¨  Single or multiple peaks? ¨  Same variance or different variance? ¨  Covariance or no covariance? ¨  Mixtures? ¨  Treat entire vectors, subsets of variables, or single variables? Martin Pelikan, Probabilistic Model-Building GAs 91
92. 92. Real-Valued PMBGAs: Recommendationsn  Multimodality? ¨  Use multiple peaks.n  Decomposability? ¨  All variables, subsets, or single variables.n  Strong linear dependencies? ¨  Covariance.n  Partial differentiability? ¨  Combine with gradient search. Martin Pelikan, Probabilistic Model-Building GAs 92
93. 93. PMBGP (Genetic Programming)n  New challenge ¨  Structured, variable length representation. ¨  Possibly infinitely many values. ¨  Position independence (or not). ¨  Low correlation between solution quality and solution structure (Looks, 2006).n  Approaches ¨  Use explicit probabilistic models for trees. ¨  Use models based on grammars. Martin Pelikan, Probabilistic Model-Building GAs 93
94. 94. PIPEn  Probabilistic incremental program evolution (Salustowicz & X P(X) Schmidhuber, 1997) sin 0.15n  Store frequencies of + 0.35 operators/terminals in - 0.35 nodes of a maximum tree. X 0.15n  Sampling generates tree from top to bottom Martin Pelikan, Probabilistic Model-Building GAs 94
95. 95. eCGPn  Sastry & Goldberg (2003)n  ECGA adapted to program trees.n  Maximum tree as in PIPE.n  But nodes partitioned into groups. Martin Pelikan, Probabilistic Model-Building GAs 95
96. 96. BOA for GPn  Looks, Goertzel, & Pennachin (2004)n  Combinatory logic + BOA ¨  Trees translated into uniform structures. ¨  Labels only in leaves. ¨  BOA builds model over symbols in different nodes.n  Complexity build-up ¨  Modeling limited to max. sized structure seen. ¨  Complexity builds up by special operator. Martin Pelikan, Probabilistic Model-Building GAs 96
97. 97. MOSESn  Looks (2006).n  Evolve demes of programs.n  Each deme represents similar structures.n  Apply PMBGA to each deme (e.g. hBOA).n  Introduce new demes/delete old ones.n  Use normal forms to reduce complexity. Martin Pelikan, Probabilistic Model-Building GAs 97
98. 98. PMBGP with Grammarsn  Use grammars/stochastic grammars as models.n  Grammars restrict the class of programs.n  Some representatives ¨  Program evolution with explicit learning (Shan et al., 2003) ¨  Grammar-based EDA for GP (Bosman, de Jong, 2004) ¨  Stochastic grammar GP (Tanev, 2004) ¨  Adaptive constrained GP (Janikow, 2004) Martin Pelikan, Probabilistic Model-Building GAs 98
99. 99. PMBGP: Summaryn  Interesting starting points available.n  But still lot of work to be done.n  Much to learn from discrete domain, but some completely new challenges.n  Research in progress Martin Pelikan, Probabilistic Model-Building GAs 99
100. 100. PMBGAs for Permutationsn  New challenges ¨  Relativeorder ¨  Absolute order ¨  Permutation constraintsn  Two basic approaches ¨  Random-key and real-valued PMBGAs ¨  Explicit probabilistic models for permutations Martin Pelikan, Probabilistic Model-Building GAs 100
101. 101. Random Keys and PMBGAsn  Bengoetxea et al. (2000); Bosman et al. (2001)n  Random keys (Bean, 1997) ¨  Candidate solution = vector of real values ¨  Ascending ordering gives a permutationn  Can use any real-valued PMBGA (or GEA) ¨  IDEAs (Bosman, Thierens, 2002) ¨  EGNA (Larranaga et al., 2001)n  Strengths and weaknesses ¨  Good: Can use any real-valued PMBGA. ¨  Bad: Redundancy of the encoding. Martin Pelikan, Probabilistic Model-Building GAs 101
102. 102. Direct Modeling of Permutationsn  Edge-histogram based sampling algorithm (EHBSA) (Tsutsui, Pelikan, Goldberg, 2003) ¨  Permutations of n elements ¨  Model is a matrix A=(ai,j)i,j=1, 2, …, n ¨  ai,j represents the probability of edge (i, j) ¨  Uses template to reduce exploration ¨  Applicable also to scheduling Martin Pelikan, Probabilistic Model-Building GAs 102
103. 103. ICE: Modify Crossover from Modeln  ICE ¨  Bosman, Thierens (2001). ¨  Represent permutations with random keys. ¨  Learn multivariate model to factorize the problem. ¨  Use the learned model to modify crossover.n  Performance ¨  Typically outperforms IDEAs and other PMBGAs that learn and sample random keys. Martin Pelikan, Probabilistic Model-Building GAs 103
104. 104. Multivariate Permutation Modelsn  Basic approach ¨  Use any standard multivariate discrete model. ¨  Restrict sampling to permutations in some way. ¨  Bengoetxea et al. (2000), Pelikan et al. (2007).n  Strengths and weaknesses ¨  Use explicit multivariate models to find regularities. ¨  High-order alphabet requires big samples for good models. ¨  Sampling can introduce unwanted bias. ¨  Inefficient encoding for only relative ordering constraints, which can be encoded simpler. Martin Pelikan, Probabilistic Model-Building GAs 104
105. 105. Conclusionsn  Competent PMBGAs exist ¨  Scalable solution to broad classes of problems. ¨  Solution to previously intractable problems. ¨  Algorithms ready for new applications.n  PMBGAs do more than just solve the problem ¨  They provide us with sequences of probabilistic models. ¨  The probabilistic models tell us a lot about the problem.n  Consequences for practitioners ¨  Robust methods with few or no parameters. ¨  Capable of learning how to solve problem. ¨  But can incorporate prior knowledge as well. ¨  Can solve previously intractable problems. Martin Pelikan, Probabilistic Model-Building GAs 105
106. 106. Starting Pointsn  World wide webn  Books and surveys ¨  Larrañaga & Lozano (eds.) (2001). Estimation of distribution algorithms: A new tool for evolutionary computation. Kluwer. ¨  Pelikan et al. (2002). A survey to optimization by building and using probabilistic models. Computational optimization and applications, 21(1), pp. 5-20. ¨  Pelikan (2005). Hierarchical BOA: Towards a New Generation of Evolutionary Algorithms. Springer. ¨  Lozano, Larrañaga, Inza, Bengoetxea (2006). Towards a New Evolutionary Computation: Advances on Estimation of Distribution Algorithms, Springer. ¨  Pelikan, Sastry, Cantu-Paz (eds.) (2006). Scalable Optimization via Probabilistic Modeling: From Algorithms to Applications, Springer. Martin Pelikan, Probabilistic Model-Building GAs 106
107. 107. Online Code (1/2)n  BOA, BOA with decision graphs, dependency-tree EDA http://medal-lab.org/n  ECGA, xi-ary ECGA, BOA, and BOA with decision trees/graphs http://www.illigal.org/n  mBOA http://jiri.ocenasek.com/n  PIPE http://www.idsia.ch/~rafal/n  Real-coded BOA http://www.evolution.re.kr/ Martin Pelikan, Probabilistic Model-Building GAs 107
108. 108. Online Code (2/2)n  Demos of APS and EHBSA http://www.hannan-u.ac.jp/~tsutsui/research-e.htmln  RM-MEDA: A Regularity Model Based Multiobjective EDA Differential Evolution + EDA hybrid http://cswww.essex.ac.uk/staff/qzhang/mypublication.htmn  Naive Multi-objective Mixture-based IDEA (MIDEA) Normal IDEA-Induced Chromosome Elements Exchanger (ICE) Normal Iterated Density-Estimation Evolutionary Algorithm (IDEA) http://homepages.cwi.nl/~bosman/code.html Martin Pelikan, Probabilistic Model-Building GAs 108