Your SlideShare is downloading. ×
Efficiency Enhancement of Probabilistic Model Building Genetic Algorithms
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Efficiency Enhancement of Probabilistic Model Building Genetic Algorithms

1,112
views

Published on


This paper presents two different efficiency-enhancement techniques for probabilistic model building genetic algorithms. The first technique proposes the use of a mutation operator which performs local search in the sub-solution neighborhood identified through the probabilistic model. The second technique proposes building and using an internal probabilistic model of the fitness along with the probabilistic model of variable interactions. The fitness values of some offspring are estimated using the probabilistic model, thereby avoiding computationally expensive function evaluations. The scalability of the aforementioned techniques are analyzed using facetwise models for convergence time and population sizing. The speed-up obtained by each of the methods is predicted and verified with empirical results. The results show that for additively separable problems the competent mutation operator requires O(k0.5logm)—where k is the building-block size, and m is the number of building blocks—less function evaluations than its selectorecombinative counterpart. The results also show that the use of an internal probabilistic fitness model reduces the required number of function evaluations to as low as 1-10% and yields a speed-up of 2–50.

Published in: Economy & Finance, Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,112
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Efficiency Enhancement of Probabilistic Model Building Genetic Algorithms Kumara Sastry‘ David E. Goldberg‘ and Martin Pelikanz ‘Illinois Genetic Algorithms Laboratory University of Illinois at Urbana-Champaign ‘Department of Mathematics and Computer Science University of Missouri at St. Louis SLJppOr‘tO(fl by AFOSR F49620-03-1-0129. NSF/ DMR at MCC D. ‘vlR-99-76550‘ NSFI/ ITR at CPSD DMR-0121695. DOE at FS—i‘vJlRL DEFGO2—91ER45*-139i
  • 2. O 00 O C 0.0 Motivation Probabilistic model building genetic algorithms: + Solve hard problems quickly, reliably, and accurately lntractability 9 tractability + Polynomial (usually subquadratic) scalability ar Daunting for complex problems >0: 1000 variables, 10 seconds/ evaluation 9 ~120 days! Efficiency-Enhancement Techniques - Tractability 9 Practicality Evaluation relaxation via endogenous fitness-estimation model Local search in linkage neighborhoods
  • 3. O 0 Outline Background and purpose Methodology Evaluation relaxation using an internal probabilistic fitness- estimation model 4» Methodology o Scalability and speed-up analysis Incorporation of a competent mutation operator o Methodology o Scalability and speed-up analysis Future work, summary and conclusions
  • 4. Background and Purpose -: - Efficiency Enhancement of GAS + Parallelization [Cantu-Paz, 2001] + Time utilization [Goldberg, 1999; Goldberg & Srivastava, 2001] + Hybridization [Davis, 1990; Goldberg & Voessner, 1999] 4» Evaluation relaxation [Sastry, 2002] »: ~ Solve boundedly difficult problems in polynomial time - Propose two principled efficiency-enhancement techniques ~: ~ Use probabilistic models built by PMBGAS + Develop and use an inexpensive fitness-estimation model + Local search in the linkage neighborhood
  • 5. Evaluation Relaxation in PMBGAs ~: - Replace accurate, but expensive fitness evaluation with less- accurate, but inexpensive fitness estimation + Fitness inheritance [Smith eta/ , 1994; Sastry eta/ , 2000] 4» Simple method yields limited speed-up of 1.25 o: ~ Develop and use an internal probabilistic fitness model + Automatic discovery of fitness structure + Estimate sub-structure (building block) fitnesses 4» Individual fitness a function of sub-structural fitness . :» Illustrate using extended compact genetic algorithm (eCGA)
  • 6. Internal Sub-Structural Fitness Model -: - Identify key sub-structures of the search problem «: ~ Estimate the fitness of sub-structure instances . Average fitness of all Average fitness of all Fitness of _ , , _ _ _ _ , _ . - evaluated individuals with - evaluated individuals in substructure instance , _ the schema instance the population , ..-. ..«, ' ' i. .-. .,~', ’ ‘ i . . - _ u. ‘ »: ~ Individual fitness as a function of sub-structural fitnesses 4» Sum of fitness estimates of sub-structure instances + Can use other complex methods such as neural networks
  • 7. Illustration Of Sub-Structural Fitness Estimation oz» Use model-building procedure of extended compact GA + Partition genes into (mutually) independent groups Model Structure Metric [Xo] [Xi] [X2] [Xi] [X4] [Xs] [Xii] [X1] [Xa] [X9] [Xio] [Xii] ‘-0000 [Xo] [Xi] [X2] [X3] [X. iXs] [Xd [X1] [Xe] [X9] [Xio] [Xii] 0-9933 [Xo] [Xi] [X2] [X3] [X4XsX1] [X6] [X3] [X9] [Xio] [Xii] 0-93|9 [Xo] [Xi] [X2] [X3] [X. iXsXsXi] [Xii] [X9] [Xio] [Xii] 0-9,544 Ix. ) [Xi] txa [Xi] [XiXsXeX7] [):8)(9): lD)(l . :i 05273 [x, x,x, x,] [x. ,x, x6x, ] [: :,, ::, ::, ,,: :, ,1 0.0895 «: ~ Use population data to estimate: + Probabilities: p(X0X, X2X3), p(X4X5X6X7), p(X8X9X10X11) V Sub-structure fitnesses: f(XOX1X2X3), f(X4X5X6X7), f(X8X9X,0X1,)
  • 8. Scalability of Evaluation-Relaxation Scheme -: ~ Error in fitness estimation modeled as additive Gaussian noise -: - Convergence time: Selection-Intensity based models [Miihlenbein & Schlierkamp-Voosen. 1993; Thierens & Goldberg 1993; Back, 1994; Miller 8. Goldberg, 1995 & 1996; Sastry & Goldberg, 2002] 1.7 . . . O 10 4«Trap: s=4 O ‘5. 0 104-Trap. s=B O 100 bi10ncMax. s - 4 D 100-bi10neMax. s = 8 D — Theory _. ' 'cnr. ‘pi = 9.) 9 vi encetime ratio is; "c. oz» Neglected cumulative effects of fitness estimation Converg o 0.2 . o.4_ _ 0.6 0.3 Proportion of inheritance. p,
  • 9. Scalability of Evaluation-Relaxation Scheme ~: ~ Population Sizing Model [Goldberg eta/ , 1992; Harik eta/ , 1997; Pelikan etal, 2001] + BB supply, + Decision-making, and + Accurate model building 22 10 4-Trap: s = 4 ‘o 0 10¢Tmps-8 9 0 2 in 100-bit Cnellilax, s - 4 _ D 100-bit Cneltlax, s = 8 O B : — Theory _ o 3 0 _. O . . OI _. D Population-slze ratio, n/ n(p. = 0) iv 0 0.2 _ o.4_ _ 06 0.8 Proportion of inheritance. pi
  • 10. Scalability Of Evaluation-Relaxation Scheme -: ~ Scalability: # function evaluations required to obtain a solution of quality 1-1/m + 1-pi individuals evaluated + Neglected the cost of building and using the 1.2- i fitness-estimation model 0 . n,, V,/ n.elp, = 0) b _. 10 4-Trap: s = 4 10 4-Trap. 5 -. 8 . ° 03 l o 0 in D 100-bit Onefilax, s = 8 Q4 — Theory 0 0.2 _ o.4_ _ 05 0.8 Proportion of inheritance. pi Function evaluation ratio D 100-bitOrieMax. s« 4 ' 9' I
  • 11. Scalability Of Evaluation-Relaxation Scheme -: - Speed-Up: Ratio of # function evaluations with evaluation relaxation to that without it »: - Only 1-15% individuals need evaluation o Optimal p, ~ 0.85-0.99 O 10 4 Trap: s = 4 0 10 4-Trap. s - 8 D 100-bit OneMax. s - 4 2 D 100-bit OneMax. 5 - 8 — Theory 225 -: - Speed-Up: 1.75-53 Speed-Up, _r1m 5.‘ 3 1.25 o 0.2 _ o.4_ _ 0.6 0.8 Proportion of inheritance, pi 77% <1 -22,->"‘<1 +p. »>'1-5
  • 12. Designing A Competent Mutation ~: ~ Systematic induction of good neighborhood information 4» Based on a population of promising candidate solutions -: ~ Polynomial (quadratic) scalability ~: ~ Perform scalability analysis of the proposed algorithm + Derive and use facetw/ se models 4» Verify on nearly decomposable problems of bounded difficulty »= Additively separable problems + Compare with a competent selectorecombinative GA
  • 13. Designing A Competent Mutation Operator ~: ~ Inducing good neighborhoods + Sample of candidate solutions ‘ + Use linkage-learning procedures developed fl*“»' i ’ l I l for selectorecombinative GAs “M "'“‘3 '“”” ~: - Neighborhood: Space vs. Sub-space ~: Search: Hillclimbing vs. Random A5"—'3-~"”-‘—-- 4» Consider linkage partitions i I I 1 =0: Arbitrary left-to-right order _ 4» Choose the best schemata V V is Among the 2" possible ones 9 lat —i schemata Search hi -1 Schcmillll search V”_ V
  • 14. How to Induce Neighborhoods? -: - Induce good neighborhoods as linkage groups -: - Use model-building procedure of extended compact GA + Partition genes into (mutually) independent groups + Start with the lowest complexity model + Search for a least-complex, most-accurate model Model Structure Metric [Xo] [Xi] [X2] [X3] [X4] [X5] [X5] [X1] [Xi] [X9] [Xio] [Xii] ‘-0000 [Xo] [Xi] [X2] [X3] [X4Xs] [Xd [X2] [Xs] [X9] [Xio] [Xi i] 0-9933 [Xo] [Xi] [X2] [X3] [XiXsX7] [X5] [X3] [X9] [Xio] [Xii] 0-93|9 [Xo] [Xi] [X2] [Xi] [X4XsX«. X7] [Xe] [X2] [Xio] [Xii] 0-9644 rx. i [Xi] [X2] ix. ) ix. x.x. x.i [31.. >1931.oX. .] 0.9273 [x, ,x, x,x, ] [x, x,x, x,] [x_x, ::, ,,: :, ,3 0.8895
  • 15. 1. Algorithmic Description of Selectomutative GA Initialize the population (usually random initialization) Evaluate the fitness of individuals Select promising solutions (e. g., tournament selection) Build the probabilistic model 0 0.6 Optimize the structure and parameter of the probabilistic model to best fit the selected individuals «to Automatic discovery linkage neighborhoods 6.6 . Apply enumerative BB-wise mutation Search for the best building block in each partition
  • 16. Scalability of Competent Mutation Operator -: - Population size required for accurate modeling of neighborhood information [Pelikan. Sastry. and Goldberg, 2002]: O (21: _ 7711.05) S n S O (21: _ 7712.1) -: ~ Empirical verification: 10 10° Sub-Component size (BB size) < Population size. n # Components (#3 BB5) < 5 10‘ , .20. 2 3 4 5 7 10 20 30 50 Number of building blocks. m
  • 17. Scalability of Competent Mutation Operator Number of function evaluations, no 103 o k"4,d=0.25:O(m: ‘) o k 5,ci= o.2o: o(m‘) 2 5 -is 7 1o 20 so so Number of building blocks. m
  • 18. Competent Crossover vs. Competent Mutation ' o k=4,ld=0.2,5:O(l<:5|og(IT-1)) 0 9 Competent mutation: 8-- ° '<=5-d= °~2°i°<l< 5'°9<ml) 9 .9’ VT 9 . O (2km1.5) Competent crossover: O (2km1'5 log m) i 3 :5 5 7 1o 15 20 Number of building blocks. m -: - Speed-Up: Scalability ratio of mutation to that of crossover
  • 19. O 0 Future Work Hybridization of competent crossover and mutation o Consider a hybrid GA with oscillating population Problems with overlapping building blocks o Effect is similar to that of exogenous noise o Speed-Up likely to be lower o More complex model-building schemes (E. g., BOA, DSMGA) Problems with non-uniform salience o Account for sequentiality Hierarchical problems o Account for effect of niching
  • 20. Summary & Conclusions -: - Evaluation relaxation using internal fitness model o Uses sub-structural fitness estimates as basis o Only 1-15% of the individuals need fitness evaluation o Yields a speed-up of 1.75-53 -: - Competent mutation operator: o Systematic induction of good neighborhoods o Neighborhood: Entire search space 9 Linkage subspace 9 Search: Hillclimbing 9 Random/ Enumerative o Subquadratic scalability: 0(2" m‘-5) It e'*m" scalability of simple mutation [Mt‘ihIenbein, 1991]
  • 21. Test Problems: m-k Deceptive Trap Functions -: - Building block identification is critical to GA success -: - mconcatenated deceptive trap functions [Ack| ey, 1937; Goldberg, 1937; Deb & Goldberg. 1993] li]‘l| l1l‘lliIlll>()_lIMl)7:| ‘ll|0A:0llI]Zll7Hl]l](lAlj . l . . l J .1 ‘ 1: l J: I >- lm 1 ] ml . , . , ‘, J_t_1_! _L%‘ l, (|l)J| ll 15 -- E . 5 (1.51) V. 11.25 (mil 1 1 » II I ’ l| "IL J _
  • 22. :"‘S*°! ‘-” Algorithmic Description of eCGA Initialize the population (usually random initialization) Evaluate the fitness of individuals Select promising solutions (e. g., tournament selection) Build the probabilistic model ~: ~ Optimize the structure and parameter of the probabilistic model to best fit the selected individuals «2~ Automatic identification of building blocks Sample the model to create new candidate solutions -3 Effective exchange of building blocks Repeat steps 2-7 till some convergence criteria are met
  • 23. Scalability of eCGA ~: - Population sizing: [Pelikan, Goldberg, Cantu-Paz, 2ooo; Pelikan, Sastry, s. Goldberg, 2003] + Accounts for BB supply, decision-making, and model accuracy requirements BB fitness variance < I 10 Signal between 4 SU‘b‘C°”‘l30"‘9"‘ Competing 335 size (BB size) Failure probability ‘ C 3 .10 Q! N (I) C 2 E 3 . . 3 10‘ Error tolerance < Signal-to-Noise ratio < O 3 k _ ad ; 0 Om. ’-log m) 1 . _ <> l(=5, = o2o o ' it Competing sub-components ‘ 10 - 2 3 5 7 10 15 20 #1’ Components (:3 BB5) < Number cl building blocks, m
  • 24. Scalability of eCGA -: - Number of function evaluations: 9 Error probability or = 1/m 1. 0 UI —I O ‘ I .3 O 0: Number of function evaluations, ne 0.25: O(m‘ slog m) _ 102r O O k=4'd= 14 . F O k=5,d=0.20:O(m logm) ,3 L . l l 2 3 7 10 15 20 5 Number of building blocks, m
  • 25. 00 O 00 O 00 O O 00 0 Related Work Mutation operators in genetic algorithms + Random walk locally around individuals Mutation operators in evolution strategies [Rechenberg, 1973; Schwefel, 1977; Back, 1996; Beyer, 1996; Hansen & Ostermeier, 2001] + Neighborhood information based on entire search space Local sea rch literature [Barnes eta/ , 2003; Watson, 2003; Vaughn eta/ , 2000; Armstrong & Jacobson, 2004; Glover 8. Laguna, 1997] + Emphasis on good ne/ jghborhooa’ operators Lack systematic methods for inducing good neighborhoods Mutation with known linkage neighborhoods [Sastry & Goldberg, 2004] ekml‘ scalability 9 2“m2 scalability + Neglected the cost of inducing linkage neighborhoods

×