Order Or Not: Does Parallelization of Model Building in hBOA Affect Its Scalability?

1,639 views

Published on

It has been shown that model building in the hierarchical Bayesian optimization algorithm (hBOA) can be efficiently parallelized by randomly generating an ancestral ordering of the nodes of the network prior to learning the network structure and allowing only dependencies consistent with the generated ordering. However, it has not been thoroughly shown that this approach to restricting probabilistic models does not affect scalability of hBOA on important classes of problems. This presentation demonstrates that although the use of a random ancestral ordering restricts the structure of considered models to allow efficient parallelization of model building, its effects on hBOA performance and scalability are negligible.

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,639
On SlideShare
0
From Embeds
0
Number of Embeds
42
Actions
Shares
0
Downloads
16
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Order Or Not: Does Parallelization of Model Building in hBOA Affect Its Scalability?

  1. 1. Order or Not: Does Parallelization of Model Building in hBOA Affect Its Scalability? Martin Pelikan and James D. Laury, Jr. Missouri Estimation of Distribution Algorithms Laboratory (MEDAL) University of Missouri, St. Louis, MO http://medal.cs.umsl.edu/ pelikan@cs.umsl.edu Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  2. 2. Motivation Background hBOA can solve nearly decomposable and hierarchical problems scalably, often in O(n2 ) evaluations or faster. O(n2 ) is sometimes not enough... We may need efficiency enhancemente techniques Parallelization, evaluation relaxation, hybridization, ... Parallelization of model building Among the most powerful efficiency enhancements. But modifies the model building procedure. Speedups obtained were analyzed. But effects of modifying the model building procedure on hBOA scalability were not analyzed thoroughly. Purpose Study the effects of model-building parallelization on hBOA performance. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  3. 3. Outline 1. Hierarchical BOA (hBOA) basics. 2. Parallelization of model building in hBOA. 3. Experiments. 4. Summary and conclusions. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  4. 4. Hierarchical Bayesian Optimization Algorithm (hBOA) Hierarchical BOA (Pelikan & Goldberg; 2000, 2001) Build and sample a Bayes net instead of crossover/mutation. Use local structures and niching. Current Selected Bayesian New population population network population Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  5. 5. Bayesian Network Bayesian network has two parts Structure Structure determines edges in the network. X0 X5 X1 X4 X2 X3 Parameters. Parameters specify conditional probabilities of each variable given its parents (variables that this variable depends on). Example: p(X2 |X0 , X1 ). Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  6. 6. Efficiency Enhancement of hBOA Potential bottlenecks 1. Model building. 2. Evaluation of candidate solutions. Efficiency enhancement 1. Parallelization. 2. Hybridization. 3. Time continuation. 4. Fitness evaluation relaxation. 5. Prior knowledge utilization. 6. Incremental and sporadic model building. 7. Learning from experience. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  7. 7. Parallel hBOA What to parallelize? 1. Model building. 2. Fitness evaluation. Parallelizing evaluation Same situation as for other evolutionary algorithms. Use master-slave architecture to distribute evaluations. This work Focus on parallelization of model building. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  8. 8. Model Building: Basics Components of model building 1. Learn structure (dependencies). 2. Learn parameters (conditional probabilities). Learning structure Start with an empty network. Add one edge at a time (without creating cycles). Choose edges that improve the network the most. Learning parameters Compute frequencies from selected solutions. Estimate probabilities from computed frequencies. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  9. 9. Complexity Analysis of Model Building Notation Upper bound on the number of parents: k Population size: N Number of bits: n Analysis Learn structure: O(kn2 N ). Learn parameters: O(knN ). Bottom line Learning structure = most expensive part. Should parallelize the greedy algorithm for learning structure. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  10. 10. Parallelizing Structure Learning Master-slave approach (Ocenasek & Schwarz, 2000) 1. Master sends each node to a separate slave processor. 2. Master distributes the population to all slave processors. 3. Each slave processor determines parents for its node. 4. The master collects the results and learns parameters. Variations Each processor deals with multiple nodes. Parameters are also learned locally in slave processors. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  11. 11. Ordering Nodes to Avoid Cycles Motivation Master-slave approach looks great. But Bayesian network cannot contain cycles! Communication required after each new edge on any slave. This leads to a LOT of communication. Ordering nodes (Ocenasek & Schwarz, 2000) Order the nodes randomly before learning the structure. Only allow edges consistent with the ordering. No need to check for cycles. Each processor can deal with edges into a single node. Communication needed only before/after the learning. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  12. 12. Ordering Nodes: Example Consider ordering (X1 , X3 , X2 , X4 , X0 , X5 ) Consistent network Inconsistent network Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  13. 13. Example Speedups with Parallel hBOA Results from Ocenasek et al. (2004) Nearly linear speedups even for relatively small problems. No need for parallel computers with shared memory. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  14. 14. Effects of Node Ordering Positive effects Straightforward parallelization with little communication. Slightly faster model learning even on a sequential computer. Negative effects Model structures are restricted. Models may become less accurate. Population size and number of generations might increase. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  15. 15. Description of Experiments Test functions Concatenated trap of order 3 and 5. Hierarchical trap. Ising spin glass (2D). Scalability analysis Number of function evaluations with respect to problem size. Parameters Population size from bisection to ensure 30/30 successful runs. Number of generations upper bounded by n. Window size in RTR: w = min{n, N/20}. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  16. 16. Results on dec-3 hBOA hBOA with ordered nodes Number of Evaluations 100000 10000 30 60 90 120 150 Problem Size hBOA with node ordering needs slightly more evaluations. But differences are relatively small. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  17. 17. Results on trap-5 1e+06 hBOA hBOA with ordered nodes Number of Evaluations 100000 10000 30 60 90 120 150 Problem Size Results are mixed (no approach consistently better). But differences are again relatively small. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  18. 18. Results on hTrap 1e+06 hBOA hBOA with ordered nodes Number of Evaluations 100000 10000 1000 27 81 243 Problem Size hBOA with node ordering in fact outperforms standard hBOA. But differences are again relatively small. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  19. 19. Results on 2D Ising Spin Glass hBOA hBOA with ordered nodes Number of Evaluations 1000 100 36 64 100 144 196 Problem Size hBOA with node ordering needs slightly more evaluations. But differences are relatively small. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  20. 20. Summary and Conclusions Summary Tested scalability of parallel model building in hBOA. Analyzed the results. Conclusions Parallelization leads to a slight increase in the number of function evaluations. But the negative effects are negligible compared to the gains. Bottom line: If model building is the bottleneck, parallelize the model building! Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability
  21. 21. Acknowledgments Acknowledgments NSF; NSF CAREER grant ECS-0547013. University of Missouri; High Performance Computing Collaboratory sponsored by Information Technology Services; Research Award; Research Board. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

×