Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Explaining Optimization in Genetic Algorithms with Uniform Crossover

6,668 views

Published on

  • I liked it!.... STARTUPS...Send your pitchdeck to over 5700 of VC's and Angel's with just 1 click. Visit: Angelvisioninvestors.com
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

Explaining Optimization in Genetic Algorithms with Uniform Crossover

  1. 1. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Explaining Optimization in Genetic Algorithms with Uniform Crossover Keki Burjorjee kekib@cs.brandeis.edu Foundations of Genetic Algorithms Conference XII (2013) Keki Burjorjee Explaining Optimization in GAs with UX
  2. 2. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Uniform Crossover Uniform Crossover first used by Ackley (1987) Studied and popularized by Syswerda (1989) and Spears & De Jong (1990, 1991) Numerous accounts of optimization in GAs with UX In practice frequently outperforms XO with tight linkage (Fogel, 2006) Keki Burjorjee Explaining Optimization in GAs with UX
  3. 3. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Uniform Crossover Uniform Crossover first used by Ackley (1987) Studied and popularized by Syswerda (1989) and Spears & De Jong (1990, 1991) Numerous accounts of optimization in GAs with UX In practice frequently outperforms XO with tight linkage (Fogel, 2006) Keki Burjorjee Explaining Optimization in GAs with UX
  4. 4. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Uniform Crossover Uniform Crossover first used by Ackley (1987) Studied and popularized by Syswerda (1989) and Spears & De Jong (1990, 1991) Numerous accounts of optimization in GAs with UX In practice frequently outperforms XO with tight linkage (Fogel, 2006) Keki Burjorjee Explaining Optimization in GAs with UX
  5. 5. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Uniform Crossover Uniform Crossover first used by Ackley (1987) Studied and popularized by Syswerda (1989) and Spears & De Jong (1990, 1991) Numerous accounts of optimization in GAs with UX In practice frequently outperforms XO with tight linkage (Fogel, 2006) Keki Burjorjee Explaining Optimization in GAs with UX
  6. 6. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Optimization in GAs with UX Unexplained Cannot be explained within the rubric of the BBH No viable alternative has been proposed Keki Burjorjee Explaining Optimization in GAs with UX
  7. 7. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Optimization in GAs with UX Unexplained Hyperclimbing Hypothesis A scientific explanation for optimization in GAs with UX Keki Burjorjee Explaining Optimization in GAs with UX
  8. 8. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion What Does “Scientific” Mean? Logical Positivism (Proof or Bust) Scientific truth is absolute Emphasis on Verifiability (i.e. mathematical proof) The Popperian method Scientific truth is provisional Emphasis on Falsifiability (testable predictions) Keki Burjorjee Explaining Optimization in GAs with UX
  9. 9. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion What Does “Scientific” Mean? Logical Positivism (Proof or Bust) Scientific truth is absolute Emphasis on Verifiability (i.e. mathematical proof) The Popperian method Scientific truth is provisional Emphasis on Falsifiability (testable predictions) Keki Burjorjee Explaining Optimization in GAs with UX
  10. 10. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Popperian Method Logic behind the Popperian Method : Contrapositive Theory ⇒ Phenomenon ⇔ ¬Phenomenon ⇒ ¬Theory Additional tightening by Popper Unexpected Phenomenon → More credence owed to the theory e.g. Gravitational lensing → General theory of relativity Keki Burjorjee Explaining Optimization in GAs with UX
  11. 11. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Popperian Method Logic behind the Popperian Method : Contrapositive Theory ⇒ Phenomenon ⇔ ¬Phenomenon ⇒ ¬Theory Additional tightening by Popper Unexpected Phenomenon → More credence owed to the theory e.g. Gravitational lensing → General theory of relativity Keki Burjorjee Explaining Optimization in GAs with UX
  12. 12. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Additional Requirements in a Science of EC Weak assumptions about distribution of fitness This is just Occam’s Razor Upfront proof of concept Avoid another ”Royal Roads moment” (Nice to have) Identification of a core computational efficiency Keki Burjorjee Explaining Optimization in GAs with UX
  13. 13. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Additional Requirements in a Science of EC Weak assumptions about distribution of fitness This is just Occam’s Razor Upfront proof of concept Avoid another ”Royal Roads moment” (Nice to have) Identification of a core computational efficiency Keki Burjorjee Explaining Optimization in GAs with UX
  14. 14. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Additional Requirements in a Science of EC Weak assumptions about distribution of fitness This is just Occam’s Razor Upfront proof of concept Avoid another ”Royal Roads moment” (Nice to have) Identification of a core computational efficiency Keki Burjorjee Explaining Optimization in GAs with UX
  15. 15. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Hyperclimbing Hypothesis GAs with UX perform optimization by efficiently implementing a global search heuristic called hyperclimbing Keki Burjorjee Explaining Optimization in GAs with UX
  16. 16. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Outline 1 Highlight Symmetry of UX 2 Describe Hyperclimbing Heuristic 3 Provide proof-of-concept Show a GA implementing hyperclimbing efficiently 4 Make a prediction and validate it on MAX-3SAT Sherrington Kirkpatrick Spin Glasses problem 5 Outline future work Keki Burjorjee Explaining Optimization in GAs with UX
  17. 17. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Variation in GAs 0 1 0 0 1 0 1 0 1 ......... 0 1 0 0 1 1 0 0 1 1 ......... 1 Keki Burjorjee Explaining Optimization in GAs with UX
  18. 18. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Variation in GAs 0 1 0 0 1 0 1 0 1 ......... 0 X1 X2 X3 X4 X5 X6 X7 X8 X9 . . . . . . . . . X 1 0 0 1 1 0 0 1 1 ......... 1 Keki Burjorjee Explaining Optimization in GAs with UX
  19. 19. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Variation in GAs 0 1 0 0 1 0 1 0 1 ......... 0 ↑ ↓ ↓ ↑ ↑ ↓ ↓ ↓ ↑ ......... ↑ 1 0 0 1 1 0 0 1 1 ......... 1 0 0 0 0 1 0 0 1 1 ......... 0 Keki Burjorjee Explaining Optimization in GAs with UX
  20. 20. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Variation in GAs 0 1 0 0 1 0 1 0 1 ......... 0 ↑ ↓ ↓ ↑ ↑ ↓ ↓ ↓ ↑ ......... ↑ 1 0 0 1 1 0 0 1 1 ......... 1 0 0 0 0 1 0 0 1 1 ......... 0 Y1 Y2 Y3 Y4 Y5 Y6 Y7 Y8 Y9 ......... Y Keki Burjorjee Explaining Optimization in GAs with UX
  21. 21. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Variation in GAs 0 1 0 0 1 0 1 0 1 ......... 0 ↑ ↓ ↓ ↑ ↑ ↓ ↓ ↓ ↑ ......... ↑ 1 0 0 1 1 0 0 1 1 ......... 1 0 0 0 0 1 0 0 1 1 ......... 0 ⇓ ⇓ ⇓ ⇓ ⇓ ⇓ ⇓ ......... ⇓ 0 0 0 1 1 0 0 0 1 ......... 0 Keki Burjorjee Explaining Optimization in GAs with UX
  22. 22. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Variation in GAs X1 X2 X3 X4 X5 X6 X7 X8 X9 . . . . . . . . . X Y1 Y2 Y3 Y4 Y5 Y6 Y7 Y8 Y9 . . . . . . . . . Y UX: X1 , . . . , X are independent Absence of positional bias (Eshelman et al., 1989) Attributes can be arbitrarily permuted Suppose Y1 , . . . , Y independent and independent of If locus i immaterial to fitness, can be spliced out Keki Burjorjee Explaining Optimization in GAs with UX
  23. 23. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion The Hyperclimbing Heuristic Keki Burjorjee Explaining Optimization in GAs with UX
  24. 24. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Hyperclimbing: A Stochastic Search Heuristic The Setting f {0, 1} → R f may be stochastic Keki Burjorjee Explaining Optimization in GAs with UX
  25. 25. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Hyperclimbing: A Stochastic Search Heuristic The Setting f {0, 1} → R f may be stochastic Keki Burjorjee Explaining Optimization in GAs with UX
  26. 26. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Schema Partitions and Schemata Given index set I ⊂ {1, . . . , }, s.t. |I| = k I partitions the search space into 2k schemata Schema partition denoted by I E.g. for I = {3, 7} Coarseness of the partition determined by k (the order) A3 = 0, A7 = 1 A3 A7 k schema partitions of = 0, = 0 order k A3 = 1, A7 = 0 A3 = 1, A7 = 1 For fixed k, k ∈ Ω( k ) Keki Burjorjee Explaining Optimization in GAs with UX
  27. 27. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Schema Partitions and Schemata Given index set I ⊂ {1, . . . , }, s.t. |I| = k I partitions the search space into 2k schemata Schema partition denoted by I E.g. for I = {3, 7} Coarseness of the partition determined by k (the order) A3 = 0, A7 = 1 A3 A7 k schema partitions of = 0, = 0 order k A3 = 1, A7 = 0 A3 = 1, A7 = 1 For fixed k, k ∈ Ω( k ) Keki Burjorjee Explaining Optimization in GAs with UX
  28. 28. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Schema Partitions and Schemata Given index set I ⊂ {1, . . . , }, s.t. |I| = k I partitions the search space into 2k schemata Schema partition denoted by I E.g. for I = {3, 7} Coarseness of the partition determined by k (the order) A3 = 0, A7 = 1 A3 A7 k schema partitions of = 0, = 0 order k A3 = 1, A7 = 0 A3 = 1, A7 = 1 For fixed k, k ∈ Ω( k ) Keki Burjorjee Explaining Optimization in GAs with UX
  29. 29. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Schema Partitions and Schemata Given index set I ⊂ {1, . . . , }, s.t. |I| = k I partitions the search space into 2k schemata Schema partition denoted by I E.g. for I = {3, 7} Coarseness of the partition determined by k (the order) A3 = 0, A7 = 1 A3 A7 k schema partitions of = 0, = 0 order k A3 = 1, A7 = 0 A3 = 1, A7 = 1 For fixed k, k ∈ Ω( k ) Keki Burjorjee Explaining Optimization in GAs with UX
  30. 30. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion The Effect of a Schema Partition Effect of I : Variance of average fitness of schemata in I If I ⊂ I , then Effect(I) ≤ Effect(I ) I Keki Burjorjee Explaining Optimization in GAs with UX
  31. 31. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion The Effect of a Schema Partition Effect of I : Variance of average fitness of schemata in I If I ⊂ I , then Effect(I) ≤ Effect(I ) I I Keki Burjorjee Explaining Optimization in GAs with UX
  32. 32. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion How a Hyperclimbing Heuristic Works Finds a coarse schema partition with a detectable effect Limits future sampling to a schema with above average fitness Raises expected fitness of all future samples Keki Burjorjee Explaining Optimization in GAs with UX
  33. 33. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion How a Hyperclimbing Heuristic Works Finds a coarse schema partition with a detectable effect Limits future sampling to a schema with above average fitness Raises expected fitness of all future samples Keki Burjorjee Explaining Optimization in GAs with UX
  34. 34. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion How a Hyperclimbing Heuristic Works Finds a coarse schema partition with a detectable effect Limits future sampling to a schema with above average fitness Raises expected fitness of all future samples Keki Burjorjee Explaining Optimization in GAs with UX
  35. 35. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion How a Hyperclimbing Heuristic Works Finds a coarse schema partition with a detectable effect Limits future sampling to a schema with above average fitness Raises expected fitness of all future samples Keki Burjorjee Explaining Optimization in GAs with UX
  36. 36. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion How a Hyperclimbing Heuristic Works Limit future sampling to some schema equivalent to Fix defining loci of the schema in the population Hyperclimbing Heuristic now recurses Search occurs over the unfixed loci And so on . . . Keki Burjorjee Explaining Optimization in GAs with UX
  37. 37. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion How a Hyperclimbing Heuristic Works Limit future sampling to some schema equivalent to Fix defining loci of the schema in the population Hyperclimbing Heuristic now recurses Search occurs over the unfixed loci And so on . . . Keki Burjorjee Explaining Optimization in GAs with UX
  38. 38. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion How a Hyperclimbing Heuristic Works Limit future sampling to some schema equivalent to Fix defining loci of the schema in the population Hyperclimbing Heuristic now recurses Search occurs over the unfixed loci And so on . . . Keki Burjorjee Explaining Optimization in GAs with UX
  39. 39. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Fitness Distribution Assumption ∃ a small set of unfixed loci, I, such that Conditional effect of I is detectable Conditional upon loci that have already been fixed Unconditional effect of I may be undetectable E.g. Effect of 1*#*0*#* is detectable Effect of **#***#* undetectable (‘*’ = wildcard, ‘#’= defined locus) Called staggered coarse conditional effects Keki Burjorjee Explaining Optimization in GAs with UX
  40. 40. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Fitness Distribution Assumption ∃ a small set of unfixed loci, I, such that Conditional effect of I is detectable Conditional upon loci that have already been fixed Unconditional effect of I may be undetectable E.g. Effect of 1*#*0*#* is detectable Effect of **#***#* undetectable (‘*’ = wildcard, ‘#’= defined locus) Called staggered coarse conditional effects Keki Burjorjee Explaining Optimization in GAs with UX
  41. 41. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Fitness Distribution Assumption ∃ a small set of unfixed loci, I, such that Conditional effect of I is detectable Conditional upon loci that have already been fixed Unconditional effect of I may be undetectable E.g. Effect of 1*#*0*#* is detectable Effect of **#***#* undetectable (‘*’ = wildcard, ‘#’= defined locus) Called staggered coarse conditional effects Keki Burjorjee Explaining Optimization in GAs with UX
  42. 42. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Fitness Distribution Assumption is Weak Staggered coarse conditional effects assumption is weak Weaker than fitness distribution assumption in the BBH BBH assumes unstaggered coarse unconditional effects Weaker assumptions are more likely to be satisfied in practice Good b/c we aspire to explain optimization in all GAs with UX Keki Burjorjee Explaining Optimization in GAs with UX
  43. 43. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Fitness Distribution Assumption is Weak Staggered coarse conditional effects assumption is weak Weaker than fitness distribution assumption in the BBH BBH assumes unstaggered coarse unconditional effects Weaker assumptions are more likely to be satisfied in practice Good b/c we aspire to explain optimization in all GAs with UX Keki Burjorjee Explaining Optimization in GAs with UX
  44. 44. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Hyperclimbing Heuristic is in Good Company Hyperclimbing an example of a global decimation heuristic Global decimation heuristics iteratively reduce the size of a search space Use non-local information to find and fix partial solutions No backtracking E.g. Survey propagation (M´rzad et al., 2002) e State-of-the-art solver for large, random SAT problems Keki Burjorjee Explaining Optimization in GAs with UX
  45. 45. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Hyperclimbing Heuristic is in Good Company Hyperclimbing an example of a global decimation heuristic Global decimation heuristics iteratively reduce the size of a search space Use non-local information to find and fix partial solutions No backtracking E.g. Survey propagation (M´rzad et al., 2002) e State-of-the-art solver for large, random SAT problems Keki Burjorjee Explaining Optimization in GAs with UX
  46. 46. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Hyperclimbing Heuristic is in Good Company Hyperclimbing an example of a global decimation heuristic Global decimation heuristics iteratively reduce the size of a search space Use non-local information to find and fix partial solutions No backtracking E.g. Survey propagation (M´rzad et al., 2002) e State-of-the-art solver for large, random SAT problems Keki Burjorjee Explaining Optimization in GAs with UX
  47. 47. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Proof Of Concept Keki Burjorjee Explaining Optimization in GAs with UX
  48. 48. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion 4-Bit Stochastic Learning Parities Problem N (+0.25, 1) if xi1 ⊕ xi2 ⊕ xi3 ⊕ xi4 = 1 fitness(x) ∼ N (−0.25, 1) otherwise Probability 0 −0.25 +0.25 Fitness Keki Burjorjee Explaining Optimization in GAs with UX
  49. 49. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion 4-Bit Stochastic Learning Parities Problem The Game 10111001011101001111000001 → +0.5689 10001010110110100011110000 → −0.25565 11101100100010111101101110 → −0.37747 01110101001111100000110001 → −0.29589 00000010001000100110011101 → −1.4751 00001100000100010001100000 → −0.234 01000001111000111100110100 → +0.11844 11010001000101000011000110 → +0.31481 00000011100010001100000111 → +1.4435 00100111000111000001110000 → −0.35097 10100011011010101010100001 → +0.62323 00011011101010100010100000 → +0.79905 00101010101100101110100000 → +0.94089 01010110000000110110110011 → −0.99209 11100101000110010110110101 → +0.21204 00011111011101110101000111 → +0.23788 00001111111101011110111010 → −1.0078 . . . . . . . . . . . . 00001011101111000000111100 → +1.0823 11011100111100010100111101 → −0.1315 Keki Burjorjee Explaining Optimization in GAs with UX
  50. 50. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion 4-Bit Stochastic Learning Parities Problem The Game ↓ ↓ ↓ ↓ 10111001011101001111000001 → +0.5689 10001010110110100011110000 → −0.25565 11101100100010111101101110 → −0.37747 01110101001111100000110001 → −0.29589 00000010001000100110011101 → −1.4751 00001100000100010001100000 → −0.234 01000001111000111100110100 → +0.11844 11010001000101000011000110 → +0.31481 00000011100010001100000111 → +1.4435 00100111000111000001110000 → −0.35097 10100011011010101010100001 → +0.62323 00011011101010100010100000 → +0.79905 00101010101100101110100000 → +0.94089 01010110000000110110110011 → −0.99209 11100101000110010110110101 → +0.21204 00011111011101110101000111 → +0.23788 00001111111101011110111010 → −1.0078 . . . . . . . . . . . . 00001011101111000000111100 → +1.0823 11011100111100010100111101 → −0.1315 ↑ ↑ ↑ ↑ Effective Loci Keki Burjorjee Explaining Optimization in GAs with UX
  51. 51. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Solving the 4-Bit Stochastic Learning Parities Problem Expected xi1 x i2 x i3 x i4 Marginal Fitness 0 0 0 0 −0.25 Naive approach 0 0 0 1 +0.25 0 0 1 0 +0.25 Visit loci in groups of four 0 0 0 1 1 0 1 0 −0.25 +0.25 0 1 0 1 −0.25 Check for differentiation in 0 1 1 0 −0.25 0 1 1 1 +0.25 the multivariate marginal 1 0 0 0 +0.25 1 0 0 1 −0.25 fitness values 1 0 1 0 −0.25 1 0 1 1 +0.25 Time complexity is Ω( 4 ) 1 1 0 0 −0.25 1 1 0 1 +0.25 1 1 1 0 +0.25 1 1 1 1 −0.25 Keki Burjorjee Explaining Optimization in GAs with UX
  52. 52. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Solving the 4-Bit Stochastic Learning Parities Problem Expected Expected xj1 xj2 Marginal xj Marginal Fitness Fitness 0 0 0.0 0 0.0 0 1 0.0 1 0.0 1 0 0.0 1 1 0.0 Visiting loci in groups of Expected xj1 xj2 xj3 Marginal three or less won’t work Fitness 0 0 0 0.0 0 0 1 0.0 0 1 0 0.0 0 1 1 0.0 1 0 0 0.0 1 0 1 0.0 1 1 0 0.0 1 1 1 0.0 Keki Burjorjee Explaining Optimization in GAs with UX
  53. 53. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Watch On YouTube Dynamics of a SGA on the 4-Bit Learning Parities problem #Loci=200 Keki Burjorjee Explaining Optimization in GAs with UX
  54. 54. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Symmetry Analytic Conclusion Expected #generations for the red dots to diverge constant w.r.t. Location of the red dots Number of blue dots Dynamics of a blue dot invariant to Location of the red dots Number of blue dots Keki Burjorjee Explaining Optimization in GAs with UX
  55. 55. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Watch On YouTube Dynamics of a SGA on the 4-Bit Learning Parities problem #Loci=1000 Keki Burjorjee Explaining Optimization in GAs with UX
  56. 56. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Watch On YouTube Dynamics of a SGA on the 4-Bit Learning Parities problem #Loci=10000 Keki Burjorjee Explaining Optimization in GAs with UX
  57. 57. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Staircase Functions Staircase function descriptor: (m, n, δ, σ, , L, V )     3 87 · · · 93   1 0 ··· 1    42 58 · · · 72   1 0 ··· 0  L=  m, V =  m     . . . .. . . . . . . . .. . .  . . . .    . . . .   67 73 · · · 81 0 1 ··· 1   n n Staircase function is a stochastic function f : {0, 1} → R 0 δ 2δ (m−1)δ mδ Keki Burjorjee Explaining Optimization in GAs with UX
  58. 58. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Staircase Function 256 192 128 64 64 128 192 256 Keki Burjorjee Explaining Optimization in GAs with UX
  59. 59. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Staircase Function 256 192 128 64 64 128 192 256 Keki Burjorjee Explaining Optimization in GAs with UX
  60. 60. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Staircase Function 256 192 128 64 64 128 192 256 Keki Burjorjee Explaining Optimization in GAs with UX
  61. 61. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Staircase Function 256 192 128 64 64 128 192 256 Keki Burjorjee Explaining Optimization in GAs with UX
  62. 62. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Staircase Functions δ = 0.2, σ = 1 Keki Burjorjee Explaining Optimization in GAs with UX
  63. 63. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 50 m=5 n=4 δ = 0.2 σ=1 1 2 3 4    5 6 7 8  L= 9 10 11 12     13 14 15 16  17 18 19 20 1 1 1 1    1 1 1 1  V = 1 1 1 1     1 1 1 1  1 1 1 1 Watch On YouTube Keki Burjorjee Explaining Optimization in GAs with UX
  64. 64. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 50 m=5 n=4 δ = 0.2 σ=1 1 2 3 4    5 6 7 8  L= 9 10 11 12     13 14 15 16  17 18 19 20 1 1 1 1    1 1 1 1  V = 1 1 1 1    #trials=100. Error bars show standard error.  1 1 1 1  1 1 1 1 Keki Burjorjee Explaining Optimization in GAs with UX
  65. 65. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 50 m=5 n=4 δ = 0.2 σ=1 35 25 37 46    31 5 32 20  L =  21 33 50 42     40 27 30 3  12 14 48 2 1 1 1 1    1 1 1 1  V = 1 1 1 1     1 1 1 1  1 1 1 1 Watch On YouTube Keki Burjorjee Explaining Optimization in GAs with UX
  66. 66. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 5000 m=5 n=4 δ = 0.2 σ=1 L= 2050 4659 1931 3284    2130 2404 205 4418   143 1284 53 437     2061 904 2650 3080  3503 724 4822 4444 1 1 1 1    1 1 1 1  V = 1 1 1 1     1 1 1 1  1 1 1 1 Watch On YouTube Keki Burjorjee Explaining Optimization in GAs with UX
  67. 67. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 5000 m=5 n=4 δ = 0.2 σ=1 L= 2050 4659 1931 3284    2130 2404 205 4418   143 1284 53 437     2061 904 2650 3080  3503 724 4822 4444 0 0 1 0    1 1 0 0  V = 1 1 0 1     1 1 0 1  0 0 1 1 Watch On YouTube Keki Burjorjee Explaining Optimization in GAs with UX
  68. 68. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Prediction & Validation Keki Burjorjee Explaining Optimization in GAs with UX
  69. 69. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 500 m = 125 n=4 δ = 0.2 σ=1 L= 1 2 3 4    5 6 7 8  . . . .   . . . .    . . . .  496 497 498 500 1 1 1 1    1 1 1 1  1 1 1 1   V =   . . . .  . . . .    . . . .  Watch On YouTube 1 1 1 1 Keki Burjorjee Explaining Optimization in GAs with UX
  70. 70. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 500 m = 125 n=4 δ = 0.2 σ=1 L= 1 2 3 4    5 6 7 8  . . . .   . . . .    . . . .  496 497 498 500 1 1 1 1    1 1 1 1  1 1 1 1   V =   . . . .  . . . .    . . . .  1 1 1 1 #trials=10. Error bars show standard error. Keki Burjorjee Explaining Optimization in GAs with UX
  71. 71. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 500 m = 125 n=4 δ = 0.2 σ=1 L= 1 2 3 4    5 6 7 8  . . . .   . . . .    . . . .  496 497 498 500 1 1 1 1    1 1 1 1  1 1 1 1   V =   . . . .  . . . .    . . . .  1 1 1 1 #trials=10. Error bars show standard error. Keki Burjorjee Explaining Optimization in GAs with UX
  72. 72. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 500 m = 125 n=4 δ = 0.2 σ=1 L= 1 2 3 4    5 6 7 8  . . . .   . . . .    . . . .  496 497 498 500 1 1 1 1    1 1 1 1  1 1 1 1   V =   . . . .  . . . .    . . . .  1 1 1 1 #trials=10. Error bars show standard error. Keki Burjorjee Explaining Optimization in GAs with UX
  73. 73. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion = 500 m = 125 n=4 δ = 0.2 σ=1 L= 1 2 3 4    5 6 7 8  . . . .   . . . .    . . . .  496 497 498 500 1 1 1 1    1 1 1 1  1 1 1 1   V =   . . . .  . . . .    . . . .  Watch On YouTube 1 1 1 1 Keki Burjorjee Explaining Optimization in GAs with UX
  74. 74. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Uniform Random MAX-3SAT #variables=1000, #clauses=4000 4000 3950 3900 3850 Satisfied clauses 3800 3750 3700 3650 3600 3550 0 2000 4000 6000 Generations #trials=10. Error bars show standard error. Keki Burjorjee Explaining Optimization in GAs with UX
  75. 75. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Uniform Random MAX-3SAT #variables=1000, #clauses=4000 4000 4000 3950 3950 3900 3900 3850 3850 Satisfied clauses 3800 Satisfied clauses 3800 3750 3750 3700 3700 3650 3650 3600 3600 3550 3550 0 2000 4000 6000 0 2000 4000 6000 Generations Generations #trials=10. Error bars show standard error. Keki Burjorjee Explaining Optimization in GAs with UX
  76. 76. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Uniform Random MAX-3SAT #variables=1000, #clauses=4000 4000 4000 900 3950 3950 800 3900 3900 700 3850 3850 600 Satisfied clauses Satisfied clauses Unmutated loci 3800 3800 500 3750 3750 400 3700 3700 300 3650 3650 200 3600 3600 100 3550 3550 0 0 2000 4000 6000 0 2000 4000 6000 0 2000 4000 6000 Generations Generations Generation #trials=10. Error bars show standard error. Keki Burjorjee Explaining Optimization in GAs with UX
  77. 77. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion SK Spin Glasses System #spins=1000 , fitness(σ) = Jij σi σj , Jij drawn from N (0, 1) drawn from {−1, +1} 1≤i<j≤ 4 x 10 2.5 2 1.5 Fitness 1 0.5 0 0 2000 4000 6000 8000 Generations #trials=10. Error bars show standard error. Keki Burjorjee Explaining Optimization in GAs with UX
  78. 78. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion SK Spin Glasses System #spins=1000 , fitness(σ) = Jij σi σj , Jij drawn from N (0, 1) drawn from {−1, +1} 1≤i<j≤ 4 4 x 10 x 10 2.5 2.5 2 2 1.5 1.5 Fitness Fitness 1 1 0.5 0.5 0 0 0 2000 4000 6000 8000 0 2000 4000 6000 8000 Generations Generations #trials=10. Error bars show standard error. Keki Burjorjee Explaining Optimization in GAs with UX
  79. 79. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion SK Spin Glasses System #spins=1000 , fitness(σ) = Jij σi σj , Jij drawn from N (0, 1) drawn from {−1, +1} 1≤i<j≤ 4 4 x 10 x 10 1000 2.5 2.5 900 800 2 2 700 Unmutated loci 1.5 1.5 600 Fitness Fitness 500 1 1 400 300 0.5 0.5 200 100 0 0 0 0 2000 4000 6000 8000 0 2000 4000 6000 8000 0 2000 4000 6000 8000 Generations Generations Generation #trials=10. Error bars show standard error. Keki Burjorjee Explaining Optimization in GAs with UX
  80. 80. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Conclusion Keki Burjorjee Explaining Optimization in GAs with UX
  81. 81. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Summary Proposed the Hyperclimbing Hypothesis Weak assumptions about the distribution of fitness Based on a computational efficiency of GAs with UX Upfront proof of concept Testable predictions + Validation on Uniform Random MAX-3SAT SK Spin Glasses Problem Keki Burjorjee Explaining Optimization in GAs with UX
  82. 82. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Summary Proposed the Hyperclimbing Hypothesis Weak assumptions about the distribution of fitness Based on a computational efficiency of GAs with UX Upfront proof of concept Testable predictions + Validation on Uniform Random MAX-3SAT SK Spin Glasses Problem Keki Burjorjee Explaining Optimization in GAs with UX
  83. 83. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Summary Proposed the Hyperclimbing Hypothesis Weak assumptions about the distribution of fitness Based on a computational efficiency of GAs with UX Upfront proof of concept Testable predictions + Validation on Uniform Random MAX-3SAT SK Spin Glasses Problem Keki Burjorjee Explaining Optimization in GAs with UX
  84. 84. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Summary Proposed the Hyperclimbing Hypothesis Weak assumptions about the distribution of fitness Based on a computational efficiency of GAs with UX Upfront proof of concept Testable predictions + Validation on Uniform Random MAX-3SAT SK Spin Glasses Problem Keki Burjorjee Explaining Optimization in GAs with UX
  85. 85. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Summary Proposed the Hyperclimbing Hypothesis Weak assumptions about the distribution of fitness Based on a computational efficiency of GAs with UX Upfront proof of concept Testable predictions + Validation on Uniform Random MAX-3SAT SK Spin Glasses Problem Keki Burjorjee Explaining Optimization in GAs with UX
  86. 86. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Corollaries Implicit Parallelism is real Not your grandmother’s implicit parallelism Effect of coarse schema partitions evaluated in parallel NOT the average fitness of short schemata Defining length of a schema partition is immaterial New Implicit parallelism more powerful Think in terms of Hyperscapes not Landscapes Keki Burjorjee Explaining Optimization in GAs with UX
  87. 87. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Corollaries Implicit Parallelism is real Not your grandmother’s implicit parallelism Effect of coarse schema partitions evaluated in parallel NOT the average fitness of short schemata Defining length of a schema partition is immaterial New Implicit parallelism more powerful Think in terms of Hyperscapes not Landscapes Keki Burjorjee Explaining Optimization in GAs with UX
  88. 88. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion What Now? Flesh out the Hypothesis What does a deceptive hyperscape look like? More Testable Predictions + Validation Are staggered coarse conditional effects indeed common? Generalization Explain optimization in other EAs Outreach Identify and efficiently solve problems familiar to theoretical computer scientists Exploitation Construct better representations Choose better parameters for existing EAs Construct better EAs (e.g. backtracking EAs) Keki Burjorjee Explaining Optimization in GAs with UX
  89. 89. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion What Now? Flesh out the Hypothesis What does a deceptive hyperscape look like? More Testable Predictions + Validation Are staggered coarse conditional effects indeed common? Generalization Explain optimization in other EAs Outreach Identify and efficiently solve problems familiar to theoretical computer scientists Exploitation Construct better representations Choose better parameters for existing EAs Construct better EAs (e.g. backtracking EAs) Keki Burjorjee Explaining Optimization in GAs with UX
  90. 90. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion What Now? Flesh out the Hypothesis What does a deceptive hyperscape look like? More Testable Predictions + Validation Are staggered coarse conditional effects indeed common? Generalization Explain optimization in other EAs Outreach Identify and efficiently solve problems familiar to theoretical computer scientists Exploitation Construct better representations Choose better parameters for existing EAs Construct better EAs (e.g. backtracking EAs) Keki Burjorjee Explaining Optimization in GAs with UX
  91. 91. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion What Now? Flesh out the Hypothesis What does a deceptive hyperscape look like? More Testable Predictions + Validation Are staggered coarse conditional effects indeed common? Generalization Explain optimization in other EAs Outreach Identify and efficiently solve problems familiar to theoretical computer scientists Exploitation Construct better representations Choose better parameters for existing EAs Construct better EAs (e.g. backtracking EAs) Keki Burjorjee Explaining Optimization in GAs with UX
  92. 92. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion What Now? Flesh out the Hypothesis What does a deceptive hyperscape look like? More Testable Predictions + Validation Are staggered coarse conditional effects indeed common? Generalization Explain optimization in other EAs Outreach Identify and efficiently solve problems familiar to theoretical computer scientists Exploitation Construct better representations Choose better parameters for existing EAs Construct better EAs (e.g. backtracking EAs) Keki Burjorjee Explaining Optimization in GAs with UX
  93. 93. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion Questions? follow me @evohackr Keki Burjorjee Explaining Optimization in GAs with UX

×