Upcoming SlideShare
×

Toward a Natural Genetic / Evolutionary Algorithm for Multiobjective Optimization

1,096 views

Published on

New Genetic or Evolutionary Algorithm for multiobjective optimization, that attempts to find tradeoff solutions and scales easily with increase in parameter space as well as objective space. Does not use complex niche calculation that is used in existing multiobjective genetic algorithms.

Published in: Design
1 Like
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

Views
Total views
1,096
On SlideShare
0
From Embeds
0
Number of Embeds
33
Actions
Shares
0
74
0
Likes
1
Embeds 0
No embeds

No notes for slide

Toward a Natural Genetic / Evolutionary Algorithm for Multiobjective Optimization

1. 1. Toward a NaturalGenetic/Evolutionary Algorithm for Multiobjective Optimization Hariharane Ramasamy Evolutionary Systems Inc., Cupertino, CA. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 1/58
2. 2. Outline 1. Problem Description 2. Practical Examples 3. Classical Algorithms 4. Genetic Algorithms 5. Multiobjective Genetic Algorithms 6. Extended Genetic Algorithms 7. Results 8. Protein Folding—Lattice Model 9. Future Research10. Conclusion Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 2/58
3. 3. Problem Description1. In most practical optimization problems, one deals with simultaneous optimization of multiple objectives that may conﬂict with one another.2. We seek efﬁcient ways to optimize systems speciﬁed by a tuple of parameters (not necessarily numerical). Each tuple determines a system with ﬁtnesses for the individual objectives, this maps the parameter space into the ﬁtness space.3. The best compromise solutions are called the tradeoff or nondominated solutions and form the Pareto front. The parameter set associated with the Pareto front is called the Pareto set Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 3/58
4. 4. An Example Figure 1: Multiobjective problem example with two objectivesThe plot is an example of a biobjective problem (f1 , f2 ) that depends on two parameters (x1 ,x2 ). Assuming minimization, the thick dark line represents the best tradeoff solutions andconstitute the Pareto front. In the above ﬁgure, point O is dominated by all points in theinterior of the quadrant OAB. The lines OA, OB, OC, and OD deﬁne the boundary pointswhere f1 or f2 gets better or worse by moving either horizontally or vertically with respect topoint O. However, the points inside the region OAC and ODB are neither dominated nornondominated by the point O. Each one of them has either f1 or f2 optimal, but not both withrespect to point O. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 4/58
5. 5. Local and Global Pareto FrontA local Pareto front dominates every point in its neighborhood and a global Pareto frontdominates every point in the objective space. In the left plot, A and B are two disjointobjective regions. Assuming minimization of objectives, B contains the global Pareto frontand A the local Pareto front. In the right plot, the curve AOBCD contains the entire objectivespace; OB is A local Pareto front, and the global Pareto front is the union of AO and CD. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 5/58
6. 6. Multiobjective problemThus a multiobjective problem can be stated as: Minimize F (x) = (f1 (x), f2 (x), ..., fm (x)) such that x ∈S x = (x1 , x2 , ..., xn ) S is the feasible parameter space m is the number of objectiveswe seek feasible conﬁgurations, tuples (x1 , x2 ,...,xn ) that maps into points(f1 , f2 ,...,fm ) on objective space, which can’t be improved w.r.t anyobjective without worsening them w.r.t. some other objective. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 6/58
7. 7. Practical Examples1. Protein folding problem2. Multiple knapsack problem Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 7/58
8. 8. Protein Folding Problem Protein FoldThe primary sequence of a protein (left), based on its composition of amino acids, folds intoa unique three dimensional structure (right) under certain physiological conditions. Thefolding is driven by multiple physio-chemical properties like hydrogen bonds, protectinghydrophobes from water- a multiobjective problem. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 8/58
9. 9. Protein Folding—Lattice Model Allowed Moves Buried Hydrophobe Rule . in inThe amino acids in the primary sequence of a protein is connected to each other by theformation of peptide bond that resembles a rectangular lattice. Using a three dimensionalrectangular lattice, protein folding is simulated with move rules deﬁned by the left picture.The right picture deﬁnes the condition under which a Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 9/58 side chain in the rectangular lattice is
10. 10. Multiple Knapscack ProblemProblems such as bin packing, cutting stock, and ﬁnancial managementcan all be modeled as multiple knapsack problems, in which; 1. we are given a set of items with speciﬁc weights and proﬁts and multiple knapsacks with ﬁxed capacity. 2. We have to ﬁll up each knapsack, maximizing its proﬁts and space utilization.We present results for two variations of multiple knapsack problems. 1. In one problem, when an item is selected, it is included in all knapsacks. 2. In the second, the selected item is included in only one knapsack. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 10/58
11. 11. Existing Methods1. Traditional methods2. Genetic algorithm methods Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 11/58
12. 12. Summary of Traditional MethodsName Advantages DisadvantagesWeighted Simple, works well when Pareto Computationally expensive, failsMethods front is simple. with complicated Pareto fronts.ǫ-constraint Works with complicated regions. Success largely depends on theMethod initial solution, which might be selected in an infeasible region.Lexicographic Simple, optimizes objectives se- A limited number of Pareto opti-Approach quentially by predeﬁne priority. mal points are found.Normal Bound- Finds a well-spread Pareto Fails with complicated land-ary Intersec- points. scapes in objective space.tion (NBI) Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 12/58
13. 13. Genetic AlgorithmGenetic algorithm, motivated from natural evolution, repeatedly apply genetic operators on apool of solutions called populations. generate initial population P randomly set new population Pn while desired convergence is achieved in the population do perform crossover with probability pc if the ﬁtness of offspring are better then parents then add the offspring to Pn else decide with very low probability to include offspring in Pn end if perform mutation with probability pm perform reproduction with probability pr if size of(Pn ) ≥ N then replace P with N members from Pn reduce size of Pn by N end if end while Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 13/58
14. 14. Genetic Algorithm—Metrics1. Diversity—Genetic Algorithm often converge to a few dominant solutions and lose diversity. Any further optimization is ineffective since repeated application of operators yield the few converged dominant solutions. Diversity is essential for multiobjective problems.2. Elitism—selecting individuals with a bias to create better individuals is called elitism; selecting the best parents to replace the less ﬁt members drives the population to converge to a few best parents.3. Scalability—Performance of the algorithm should not deteriorate with an increase in the number of objectives and parameters.4. Exploration—Ability of an algorithm to ﬁnd new solutions or reproduce lost solutions is called exploration.5. Exploitation—Ability to retain the current best solutions in the population is called exploitation. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 14/58
15. 15. Niching1. Goldberg and Richardson proposed the initial method to promote the diversity within the population.2. The method used a distance function, called sharing function, which calculates similarities within the population members using a parameter called niche radius in parameter or objective space.3. The members in a crowded neighborhood get their ﬁtness degraded, thus preventing their selection for next generation.4. The sharing function helps to keep the solutions diverse. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 15/58
16. 16. Niching1. Niche radius is difﬁcult to determine. Too large a radius will ignore good tradeoff solutions; too small can consider points that are already dominated in the set.2. When there are more than two objectives, the number of solutions on the Pareto front increases, along with the complexity of the niche calculation.3. In practical problems with more than two objectives, the sharing function has met little success. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 16/58
17. 17. Multiobjective Genetic AlgorithmsName Advantages DisadvantagesPareto Gives a clear description for Does not scale well with problemArchived class of problems that genetic al- dimension.Evolution gorithm should excel.StrategyVector Evalu- Executes by population reshuf- Loses good solutions and doesated Genetic ﬂing using each objective func- not have a good spread of ParetoAlgorithm tion. optimal set.Weight-Based Similar to weighted methods, Fails when the objective functionGenetic Algo- evolves the weight along with is nonconvex.rithm the optimization of the objective function. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 17/58
18. 18. Multiobjective Genetic AlgorithmsName Advantages DisadvantagesNiched Pareto Builds a small nondominated set Success largely depends on theGenetic Algo- along with the algorithm. small nondominated subset.rithmNondominated Niching is performed in the deci- Niching function is too complex,Sorting Ge- sion space. and scales poorly as number ofnetic Algorithm objectives increases.Strength Finds well-spread Pareto points clustering is performed to main-Pareto Evo- by maintaining an external non- tain the size of the externallutionary dominated set of ﬁxed size. Pareto set. This clustering pro-Algorithm cess scales poorly with the size. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 18/58
19. 19. Extended Genetic AlgorithmsWill be published in paper, only results are presented here Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 19/58
20. 20. ResultsWe’ll see that our versions outperformed a large number of existing algorithms, ona battery of tests commonly used in the literature to evaluate optimizationmethods. Speciﬁcally, do our proposed methods : 1. ﬁnd more Pareto front points with ease? 2. effectively move towards Pareto front? 3. explore and exploit the search space effectively? 4. scale with increase in number of objectives and parameters? 5. ﬁnd different Pareto set points that map to one point in objective space? 6. scale well with scattered Pareto set? 7. ﬁnd more that one distinct points in the Pareto set that map to the same point in the Pareto front? 8. have a framework for ﬂexible extension? Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 20/58
21. 21. ResultsWe’ll see that our versions outperformed a large number of existing algorithms, ona battery of tests commonly used in the literature to evaluate optimizationmethods. Speciﬁcally, do our proposed methods : 1. ﬁnd more Pareto front points with ease? 2. effectively move towards Pareto front? 3. explore and exploit the search space effectively? 4. scale with increase in number of objectives and parameters? 5. ﬁnd different Pareto set points that map to one point in objective space? 6. scale well with scattered Pareto set? 7. ﬁnd more that one distinct points in the Pareto set that map to the same point in the Pareto front? 8. have a framework for ﬂexible extension? Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 21/58
22. 22. Schaeffer-1 25 Table 1: Summary of Results—Schaeffer 1 20 Name Pop. Size Total - Pareto front Points Duration NSGA 100 ≤ 30 15 secf2 10 SPEA 100 100 ≤ 30 sec MOEAD 100 100 ≤ 30 sec 5 NSGA 1000 1000 2 min 0 SPEA 1000 1000 8 min 0 5 10 15 20 25 f1 MOEAD 1000 300 11 sec MOEAD 10000 300 11 sec SEGA 100 2541 ≤ 30 sec SEGA 100 18500+ 3 min PEGA 100 608 ≤ 30 sec PEGA 100 900+ 3 min Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 22/58
23. 23. Schaeffer-2 35 Table 2: Summary of Results—Schaeffer 2 30 Name Pop. Size Total - Pareto front Points Duration 25 NSGA 100 100 ≤ 30 20 secf2 15 SPEA 100 100 ≤ 30 10 sec 5 MOEAD 100 100 ≤ 30 0 sec -1 0 1 2 3 4 SEGA 100 3767 ≤ 30 f1 sec PEGA 100 x 2 409 ≤ 30 sec SEGA 100 114000+ 3 min PEGA 100 x 2 2300+ 3 min Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 23/58
24. 24. Viennet Table 3: Viennet—Summary of Results 60 Name Unique PF points Unique PS points 50 NSGA 100 100 40 SPEA 1 1f2 30 MOEAD 40 65 20 SEGA 2394 2405 10 PEGA 3522 3605 0 0 1 2 3 4 5 6 7 8 9 f1 0.2 0.15f3 0.1 0.05 Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 24/58 0
25. 25. Kalyanamoy DTLZ4 - SEGA Population Plot Table 4: Kalayanomoy Scalable Multiobjective Problem SEGA PEGA DTLZ4—Seven Objectives—Summary of Results Name Unique Pareto front points Unique Pareto set points 1 NSGA2 999 999 0.8 SEGA 744 773f3 0.6 0.4 PEGA 3728 4815 0.2 0 1 0.8 0 0.6 0.2 0.4 f2 0.4 0.6 0.2 f1 0.8 10 Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 25/58
26. 26. ResultsWe’ll see that our versions outperformed a large number of existing algorithms, ona battery of tests commonly used in the literature to evaluate optimizationmethods. Speciﬁcally, do our proposed methods : 1. ﬁnd more Pareto front points with ease? 2. effectively move towards Pareto front? 3. explore and exploit the search space effectively? 4. scale with increase in number of objectives and parameters? 5. ﬁnd different Pareto set points that map to one point in objective space? 6. scale well with scattered Pareto set? 7. ﬁnd more that one distinct points in the Pareto set that map to the same point in the Pareto front? 8. have a framework for ﬂexible extension? Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 26/58
27. 27. Pareto front—Progress Knapsack 750-3 - F1 Versus F2 after 500 Generations 35000 1000 30000f2 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
28. 28. Pareto front—Progress Knapsack 750-3 - F1 Versus F2 after 1000 Generations 35000 1000 30000f2 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
29. 29. Pareto front—Progress Knapsack 750-3 - F1 Versus F2 after 1500 Generations 35000 1500 30000f2 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
30. 30. Pareto front—Progress Knapsack 750-3 - F1 Versus F2 after 2000 Generations 35000 2000 30000f2 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
31. 31. Pareto front—Progress Knapsack 750-3 - F1 Versus F2 after 2500 Generations 35000 2500 30000f2 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
32. 32. Pareto front—Progress Knapsack 750-3 - F1 Versus F2 after 3000 Generations 35000 3000 30000f2 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
33. 33. Pareto front—Progress Knapsack 750-3 - F1 Versus F2 after 3300 Generations 35000 3300 30000f2 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
34. 34. Pareto front Knapsack 750-3 - PF F1 Versus F2 after 3300 Generations 35000 500 1000 1500 2000 2500 3000 3300 30000f3 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 28/58
35. 35. Pareto front Knapsack 750-3 - PF F1 Versus F3 after 3300 Generations 35000 500 1000 1500 2000 2500 3000 3300 30000f3 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 28/58
36. 36. Pareto front Knapsack 750-3 - PF F2 Versus F3 after 3300 Generations 35000 500 1000 1500 2000 2500 3000 3300 30000f3 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 28/58
37. 37. ResultsWe’ll see that our versions outperformed a large number of existing algorithms, ona battery of tests commonly used in the literature to evaluate optimizationmethods. Speciﬁcally, do our proposed methods : 1. ﬁnd more Pareto front points with ease? 2. effectively move towards Pareto front? 3. explore and exploit the search space effectively? 4. scale with increase in number of objectives and parameters? 5. ﬁnd different Pareto set points that map to one point in objective space? 6. scale well with scattered Pareto set? 7. ﬁnd more that one distinct points in the Pareto set that map to the same point in the Pareto front? 8. have a framework for ﬂexible extension? Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 29/58
38. 38. Exploration And Exploitation PEGA - F1 Vs F2 Population 35000 F1 F2 F3 F1-F2 F1-F3 F2-F3 F1-F2-F3 30000f2 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 30/58
39. 39. Exploration And Exploitation PEGA - F2 Vs F3 Population 35000 F1 F2 F3 F1-F2 F1-F3 F2-F3 F1-F2-F3 30000f3 Profits 25000 20000 20000 25000 30000 35000 f2 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 30/58
40. 40. Exploration And Exploitation PEGA - F1 Vs F3 Population 35000 F1 F2 F3 F1-F2 F1-F3 F2-F3 F1-F2-F3 30000f3 Profits 25000 20000 20000 25000 30000 35000 f1 Profits Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 30/58
41. 41. ResultsWe’ll see that our versions outperformed a large number of existing algorithms, ona battery of tests commonly used in the literature to evaluate optimizationmethods. Speciﬁcally, do our proposed methods : 1. ﬁnd more Pareto front points with ease? 2. effectively move towards Pareto front? 3. explore and exploit the search space effectively? 4. scale with increase in number of objectives and parameters? 5. ﬁnd different Pareto set points that map to one point in objective space? 6. scale well with scattered Pareto set? 7. have a framework for ﬂexible extension? Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 31/58
42. 42. Scaling with ObjectivesKalyanamoy Scalable Objective Test function - Five Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 32/58
43. 43. Scaling with Objectives DTLZ4—f1 vs f2 —NSGA (left), SEGA (middle), PEGA(right) DTLZ4 - 5 Objectives - PF f1 vs f2 - NSGA2 DTLZ4 - 5 Objectives - PF f1 vs f2 - SEGA DTLZ4 - 5 Objectives - PF f1 vs f2 - PEGA 2 2 2 NSGA2 SEGA PEGA 1.5 1.5 1.5 1 1 1 f2 f2 f2 0.5 0.5 0.5 0 0 0 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 f1 f1 f1 Figure 2: Results of DTLZ4 with ﬁve objectivesAmong the tests we did, only the NSGA results can be compared qualitatively with SEGAand PEGA. The graph clearly show the superior performance of EGAs. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 33/58
44. 44. Scaling with Objectives DTLZ4—f3 vs f4 —NSGA (left), SEGA (middle), PEGA(right) DTLZ4 - 5 Objectives - PF f3 vs f4 - NSGA2 DTLZ4 - 5 Objectives - PF f3 vs f4 - SEGA DTLZ4 - 5 Objectives - PF f3 vs f4 - PEGA 2 2 2 NSGA2 SEGA PEGA 1.5 1.5 1.5 1 1 1 f4 f4 f4 0.5 0.5 0.5 0 0 0 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 f3 f3 f3 Figure 3: Results of DTLZ4 with ﬁve objectivesAmong the tests we did, only the NSGA results can be compared qualitatively with SEGAand PEGA. The graph clearly show the superior performance of EGAs. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 34/58
45. 45. Scaling with Objectives DTLZ4—f1 vs f5 —NSGA (left), SEGA (middle), PEGA(right) DTLZ4 - 5 Objectives - PF f1 vs f5 - NSGA2 DTLZ4 - 5 Objectives - PF f1 vs f5 - SEGA DTLZ4 - 5 Objectives - PF f1 vs f5 - PEGA 2 2 2 NSGA2 SEGA PEGA 1.5 1.5 1.5 1 1 1 f5 f5 f5 0.5 0.5 0.5 0 0 0 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 f1 f1 f1 Figure 4: Results of DTLZ4 with ﬁve objectivesAmong the tests we did, only the NSGA results can be compared qualitatively with SEGAand PEGA. The graph clearly show the superior performance of EGAs. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 35/58
46. 46. Scaling with objectivesKalyanamoy Scalable Objective Test function - Seven Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 36/58
47. 47. Scaling with Objectives DTLZ4—f1 vs f2 —NSGA (left), SEGA (middle), PEGA(right) DTLZ4 - 7 Objectives - PF f1 vs f2 - NSGA2 DTLZ4 - 7 Objectives - PF f1 vs f2 - SEGA DTLZ4 - 7 Objectives - PF f1 vs f2 - PEGA 2 2 2 NSGA2 SEGA PEGA 1.5 1.5 1.5 1 1 1 f2 f2 f2 0.5 0.5 0.5 0 0 0 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 f1 f1 f1 Figure 5: Results of DTLZ4 with seven objectivesAmong the tests we did, only the NSGA results can be compared qualitatively with SEGAand PEGA. The graph clearly show the superior performance of EGAs. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 37/58
48. 48. Scaling with Objectives DTLZ4—f3 vs f4 —NSGA (left), SEGA (middle), PEGA(right) DTLZ4 - 7 Objectives - PF f3 vs f4 - NSGA2 DTLZ4 - 7 Objectives - PF f3 vs f4 - SEGA DTLZ4 - 7 Objectives - PF f3 vs f4 - PEGA 2 2 2 NSGA2 SEGA PEGA 1.5 1.5 1.5 1 1 1 f4 f4 f4 0.5 0.5 0.5 0 0 0 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 f3 f3 f3 Figure 6: Results of DTLZ4 with seven objectivesAmong the tests we did, only the NSGA results can be compared qualitatively with SEGAand PEGA. The graph clearly show the superior performance of EGAs. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 38/58
49. 49. Scaling with Objectives DTLZ4—f5 vs f6 —NSGA (left), SEGA (middle), PEGA(right) DTLZ4 - 7 Objectives - PF f5 vs f6 - NSGA2 DTLZ4 - 7 Objectives - PF f5 vs f6 - SEGA DTLZ4 - 7 Objectives - PF f5 vs f6 - PEGA 2 2 2 NSGA2 SEGA PEGA 1.5 1.5 1.5 1 1 1 f6 f6 f6 0.5 0.5 0.5 0 0 0 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 f5 f5 f5 Figure 7: Results of DTLZ4 with seven objectivesAmong the tests we did, only the NSGA results can be compared qualitatively with SEGAand PEGA. The graph clearly show the superior performance of EGAs. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 39/58
50. 50. Scaling with Objectives DTLZ4—f1 vs f2 —NSGA (left), SEGA (middle), PEGA(right) DTLZ4 - 7 Objectives - PF f1 vs f7 - NSGA2 DTLZ4 - 7 Objectives - PF f1 vs f7 - SEGA DTLZ4 - 7 Objectives - PF f1 vs f7 - PEGA 2 2 2 NSGA2 SEGA PEGA 1.5 1.5 1.5 1 1 1 f7 f7 f7 0.5 0.5 0.5 0 0 0 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 f1 f1 f1 Figure 8: Results of DTLZ4 with seven objectivesAmong the tests we did, only the NSGA results can be compared qualitatively with SEGAand PEGA. The graph clearly show the superior performance of EGAs. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 40/58
51. 51. ResultsWe’ll see that our versions outperformed a large number of existing algorithms, ona battery of tests commonly used in the literature to evaluate optimizationmethods. Speciﬁcally, do our proposed methods : 1. ﬁnd more Pareto front points with ease? 2. effectively move towards Pareto front? 3. explore and exploit the search space effectively? 4. scale with increase in number of objectives and parameters? 5. ﬁnd different Pareto set points that map to one point in objective space? 6. scale well with scattered Pareto set? 7. have a framework for ﬂexible extension? Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 41/58
52. 52. Scaling with Parameter SpaceKalyanamoy Scalable Objective Test function - Five Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 42/58
53. 53. Scaling with Parameter Space DTLZ4—Pareto set (x1 , x2 , x3 )—NSGA (left), SEGA (middle), PEGA(right) DTLZ4 - 5 Objectives - PS (x1,x2,x3) - NSGA2 DTLZ4 - 5 Objectives - PS (x1,x2,x3) - SEGA DTLZ4 - 5 Objectives - PS (x1,x2,x3) - PEGA NSGA2 SEGA PEGA 1 1 1 0.8 0.8 0.8 x3 0.6 x3 0.6 x3 0.6 0.4 0.4 0.4 0.2 0.2 0.2 0 0 0 1 1 1 0.8 0.8 0.8 0 0.6 0 0.6 0 0.6 0.2 0.4 x2 0.2 0.4 x2 0.2 0.4 x2 0.4 0.4 0.4 0.6 0.2 0.6 0.2 0.6 0.2 x1 0.8 x1 0.8 x1 0.8 10 10 10 Figure 9: Results of DTLZ4 with ﬁve objectivesAmong the tests we did, only the NSGA results can be compared qualitatively with SEGAand PEGA. The graph clearly show the superior performance of EGAs. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 43/58
54. 54. Scaling with Parameter SpaceKalyanamoy Scalable Objective Test function - Seven Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 44/58
55. 55. Scaling with Parameter Space DTLZ4—Pareto set (x1 , x2 , x3 )—NSGA (left), SEGA (middle), PEGA(right) DTLZ4 - 7 Objectives - PS (x1, x2, x3) - NSGA2 DTLZ4 - 7 Objectives - PS (x1, x2, x3) - SEGA DTLZ4 - 7 Objectives - PS (x1, x2, x3) - PEGA NSGA2 SEGA PEGA 1 1 1 0.8 0.8 0.8 x3 0.6 x3 0.6 x3 0.6 0.4 0.4 0.4 0.2 0.2 0.2 0 0 0 1 1 1 0.8 0.8 0.8 0 0.6 0 0.6 0 0.6 0.2 0.4 x2 0.2 0.4 x2 0.2 0.4 x2 0.4 0.4 0.4 0.6 0.2 0.6 0.2 0.6 0.2 x1 0.8 x1 0.8 x1 0.8 10 10 10 Figure 10: Results of DTLZ4 with seven objectivesAmong the tests we did, only the NSGA results can be compared qualitatively with SEGAand PEGA. The graph clearly show the superior performance of EGAs. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 45/58
56. 56. Multiple Knapsack Problem(Exclusive) We present results for 750 items and 3 knapsacks Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 46/58
57. 57. Three Knapsacks, 750 Items—Proﬁts SEGA f1 vs f3 PEGA f1 vs f3 Knapsack 750-3 - f1 vs f3 Profits - SEGA Knapsack 750-3 - f1 vs f3 Profits - SEGA 35000 35000 30000 30000 25000 25000 20000 20000 f3 f3 15000 15000 10000 10000 5000 5000 0 0 0 5000 10000 15000 20000 25000 30000 35000 0 5000 10000 15000 20000 25000 30000 35000 f1 f1Figure 11: Proﬁts KP 750-3. The left and right columns show the proﬁts-to-weightsratio obtained by the sequential and parallel extended genetic algorithms for f1 vsf3 . The conﬂicting nature of the objective functions can be easily seen in the plots Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 47/58
58. 58. Three Knapsacks, 750 Items—Items SEGA f1 vs f3 PEGA f1 vs f3 Knapsack 750-3 - f1 vs f3 Items - SEGA Knapsack 750-3 - f1 vs f3 Items - PEGA 750 750 500 500 Items in KS3 Items in KS3 250 250 0 0 0 250 500 750 0 250 500 750 Items in KS1 Items in KS1Figure 12: Proﬁts KP 750-3. The plots show the items obtained by the sequentialand parallel extended genetic algorithms for f1 vs f3 . The conﬂicting nature of theobjective functions can be easily seen in the plots Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 48/58
59. 59. Parameter Space—Exploration Solutions that Map to Similar Sum Knapsack 750-3 - Number of unique solutions 800 Total Items Included in all Knapsacks(SEGA+PEGA) 750 700 650 600 0 25 50 75 100 Number of OccurrencesFigure 13: Item KP 750-3. The plots show the number of solutions assigned to thethree knapsacks that map to the same sum Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 49/58
60. 60. ResultsWe’ll see that our versions outperformed a large number of existing algorithms, ona battery of tests commonly used in the literature to evaluate optimizationmethods. Speciﬁcally, do our proposed methods : 1. ﬁnd more Pareto front points with ease? 2. effectively move towards Pareto front? 3. explore and exploit the search space effectively? 4. scale with increase in number of objectives and parameters? 5. ﬁnd different Pareto set points that map to one point in objective space? 6. scale well with scattered Pareto set? 7. have a framework for ﬂexible extension? Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 50/58
61. 61. Scattered Pareto Set Non Dominated Points/Non Dominated Set F1 versus F2 Corresponding x1, x2, x3 - Set 1 0.8 1 0.6 0.5f2 0.4 x3 0 0.2 -0.5 1 -1 0 0.5 0.2 0.4 0 x 0 -0.5 2 0 0.2 0.4 0.6 0.8 1 x1 0.6 0.8 1 -1 f1 F1 versus F2 Corresponding x1, x2, x3 - Set 1 0.8 1 0.6 0.5f2 0.4 x3 0 0.2 -0.5 1 -1 0 0.5 0.2 0.4 0 x 0 -0.5 2 0 0.2 0.4 0.6 0.8 1 x1 0.6 0.8 1 -1 f1 Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 51/58
62. 62. ResultsWe’ll see that our versions outperformed a large number of existing algorithms, ona battery of tests commonly used in the literature to evaluate optimizationmethods. Speciﬁcally, do our proposed methods : 1. ﬁnd more Pareto front points with ease? 2. effectively move towards Pareto front? 3. explore and exploit the search space effectively? 4. scale with increase in number of objectives and parameters? 5. ﬁnd different Pareto set points that map to one point in objective space? 6. scale well with scattered Pareto set? 7. have a framework for ﬂexible extension? Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 52/58
63. 63. FlexibilityThe results presented so far were ran with the same algorithmwithout altering any parameters.Niching methods require suitable parameter selection to reachPareto front which is totally absent here.Population size is not increased to obtain a larger Pareto set. This isa great advantage as we cannot provide inﬁnite source to hold thepopulation. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 53/58
64. 64. Protein Folding - Interesting Results Helix Beta SheetsFigure 14: Two interesting results obtained by extended genetic algorithms. The red and bluerepresents hydrophobic and hydrophilic amino acids. The two green lattices, connecting thetwo helices, represent the glycine, which has four additional lattice movements. Glycine isoften found near sharp turns in the protein due to its small size. In the right result, every othermember in the sequence is hydrophobic, and hence the algorithm produced a beta sheet thatfolds against itself to bury the hydrophobes Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 54/58
65. 65. Conclusion Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is explored simultaneously by different members in the population. Extended genetic algorithms successfully extended pseudo-parallelism not only in parameter space but also in objective space.The new algorithms leave much room for extension and improvements. A few keyenhancements are mentioned in Future research. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
66. 66. Conclusion Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is explored simultaneously by different members in the population. Extended genetic algorithms successfully extended pseudo-parallelism not only in parameter space but also in objective space. Parallel and sequential extended genetic algorithms represent two different extensions of the genetic algorithm. Both the algorithms were applied for the ﬁrst time to the protein folding problem and were presented in several conferences in the year 1996.The new algorithms leave much room for extension and improvements. A few keyenhancements are mentioned in Future research. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
67. 67. Conclusion Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is explored simultaneously by different members in the population. Extended genetic algorithms successfully extended pseudo-parallelism not only in parameter space but also in objective space. Parallel and sequential extended genetic algorithms represent two different extensions of the genetic algorithm. Both the algorithms were applied for the ﬁrst time to the protein folding problem and were presented in several conferences in the year 1996. Extended genetic algorithms have better performance in exploring objective and parameter space than do existing multiobjecitve genetic algorithms.The new algorithms leave much room for extension and improvements. A few keyenhancements are mentioned in Future research. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
68. 68. Conclusion Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is explored simultaneously by different members in the population. Extended genetic algorithms successfully extended pseudo-parallelism not only in parameter space but also in objective space. Parallel and sequential extended genetic algorithms represent two different extensions of the genetic algorithm. Both the algorithms were applied for the ﬁrst time to the protein folding problem and were presented in several conferences in the year 1996. Extended genetic algorithms have better performance in exploring objective and parameter space than do existing multiobjecitve genetic algorithms. Duplication and transposon operators were introduced for the ﬁrst time in the genetic algorithm.The new algorithms leave much room for extension and improvements. A few keyenhancements are mentioned in Future research. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
69. 69. Conclusion Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is explored simultaneously by different members in the population. Extended genetic algorithms successfully extended pseudo-parallelism not only in parameter space but also in objective space. Parallel and sequential extended genetic algorithms represent two different extensions of the genetic algorithm. Both the algorithms were applied for the ﬁrst time to the protein folding problem and were presented in several conferences in the year 1996. Extended genetic algorithms have better performance in exploring objective and parameter space than do existing multiobjecitve genetic algorithms. Duplication and transposon operators were introduced for the ﬁrst time in the genetic algorithm. Extended genetic algorithms’ performed better with increase in the number of objectives and variables than were the multiobjective genetic algorithms.The new algorithms leave much room for extension and improvements. A few keyenhancements are mentioned in Future research. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
70. 70. Future ResearchNatural Genetic Algorithm—described in the next slide represents the combination ofparallel and sequential extended genetic algorithm. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 56/58
71. 71. Future ResearchNatural Genetic Algorithm—described in the next slide represents the combination ofparallel and sequential extended genetic algorithm.Coding—binary coding is used in most of the problems. The selection of coding dependson the type of problem and the parameters associated with the problem. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 56/58
72. 72. Future ResearchNatural Genetic Algorithm—described in the next slide represents the combination ofparallel and sequential extended genetic algorithm.Coding—binary coding is used in most of the problems. The selection of coding dependson the type of problem and the parameters associated with the problem.Operators—duplication and transposon operators used an arbitrary method. There aredifferent ways the operators can be performed. Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 56/58
73. 73. Future ResearchToward a Natural Genetic / Ramasamy Algorithm Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 57/58
74. 74. Questionsplease email sundar_hariharane @ yahoo.com Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 58/58