Successfully reported this slideshow.
Upcoming SlideShare
×

# Multi objective optimization and Benchmark functions result

2,042 views

Published on

Implemented Strength Pareto Evolutionary Algorithm (SPEA 2) and Non Dominated Sorting Genetic Algorithm (NSGA II) in MATLAB, Guide Assistant Prof. Divy Kumar, MNNIT Allahabad.
The two algorithms are use to solve multiobjective functions. Tested the algorithms on all the benchmark functions.

Applied both the algorithms to solve Portfolio Optimization satisfying different types of constraints to derive the optimal portfolio.

Published in: Engineering
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

### Multi objective optimization and Benchmark functions result

1. 1. Project Incharge: Mr. Divya Kumar Muti-objective optimization using NSGA II and SPEA2 Team(CS07): 1 Piyush Agarwal 2 Saquib Aftab 3 Ravi Ratan 4 Ravi Shankar 5 Pradhumna Mainali Multi-objective optimization 1/31
2. 2. What we did? Studied genetic algorithm: single objective optimization, multi-objective optimization problems. Implemented NSGA II and Strength Pareto Evolutionary Algorithm (SPEA2) in MATLAB Tested SPEA2 algorithm on all benchmark function Tested NSGA II algorithm on all benchmark function Compared the results Multi-objective optimization 2/31
3. 3. What is Evolutionary Algorithm? Evolutionary algorithms (EAs) are often well-suited for optimization problems involving several, often conﬂicting objectives Evolutionary algorithms typically generate sets of solutions, allowing computation of an approximation of the entire Pareto front SPEA2 and NSGA II are two such Evolutionary Algorithms implemented on multi-objective functions Multi-objective optimization 3/31
4. 4. Life cycle of EA Initialization : Initializing the population in the ﬁrst generation satisfying the bounds and constraints of the problem. Parent Selection : Selection of ﬁttest individuals for the mating pool Recombination: Forming new individuals from the mating pool. Crossover and Mutation is applied to the parents to produce new individuals. Survivor Selection: Fittest individuals of parents and children combined are selected as population for next generation. Multi-objective optimization 4/31
5. 5. Life cycle of EA contd. Figure 1: Life cycle of Evolutionary Algorithm Multi-objective optimization 5/31
6. 6. NSGA II Algorithm Input 1 N (Population Size) 2 P (Population) 3 Q (Oﬀsprings) 4 T (Maximum number of generations) Output 1 A (non-dominated set) Multi-objective optimization 6/31
7. 7. NSGA II Algorithm Contd. 1 Initialization: Generate an initial population P. 2 Mating selection: Perform binary tournament selection with replacement on P in order to ﬁll the mating pool. 3 Variation: Apply recombination and mutation operators to the mating pool and set P to the resulting population and store the result into Q. 4 Non dominated sort: Non dominated sort of P and Q 5 Fronts division: Divide into fronts. Front 0 is non-dominated. 6 New generation: Selection of new population from fronts Multi-objective optimization 7/31
8. 8. Fast non dominated sorting 1 Each population i is compared with every population j. 2 ni is the count of individuals which dominate the ith population. 3 Si is the set of individuals that i dominates. 4 When ni = 0, that means the individual is the best solution and is assigned ﬁrst front. 5 After getting the ﬁrst front,for each individuals in Si, np is decremented by 1 and the next front in obtained like in Step 4. Multi-objective optimization 8/31
9. 9. Generation of NSGA II Figure 2: One generation of NSGA II algorithm Multi-objective optimization 9/31
10. 10. Evolution of individuals on ZDT 3 Function Multi-objective optimization 10/31
11. 11. SPEA2 Algorithm Input 1 N(Population Size) 2 N(Archive size) 3 T (Maximum number of generations) Output 1 A (non-dominated set) Multi-objective optimization 11/31
12. 12. SPEA2 Algorithm contd. 1 Initialization: Generate an initial population P0 and create the empty archive P0 = φ. Set t = 0. 2 Fitness assignment: Calculate ﬁtness values of individuals in Pt and Pt. 3 Environmental selection: Copy all non-dominated individuals in Pt and Pt to Pt+1 keeping the size N. 4 Termination: If t ≥ T or another stopping criterion is satisﬁed then set A to the set of decision vectors in Pt+1. Stop. 5 Mating selection: Perform binary tournament selection with replacement on Pt+1 in order to ﬁll the mating pool. 6 Variation: Apply recombination and mutation operators to the mating pool and set Pt+1 to the resulting population. Increment generation counter (t = t + 1) and go to Step 2 Multi-objective optimization 12/31
13. 13. Evaluation of individuals on Viennet Function Multi-objective optimization 13/31
14. 14. Testing on Benchmark Functions Both the algorithm, NSGA II and SPEA2 are tested on all benchmark functions. Benchmark functions may be convex or non-convex (un-constraint) or can have single or multiple constraints.For all tests on benchmark, the Red graph represents NSGA II curve and Yellow represents SPEA 2 curve. The x-axis represents the First objective function and the y - axis represents Second objective function. Multi-objective optimization 14/31
15. 15. Schaﬀer function N. 1 Minimize =    f1(x) = x2 f2(x) = (x − 2)2 s.t =    −A ≤ x ≤ A 10 ≤ A ≤ 105 Multi-objective optimization 15/31
16. 16. ZDT1 Minimize =    f1(x) = x1 f2(x) = g(x)h(f1(x), g(x)) g(x) = 1 + 9 29 30 i=2 xi h(f1(x), g(x)) = 1 − f1(x) g(x) s.t =    0 ≤ xi ≤ 1 1 ≤ i ≤ 30 Multi-objective optimization 16/31
17. 17. Schaﬀer function N. 2 Minimize =    f1(x) =    −x, if x ≤ 1 x − 2, if 1 ≤ x < 3 4 − x, if 3 ≤ x < 4 x − 4, if 4 ≤ x f2(x) = (x − 5)2 s.t = −5 ≤ x ≤ 10 Multi-objective optimization 17/31
18. 18. ZDT3 Minimize =    f1(x) = x1 f2(x) = g(x)h(f1(x), g(x)) g(x) = 1 + 9 29 30 i=2 xi h(f1(x), g(x)) = 1 − f1(x) g(x) − ( f1(x) g(x) ) sin(10πf1(x)) s.t =    0 ≤ xi ≤ 1 1 ≤ i ≤ 30 Multi-objective optimization 18/31
19. 19. ZDT2 Minimize =    f1(x) = x1 f2(x) = g(x)h(f1(x), g(x)) g(x) = 1 + 9 29 30 i=2 xi h(f1(x), g(x)) = 1 − ( f1(x) g(x) )2 s.t =    0 ≤ xi ≤ 1 1 ≤ i ≤ 30 Multi-objective optimization 19/31
20. 20. Fonseca and Fleming function Minimize =    f1(x) = 1 − exp(− n i=1(xi − 1√ n )2) f2(x) = 1 − exp(− n i=1(xi − 1√ n )2) s.t =    −4 ≤ xi ≤ 4 1 ≤ i ≤ n Multi-objective optimization 20/31
21. 21. Kursawe function Minimize =    f1(x) = 2 i=1[−10 exp(−0.2 x2 i + x2 i+1 )] f2(x) = 3 i=1[|xi|0.8 + 5 sin(x3 i )] √ n)2) s.t =    −5 ≤ xi ≤ 5 1 ≤ i ≤ 3 Multi-objective optimization 21/31
22. 22. Poloni’s two objective function (POL) Minimize =    f1(x, y) = [1 + (A1 − B1(x, y))2 + (A2 − B2(x, y))2 ] f2(x, y) = (x + 3)2 + (y + 1)2 s.t =    A1 = 0.5 sin(1) − 2 cos(1) + sin(2) − 1.5 cos(2) A2 = 1.5 sin(1) − cos(1) + 2 sin(2) − 0.5 cos(2) B1(x, y) = 0.5 sin(x) − 2 cos(x) + sin(y) − 1.5 cos(y) B2(x, y) = 1.5 sin(x) − cos(x) + 2 sin(y) − 0.5 cos(y) −π ≤ x, y ≤ π Multi-objective optimization 22/31
23. 23. Viennet function Minimize =    f1(x, y) = 0.5(x2 + y2) + sin(x2 + y2) f2(x, y) = (3x−2y+4)2 8 + (x−y+1)2 27 + 15 f3(x, y) = 1 x2+y2+1 − 1.1 exp(−(x2 + y2)) s.t = −3 ≤ x, y ≤ 3 Multi-objective optimization 23/31
24. 24. Binh and Korn function Minimize =    f1 x, y = 4x2 + 4y2 f2 x, y = (x − 5)2 + (y − 5)2 s.t =    g1(x, y) = (x − 5)2 + y2 ≤ 25 g2(x, y) = (x − 8)2 + (y + 3)2 ≥ 7.7 0 ≤ x ≤ 5 0 ≤ y ≤ 3 Multi-objective optimization 24/31
25. 25. Chakong and Haimes function Minimize =    f1(x, y) = 2 + (x − 2)2 + (y − 1)2 f2(x, y) = 9x − (y − 1)2 s.t =    g1(x, y) = x2 + y2 ≤ 225 g2(x, y) = x − 3y + 10 ≤ 0 −20 ≤ x, y ≤ 20 Multi-objective optimization 25/31
26. 26. Test Function Minimize =    f1(x) = x2 − y f2(x) = −0.5x − y − 1 s.t =    g1(x, y) = 6.5 − x 6 − y ≥ 0 g2(x, y) = 7.5 − 0.5x − y ≥ 0 g3(x, y) = 30 − 5x − y ≥ 0 −7 ≤ x, y ≤ 4 Multi-objective optimization 26/31
27. 27. Osyczka and Kundu function Minimize =    f1(x) = −25(x1 − 2)2 − (x2 − 2)2 − (x3 − 1)2 − (x4 − 4)2 − (x5 − 1)2 f2(x) = 6 i=1 x2 i    s.t = g1(x) = x1 + x2 − 2 ≥ 0 g2(x) = 6 − x1 − x2 ≥ 0 g3(x) = 2 − x2 + x1 ≥ 0 g4(x) = 2 − x1 + 3x2 ≥ 0 g5(x) = 4 − (x3 − 3)2 − x4 ≥ 0 g6(x) = (x5 − 3)2 + x6 − 4 ≥ 0 0 ≤ x1, x2, x6 ≤ 10 1 ≤ x , x ≤ 5 Multi-objective optimization 27/31
28. 28. Constr-Ex problem function Minimize =    f1(x, y) = x f2(x, y) = 1+y x    s.t = g1(x, y) = y + 9x ≥ 6 g1(x, y) = −y + 9x ≥ 1 0.1 ≤ x ≤ 1 0 ≤ y ≤ 5 Multi-objective optimization 28/31
29. 29. Work in progress One of the application of multi-objective real life problems is Portfolio Optimization. In a portfolio problem with an asset universe of n securities, let xi (i = 1, 2, . . . , n) designate the proportion of initial capital to be allocated to security i. And typically there are two conﬂicting goals Minimize risk. n i=1 n j=1 xiσijxj Maximize proﬁt n i=1 rixi where ri is the expected return of ith security. σij is the co variance between ith and jth security. Multi-objective optimization 29/31
30. 30. Constraints of Portfolio Optimization n i=1 xi = 1 α <= xi <= β dmin <= d <= dmax 5/10/40 rule 0 <= α <= β <= 1 where, α and β are minimum and maximum capital proportion to be allocated to security i respectively. dmin and dmax is the minimum and maximum number of non zero securities in the portfolio respectively. Multi-objective optimization 30/31
31. 31. Conclusion It was derived from the project that the multi-objective evolutionary algorithm can solve multi-objective functions satisfying given sets of constraints. Higher number of generations will lead to better solutions until an upper bound is reached where all solutions tend to converge. Multi-objective optimization algorithms can solve various real life applications by converting them into sets of objective functions and applying constraints. Multi-objective optimization 31/31