E31034043

126 views
94 views

Published on

IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
126
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

E31034043

  1. 1. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-043 Cellular Genetic Algorithm with Density Dependence for Dynamic Optimization Problems K.K.Swamy1, V.Punnarao 2, J. S.N.Jyothi 3 1, 2 sinhgad institute of technology and science 3 usha rama college of engineeringAbstract. For dynamic optimization problems, the restricted in its neighborhood, and each individual isaim of an effective optimization algorithm is both only allowed to genetic operate with the individualsto find the optimal solutions and to track the in its neighborhood. With the distributedoptima over time. In this paper, we advanced two arrangement, CGA has a good performance onkinds of cellular genetic algorithms inspired by maintaining genetic diversity which is important tothe density dependence scheme in ecological find and approximate the dynamic optimum forsystem to solving dynamic optimization problems. DOPs. Hence, CGA is considered to be a significantTwo kinds of improved evolution rules are and meaningful algorithm to solving DOPs.proposed to replace the rule in regular cellular The research on combining ideas from cellulargenetic algorithm, in which null cells are automata with genetic algorithms began inconsidered to the foods of individuals in Manderick and Spiessens‟s work [9]. Over the pastpopulation and the maximum of living individuals decade or so, CGAs have been proven to be effectivein cellular space is limited by their food. for solving many kinds of optimization problemsMoreover, in the second proposed rule, the from both classical and real world settings.competition scheme of the best individuals within Many kinds of improved CGAs were proposed forthe neighborhoods of one individual is also optimizations. Kirley [10] introduced a novelintroduced. The performance of proposed cellular evolutionary algorithm named cellular geneticgenetic algorithms is examined under three algorithm with disturbances inspired by the nature ofdynamic optimization problems with different spatial interactions in ecological systems. Simoncinichange severities. The computation results et al. [11] presented an anisotropic selection schemeindicate that new algorithms demonstrate their for CGA, improved the performance by enhancesuperiority respectively on both convergence and diversity and control the selective pressure. Jansondiversity. and Alba [12] proposed a hierarchical CGA, where the populationKeywords: cellular genetic algorithm, dynamic structure was augmented with a hierarchy accordingoptimization, density dependence scheme to the fitness of individuals. Nebro et al [13] introduced an external archive in CGA to store theIntroduction better solutions, the search experience contained in Most of the optimization problems in the archive were feed backed into algorithm thoughreal world are dynamic optimization problems replacement strategy. Ishibuchi et al. [14] proposed a(DOPs). new CGA with two neighborhood structures: one forEvolutionary algorithms (EAs) have been widely and global elitism, the other for local competition amongsuccessfully applied to solve static optimization neighbors.problems (SOPs). However, the evaluation function, Besides, the theoretical research of CGAs is alsodesign variables, and the constraints are not fixed in active. Giacobini et al. [15] presented a theoreticalDOPs. Hence, for DOPs the aim of an effective study of the selection pressure in asynchronousoptimization problem is not only to find the optimal CGAs with different evolution rules. Alba et al. [16]solution but also to track the optima over time. presented a comparative study of severalIn recent years, there is a growing interest in studying asynchronous policies for updating the population inevolutionary algorithms (EAs) for DOPs, and several CGAs. Zhang [17] researched the evolution rules ofapproaches have been developed, such as increasing optimization algorithm with cellular automata fromdiversity after a change via hyper mutation [1] or the ability of life reproduction and the probability ofrandom immigrants[2], maintaining diversity survival.throughout the run [3,4], memory schemes [5,6], and In this paper we investigate an improved cellularmulti- population approaches [7,8]. genetic algorithm to solving DOPs. Inspired byCellular genetic algorithm (CGA) is a subclass of density dependence scheme in the nature, we proposegenetic algorithms (GAs); it is set up through an a new evolution rules. The paper is structured asorganic combination of evolutionary computation and follows. Section 2 reviews some related work oncellular automata. In CGA, the population is arranged CGA. Section 3 introduces the regular CGA within a given grid, the evolution of each individual is evolution rules. In Section 4, two density dependence 34 | P a g e
  2. 2. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-043schemes are introduced, population control schemeis also discussed, and a cellular genetic algorithmwith density dependence is proposed. Section 5introduces the DOPs chosen and presents theresults of proposed algorithm. Section 6 brieflyexpressed the conclusion of this paper.2. Cellular Genetic Algorithm withEvolution Rules 2.1. Basic conceptsA cellular automaton can be denoted as A  (Ld , S , Nd , f) mathematicall y, in which A is a cellularautomaton; Ld is the cellular space; S is the set ofstates of cell, each cell only has one state such as“living” or “dead”; 2.2. Cellular genetic algorithm with evolutionN d denotes the neighborhood of a cell such ruleas Von. Neumann-type, Moore-type, Ex- The pseudo-code of cellular genetic algorithm withMoore-type; f is the local transfer function which evolution rule is shown in Fig.2. In this algorithm,defines the state of the center cell by the states of each living individual only interacts and geneticits neighbors, and can be called evolution rule. operates with individuals in its neighborhood. TheFig.1 shows the Moore-type in grid, in which a cell fitness value of offspring individuals will bein the small black square is the center cell; cells calculated, if an offspring is better than the centerwithin two squares are the neighborhood of the cell individual, the old center one will be replacedcenter cell; the grey means living, contrarily, the during the next generation. After the geneticwhite means dead. In this paper, the proposed operation, state of each individual will be updatealgorithm uses this type. by the evolution rule.In CGA, the complex optimization problems can besolved by some simple rules. In order to simulatethe biological evolution more effectively, it isimportant to introduce evolution rule of „living‟ or„dead‟ state of cell. In the next loop, the state of acell depends on the states of its neighbors. Thegame of life evolution rule is a typical evolutionrule. The mathematical formula is shown as follow: 3. Proposed Algorithms Among the existing research, most of evolution rules in CGAs are directly introduced from cellular 35 | P a g e
  3. 3. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-043automaton. For these evolution rules, although food in a region is considered to the living densitycomplex system behavior can be obtained by in this region.simple settings, but the interaction between Specially, the living density in the neighborhood ofindividuals and the relationship between evolution an individual xi is called the local living densityscheme and group behavior of individuals had been of the individual which can be denoted as LD(xi ).ignored. Taking Moore-type for instance, the local livingIn nature, the state (“living” or “dead”) of density of the center individual in Fig.1 is 0.667.individual in population depends on the structure of Definition 2: Maximum local living density. Inliving space, the mortality rate and survival rate are a living space, foods for the population aredependent by the density of population in living limited. Hence, the local density is limited inenvironment. When the density is lower, the food isadequate for the population, and the survival rate the scope of the food supplied, the upperincreases. When the density is higher, food supplies limit of the scope is considered as maximumare scarce, intraspecific competition becomes local living density which is denoted as LDMaxseriously and the mortality rate increases. .Inspired by this actual phenomenon, two kinds of In neighborhood of an individual, if the local livinglocal density dependence schemes are introduced in density is more than LDMax , the surplus weakestthis section and cellular genetic algorithms with individuals will dead or escaped from the regiondensity dependence are also proposed. because of the shortage of food. The density dependence scheme in neighborhood of an3.1. Local density dependence scheme within individual is shown in Fig.3 (LDMax is set to 0.8).neighborhood Same as the setting shown in Fig.1, the grey meansThe null cells in grid space are considered as the living, and the white means dead. The regionfood of individuals in population, an individual in a within solid line is the current living space; thecell black cell in it means the surplus weakestmeans the food in this cell is occupied by the individual; the region within dashed box is theindividual. escape area of the surplus individuals.Definition 1: Living density. The ratio between thenumber of living individuals and the number ofWith the effect of density dependence, if there is within neighborhood can be described in Fig.4.more than one null cell in the escape area as After the density dependence operation, theshown in left figure of Fig.3a, the surplus structure of the population and the states of cellsindividual will escape from the former region; corresponding to the individuals will change. Incontrarily, if there is no null cell in the escape the later sections, this operation is named densityarea, the surplus individual will dead as shown in dependence I, is denoted as [P(t),right figure of Fig.3b. Additionally, if a surplus S(t)]=Dependence-I (P(t), S(t)) in which S(t)individual had been escaped from a region, it is means the states of whole cells in grid space.not permitted escape twice; in other words, it Besides local density dependence as previouslywill dead. described, the intraspecific competition alsoThe detail algorithm of local density scheme contains the competition between the best 36 | P a g e
  4. 4. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-043individuals within a living space. For genetic operation into the density dependence scheme toalgorithm and some similar algorithms, premature avoid two or more same structure individualsconvergence is one of the main factors restricted appeared in the neighbourhood of an individual.the optimization performance, and the best The detail setting is also described in Fig.4, and inindividual is replicated many times due to the the later sections, we call this operation by densityeffect of select operation. This is the main reason dependence II, and denote it by [Pto this appearance. Hence, we introduced an adding (t),S(t)]=Dependence-II (P(t),S(t)) .3.2. Population control schemeThe density dependence scheme is acting among the individuals in same generation. Due to the separateeffect of density dependence, the number of living individuals after operation is less than or equals to thenumber before operation. In this section, the scheme of population control is defined.In nature, the population growth is relevant to the abundance of food, and cannot grow unlimited. In grid space,the anticipant number of living individuals in next generation is related to the number of null cells in currentgeneration. We define a mathematical formula as follow to determine the number of living individualsin next generation.where N t and N t+1 are the number of living individuals in t and t+1 generation, N is the number of cells incellular space,  is a rate which control the maximum number of individuals feed,   [LD max ,1] ,especially when   1 means the food in one cell can feed one individual. In the later sections, we denote thisoperation by |P(t+1)|=population-control(|P(t)|). 3.3. Cellular genetic algorithm with density dependence We introduce the local density dependence scheme and population control scheme to the cellular genetic algorithm and propose a new cellular genetic algorithm. The pseudo-code of the new cellular genetic algorithm is shown in Fig.5. 37 | P a g e
  5. 5. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-0434. Experimental Results set to 0.1, 0.3, 0.5, and 0.9,  is set to 25, T is set to random, cyclical and cyclical with noise [6].4.1. Test problems The detail describe of there DOPs are shownYang‟s DOP generator [18] can construct respectively as follow.random dynamic environments from any binary- In static One-Max problem, the fitness value ofencoded individual is assigned by the number of the samestationary function by an exclusive OR operator. bits between individual and the given template.For a static optimization problem f ( x) , create The mathematical formula of the dynamic One- Max is shown as follow:intermediate binary template T  [T1 ,T2 ,TN ]by a designated method, N is the number ofchange. The expression of the DOP in ith where B is the given template, L=100 is the lengthenvironment is shown as follow: of binary code string. It has an optimum fitness of 100 in each environment. Static NK(25,4) problem consist of 25where  denotes the XOR operator. The severityof environmental changes is determined by the contiguous 4-bit building blocks sk , k  1,,25percentage of 1 in the template T which is , the calculation formula isdenoted as s, the frequency is controlled by thegeneration interval between two adjacent changeswhich is denoted as  , the complexity is affectedby the periodically structure of T .Three 100-bit binary functions, denoted One-Max,NK(25, 4) and Deceptive respectively, are selectedas base stationary functions to construct DOPs.Construct test DOPs from these stationaryfunctions by Yang‟s DOP generator, where s is Same to the dynamic One-Max, it has an optimum fitness of 100 in each environment. Static 38 | P a g e
  6. 6. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-043deceptive problem is a fully deceptive problem, operator with rate P m =0.01.it also consist of 25 contiguous 4-bit building The experimental results of convergenceblocks sk , k  1,,25 , the calculation formula is performances of CGA-DI, CGA-DII, CGA-R and ISGA are summarized in Tables I, II, III with the form of average  the standard error. The convergence metric is used to measure the overallwhere xk is the corresponding sub-string with convergence performance of algorithms, which is defined assk in individual x , H ( xk , sk ) denotes thehamming distance between xk and sk , d (*) is amapping function, d (*) is 4,0,1,2,3 respectivelywhen  is 0,1,2,3,4. The dynamic deceptive where n=20 is the total number of runs, K=G/ problem is shown as follow: is the total number of environmental changes, Fij is the optimum of jth change in ith run. Table I, II, III are the results of random, cyclical andIt also has an optimum fitness of 100 in each cyclical with noise DOPS respectively. Fig.6environment. shows the diversity behavior of algorithms in random dynamic environment, Fig.6a shows the4.2. Results and discussion results on dynamic One-Max problem, Fig.6b Three kinds of the dynamic test shows the results on dynamic NK(25,4) problem,optimization problems are optimized respectively Fig.6c shows the results on dynamic deceptiveby four algorithms: problem. The diversity metric measures the extentcellular genetic algorithm with density of diversity achieved among the individuals, whichdependence I (CGA-DI), cellular genetic is defined asalgorithm with density dependence II (CGA-DII),cellular genetic algorithm with evolution rule(CGA-R) and improved simple genetic algorithm(ISGA). Experiments were carried out to comparethe performance of algorithms on the dynamic testenvironments. For all algorithms, the parameters where N is the population size, L is the length ofare set as follows: population size N=100, max chromosome, s c is the cardinality of genotypicgenerations G=500, uniform crossover operator alleles, P lk is the rate of kth genotypic allele appearwith crossover rate P c =0.7, discrete mutation on lth location, the maximum of is ln2. 39 | P a g e
  7. 7. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-0431)The convergence performance of algorithms in ISGA, the F acc reduced with the increase of s,dynamic environments the relative errors of F acc get minimums ofFrom Tables I, II, III, it is interesting to note that 0.65%, 2.13% and 16.15% respectively whenCGA-DI can obtain the best results in dynamic s=0.1, 0.65%, and get maximums of 4.74%,One- Max problems with different severity and 20.12% and 40.03% respectively when s=0.9. Incomplexity among four algorithms; CGA-DII random dynamic NK(25,4) problem, for CGA-DIhas a good performance in dynamic NK(25,4) and CGA-DII the F acc reduced with the increaseand deceptive problems can obtain the best results of s, the relative errors of F acc t get minimums ofin dynamic One-Max problems with different 22.86% and 16.24% when s=0.1, and getseverity and complexity; ISGA is the worst one. maximums of 40.29% and 19.79% when s=0.9;Moreover, for all algorithms, they get the best for CGA-R and ISGA the F acc reduced andperformance in dynamic One-Max problems, andget the worst performance in dynamic NK(25,4) then increased with the increase of s, the relative errors of F acc get minimums of 29.66% andproblems. The detail discussions are shown as 56.11% when s=0.1, and get maximums offollow: 58.98% and 75.77% when s=0.5. In randomIn random dynamic One-Max problem, CGA-DI dynamic deceptive problem, for all algorithmshas the best convergence performance, the the F acc reduced and then increased with therelative errors of F acc with global optimal increase of s, the relative errors of F acc getsolution under 4 different change severities all minimums of 15.42%, 13.27%, 17.33% andlower than 0.5%; for CGA- DII, CGA-R and 34.44% respectively when s=0.9, and get 40 | P a g e
  8. 8. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-043maximums of 9.12%, 16.37%, 28.44% and 46.20% s=0.9, and get maximums of 35.96%, 18.59%,respectively when s=0.5. 45.14% and 51.57% respectively when s=0.5; withThe mutative trends of F acc of algorithms for the introduction of noise, the largest percentagecyclical and cyclical with noise DOPs are similar, decrease with 1.60%, 2.37%, 3.17% and 3.01%and F acc is reduced with the introduction of noise. respectively. In cyclical dynamic deceptiveIn cyclical dynamic One-Max problem, for all problem, for CGA-DII, CGA-R and ISGA, thealgorithms the F acc reduced and then increased F acc reduced and then increased with thewith the increase of s, the relative errors of F acc increase of s, the relative errors of F acc getget minimums of 0.01%, 1.22%, 2.18% and minimums of 12.08%,6.43% respectively when s=0.9, and get 18.50% and 23.93% respectively when s=0.9,maximums of 0.18%, 4.59%, 12.55% and and get maximums of 14.04%, 25.64% and19.58% respectively when s=0.5; with the 31.15% respectively when s=0.5; for CGA-DI, theintroduction of noise, the largest percentage F acc increased and then reduced with the increasedecrease with 0.07%, 0.20%, 1.26% and 3.33% of s, the relative errors of F acc get a minimum ofrespectively. In cyclical dynamic NK(25,4) 16.56% when s=0.5, and get a maximum ofproblem, for all algorithms the F acc reduced and 18.21% when s=0.9; with the introduction ofthen increased with the increase of s, the relative noise, the largest percentage decrease with 1.14%,errors of F acc get minimums of 19.61%, 2.40%, 2.40% and 2.17% respectively.13.98%, 15.05% and 37.75% respectively when 41 | P a g e
  9. 9. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-043 Fig. 6 Diversity behavior of algorithms in random dynamic environments2) The diversity performance of algorithms in when s=0.5.dynamic environmentsFrom Fig.6, it is interesting to note that all the 5. Conclusioncurves of diversity metric change cyclically with In this study, two kinds of evolution rulesthe change of environments; the diversity curve with density dependence for cellular geneticof CGA-DII is better than that of the other algorithm are discussed, and the correspondingalgorithm; the diversity curve of ISGA is worse cellular genetic algorithms with densitythan that of the others. The detail discussions are dependence are proposed. Compared with regularshown as follow: cellular genetic algorithm with evolution rule, newIn random dynamic One-max problems, the algorithms can obtain superior convergence andvariation within two adjacent changes of diversity diversity performance. According to thecurves and the diversity levels of algorithms experiments carried on the dynamic test problemsincreased with the increase of s. For CGA-DI, the selected, CGA-DI can obtain the best results indiversity curve is smooth and near 0.3 when dynamic One-Max problems and CGA-DII has as=0.1, and changes sharply between 0.3 and 0.65 good performance in dynamic NK(25,4) andwhen s=0.9; for CGA-DII, the diversity curve is deceptive problems.smooth and near 0.4 when s=0.1, and changessharply between 0.4 and 0.65 when s=0.9; for 7. ReferencesCGA-R, the diversity curve is locate between 0.1 [1] R. W. Morrison, K. A. De Jong. Triggeredand 0.2 when s=0.1, and has a cyclical change hypermutation revisited. in Proc. of the 2000between Congress on Evolutionary Computation.0.2 and 0.35 when s=0.9; for ISGA, the diversity California. 2000, pp. 1025-1032.curve is locate below 0.1 in each value of s. [2] J. J. Grefenstette. Genetic algorithms forIn random dynamic NK(25,4) problems, the changing environments. in Proc. of the 2nd Int.trend of diversity curves is similar to that in Conf. on Parallel Problem Solving fromOne-Max problems. Differently, the variation Nature. 1992, pp. 137-144.within two adjacent changes of diversity curves [3] K. W. Yeom, J. H. Park. Biologically inspiredbecomes smoother than that in One-Max evolutionary agent systems in dynamicproblems and the diversity levels of algorithms environments. in Proc. of the 2006 Congressbecomes lower than that in One-Max problems, on Evolutionary Computation. Vancouver.such as the diversity curve of CGA-DI changes 2006, pp. 386-390.between 0.3 and 0.5 when s=0.9; for CGA-R, the [4] M. Maury, J. Gouvea, F. R. Aluizio.diversity curve has a change near 0.2 when s=0.9 Diversity-based model reference for geneticIn random deceptive problems, the variation of algorithms in dynamic environment. in Proc.diversity curves increased and then reduced with 2007 Congress on Evolutionary Computing.the increase of s. The change of diversity curves Singapore. 2007, pp. 4639-4645.when s=0.1 is similar to that when s=0.9. The [5] A. Simoes, E. Costa. Variable-size memoryseverity of diversity curves get the most violent evolutionary algorithm to deal with dynamic 42 | P a g e
  10. 10. K.K.Swamy, V.Punnarao, J. S.N.Jyothi / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue 5, September- October 2012, pp.034-043 environments. M. Giacobini et al. Eds. LNCS [17] Y. Zhang, M. Li, Y. Lu. Study on evolution 4448. Springer-Verlag. 2007, pp. 617-626. rules of optimization genetic algorithm with[6] S. Yang, X. Yao. Population-based cellular automata. incremental learning with associative memory for dynamic environments. In Proc. of Congress on Evolutionary Computing. Hong Kong. 2008, pp. 1-20.[7] J. Branke, T. Kaubler, C. Schmidt. A multi- population approach to dynamic optimization problems. in Adaptive Computing in Design and Manufacturing. Berlin. 2000, pp. 299-308.[8] T. Blackwell, J. Branke. Multi-swarm K.Kotaiah swamy received his optimization in dynamic environments. LNCS M.Tech Degree in Computer Science Engineering 3005. Springer-Verlag. 2007, pp. 489-500. in 2009 From Acharya Nagarjuna University. is a[9] B. Manderick, P. Spiessens. Fine-grained well known Author and Excellent teacher, of his parallel genetic algorithms. J. D. Schaffer Eds. 6 years experience he wrote many books, articles in Proc. of the third international conference and research papers .He had published around 2 on genetic algorithms. San Mateo: Morgan International journals and attended many Kaufmann. 1989, pp. 428-433. International conferences. PresentlyWorking as[10] M. Kirley. A cellular genetic algorithm with aAsst.Professor in sinhgad institute of technology disturbances: optimisation using dynamic and science spatial interactions. Journal of Heuristics. 2002, 8(3): 321-342.[11] D. Simoncini, P. Collard, S. Vérel, M. Clergue. From cells to islands: an unified model of cellular parallel genetic algorithms. in Proc. 7th International Conference on Cellular Automata, for Research and Industry. Perpignan, France. 2006, pp. 248-257. V.PUNNARAO received his[12] S. Janson, E. Alba, B. Dorronsoro, M. M.Tech Degree in Computer Science Engineering Middendorf. Hierarchical cellular genetic in 2009 From Andhra University. is a well known algorithm. in Proc. of 6th European Conference Author and Excellent teacher, of his 4 years on Evolutionary Computation in experience he wrote many books and research Combinatorial Optimization. Budapest, papers .He had published 1 International journals Hungary. 2006, pp.111-122. and attended many International conferences.[13] A. Nebro, J. Durillo, F. Luna, B. Dorronsor, E. PresentlyWorking as aAsst.Professor in sinhgad Alb. A MOCell: a cellular genetic algorithm institute of technology and science for multiobjective optimization. International Journal of Intelligent Systems. 2009, 24(7): 726-746.[14] H. Ishibuchi, N. Tsukamoto, Y. Nojima. Use of local ranking in cellular genetic algorithms with two neighborhood structures. in Proc. of 7th International Conference on Simulated Evolution and Learning. Australia: J. S.N.Jyothi received her M.Tech Melbourne. 2008, pp. 309-318. Degree in Computer Science Engineering in 2009[15] M. Giacobini, E. Alba, M. Tomassini. From sathyabama University. is a well known Selection intensity in asynchronous cellular Author and Excellent teacher, of her 4 years evolutionary algorithms. In Proc. 2003 experience she wrote many books, articles and International Conference on Genetic and research papers .. PresentlyWorking as Evolutionary Computation. Chicago. 2003, pp. aAsst.Professor in USHA RAMA COLLEGE OF 955-966. ENGINEERING[16] E. Alba, K. Doerner, B. Dorronsoro. Adapting the savings based ant system for non-stationary vehicle routing problems. in Proc. of First Conference on Metaheuristics and Nature Inspired Computing. Tunisia, Hammamet. 2006. 43 | P a g e

×