SlideShare a Scribd company logo
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
DOI : 10.5121/ijcsa.2012.2108 83
A NON-REVISITING GENETIC ALGORITHM FOR
OPTIMIZING NUMERIC MULTI-DIMENSIONAL
FUNCTIONS
Saroj1
and Devraj2
1
Department of Computer Engineering, Jambheshwar University of Science and
Technology, Hisar, Haryana
ratnoo.saroj@gmail.com
2
Department of Computer Engineering, Jambheshwar University of Science and
Technology, Hisar, Haryana
devraj.kamboj@gmail.com
ABSTRACT
Genetic Algorithm (GA) is a robust and popular stochastic optimization algorithm for large and complex
search spaces. The major shortcomings of Genetic Algorithms are premature convergence and revisits to
individual solutions in the search space. In other words, Genetic algorithm is a revisiting algorithm that
escorts to duplicate function evaluations which is a clear wastage of time and computational resources. In
this paper, a non-revisiting genetic algorithm with adaptive mutation is proposed for the domain of Multi-
Dimensional numeric function optimization. In this algorithm whenever a revisit occurs, the underlined
search point is replaced with a mutated version of the best/random (chosen probabilistically) individual
from the GA population. Furthermore, the recommended approach is not using any extra memory
resources to avoid revisits. To analyze the influence of the method, the proposed non-revisiting algorithm is
evaluated using nine benchmarks functions with two and four dimensions. The performance of the proposed
genetic algorithm is superior as contrasted to simple genetic algorithm as confirmed by the experimental
results.
KEYWORDS
Function optimization, Genetic algorithm, Non-revisiting, Adaptive mutation.
1. INTRODUCTION
Developing new optimization techniques is an active area of research and Genetic Algorithm
(GA) is a relatively new stochastic optimization algorithm pioneered by Holland [1]. A GA is
capable of finding optimal solutions for complex problems in a wide spectrum of applications due
to its global nature. A GA is an iterative procedure that maintains a population of structures that
are candidate solutions to the specific problem under consideration. In each generation, each
individual is evaluated using a fitness function that measures the quality of solution provided by
the individual. Further, genetic operators such as reproduction, crossover and mutation are
applied to find the diverse candidate solutions. In fact, a GA mimics the natural principle of
survival of fittest. A fitness proportionate selection and GA operators ensures the better and better
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
84
fit solutions to emerge in successive generations [2][3][4]. However, GAs are not without
limitations. Two of the main problems are: 1. premature convergence i.e. many a times a GA
converges to some local optimal solution. 2. redundant function evaluations. A simple genetic
algorithm do not memorizes the search points or solutions to the problem that it visits in its life
time and it revisits lots of search points generating duplicate solutions resulting into redundant
fitness computations. Here, a revisit to a search position x is defined as a re-evaluation of a
function of x which has been evaluated before. The problem of revisit is all the more severe
towards the end of a GA run. In many domains, the fitness evaluation is computationally very
expensive and lots of time is wasted in revisiting the parts of the search space and duplicate
function evaluations. The optimization is become more complex when the functions are having
multi-dimensions.
In this paper, we propose an improved GA with adaptive mutation operator to avoid revisits and
redundant fitness evaluations to a large extent. This GA has the elitist approach and retains the
best individual in every new population. A look up for revisits is made only in the current
population along with the population of previous generation. If any individual produced is found
duplicate, it is replaced probabilistically with a mutated version of the best individual or of a
random individual. The mutation operator is adaptive in the sense that its power of exploration
decreases and power of exploitation increases with the number of generations. The proposed
approach demonstrates that the duplicate removal introduces a powerful diversity preservation
mechanism which not only results in better final-population solutions but also avoids premature
convergence. The results are presented for nine benchmark functions with multi-dimensions and
illustrate the effectiveness of duplicate removal through adaptive mutation. The results are
directly compared to a simple GA which is not removing duplicate solutions generated in its
successive generations.
Rest of the paper is organized as below. Section II describes the related work. The proposed non
revisiting Genetic algorithm which removes duplicate individuals through adaptive mutation is
given in section III. Experimental design and results are enlisted in section IV. Section V
concludes the papers and points to the future scope of this work.
2. RELATED WORK
A number of implementations of genetic algorithms have been reported in the literature
[1][2][3][4][5]. Since the inception of genetic algorithms, a number of advanced GA approaches
came up improving the efficiency and efficacy of the results achieved in the domain of function
optimization along with various other domains. Several of these approaches addressed the
problem of premature convergence and revisiting behaviour of genetic algorithms. One of these
advance approaches is devising new GA operators that can be adapted to produce suitable
individuals by considering the state of the current population with respect to global optimal
solution[6][7][8][9]. Srinivas and Patnaik (1994) employed a self adapted GA that achieved
global optima more often than a simple GA for several multimodal functions [8]. In their work,
they have suggested use of adaptive probabilities of crossover and mutation to maintain adequate
diversity to avoid premature convergence. Another significant contribution came from Herrera
and Lozano who suggested heterogeneous gradual distributed real coded genetic algorithms
(RCGAs) for function optimization domain. They devised new fuzzy connective based crossovers
with varying degree of exploration and exploitation which proved very effective to avoid
premature convergence and achieved one of the best reported results for several benchmark
problems of function optimization [9].
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
85
The other important problem with GAs that has been in the recent focus of the GA research
community is redundant function evaluations and revisits. Mauldin [10] was among the first ones
who enhanced the performance of a genetic algorithm by eliminating duplicate genotypes during
a GA run. Mauldin used a uniqueness operator that allowed a new child x to be inserted into the
population if x was greater than a Hamming-distance threshold from all existing population
genotypes. Davis [4] also showed that a binary coded GA for a comparable number of child
evalutions that removes duplicates in the population has superior performance. Eshelman and
Schaffer [11] reconfirmed this observation by using selection based innovations and new GA
operators that prevented revisits. This problem duplicate function evaluations has also been
addressed by providing GA a short/long term memory i.e. the GA stores all the search points
visited and their corresponding fitness into some data structure. In such approaches every time a
new search point is produced by GA, before actually computing its fitness, the memory of GA is
looked into and if this search point exists, its fitness is not recomputed. If the new solution is not
in the memory, its fitness is computed and appended to the memory. Binary search trees, Binary
partition tress, heap, hash tables and cache memory have been used to supplement memory. Such
strategies have resulted in performance enhancement of GAs by eliminating revisits partially or
completely [12][13][14][15]. Recently, Yuen and Chow [16] used a novel binary space
partitioning tree to eliminate the duplicate individuals. Saroj et al. used a heap structure to avoid
the redundant fitness evaluations in domain of rule mining. Their approach proved to be effective
for large datasets. [17]. Though all these duplicate removal method reduce the run time of a GA,
particularly for the applications where objective function is computationally expensive, they
require huge data structures and a significant amount of time is spent for memory look ups. It is
not uncommon for a GA to run for thousands of generations with a population of hundreds of
individuals. If we assume a GA with 100 individuals and 5000 generations, we shall need a data
structure that can store 250000 problem solutions and that is when we assume half the individuals
produced are duplicates. The GA shall require 500000 look ups to avoid redundant fitness
computation. GAs is already considered slow as compared to other optimization techniques and
these approaches further slow down GA’s performance. Clearly this method is successful only in
the domains where fitness computations time is significantly larger than the memory look ups
time and not suitable at all for domain of function optimization where fitness evaluation is
relatively less expensive. Therefore, we have adopted a genetic algorithm approach in this paper
that maintains the diversity through adaptive mutation, avoids revisits to a large extent and that is
without any additional memory overheads.
3. PROPOSED NON-REVISITING GA WITH ADAPTIVE MUTATION
Non-revisiting algorithm is the one which do not visit the search points already visited. The
improved GA with non-revisiting algorithm and adaptive mutation has to perform some extra
steps than a simple GA. These steps are used to found the duplicate individuals. If any duplicate
individual is found then it is mutated and reinserted in the current population. The duplicates are
looked with respect to current and the previous generation only. There is a special condition that
the best individual is preserved and not mutated. The flow chart and algorithm for the proposed
GA is given in Fig. 1 and Fig 2 respectively.
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
86
The mutation applied is Gaussian adaptive mutation. The Gaussian function generates a random
number around zero mean. The formula for mutation is as follows.
(1)rations)total_geneeneration/(current_g*mscale*mshrinkmscalemscale −=
(2)e)rand*mscal(gaussian_best_xx(i) ±=
Figure 1. Step by Step Procedure for the Improved GA
The amount of mutation is controlled by the two parameters mscale and mshrink. Here, mscale
represents the variance for mutation for the first generation and mshrink represents the amount of
shrink in mutation in successive generations. The mutation scale decreases as the number of
generation increase. It is clear from the above formulae that such kind of mutation is exploratory
during the initial runs and exploitative towards the final runs of the GA. We have kept the mscale
as 1.0 and mshrink equals to 0.75.
SelectionStopping
criteria
Output the
best Individual
Crossover
Mutation
New Pop
Replace the
revisited with
mutated
individual in
Mutate
probabilistically the
best/ random
individual
Find Revisits
Initialize population
Evaluate population
Start
Figure 1. Step by Step Procedure for the Improved GA
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
87
Figure 2. Algorithm for improved GA with no-revisit
The proposed non-revisiting GA with adaptive mutation has three key strengths.
1. It automatically assures maintenance of diversity and prevents premature convergence.
Most of the individuals in a GA population are guaranteed to be different. By nature of the
proposed GA, it is impossible for a population to consist of one kind of individuals only.
2. It might not completely eliminate the revisits. However it doesn’t require large data
structure to store the individuals to do a look up for duplicates and only uses the previous
and current populations which are anyway available.
3. It probabilistically takes the advantage of the best individuals and converges faster without
suffering problem of convergence.
4.
4. EXPERIMENTAL RESULTS
4.1. Test Function Set
Nine Benchmarks functions in four dimensions used to test the proposed GA are as follows.
1. Rastrigin’s function
1. Begin
2. Initial condition: Initial population:- old-pop=[], new-pop[], Stopping Flag:-
SF=0, Best Fit=0, Visiting Flag:- VF[]=0
3. Initialize the population in old-pop
4. Evaluate the fitness of old-pop and calculate the overall best individual up to
the current generation.
5. If(stopping criteria=yes)
5.1 SF=1 //set stopping flag
5.2 Output the best individual
5.3 Stop
Else
5.4 Copy the best individual into new-pop
5.5 Perform a GA step by applying GA operators i.e. selection, crossover and
mutation.
5.6 Maintain the old population and store the newly generated individuals in
the new pop.
5.7 For (i=1 to pop-size)
check revisits within the new-pop and with respect to old-pop
5.8 If revisit && not best_individual)
5.9 VF[i]=1
5.10 For(i=1 to pop-size)
5.11 If VF[i]==1
New-pop[i] =mutated (new-Pop[i])
5.12 old-pop=new-pop
Go to step 4
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
88
2. Sphere function
3. Generalized Rosenbrock function
4. Generalized Rastrigin function
5. Griewank’s function
6. Ackley function
7. Rotated Griewank’s function
8. Rotated Weierstrass’s function
9. Branin function
All these function are shown in detail in the Appendix I. The first four functions are unimodal
functions; the next six are multimodal functions designed with a considerable amount of local
minima. The eighth and ninth functions are rotated multimodal functions. The improved GA is
implemented in MATLAB. A comparison is made between a simple GA and the proposed GA on
the basis of mean and the best fitness over the generations of all nine numeric functions with two
and four dimensions. The best fitness is the minimum score of the population. Both the GA’s
stop when there is no improvement in the best score over last fifty generations. The population
size is kept at 20 for all the functions and, the crossover and mutation rates are equal to 0.6 and
0.01 respectively. The normal mutation rate is kept low as adaptive mutation to remove
duplicates is also applied. The best individual or a random individual is mutated with equally
likely (probability = 0.5) to replace the revisited points in the new population. We have used real
encoding, proportion scaling, remainder stochastic selection, one point crossover and Gaussian
mutation. These results are averaged over 20 runs of the GA. A comparison on the basis of mean
and best fitness of the final population of the proposed GA and simple GA for nine numeric
functions with decrement in fitness value is shown Table 1 and 2.
Table 1. Best Fitness of the final populations of SGA and NRGA
Table 2. Mean fitness of the final populations of SGA and NRGA
Benchmarks
Functions
Best Fitness
2 Dimensions
Percentage
Decrement
in Best Fitness
Best Fitness
4 Dimensions
Percentage
Decrement
in Best
Fitness
SGA NRGA SGA NRGA
Rastrigin’s function 1.361 0.323 76.26 1.935 0.528 72.71
Sphere function 0.003 0.002 33.33 0.008 0.004 50.00
Generalized
Rastrigin’s function
2.064 0.054 97.38 4.469 2.156 51.75
Griewank’s function 0.0020 0.0018 10.00 0.0056 0.0027 51.78
Generalized
Rosenbrock function
0.954 0.086 90.98 2.512 0.247 90.16
Ackley function 2.186 0.349 84.03 4.868 1.347 72.32
Rotated Griewank’s
function
0.009 0.001 88.88 0.014 0.005 64.28
Rotated Weierstrass’s
function
2.484 0.046 98.14 3.857 0.765 80.16
Branin function 2.842 0.235 91.73 3.367 1.376 59.13
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
89
Benchmarks
Functions
Mean Fitness
2 Dimensions
Percentage
Decrement
in Mean Fitness
Mean Fitness
4 Dimensions
Percentage
Decrement
in Mean
Fitness
SGA NRGA SGA NRGA
Rastrigin’s function 19.72 12.39 37.17 19.96 13.48 32.46
Sphere function 1.248 1.000 19.87 2.276 1.874 17.66
Generalized
Rastrigins function
13.24 11.45 13.51 22.30 19.82 10.86
Griewank’s function 1.214 1.048 13.67 2.365 2.163 08.54
Generalized
Rosenbrock
function
14.22 10.64 25.17 21.96 18.28 16.75
Ackley function 19.32 11.54 40.26 24.46 17.36 29.02
Rotated Griewank’s
function
1.104 0.514 53.44 1.874 1.265 32.49
Rotated
Weierstrass’s
function
1.947 1.034 46.89 4.143 2.830 31.69
Branin function 14.12 12.05 14.66 18.43 16.36 11.23
Fig. 3. Performance comparison NRGA and
SGA for Rastrigin’s function with two
dimensions
Fig. 4. Performance comparison NRGA and SGA
for Rastrigin’s function with four
dimensions
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
90
Fig. 5. Performance comparison NRGA and
SGA for Generalized Rastrigin’s
function with two dimensions
Fig. 6. Performance comparison NRGA and SGA
for Generalized Rastrigin’s function with
four dimensions
Fig. 7. Performance comparison NRGA and
SGA for Generalized Rosenbrock
function with two dimensions
Fig. 8. Performance comparison NRGA and SGA
for Generalized Rosenbrock function with
four dimensions
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
91
Thus our approach achieves higher accuracy for all the nine benchmark multi-dimensional
numeric functions Fig. 3 to Fig. 10 shows the performance comparison of non-revisiting GA with
simple GA for Rastrigins, Generalized Rastrigins, Rosenbrock, Ackley functions with two and
four dimensions where NRGA approach dominates significantly. It is quite clear from the results
that the performance of the non-revisiting GA with adaptive mutation is superior to the simple
GA.
5 CONCLUSION
In this paper, a novel non-revisiting GA with adaptive mutation is proposed and tested in the
domain of multi-dimensional numeric function optimization. Though novel GA approach may not
completely eliminate the revisits and redundant function evaluation, it assures adequate diversity
to evade the problem of premature convergence. The envisaged approach attains better accuracy
without much overheads of searching time for duplicates individuals and large data structures to
serve as the long term memory for a GA.
The mechanism of a probabilistic adaptive mutation provides the much required balance between
exploration and exploitation along with faster convergence to the optimal. It is exploratory in the
initial runs of GA and exploitative towards the final runs of GA. More the number of generations
of the GA, smaller will be the change in the new individual that replaces the revisited search
point. The experimental results are very encouraging and show that the improved GA is clearly
superior to its conventional counterpart. The adaptation of the current approach is underway for
the domain of rule mining in the field of knowledge discovery.
REFERENCES
[1] Holland, J. H., (1975) Adaptation in Natural and Artificial Systems, Ann Arbor, Michigan Press
[2] Goldberg, D.E., (1989) Genetic Algorithms in Search, Optimization and Machine Learning. Addison-
Wesley, New York
Fig. 9. Performance comparison NRGA and
SGA for Ackley function with two
dimensions
Fig. 10. Performance comparison NRGA and SGA
for Ackley function with four dimensions
function
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
92
[3] Michalewicz, Z., (1999) Genetic Algorithms + Data Structures = Evolution Programs, Springer-
Verlag, Berlin, Heidelberg, Germany
[4] Davis, L., (1991) Handbook of Genetic Algorithms. Van Nostrand Reinhold, New York
[5] De Jong, K. A., (1975) An Analysis of the Behavior of a Class of Genetic Adaptive Systems, PhD
Thesis, University of Michigan, Ann Arbor, MI, USA
[6] Srinivasa, K. G., Venugopal, K. R. & Patnaik, L. M., (2007) “A Self Adaptive Migration Model
Genetic Algorithm for Data Mining Applications. Information Sciences”, 177, pp 4295—4313.
[7] Ono, I., Kita, H., & Kobayashi, S., (1999) “A Robust Real-Coded Genetic Algorithm using Unimodal
Normal Distribution Crossover Augmented by Uniform Crossover: Effects of Self-Adaptation of
Crossover Probabilities”, In Proceedings of the Genetic and Evolutionary Computation
Conference, vol. 1, pp. 496--503.
[8] Srinivas, M. & Patnaik, L.M, (1994) “Adaptive Probabilities of Crossover and Mutation in Genetic
Algorithms. IEEE Transactions on Systems, Man and Cybernetics, pp 656—667.
[9] Herrera, F. & Lozano, M., (2000) “Gradual Distributed Real-Coded Genetic Algorithms”, IEEE
Transactions on Evolutionary Computation, 43--63
[10] Mauldin, M.L., (1984) “Maintaining Diversity in Genetic Search” In Proceeding National Conference
on Artificial Intelligence, pp. 247—250.
[11] Eshelman, L. & Schaffer, J., (1991) “Preventing Premature Convergence in Genetic Algorithms by
Preventing Incest”, In Proceedings of the Fourth International Conference on Genetic Algorithms.
Morgan Kaufmann, San Mateo, CA
[12] Povinelli, R.J. & Feng, X., (1999) “Improving Genetic Algorithms Performance by Hashing Fitness
Values” In Proceedings Artificial Neural Network, pp. 399--404.
[13] Kratica, J., (1999) “Improving the Performances of Genetic Algorithm by Caching” Computer
Artificial Intelligence, pp 271—283.
[14] Friedrich, T., Hebbinghaus N. & Neumann F., (2007) “Rigorous Analysis of Simple Diversity
Mechanisms” In Proc. Genetic Evolutionary Computation Conference, pp. 1219--1225.
[15] Ronald, S., (2011) “Duplicate Genotypes in a Genetic Algorithm” In Proc. IEEE Int. Conf.
Evolutionary Computation, vol. 131, pp. 793-798.
APPENDIX I: BENCHMARKS FUNCTION
1. Rastrigin’s function: f1(x) = 10x+∑ [‫ݔ‬஽
௜ୀଵ 2-10cos (2πx) +10]
Where x [-5.12, 5.12] D
Min f1(x) = f1 ([0, 0….0]) =0
2. Sphere function [14]: f2ሺxሻ = ∑ xୈ
୧ୀଵ 2
Where x [-5.12, 5.12] D
Min f2(x) =f2 ([0, 0….0]) =0
3. Generalized Rosenbrock function: f3(x) = ∑ [100ሺ‫ݔ‬஽ିଵ
௜ୀଵ i+1-xi2)2 + (xi-1)2]
Where x [-5.12, 5.12] D
Min f3(x) = f3 ([1, 1….1]) =0
4. Generalized Rastrigin function: f4(x) = ∑ [‫ݔ‬஽
௜ୀଵ 2-10cos (2πx) +10]
Where x [-5.12, 5.12] D
Min f4(x) = f4 ([0, 0… 0]) =0
5. Griewank’s function: f5(x) =
ଵ
ସ଴଴଴
∑ xୈ
୧ୀଵ 2-∏ ܿ‫ݏ݋‬஽
௜ୀଵ
௫௜
√௜
+ 1
Where f5(x) [-5.12, 5.12] D
Min f5(x) = f5 ([0, 0… 0])
6. Ackley function: f6(x) = -20 exp (-0.2√
ଵ
஽
∑ xୈ
୧ୀଵ i2)-exp (
ଵ
஽
∑ cos2πxୈ
୧ୀଵ i) +20+e
where x [-5.12, 5.12] D
Min f6(x) = f6 ([0, 0… 0]) = 0
International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012
93
7. Rotated Griewank’s function: f7(x) =
ଵ
ସ଴଴଴
∑ zୈ
୧ୀଵ 2-∏ ܿ‫ݏ݋‬஽
௜ୀଵ
௭௜
√௜
+ 1
Where z=xM , f7(x) [-5.12, 5.12]D
Min f7(x) = f7 ([0, 0… 0])
8. Rotated Weierstrass’s function [14]:
f8(x) = ൥
1 ⋯ ‫ܦ‬
⋮ ⋱ ⋮
‫2ܦ‬ − ‫ܦ‬ ⋯ ‫2ܦ‬
൩ //D2=D2
9. Branin function: f9(x) = (x2-
ହ
ସగଶ
x12+
ହ
గ
x1-6)2+10(1 -
ଵ
଼గ
) cosx1+10
Where x [-5.12, 5.12]
Min f9(x) = f9 ([-3.142, 12.275]) =
f9 ([3.142, 2.275])
Authors
Dr. Saroj is an Associate Professor with Department of Computer Science &
Engineering, Guru Jambheshwar University of Science and Technology, Hisar, India.
She has earned her Doctorate degree in Computer Science in 2010 from School of
Computer and Systems Sciences, Jawaharlal Nehru University, New Delhi. Her research
interests include Knowledge Discovery & Data Mining, Evolutionary Algorithms and
Machine learning.
Devraj is a student of M.Tech. in Department of
Computer Science & Engineering, Guru Jambheshwar University of Science and Techn
ology, Hisar, India.

More Related Content

What's hot

Biology-Derived Algorithms in Engineering Optimization
Biology-Derived Algorithms in Engineering OptimizationBiology-Derived Algorithms in Engineering Optimization
Biology-Derived Algorithms in Engineering Optimization
Xin-She Yang
 
Ijcatr04051005
Ijcatr04051005Ijcatr04051005
Ijcatr04051005
Editor IJCATR
 
Improving the effectiveness of information retrieval system using adaptive ge...
Improving the effectiveness of information retrieval system using adaptive ge...Improving the effectiveness of information retrieval system using adaptive ge...
Improving the effectiveness of information retrieval system using adaptive ge...
ijcsit
 
The potential role of ai in the minimisation and mitigation of project delay
The potential role of ai in the minimisation and mitigation of project delayThe potential role of ai in the minimisation and mitigation of project delay
The potential role of ai in the minimisation and mitigation of project delay
Pieter Rautenbach
 
Study of relevancy, diversity, and novelty in recommender systems
Study of relevancy, diversity, and novelty in recommender systemsStudy of relevancy, diversity, and novelty in recommender systems
Study of relevancy, diversity, and novelty in recommender systems
Chemseddine Berbague
 
I045046066
I045046066I045046066
I045046066
IJERA Editor
 
IRJET- Classification of Chemical Medicine or Drug using K Nearest Neighb...
IRJET-  	  Classification of Chemical Medicine or Drug using K Nearest Neighb...IRJET-  	  Classification of Chemical Medicine or Drug using K Nearest Neighb...
IRJET- Classification of Chemical Medicine or Drug using K Nearest Neighb...
IRJET Journal
 
Improvement of genetic algorithm using artificial bee colony
Improvement of genetic algorithm using artificial bee colonyImprovement of genetic algorithm using artificial bee colony
Improvement of genetic algorithm using artificial bee colony
journalBEEI
 
Conference slide
Conference slideConference slide
Conference slide
RaliyaAbubakar
 
Leave one out cross validated Hybrid Model of Genetic Algorithm and Naïve Bay...
Leave one out cross validated Hybrid Model of Genetic Algorithm and Naïve Bay...Leave one out cross validated Hybrid Model of Genetic Algorithm and Naïve Bay...
Leave one out cross validated Hybrid Model of Genetic Algorithm and Naïve Bay...
IJERA Editor
 
Ontologies mining using association rules
Ontologies mining using association rulesOntologies mining using association rules
Ontologies mining using association rules
Chemseddine Berbague
 
DataMining_CA2-4
DataMining_CA2-4DataMining_CA2-4
DataMining_CA2-4
Aravind Kumar
 
Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential E...
Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential E...Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential E...
Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential E...
ijseajournal
 
XPLODIV: An Exploitation-Exploration Aware Diversification Approach for Recom...
XPLODIV: An Exploitation-Exploration Aware Diversification Approach for Recom...XPLODIV: An Exploitation-Exploration Aware Diversification Approach for Recom...
XPLODIV: An Exploitation-Exploration Aware Diversification Approach for Recom...
Andrea Barraza-Urbina
 
New Local Search Strategy in Artificial Bee Colony Algorithm
New Local Search Strategy in Artificial Bee Colony Algorithm New Local Search Strategy in Artificial Bee Colony Algorithm
New Local Search Strategy in Artificial Bee Colony Algorithm
Dr Sandeep Kumar Poonia
 
Eugm 2011 mehta - adaptive designs for phase 3 oncology trials
Eugm 2011   mehta - adaptive designs for phase 3 oncology trialsEugm 2011   mehta - adaptive designs for phase 3 oncology trials
Eugm 2011 mehta - adaptive designs for phase 3 oncology trials
Cytel USA
 
Cukoo srch
Cukoo srchCukoo srch
Cukoo srch
siet_pradeep18
 
AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...
AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...
AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...
csandit
 

What's hot (18)

Biology-Derived Algorithms in Engineering Optimization
Biology-Derived Algorithms in Engineering OptimizationBiology-Derived Algorithms in Engineering Optimization
Biology-Derived Algorithms in Engineering Optimization
 
Ijcatr04051005
Ijcatr04051005Ijcatr04051005
Ijcatr04051005
 
Improving the effectiveness of information retrieval system using adaptive ge...
Improving the effectiveness of information retrieval system using adaptive ge...Improving the effectiveness of information retrieval system using adaptive ge...
Improving the effectiveness of information retrieval system using adaptive ge...
 
The potential role of ai in the minimisation and mitigation of project delay
The potential role of ai in the minimisation and mitigation of project delayThe potential role of ai in the minimisation and mitigation of project delay
The potential role of ai in the minimisation and mitigation of project delay
 
Study of relevancy, diversity, and novelty in recommender systems
Study of relevancy, diversity, and novelty in recommender systemsStudy of relevancy, diversity, and novelty in recommender systems
Study of relevancy, diversity, and novelty in recommender systems
 
I045046066
I045046066I045046066
I045046066
 
IRJET- Classification of Chemical Medicine or Drug using K Nearest Neighb...
IRJET-  	  Classification of Chemical Medicine or Drug using K Nearest Neighb...IRJET-  	  Classification of Chemical Medicine or Drug using K Nearest Neighb...
IRJET- Classification of Chemical Medicine or Drug using K Nearest Neighb...
 
Improvement of genetic algorithm using artificial bee colony
Improvement of genetic algorithm using artificial bee colonyImprovement of genetic algorithm using artificial bee colony
Improvement of genetic algorithm using artificial bee colony
 
Conference slide
Conference slideConference slide
Conference slide
 
Leave one out cross validated Hybrid Model of Genetic Algorithm and Naïve Bay...
Leave one out cross validated Hybrid Model of Genetic Algorithm and Naïve Bay...Leave one out cross validated Hybrid Model of Genetic Algorithm and Naïve Bay...
Leave one out cross validated Hybrid Model of Genetic Algorithm and Naïve Bay...
 
Ontologies mining using association rules
Ontologies mining using association rulesOntologies mining using association rules
Ontologies mining using association rules
 
DataMining_CA2-4
DataMining_CA2-4DataMining_CA2-4
DataMining_CA2-4
 
Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential E...
Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential E...Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential E...
Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential E...
 
XPLODIV: An Exploitation-Exploration Aware Diversification Approach for Recom...
XPLODIV: An Exploitation-Exploration Aware Diversification Approach for Recom...XPLODIV: An Exploitation-Exploration Aware Diversification Approach for Recom...
XPLODIV: An Exploitation-Exploration Aware Diversification Approach for Recom...
 
New Local Search Strategy in Artificial Bee Colony Algorithm
New Local Search Strategy in Artificial Bee Colony Algorithm New Local Search Strategy in Artificial Bee Colony Algorithm
New Local Search Strategy in Artificial Bee Colony Algorithm
 
Eugm 2011 mehta - adaptive designs for phase 3 oncology trials
Eugm 2011   mehta - adaptive designs for phase 3 oncology trialsEugm 2011   mehta - adaptive designs for phase 3 oncology trials
Eugm 2011 mehta - adaptive designs for phase 3 oncology trials
 
Cukoo srch
Cukoo srchCukoo srch
Cukoo srch
 
AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...
AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...
AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...
 

Similar to A Non-Revisiting Genetic Algorithm for Optimizing Numeric Multi-Dimensional Functions

Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
ijscai
 
Predictive job scheduling in a connection limited system using parallel genet...
Predictive job scheduling in a connection limited system using parallel genet...Predictive job scheduling in a connection limited system using parallel genet...
Predictive job scheduling in a connection limited system using parallel genet...
Mumbai Academisc
 
Parallel Evolutionary Algorithms for Feature Selection in High Dimensional Da...
Parallel Evolutionary Algorithms for Feature Selection in High Dimensional Da...Parallel Evolutionary Algorithms for Feature Selection in High Dimensional Da...
Parallel Evolutionary Algorithms for Feature Selection in High Dimensional Da...
IJCSIS Research Publications
 
Survey on evolutionary computation tech techniques and its application in dif...
Survey on evolutionary computation tech techniques and its application in dif...Survey on evolutionary computation tech techniques and its application in dif...
Survey on evolutionary computation tech techniques and its application in dif...
ijitjournal
 
Parallel evolutionary approach paper
Parallel evolutionary approach paperParallel evolutionary approach paper
Parallel evolutionary approach paper
Priti Punia
 
Memetic search in differential evolution algorithm
Memetic search in differential evolution algorithmMemetic search in differential evolution algorithm
Memetic search in differential evolution algorithm
Dr Sandeep Kumar Poonia
 
T01732115119
T01732115119T01732115119
T01732115119
IOSR Journals
 
Artificial Intelligence in Robot Path Planning
Artificial Intelligence in Robot Path PlanningArtificial Intelligence in Robot Path Planning
Artificial Intelligence in Robot Path Planning
iosrjce
 
IJCSI-2015-12-2-10138 (1) (2)
IJCSI-2015-12-2-10138 (1) (2)IJCSI-2015-12-2-10138 (1) (2)
IJCSI-2015-12-2-10138 (1) (2)
Dr Muhannad Al-Hasan
 
Nature Inspired Models And The Semantic Web
Nature Inspired Models And The Semantic WebNature Inspired Models And The Semantic Web
Nature Inspired Models And The Semantic Web
Stefan Ceriu
 
M017127578
M017127578M017127578
M017127578
IOSR Journals
 
Ce25481484
Ce25481484Ce25481484
Ce25481484
IJERA Editor
 
Parallel and distributed genetic algorithm with multiple objectives to impro...
Parallel and distributed genetic algorithm  with multiple objectives to impro...Parallel and distributed genetic algorithm  with multiple objectives to impro...
Parallel and distributed genetic algorithm with multiple objectives to impro...
khalil IBRAHIM
 
Multiple Criteria Decision Making for Hospital Location Allocation Based on I...
Multiple Criteria Decision Making for Hospital Location Allocation Based on I...Multiple Criteria Decision Making for Hospital Location Allocation Based on I...
Multiple Criteria Decision Making for Hospital Location Allocation Based on I...
IRJET Journal
 
Advanced Optimization Techniques
Advanced Optimization TechniquesAdvanced Optimization Techniques
Advanced Optimization Techniques
Valerie Felton
 
Fuzzy Genetic Algorithm
Fuzzy Genetic AlgorithmFuzzy Genetic Algorithm
Fuzzy Genetic Algorithm
Pintu Khan
 
Genetic algorithm fitness function
Genetic algorithm fitness functionGenetic algorithm fitness function
Genetic algorithm fitness function
Prof Ansari
 
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATABINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
ijejournal
 
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATABINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
acijjournal
 
Analysis of selection schemes for solving job shop scheduling problem using g...
Analysis of selection schemes for solving job shop scheduling problem using g...Analysis of selection schemes for solving job shop scheduling problem using g...
Analysis of selection schemes for solving job shop scheduling problem using g...
eSAT Journals
 

Similar to A Non-Revisiting Genetic Algorithm for Optimizing Numeric Multi-Dimensional Functions (20)

Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
Multi-Population Methods with Adaptive Mutation for Multi-Modal Optimization ...
 
Predictive job scheduling in a connection limited system using parallel genet...
Predictive job scheduling in a connection limited system using parallel genet...Predictive job scheduling in a connection limited system using parallel genet...
Predictive job scheduling in a connection limited system using parallel genet...
 
Parallel Evolutionary Algorithms for Feature Selection in High Dimensional Da...
Parallel Evolutionary Algorithms for Feature Selection in High Dimensional Da...Parallel Evolutionary Algorithms for Feature Selection in High Dimensional Da...
Parallel Evolutionary Algorithms for Feature Selection in High Dimensional Da...
 
Survey on evolutionary computation tech techniques and its application in dif...
Survey on evolutionary computation tech techniques and its application in dif...Survey on evolutionary computation tech techniques and its application in dif...
Survey on evolutionary computation tech techniques and its application in dif...
 
Parallel evolutionary approach paper
Parallel evolutionary approach paperParallel evolutionary approach paper
Parallel evolutionary approach paper
 
Memetic search in differential evolution algorithm
Memetic search in differential evolution algorithmMemetic search in differential evolution algorithm
Memetic search in differential evolution algorithm
 
T01732115119
T01732115119T01732115119
T01732115119
 
Artificial Intelligence in Robot Path Planning
Artificial Intelligence in Robot Path PlanningArtificial Intelligence in Robot Path Planning
Artificial Intelligence in Robot Path Planning
 
IJCSI-2015-12-2-10138 (1) (2)
IJCSI-2015-12-2-10138 (1) (2)IJCSI-2015-12-2-10138 (1) (2)
IJCSI-2015-12-2-10138 (1) (2)
 
Nature Inspired Models And The Semantic Web
Nature Inspired Models And The Semantic WebNature Inspired Models And The Semantic Web
Nature Inspired Models And The Semantic Web
 
M017127578
M017127578M017127578
M017127578
 
Ce25481484
Ce25481484Ce25481484
Ce25481484
 
Parallel and distributed genetic algorithm with multiple objectives to impro...
Parallel and distributed genetic algorithm  with multiple objectives to impro...Parallel and distributed genetic algorithm  with multiple objectives to impro...
Parallel and distributed genetic algorithm with multiple objectives to impro...
 
Multiple Criteria Decision Making for Hospital Location Allocation Based on I...
Multiple Criteria Decision Making for Hospital Location Allocation Based on I...Multiple Criteria Decision Making for Hospital Location Allocation Based on I...
Multiple Criteria Decision Making for Hospital Location Allocation Based on I...
 
Advanced Optimization Techniques
Advanced Optimization TechniquesAdvanced Optimization Techniques
Advanced Optimization Techniques
 
Fuzzy Genetic Algorithm
Fuzzy Genetic AlgorithmFuzzy Genetic Algorithm
Fuzzy Genetic Algorithm
 
Genetic algorithm fitness function
Genetic algorithm fitness functionGenetic algorithm fitness function
Genetic algorithm fitness function
 
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATABINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
 
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATABINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
BINARY SINE COSINE ALGORITHMS FOR FEATURE SELECTION FROM MEDICAL DATA
 
Analysis of selection schemes for solving job shop scheduling problem using g...
Analysis of selection schemes for solving job shop scheduling problem using g...Analysis of selection schemes for solving job shop scheduling problem using g...
Analysis of selection schemes for solving job shop scheduling problem using g...
 

Recently uploaded

Introduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - CybersecurityIntroduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - Cybersecurity
mikeeftimakis1
 
Serial Arm Control in Real Time Presentation
Serial Arm Control in Real Time PresentationSerial Arm Control in Real Time Presentation
Serial Arm Control in Real Time Presentation
tolgahangng
 
Infrastructure Challenges in Scaling RAG with Custom AI models
Infrastructure Challenges in Scaling RAG with Custom AI modelsInfrastructure Challenges in Scaling RAG with Custom AI models
Infrastructure Challenges in Scaling RAG with Custom AI models
Zilliz
 
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUHCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
panagenda
 
TrustArc Webinar - 2024 Global Privacy Survey
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc Webinar - 2024 Global Privacy Survey
TrustArc Webinar - 2024 Global Privacy Survey
TrustArc
 
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceAI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
IndexBug
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
Neo4j
 
Artificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopmentArtificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopment
Octavian Nadolu
 
Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1
DianaGray10
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
SOFTTECHHUB
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
KAMESHS29
 
Presentation of the OECD Artificial Intelligence Review of Germany
Presentation of the OECD Artificial Intelligence Review of GermanyPresentation of the OECD Artificial Intelligence Review of Germany
Presentation of the OECD Artificial Intelligence Review of Germany
innovationoecd
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
Safe Software
 
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Speck&Tech
 
Full-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalizationFull-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalization
Zilliz
 
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
Edge AI and Vision Alliance
 
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success StoryDriving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Safe Software
 
Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
Kumud Singh
 
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
SOFTTECHHUB
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
Quotidiano Piemontese
 

Recently uploaded (20)

Introduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - CybersecurityIntroduction to CHERI technology - Cybersecurity
Introduction to CHERI technology - Cybersecurity
 
Serial Arm Control in Real Time Presentation
Serial Arm Control in Real Time PresentationSerial Arm Control in Real Time Presentation
Serial Arm Control in Real Time Presentation
 
Infrastructure Challenges in Scaling RAG with Custom AI models
Infrastructure Challenges in Scaling RAG with Custom AI modelsInfrastructure Challenges in Scaling RAG with Custom AI models
Infrastructure Challenges in Scaling RAG with Custom AI models
 
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUHCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
 
TrustArc Webinar - 2024 Global Privacy Survey
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc Webinar - 2024 Global Privacy Survey
TrustArc Webinar - 2024 Global Privacy Survey
 
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceAI 101: An Introduction to the Basics and Impact of Artificial Intelligence
AI 101: An Introduction to the Basics and Impact of Artificial Intelligence
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
 
Artificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopmentArtificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopment
 
Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1Communications Mining Series - Zero to Hero - Session 1
Communications Mining Series - Zero to Hero - Session 1
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
 
Presentation of the OECD Artificial Intelligence Review of Germany
Presentation of the OECD Artificial Intelligence Review of GermanyPresentation of the OECD Artificial Intelligence Review of Germany
Presentation of the OECD Artificial Intelligence Review of Germany
 
Essentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FMEEssentials of Automations: The Art of Triggers and Actions in FME
Essentials of Automations: The Art of Triggers and Actions in FME
 
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
 
Full-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalizationFull-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalization
 
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
 
Driving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success StoryDriving Business Innovation: Latest Generative AI Advancements & Success Story
Driving Business Innovation: Latest Generative AI Advancements & Success Story
 
Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
 
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
 

A Non-Revisiting Genetic Algorithm for Optimizing Numeric Multi-Dimensional Functions

  • 1. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 DOI : 10.5121/ijcsa.2012.2108 83 A NON-REVISITING GENETIC ALGORITHM FOR OPTIMIZING NUMERIC MULTI-DIMENSIONAL FUNCTIONS Saroj1 and Devraj2 1 Department of Computer Engineering, Jambheshwar University of Science and Technology, Hisar, Haryana ratnoo.saroj@gmail.com 2 Department of Computer Engineering, Jambheshwar University of Science and Technology, Hisar, Haryana devraj.kamboj@gmail.com ABSTRACT Genetic Algorithm (GA) is a robust and popular stochastic optimization algorithm for large and complex search spaces. The major shortcomings of Genetic Algorithms are premature convergence and revisits to individual solutions in the search space. In other words, Genetic algorithm is a revisiting algorithm that escorts to duplicate function evaluations which is a clear wastage of time and computational resources. In this paper, a non-revisiting genetic algorithm with adaptive mutation is proposed for the domain of Multi- Dimensional numeric function optimization. In this algorithm whenever a revisit occurs, the underlined search point is replaced with a mutated version of the best/random (chosen probabilistically) individual from the GA population. Furthermore, the recommended approach is not using any extra memory resources to avoid revisits. To analyze the influence of the method, the proposed non-revisiting algorithm is evaluated using nine benchmarks functions with two and four dimensions. The performance of the proposed genetic algorithm is superior as contrasted to simple genetic algorithm as confirmed by the experimental results. KEYWORDS Function optimization, Genetic algorithm, Non-revisiting, Adaptive mutation. 1. INTRODUCTION Developing new optimization techniques is an active area of research and Genetic Algorithm (GA) is a relatively new stochastic optimization algorithm pioneered by Holland [1]. A GA is capable of finding optimal solutions for complex problems in a wide spectrum of applications due to its global nature. A GA is an iterative procedure that maintains a population of structures that are candidate solutions to the specific problem under consideration. In each generation, each individual is evaluated using a fitness function that measures the quality of solution provided by the individual. Further, genetic operators such as reproduction, crossover and mutation are applied to find the diverse candidate solutions. In fact, a GA mimics the natural principle of survival of fittest. A fitness proportionate selection and GA operators ensures the better and better
  • 2. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 84 fit solutions to emerge in successive generations [2][3][4]. However, GAs are not without limitations. Two of the main problems are: 1. premature convergence i.e. many a times a GA converges to some local optimal solution. 2. redundant function evaluations. A simple genetic algorithm do not memorizes the search points or solutions to the problem that it visits in its life time and it revisits lots of search points generating duplicate solutions resulting into redundant fitness computations. Here, a revisit to a search position x is defined as a re-evaluation of a function of x which has been evaluated before. The problem of revisit is all the more severe towards the end of a GA run. In many domains, the fitness evaluation is computationally very expensive and lots of time is wasted in revisiting the parts of the search space and duplicate function evaluations. The optimization is become more complex when the functions are having multi-dimensions. In this paper, we propose an improved GA with adaptive mutation operator to avoid revisits and redundant fitness evaluations to a large extent. This GA has the elitist approach and retains the best individual in every new population. A look up for revisits is made only in the current population along with the population of previous generation. If any individual produced is found duplicate, it is replaced probabilistically with a mutated version of the best individual or of a random individual. The mutation operator is adaptive in the sense that its power of exploration decreases and power of exploitation increases with the number of generations. The proposed approach demonstrates that the duplicate removal introduces a powerful diversity preservation mechanism which not only results in better final-population solutions but also avoids premature convergence. The results are presented for nine benchmark functions with multi-dimensions and illustrate the effectiveness of duplicate removal through adaptive mutation. The results are directly compared to a simple GA which is not removing duplicate solutions generated in its successive generations. Rest of the paper is organized as below. Section II describes the related work. The proposed non revisiting Genetic algorithm which removes duplicate individuals through adaptive mutation is given in section III. Experimental design and results are enlisted in section IV. Section V concludes the papers and points to the future scope of this work. 2. RELATED WORK A number of implementations of genetic algorithms have been reported in the literature [1][2][3][4][5]. Since the inception of genetic algorithms, a number of advanced GA approaches came up improving the efficiency and efficacy of the results achieved in the domain of function optimization along with various other domains. Several of these approaches addressed the problem of premature convergence and revisiting behaviour of genetic algorithms. One of these advance approaches is devising new GA operators that can be adapted to produce suitable individuals by considering the state of the current population with respect to global optimal solution[6][7][8][9]. Srinivas and Patnaik (1994) employed a self adapted GA that achieved global optima more often than a simple GA for several multimodal functions [8]. In their work, they have suggested use of adaptive probabilities of crossover and mutation to maintain adequate diversity to avoid premature convergence. Another significant contribution came from Herrera and Lozano who suggested heterogeneous gradual distributed real coded genetic algorithms (RCGAs) for function optimization domain. They devised new fuzzy connective based crossovers with varying degree of exploration and exploitation which proved very effective to avoid premature convergence and achieved one of the best reported results for several benchmark problems of function optimization [9].
  • 3. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 85 The other important problem with GAs that has been in the recent focus of the GA research community is redundant function evaluations and revisits. Mauldin [10] was among the first ones who enhanced the performance of a genetic algorithm by eliminating duplicate genotypes during a GA run. Mauldin used a uniqueness operator that allowed a new child x to be inserted into the population if x was greater than a Hamming-distance threshold from all existing population genotypes. Davis [4] also showed that a binary coded GA for a comparable number of child evalutions that removes duplicates in the population has superior performance. Eshelman and Schaffer [11] reconfirmed this observation by using selection based innovations and new GA operators that prevented revisits. This problem duplicate function evaluations has also been addressed by providing GA a short/long term memory i.e. the GA stores all the search points visited and their corresponding fitness into some data structure. In such approaches every time a new search point is produced by GA, before actually computing its fitness, the memory of GA is looked into and if this search point exists, its fitness is not recomputed. If the new solution is not in the memory, its fitness is computed and appended to the memory. Binary search trees, Binary partition tress, heap, hash tables and cache memory have been used to supplement memory. Such strategies have resulted in performance enhancement of GAs by eliminating revisits partially or completely [12][13][14][15]. Recently, Yuen and Chow [16] used a novel binary space partitioning tree to eliminate the duplicate individuals. Saroj et al. used a heap structure to avoid the redundant fitness evaluations in domain of rule mining. Their approach proved to be effective for large datasets. [17]. Though all these duplicate removal method reduce the run time of a GA, particularly for the applications where objective function is computationally expensive, they require huge data structures and a significant amount of time is spent for memory look ups. It is not uncommon for a GA to run for thousands of generations with a population of hundreds of individuals. If we assume a GA with 100 individuals and 5000 generations, we shall need a data structure that can store 250000 problem solutions and that is when we assume half the individuals produced are duplicates. The GA shall require 500000 look ups to avoid redundant fitness computation. GAs is already considered slow as compared to other optimization techniques and these approaches further slow down GA’s performance. Clearly this method is successful only in the domains where fitness computations time is significantly larger than the memory look ups time and not suitable at all for domain of function optimization where fitness evaluation is relatively less expensive. Therefore, we have adopted a genetic algorithm approach in this paper that maintains the diversity through adaptive mutation, avoids revisits to a large extent and that is without any additional memory overheads. 3. PROPOSED NON-REVISITING GA WITH ADAPTIVE MUTATION Non-revisiting algorithm is the one which do not visit the search points already visited. The improved GA with non-revisiting algorithm and adaptive mutation has to perform some extra steps than a simple GA. These steps are used to found the duplicate individuals. If any duplicate individual is found then it is mutated and reinserted in the current population. The duplicates are looked with respect to current and the previous generation only. There is a special condition that the best individual is preserved and not mutated. The flow chart and algorithm for the proposed GA is given in Fig. 1 and Fig 2 respectively.
  • 4. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 86 The mutation applied is Gaussian adaptive mutation. The Gaussian function generates a random number around zero mean. The formula for mutation is as follows. (1)rations)total_geneeneration/(current_g*mscale*mshrinkmscalemscale −= (2)e)rand*mscal(gaussian_best_xx(i) ±= Figure 1. Step by Step Procedure for the Improved GA The amount of mutation is controlled by the two parameters mscale and mshrink. Here, mscale represents the variance for mutation for the first generation and mshrink represents the amount of shrink in mutation in successive generations. The mutation scale decreases as the number of generation increase. It is clear from the above formulae that such kind of mutation is exploratory during the initial runs and exploitative towards the final runs of the GA. We have kept the mscale as 1.0 and mshrink equals to 0.75. SelectionStopping criteria Output the best Individual Crossover Mutation New Pop Replace the revisited with mutated individual in Mutate probabilistically the best/ random individual Find Revisits Initialize population Evaluate population Start Figure 1. Step by Step Procedure for the Improved GA
  • 5. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 87 Figure 2. Algorithm for improved GA with no-revisit The proposed non-revisiting GA with adaptive mutation has three key strengths. 1. It automatically assures maintenance of diversity and prevents premature convergence. Most of the individuals in a GA population are guaranteed to be different. By nature of the proposed GA, it is impossible for a population to consist of one kind of individuals only. 2. It might not completely eliminate the revisits. However it doesn’t require large data structure to store the individuals to do a look up for duplicates and only uses the previous and current populations which are anyway available. 3. It probabilistically takes the advantage of the best individuals and converges faster without suffering problem of convergence. 4. 4. EXPERIMENTAL RESULTS 4.1. Test Function Set Nine Benchmarks functions in four dimensions used to test the proposed GA are as follows. 1. Rastrigin’s function 1. Begin 2. Initial condition: Initial population:- old-pop=[], new-pop[], Stopping Flag:- SF=0, Best Fit=0, Visiting Flag:- VF[]=0 3. Initialize the population in old-pop 4. Evaluate the fitness of old-pop and calculate the overall best individual up to the current generation. 5. If(stopping criteria=yes) 5.1 SF=1 //set stopping flag 5.2 Output the best individual 5.3 Stop Else 5.4 Copy the best individual into new-pop 5.5 Perform a GA step by applying GA operators i.e. selection, crossover and mutation. 5.6 Maintain the old population and store the newly generated individuals in the new pop. 5.7 For (i=1 to pop-size) check revisits within the new-pop and with respect to old-pop 5.8 If revisit && not best_individual) 5.9 VF[i]=1 5.10 For(i=1 to pop-size) 5.11 If VF[i]==1 New-pop[i] =mutated (new-Pop[i]) 5.12 old-pop=new-pop Go to step 4
  • 6. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 88 2. Sphere function 3. Generalized Rosenbrock function 4. Generalized Rastrigin function 5. Griewank’s function 6. Ackley function 7. Rotated Griewank’s function 8. Rotated Weierstrass’s function 9. Branin function All these function are shown in detail in the Appendix I. The first four functions are unimodal functions; the next six are multimodal functions designed with a considerable amount of local minima. The eighth and ninth functions are rotated multimodal functions. The improved GA is implemented in MATLAB. A comparison is made between a simple GA and the proposed GA on the basis of mean and the best fitness over the generations of all nine numeric functions with two and four dimensions. The best fitness is the minimum score of the population. Both the GA’s stop when there is no improvement in the best score over last fifty generations. The population size is kept at 20 for all the functions and, the crossover and mutation rates are equal to 0.6 and 0.01 respectively. The normal mutation rate is kept low as adaptive mutation to remove duplicates is also applied. The best individual or a random individual is mutated with equally likely (probability = 0.5) to replace the revisited points in the new population. We have used real encoding, proportion scaling, remainder stochastic selection, one point crossover and Gaussian mutation. These results are averaged over 20 runs of the GA. A comparison on the basis of mean and best fitness of the final population of the proposed GA and simple GA for nine numeric functions with decrement in fitness value is shown Table 1 and 2. Table 1. Best Fitness of the final populations of SGA and NRGA Table 2. Mean fitness of the final populations of SGA and NRGA Benchmarks Functions Best Fitness 2 Dimensions Percentage Decrement in Best Fitness Best Fitness 4 Dimensions Percentage Decrement in Best Fitness SGA NRGA SGA NRGA Rastrigin’s function 1.361 0.323 76.26 1.935 0.528 72.71 Sphere function 0.003 0.002 33.33 0.008 0.004 50.00 Generalized Rastrigin’s function 2.064 0.054 97.38 4.469 2.156 51.75 Griewank’s function 0.0020 0.0018 10.00 0.0056 0.0027 51.78 Generalized Rosenbrock function 0.954 0.086 90.98 2.512 0.247 90.16 Ackley function 2.186 0.349 84.03 4.868 1.347 72.32 Rotated Griewank’s function 0.009 0.001 88.88 0.014 0.005 64.28 Rotated Weierstrass’s function 2.484 0.046 98.14 3.857 0.765 80.16 Branin function 2.842 0.235 91.73 3.367 1.376 59.13
  • 7. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 89 Benchmarks Functions Mean Fitness 2 Dimensions Percentage Decrement in Mean Fitness Mean Fitness 4 Dimensions Percentage Decrement in Mean Fitness SGA NRGA SGA NRGA Rastrigin’s function 19.72 12.39 37.17 19.96 13.48 32.46 Sphere function 1.248 1.000 19.87 2.276 1.874 17.66 Generalized Rastrigins function 13.24 11.45 13.51 22.30 19.82 10.86 Griewank’s function 1.214 1.048 13.67 2.365 2.163 08.54 Generalized Rosenbrock function 14.22 10.64 25.17 21.96 18.28 16.75 Ackley function 19.32 11.54 40.26 24.46 17.36 29.02 Rotated Griewank’s function 1.104 0.514 53.44 1.874 1.265 32.49 Rotated Weierstrass’s function 1.947 1.034 46.89 4.143 2.830 31.69 Branin function 14.12 12.05 14.66 18.43 16.36 11.23 Fig. 3. Performance comparison NRGA and SGA for Rastrigin’s function with two dimensions Fig. 4. Performance comparison NRGA and SGA for Rastrigin’s function with four dimensions
  • 8. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 90 Fig. 5. Performance comparison NRGA and SGA for Generalized Rastrigin’s function with two dimensions Fig. 6. Performance comparison NRGA and SGA for Generalized Rastrigin’s function with four dimensions Fig. 7. Performance comparison NRGA and SGA for Generalized Rosenbrock function with two dimensions Fig. 8. Performance comparison NRGA and SGA for Generalized Rosenbrock function with four dimensions
  • 9. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 91 Thus our approach achieves higher accuracy for all the nine benchmark multi-dimensional numeric functions Fig. 3 to Fig. 10 shows the performance comparison of non-revisiting GA with simple GA for Rastrigins, Generalized Rastrigins, Rosenbrock, Ackley functions with two and four dimensions where NRGA approach dominates significantly. It is quite clear from the results that the performance of the non-revisiting GA with adaptive mutation is superior to the simple GA. 5 CONCLUSION In this paper, a novel non-revisiting GA with adaptive mutation is proposed and tested in the domain of multi-dimensional numeric function optimization. Though novel GA approach may not completely eliminate the revisits and redundant function evaluation, it assures adequate diversity to evade the problem of premature convergence. The envisaged approach attains better accuracy without much overheads of searching time for duplicates individuals and large data structures to serve as the long term memory for a GA. The mechanism of a probabilistic adaptive mutation provides the much required balance between exploration and exploitation along with faster convergence to the optimal. It is exploratory in the initial runs of GA and exploitative towards the final runs of GA. More the number of generations of the GA, smaller will be the change in the new individual that replaces the revisited search point. The experimental results are very encouraging and show that the improved GA is clearly superior to its conventional counterpart. The adaptation of the current approach is underway for the domain of rule mining in the field of knowledge discovery. REFERENCES [1] Holland, J. H., (1975) Adaptation in Natural and Artificial Systems, Ann Arbor, Michigan Press [2] Goldberg, D.E., (1989) Genetic Algorithms in Search, Optimization and Machine Learning. Addison- Wesley, New York Fig. 9. Performance comparison NRGA and SGA for Ackley function with two dimensions Fig. 10. Performance comparison NRGA and SGA for Ackley function with four dimensions function
  • 10. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 92 [3] Michalewicz, Z., (1999) Genetic Algorithms + Data Structures = Evolution Programs, Springer- Verlag, Berlin, Heidelberg, Germany [4] Davis, L., (1991) Handbook of Genetic Algorithms. Van Nostrand Reinhold, New York [5] De Jong, K. A., (1975) An Analysis of the Behavior of a Class of Genetic Adaptive Systems, PhD Thesis, University of Michigan, Ann Arbor, MI, USA [6] Srinivasa, K. G., Venugopal, K. R. & Patnaik, L. M., (2007) “A Self Adaptive Migration Model Genetic Algorithm for Data Mining Applications. Information Sciences”, 177, pp 4295—4313. [7] Ono, I., Kita, H., & Kobayashi, S., (1999) “A Robust Real-Coded Genetic Algorithm using Unimodal Normal Distribution Crossover Augmented by Uniform Crossover: Effects of Self-Adaptation of Crossover Probabilities”, In Proceedings of the Genetic and Evolutionary Computation Conference, vol. 1, pp. 496--503. [8] Srinivas, M. & Patnaik, L.M, (1994) “Adaptive Probabilities of Crossover and Mutation in Genetic Algorithms. IEEE Transactions on Systems, Man and Cybernetics, pp 656—667. [9] Herrera, F. & Lozano, M., (2000) “Gradual Distributed Real-Coded Genetic Algorithms”, IEEE Transactions on Evolutionary Computation, 43--63 [10] Mauldin, M.L., (1984) “Maintaining Diversity in Genetic Search” In Proceeding National Conference on Artificial Intelligence, pp. 247—250. [11] Eshelman, L. & Schaffer, J., (1991) “Preventing Premature Convergence in Genetic Algorithms by Preventing Incest”, In Proceedings of the Fourth International Conference on Genetic Algorithms. Morgan Kaufmann, San Mateo, CA [12] Povinelli, R.J. & Feng, X., (1999) “Improving Genetic Algorithms Performance by Hashing Fitness Values” In Proceedings Artificial Neural Network, pp. 399--404. [13] Kratica, J., (1999) “Improving the Performances of Genetic Algorithm by Caching” Computer Artificial Intelligence, pp 271—283. [14] Friedrich, T., Hebbinghaus N. & Neumann F., (2007) “Rigorous Analysis of Simple Diversity Mechanisms” In Proc. Genetic Evolutionary Computation Conference, pp. 1219--1225. [15] Ronald, S., (2011) “Duplicate Genotypes in a Genetic Algorithm” In Proc. IEEE Int. Conf. Evolutionary Computation, vol. 131, pp. 793-798. APPENDIX I: BENCHMARKS FUNCTION 1. Rastrigin’s function: f1(x) = 10x+∑ [‫ݔ‬஽ ௜ୀଵ 2-10cos (2πx) +10] Where x [-5.12, 5.12] D Min f1(x) = f1 ([0, 0….0]) =0 2. Sphere function [14]: f2ሺxሻ = ∑ xୈ ୧ୀଵ 2 Where x [-5.12, 5.12] D Min f2(x) =f2 ([0, 0….0]) =0 3. Generalized Rosenbrock function: f3(x) = ∑ [100ሺ‫ݔ‬஽ିଵ ௜ୀଵ i+1-xi2)2 + (xi-1)2] Where x [-5.12, 5.12] D Min f3(x) = f3 ([1, 1….1]) =0 4. Generalized Rastrigin function: f4(x) = ∑ [‫ݔ‬஽ ௜ୀଵ 2-10cos (2πx) +10] Where x [-5.12, 5.12] D Min f4(x) = f4 ([0, 0… 0]) =0 5. Griewank’s function: f5(x) = ଵ ସ଴଴଴ ∑ xୈ ୧ୀଵ 2-∏ ܿ‫ݏ݋‬஽ ௜ୀଵ ௫௜ √௜ + 1 Where f5(x) [-5.12, 5.12] D Min f5(x) = f5 ([0, 0… 0]) 6. Ackley function: f6(x) = -20 exp (-0.2√ ଵ ஽ ∑ xୈ ୧ୀଵ i2)-exp ( ଵ ஽ ∑ cos2πxୈ ୧ୀଵ i) +20+e where x [-5.12, 5.12] D Min f6(x) = f6 ([0, 0… 0]) = 0
  • 11. International Journal on Computational Sciences & Applications (IJCSA) Vo2, No.1, February 2012 93 7. Rotated Griewank’s function: f7(x) = ଵ ସ଴଴଴ ∑ zୈ ୧ୀଵ 2-∏ ܿ‫ݏ݋‬஽ ௜ୀଵ ௭௜ √௜ + 1 Where z=xM , f7(x) [-5.12, 5.12]D Min f7(x) = f7 ([0, 0… 0]) 8. Rotated Weierstrass’s function [14]: f8(x) = ൥ 1 ⋯ ‫ܦ‬ ⋮ ⋱ ⋮ ‫2ܦ‬ − ‫ܦ‬ ⋯ ‫2ܦ‬ ൩ //D2=D2 9. Branin function: f9(x) = (x2- ହ ସగଶ x12+ ହ గ x1-6)2+10(1 - ଵ ଼గ ) cosx1+10 Where x [-5.12, 5.12] Min f9(x) = f9 ([-3.142, 12.275]) = f9 ([3.142, 2.275]) Authors Dr. Saroj is an Associate Professor with Department of Computer Science & Engineering, Guru Jambheshwar University of Science and Technology, Hisar, India. She has earned her Doctorate degree in Computer Science in 2010 from School of Computer and Systems Sciences, Jawaharlal Nehru University, New Delhi. Her research interests include Knowledge Discovery & Data Mining, Evolutionary Algorithms and Machine learning. Devraj is a student of M.Tech. in Department of Computer Science & Engineering, Guru Jambheshwar University of Science and Techn ology, Hisar, India.