1 sati

647 views

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
647
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
25
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

1 sati

  1. 1. MODIFICATIONS OF PARTICLE SWARM OPTIMIZATION PARAMETER AND PERFORMANCESJITENDRA SINGH BHADORIYA1 ,School of Instrumentation DAVV, IndoreAASHISH KUMAR BOHRE2,Maulana Azad National Institute of Technology, BhopalDr. GANGA AGNIHOTRI3, Maulana Azad National Institute of Technology, BhopalDr. MANISHA DUBEY4, , Maulana Azad National Institute of Technology, BhopalABSTRACT- modifications have been made to specific problem.The importance of soft computingtechniques is increasing day by day. Inthis paper we have focused on the I. INTRODUCTION-particle swarm optimization that is Rapid developments in computing-relatedlandmark for engineering optimization disciplines and technologies have enabledproblems. The changes in pso are taking the collection of unprecedented amounts ofplace and today we have large number data at unprecedented rates from systemsof modified PSO method. In this paper as diverse as the World Wide Web, thewe are dealing with the modified PSO to human body, the human genome, planetaryovercome of the disadvantage of environments (the Earth as well as otherstandard PSO. Study is carried out to planets), as well as many industrialthe modification of standard PSO for systems such as financial markets, aircrafteliminating certain disadvantages. Some engines, and power plants. Soft computingnew ideas are also given for refers to algorithms that are able to copemodification in pso keeping mind with uncertainty and incompleteseveral parameter of this technique. information and that are still capable ofPSO always converges very quickly discovering approximate, good solutions totowards the optimal positions but may complex computational problems, andslow its convergence speed when it is doing so faster from a computationalnear a minimum. We empirically study standpoint. These algorithms includethe performance of the particle swarm neural nets, fuzzy logic, rough set,optimizer (PSO). Four different Bayesian algorithms, evolutionary computing (genetic algorithms and particle
  2. 2. swarm optimization). PSO is getting when compared with mathematicalpopularity due to its good convergence algorithm and other heuristic optimizationrate, however for some specific problems Techniques.modification in original PSO have beendone. We have presented some modifiedPSO algorithm to overcome the III. PSO ALGORITHM-disadvantage of standard PSO The PSO algorithm is initialized with a population of random candidate solutions, conceptualized as particles. Each particleII. PARTICLE SWARM is assigned a randomized velocity and isOPTIMIZATION- iteratively moved through the problem space. It is attracted towards the locationParticle swarm optimization (PSO) is one of the best fitness Achieved so far by theof the modern heuristic algorithms, which particle itself and by the location of thecan be used to solve nonlinear and non- best fitness achieved so far across thecontinuous optimization problems [1]. It is whole population (global version of thea population-based search algorithm and algorithm). The Particle Swarmsearches in parallel using a group of Optimization (PSO) belongs to theparticles similar to other AI-based category of Swarm Intelligence methods.heuristic optimization techniques. The PSO, which was first proposed byoriginal PSO suggested by Kennedy and Kennedy and Eberhart in 1995, is aEberhart is based on the analogy of swarm famous population-based search algorithmof bird and school of fish [2]. Each particle In PSO, each individual (particle) fliesin PSO makes its decision using its own through the problem space with a velocity.experience and its neighbor’s experiences The speed and direction of the velocity arefor evolution. That is, particles approach to adjusted based on the particle’s previousthe optimum through its present velocity, best experience (self-cognitive) and theprevious experience, and the best historical best experience in itsexperience of its neighbors [3]. The main neighborhood (social-influence). In thisadvantages of the PSO algorithm are way the particle has a tendency to flysummarized as: simple concept, easy towards a promising area in the searchimplementation, robustness to control space. In PSO, each individual flies in theparameters, and computational efficiency search space with a velocity which is
  3. 3. dynamically adjusted according to its own iteration is computed by adding itsflying experience and its companions velocity vector to its current position.flying experience. Consider that we are searching in a d- dimensional search space. Let Xi = (Xi1 Xi2 ... Xid) and Vi = (Vi1 Vi2...Vid). Be i-th particles position vector and velocity vector respectively. Velocity of each particle is adjusted according to eq. (1) []. v(i+1) = w*v(i) + c1*r1* (pbest- currentposition) + c2*r2* (gbest- currentposition)……………………… (1) x(i+1)=x(i) + v(i+1)………………… (2) Where, d is the dimension, the superscript i indicates the iteration number, w is the inertia weight, r1 and r2 are two randomFigure – 1: Flow chart for Particle Swarm vectors with values in the range [0, 1], C1 Optimization (PSO) algorithm and C2 are the cognitive and social scaling parameters which are positive constants .PSO stores a population of potentialsolutions to solve problems, like IV. ADVANTAGES AND DISADVANTAGES OF PSOevolutionary algorithms. The populationsinitialized by random particles in each A PSO is considered as one of the mostiteration of the algorithm, particles update powerful methods for resolving the non-their position according to their personal smooth global optimization problems-andexperiences and their neighbors best has many key advantages as follows:position. The velocity vector of each PSO is a derivative-free techniqueparticle is responsible to update the just like as other heuristicparticle position which is given in eq. (2). optimization techniques.The position of each particle in the next
  4. 4. PSO is easy in its concept and theories. Also, it can have some limitations coding implementation compared for real-time applications such as 5- to other heuristic optimization minute dispatch considering network techniques. constraints since the PSO is also a variant PSO is less sensitivity to the nature of stochastic optimization techniques of the objective function compared requiring relatively a longer computation to the conventional mathematical time than mathematical approaches. approaches and other heuristic However, it is believed that the PSO-based methods. approach can be applied in the off-line PSO has limited number of real-world ED problems such as day-ahead parameters including only inertia electricity markets. Also, the PSO-based weight factor and two acceleration approach is believed that it has less coefficients in comparison with negative impact on the solutions than other other competing heuristic heuristic-based approaches. However, it optimization methods. Also, the still has the problems of dependency on impact of parameters to the initial point and parameters, difficulty in solutions is considered to be less finding their optimal design parameters, sensitive compared to other and the stochastic characteristic of the heuristic algorithms [4]. final outputs. PSO may lack global search PSO seems to be somewhat less ability at the end of a run due to the dependent of a set of initial points utilization of a compared to other evolutionary linearly decreasing inertia weigh. The PSO methods, implying that may fail to find the required optima in convergence algorithm is robust. cases when the problem to be solved is too PSO techniques can generate high- complicated and complex[12]. quality solutions with in shorter calculation time and stable V.MODIFIED convergence characteristics than METHODOLOGY other stochastic methods [5]. OF PSO-The major drawback of PSO, like in other PSO have been modified to overcome theheuristic optimization techniques, is that it disadvantage of conventional pso forlacks somewhat a solid mathematical increase the reliability , convergence ,foundation for analysis to be overcome in divergence . the changes have been donethe future development of relevant
  5. 5. in factors or co-efficient related to PSO a. MPSO Algorithmequation like weighting factor , Beginacceleration and so on. There is while NFC < MAXNFCdescription of some of the modified doversion of PSO . for i=1 to ps In MPSO , author propose a do modified PSO algorithm to Update velocity of the ith enhance the performance of particle according to (1); PSO[6]. The proposed approach, Update position of the ith namely MPSO, employs a novel particle according to (2); local search technique, which helps Calculate the fitness value of to concentrate the particles in the ith particle; current search space around the NFC++; global best particle In MPSO, end for author propose a new local search Update the pbest and gbest, if technique to create trail particles, needed; in order to replace the worst Calculate the boundaries of particle in the population. This current search space; technique helps to concentrate the Create a random particle Y particles in current search space according to (3); around the global best particle. In Create a trail particle X* every generation, we create a trail according to (5); particle X*, and compete it with the Calculate the fitness value of worst particle is current population. X*; The trail particle can be regarded NFC++; as a neighbor around the global If X* is better than Xw best particle. This will be then beneficial to generate high quality Replace Xw with X*; candidate solutions and accelerate end if the convergence speed. The main Update the pbest and gbest, if steps of the proposed approach are needed. described as follows. end while End.
  6. 6. In this paper, author introduce that the inertia weight iscloud model theory to the particle nonlinearly, dynamically changedswarm optimization algorithm to to have better dynamics of balanceimprove the global search ability between global and local searchand make a faster convergence abilities during the search process.speed of the algorithm[7]. In most In this paper, to efficiently provideresearch on the PSO algorithms, a balance between global and localpopulation of particles is uniformly exploration abilities, we present thegenerated with random positions, new method for using the normaland then random velocities are cloud to nonlinearly, dynamicallyassigned to each particle in the adjust the inertia weight throughprocess of initialization [13]. The the course of the run. In the mPSOgeneral location of potential algorithm, each particle of thesolutions in a search space may not swarm shares mutual informationbe known at advance. In this paper, globally and benefits fromthe cloud model is adopted to discoveries and perviousinitialize particles throughout the experiences of all other colleaguesinitialization range for its position during the search process.and velocity vectors, which canrather reflect the flocking behavior When standard PSO optimizesof bird or school of fish. The Complex multi-peak functions,inertia weight is used to control the because all swarms are affected bymomentum of the particle by the historical global best position (weighing the contribution of the gbest ) and fly to the gbest , soprevious velocity basically diversity of population reducescontrolling how much memory of quickly, whereas, the gbest may bethe previous flight direction will a local optimal solution, whichinfluence the new velocity and leads to all particles fastbalance the global and local search convergence in local optimum.ability. A large inertia weight After each iteration in [14], thefacilitates global exploration, while particle of the worst fitness valuea small one tends to facilitate local of the first sub-swarm exchangesexploration. It is crucial in finding with the particle of the best fitnessthe optimum solution efficiently value of the second sub swarm. If
  7. 7. the particle of the best fitness value seeking neighborhood and diversityof the second sub-swarm just fall of population is guaranteed.over optimal particle’s Particle swarm optimizationneighborhood, which causes method in comparison with most ofparticle to jump out the optimal optimization algorithms such asparticle’s neighborhood because of genetic algorithms is simple andparticle’s exchange. Based on fast. But the basic form of Multi-above analysis, a modified two objective Particle Swarmsub-swarms particle swarm Optimization may not obtain theoptimization is proposed, which best Pareto. By applying somedivides the particle swarm into two modifications to make it moregroups with the same size[8]. The efficient in finding optimal Paretofirst swarm adopts the standard front[9]. The main objective ofPSO model, the second adopts the every multi-objective optimizationCognition Only model. In the algorithm is to find Pareto-optimalseeking process, When the two set. These optimal set balances thesub-swarms evolve steady states tradeoffs among the conflictingindependent(The differences of the objectives. The concept of Paretooptimal fitness value after twice are optimality was conceptualized byless than a predetermined the Italian economist, Vilfredothreshold), a certain amount of Pareto, in his work, Manual ofparticles of the second sub-swarm Political Economy in 1906 [15].that are extracted randomly First step in any multi objectiveexchange with the worst fitness optimization algorithm is tovalue of particles of the first sub- minimize the distance betweenswarm, and the steps are repeated solutions and the Pareto front. Foras above. Thus it can ensure the this objective appropriate fitnessdiversity of population in the first functions must be defined.group ,and the second groups seek Traditional type of assigningthe optimal solution in the search fitness function is aggregation–process. When two groups based method, where the fitnessexchange, swarms recover life. function is a weighted sum of theThat is, extreme points are found in objective functions. However this classic approach can be very
  8. 8. sensitive to precise aggregation of is affected by its own experiencegoals and tend to be ineffective and and the experience of the bestinefficient [15]. Some newer performing member of the socialapproaches for fitness assignment groups it is a member of. In theare on the base of Pareto proposed Adaptive membership C-dominance, where fitness is PSO (AMC-PSO), a time varyingproportional to the dominance rank default Membership is introduced.of solutions. For better search of This modification enables thethe area and avoiding converge to particles to explore the space basedfalse Pareto a mutation operator is on their own experience in the firstproposed. The effect of mutation stage, and to intensify theoperator decreases with respect the connections of the social networknumber of iterations. It is in later stages to avoid prematurecontrolled with the parameter convergence. This proposedmutrate [16]. This algorithm dynamic neighborhood algorithm ispresents a good diversity in Pareto compared with other PSOfront. But some points in edge of algorithms having both static andPareto front are not found. In dynamic neighborhood topologiespractice we may be interested in on a set of classic benchmarkfinding these points. For example problems. Particle swarmin some applications, high optimizers are very sensitive to theabsorption is very important even if shape of their social network.the thickness is high. To obtain Above modified PSO lack thethese edges in Pareto front, we ability of adapting their socialshould apply a modification to this network to the landscape of thealgorithm. problem they optimize. TheThis modification introduces a proposed AMC-PSO algorithmnew dynamic neighborhood overcomes this problem.network for particle swarm This presents a modification of tlteoptimization[10]. In Club-based particle swarm optimizationParticle Swarm Optimization (C- algorithm (PSO) intended toPSO) algorithm, each particle combat the problem of prematureinitially joins a default number of convergence observed in manysocial groups (clubs). Each particle applications of PSO[11]. The
  9. 9. proposed new algorithm moves all the disadvantage is notparticles towards nearby particles eliminated completely at a time.Allof higher fitness, instead of the modified PSO does not giveattracting each panicle towards good results on all of thejust the best position discovered so benchmark function ,however thesefar by any particle. This is modified PSO are good for aaccomplished by using the ratio of specific application efficiently .thethe relative fitness and the distance modification in pso should be suchof other particles to determine the that it optimize the problemdirection in which each component universally not a specific problem.of particle position needs to be The modification have been donechanged. The resulting algorithm on the basis of weight factor ,(FDR-PSO) is shown to perform acceleration , and in PSO equationsignificantly better than the , while modifying PSO a goodoriginal PSO algorithm and some mathematical analysis should beof its variants, on many different done before testing it on thebenchmark optimization problems. benchmark function . theEmpirical examination of the modification in PSO could be doneevolution of particles demonstrates using different particle andthat the convergence of the analyzing their social behavior ,algorithm does not occur an early depending on the intelligence ofphase of panicle evolution, unlike particle the modification will havePSO. Avoiding premature good results on benchmarkconvergence allows FDR-PSO to function ,so it will also have goodcontinue search for global optima convergence rate.in difficult multimodaloptimization problems. REFERENCES-VI. CONCLUSION- [1] K. Y. Lee and M. A. El-SharkawiThere are several modification (Editors), Modern Heuristic Optimizationhave been done in Particle Swarm Techniques with Applications to PowerOptimization for enhancing the Systems, IEEE Power Engineering Societyability of PSO. In modified method (02TP160), 2002.
  10. 10. International Conference on Advanced[2] J. Kennedy and R. C. Eberhart, Intelligent Mechatronics July 2 - 5, 2008,“Particle swarm optimization,” Xian, China.Proceedings of IEEE InternationalConference on Neural Networks [8] J. Zhao “A Modified Two Sub-swarms(ICNN’95), Vol. IV, pp. 1942-1948, Perth, Exchange Particle Swarm Optimization “Australia, 1995. 2010 International Conference on Intelligent Computation Technology and[3] J. Kennedy and R. C. Eberhart, Swarm Automation 978-0-7695-4077-1/10 $26.00Intelligence, San Francisco, CA: Morgan © 2010 IEEE DOIKaufmann Publishers, 2001. 10.1109/ICICTA.2010.718 180.[4]R. C. Eberhart and Y. Shi, “Comparison [9] S. Chamaani “Modified Multi-between genetic algorithms and particle objective particle swarm optimization forswarm optimization,” Proc. IEEE Int. electromagnetic absorber design” 2007Conf. Evol. Compu., pp. 611-616, May asia –pacific conference on applied1998. electronics proceedings dec 4-6 melaka, Malaysia.[5] Z. L. Gaing, “Particle swarmoptimization to solving the economic [10] Hassan M. Emara “Adaptive Clubs-dispatch considering the generator based Particle Swarm Optimization ” 2009constraints,” IEEE Trans. on Power American Control Conference HyattSystems, Vol. 18, No. 3, pp. 1187-1195, Regency Riverfront, St. Louis, MO, USAAug. 2003. June 10-12, 2009.[6] Q. Yang, G. Lili “Modifications ofParticle Swarm Optimization for Global [11] Thanmaya Peram, KalyanOptimization” 2010 3rd International Veeramachaneni and Chilukuri K. MohanConference on Biomedical Engineering “Fitness-Distance-Ratio Based Particleand Informatics (BMEI 2010). Swarm Optiniization” 0-7803-7914- 4/03/IEEE[7] J. Wen “A Modified Particle SwarmOptimizer Based on Cloud Model”Proceedings of the 2008 IEEE/ASME
  11. 11. [12] Yuhui Shi, Russell C. Eberhart BIOGRAPHIES—“Empirical Study of Particle Swarm Jitendra Singh Bhadoriya,Optimization” 01999 IEEE.[13] R. Brits, A. P. Engelbrecht, F. vanden Bergh, “Locating multiple optimausing particle swarm optimization,”Applied Mathematics and Computation, Jitendra Singh Bhadoriya was born inno. 189, pp. 1859-1883, 2007. Distt. Bhopal , India, in 1989. He received BE degree (2011) from UIT- RGPV[14] Shen Lin-cheng , Huo Xiao-hua, Niu Bhopal in electrical engineering , and atYi-feng. “Survey of discrete particle the moment he is an M-Techswarm optimization algorithm, ”Systems (instrumentation) scholar at SCHOOL OFEngineering and INSTRUMENTATION, Devi AhilyaElectronics,2008,30(10):1986-1994. University (DAVV) , lndore, India. Email: JITENDRIY@INDIA.COM[15] A. P. Engelbrecht, fundamentals of Aashish Kumar Bohre, computational swarm intelligence, John Wiley & sons, 2005.[16] C.A. Coello Coello, G. Toscano Pulido, and M. Sazar Lechuga, “Handling multiple objectives with Aashish Kumar Bohre was born in Distt. particle swarm optimization,” IEEE Hoshangabad, India, in 1984. He received Trans Evolutionary Computat. vol. 8, BE degree (2009) from UIT- RGPV pp. 256–279. 2004. Bhopal, and M-Tech degree (Power System) in 2011 from MANIT, Bhopal. At the moment he is PhD. scholar at MANIT, Bhopal, India. Email: aashish_bohre@yahoo.co.in
  12. 12. Dr. Ganga Agnihotri, Technology, Bhopal, India. Her research interests include power systems, Genetic Algorithms, Fuzzy Logic systems and application of Soft Computing Techniques in power system dynamics and control. Email:manishadubey6@gmail.comDr. Ganga Agnihotri received BE degreein Electrical engineering from MACT,Bhopal (1972), the ME degree (1974) andPhD degree (1989) from University ofRoorkee, India. Since 1976 she is withMaulana Azad College of Technology,Bhopal in various positions. Currently sheis professor. Her research interest includesPower System Analysis, Power SystemOptimization and Distribution Operation.Dr. Manisha DubeyDr. Manisha Dubey was born in Jabalpurin India on 15th December 1968. Shereceived her B.E (Electrical), M.Tech.(Power Systems) and Ph.D (ElectricalEngg.) in 1990, 1997 and 2006respectively. She is working as Professorat the Department of ElectricalEngineering, National Institute of

×