A presentation on PSO with videos and animations to illustrate the concept. The ppt throws light on the concept, the algo, the application and comparison of PSO with GA and DE.
This presentation provides an introduction to the Particle Swarm Optimization topic, it shows the PSO basic idea, PSO parameters, advantages, limitations and the related applications.
A brief introduction on the principles of particle swarm optimizaton by Rajorshi Mukherjee. This presentation has been compiled from various sources (not my own work) and proper references have been made in the bibliography section for further reading. This presentation was made as a presentation for submission for our college subject Soft Computing.
This presentation provides an introduction to the Particle Swarm Optimization topic, it shows the PSO basic idea, PSO parameters, advantages, limitations and the related applications.
A brief introduction on the principles of particle swarm optimizaton by Rajorshi Mukherjee. This presentation has been compiled from various sources (not my own work) and proper references have been made in the bibliography section for further reading. This presentation was made as a presentation for submission for our college subject Soft Computing.
These slides presents the optimization using evolutionary computing techniques. Particle Swarm Optimization and Genetic Algorithm are discussed in detail. Apart from that multi-objective optimization are also discussed in detail.
The GENETIC ALGORITHM is a model of machine learning which derives its behavior from a metaphor of the processes of EVOLUTION in nature. Genetic Algorithm (GA) is a search heuristic that mimics the process of natural selection. This heuristic (also sometimes called a metaheuristic) is routinely used to generate useful solutions to optimization and search problems.
In computer science and operation research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems which can be reduced to finding good paths through graph.
This presentation provides an introduction to the Ant Colony Optimization topic, it shows the basic idea of ACO, advantages, limitations and the related applications.
Markov Chain and Adaptive Parameter Selection on Particle Swarm Optimizer ijsc
Particle Swarm Optimizer (PSO) is such a complex stochastic process so that analysis on the stochastic behavior of the PSO is not easy. The choosing of parameters plays an important role since it is critical in the performance of PSO. As far as our investigation is concerned, most of the relevant researches are based on computer simulations and few of them are based on theoretical approach. In this paper, theoretical approach is used to investigate the behavior of PSO. Firstly, a state of PSO is defined in this paper, which contains all the information needed for the future evolution. Then the memory-less property of the state defined in this paper is investigated and proved. Secondly, by using the concept of the state and suitably dividing the whole process of PSO into countable number of stages (levels), a stationary Markov chain is established. Finally, according to the property of a stationary Markov chain, an adaptive method for parameter selection is proposed.
These slides presents the optimization using evolutionary computing techniques. Particle Swarm Optimization and Genetic Algorithm are discussed in detail. Apart from that multi-objective optimization are also discussed in detail.
The GENETIC ALGORITHM is a model of machine learning which derives its behavior from a metaphor of the processes of EVOLUTION in nature. Genetic Algorithm (GA) is a search heuristic that mimics the process of natural selection. This heuristic (also sometimes called a metaheuristic) is routinely used to generate useful solutions to optimization and search problems.
In computer science and operation research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems which can be reduced to finding good paths through graph.
This presentation provides an introduction to the Ant Colony Optimization topic, it shows the basic idea of ACO, advantages, limitations and the related applications.
Markov Chain and Adaptive Parameter Selection on Particle Swarm Optimizer ijsc
Particle Swarm Optimizer (PSO) is such a complex stochastic process so that analysis on the stochastic behavior of the PSO is not easy. The choosing of parameters plays an important role since it is critical in the performance of PSO. As far as our investigation is concerned, most of the relevant researches are based on computer simulations and few of them are based on theoretical approach. In this paper, theoretical approach is used to investigate the behavior of PSO. Firstly, a state of PSO is defined in this paper, which contains all the information needed for the future evolution. Then the memory-less property of the state defined in this paper is investigated and proved. Secondly, by using the concept of the state and suitably dividing the whole process of PSO into countable number of stages (levels), a stationary Markov chain is established. Finally, according to the property of a stationary Markov chain, an adaptive method for parameter selection is proposed.
MARKOV CHAIN AND ADAPTIVE PARAMETER SELECTION ON PARTICLE SWARM OPTIMIZERijsc
Particle Swarm Optimizer (PSO) is such a complex stochastic process so that analysis on the stochastic
behavior of the PSO is not easy. The choosing of parameters plays an important role since it is critical in
the performance of PSO. As far as our investigation is concerned, most of the relevant researches are
based on computer simulations and few of them are based on theoretical approach. In this paper,
theoretical approach is used to investigate the behavior of PSO. Firstly, a state of PSO is defined in this
paper, which contains all the information needed for the future evolution. Then the memory-less property of
the state defined in this paper is investigated and proved. Secondly, by using the concept of the state and
suitably dividing the whole process of PSO into countable number of stages (levels), a stationary Markov
chain is established. Finally, according to the property of a stationary Markov chain, an adaptive method
for parameter selection is proposed.
International Journal of Engineering Research and Development (IJERD)IJERD Editor
call for paper 2012, hard copy of journal, research paper publishing, where to publish research paper,
journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJERD, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, reserach and review articles, IJERD Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathemetics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer reviw journal, indexed journal, reserach and review articles, engineering journal, www.ijerd.com, research journals,
yahoo journals, bing journals, International Journal of Engineering Research and Development, google journals, hard copy of journal
Query Plan Generation using Particle Swarm OptimizationAkshay Jain
This presentation on “Generation of Query Plan using Particle Swarm Optimization.” On the database, there are number of queries which are spread across the world. Now in order to appendage these queries efficiently, best query processing techniques that generate best query processing plans are being formulated. In distributed relational database systems, due to replica of relations at various sites, the relations are required to answer a query may necessitate accessing of data from various sites. This leads to an extensive increase in the possible alternative query plans for evaluating a query. Though it is not actually feasible to find all possible query plans in such a large search space, the query plan that will be the most cost-efficacious option for processing a query is reckoned necessary and is finally given out for the given query.
A Fast and Inexpensive Particle Swarm Optimization for Drifting Problem-SpacesZubin Bhuyan
Particle Swarm Optimization is a class of stochastic, population based optimization techniques which are mostly suitable for static problems. However, real world optimization problems are time variant, i.e., the problem space changes over time. Several researches have been done to address this dynamic optimization problem using Particle Swarms. In this paper we probe the issues of tracking and optimizing Particle Swarms in a dynamic system where the problem-space drifts in a particular direction. Our assumption is that the approximate amount of drift is known, but the direction of the drift is unknown. We propose a Drift Predictive PSO (DriP-PSO) model which does not incur high computation cost, and is very fast and accurate. The main idea behind this technique is to determine the approximate direction using a small number of stagnant particles in which the problem-space is drifting so that the particle velocities may be adjusted accordingly in the subsequent iteration of the algorithm.
Artificial Intelligence in Robot Path Planningiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
EFFECTS OF THE DIFFERENT MIGRATION PERIODS ON PARALLEL MULTI-SWARM PSOcscpconf
In recent years, there has been an increasing interest in parallel computing. In parallel computing, multiple computing resources are used simultaneously in solving a problem. There are multiple processors that will work concurrently and the program is divided into different tasks to be simultaneously solved. Recently, a considerable literature has grown up around the theme of metaheuristic algorithms. Particle swarm optimization (PSO) algorithm is a popular metaheuristic algorithm. The parallel comprehensive learning particle swarm optimization (PCLPSO) algorithm based on PSO has multiple swarms based on the master-slave paradigm and works cooperatively and concurrently. The migration period is an important parameter in PCLPSO and affects the efficiency of the algorithm. We used the well-known benchmark functions in the experiments and analysed the performance of PCLPSO using different migration periods.
Effects of The Different Migration Periods on Parallel Multi-Swarm PSO csandit
In recent years, there has been an increasing inter
est in parallel computing. In parallel
computing, multiple computing resources are used si
multaneously in solving a problem. There
are multiple processors that will work concurrently
and the program is divided into different
tasks to be simultaneously solved. Recently, a cons
iderable literature has grown up around the
theme of metaheuristic algorithms. Particle swarm o
ptimization (PSO) algorithm is a popular
metaheuristic algorithm. The parallel comprehensive
learning particle swarm optimization
(PCLPSO) algorithm based on PSO has multiple swarms
based on the master-slave paradigm
and works cooperatively and concurrently. The migra
tion period is an important parameter in
PCLPSO and affects the efficiency of the algorithm.
We used the well-known benchmark
functions in the experiments and analysed the perfo
rmance of PCLPSO using different
migration periods.
Optimal rule set generation using pso algorithmcsandit
Classification and Prediction is an important resea
rch area of data mining. Construction of
classifier model for any decision system is an impo
rtant job for many data mining applications.
The objective of developing such a classifier is to
classify unlabeled dataset into classes. Here
we have applied a discrete Particle Swarm Optimizat
ion (PSO) algorithm for selecting optimal
classification rule sets from huge number of rules
possibly exist in a dataset. In the proposed
DPSO algorithm, decision matrix approach was used f
or generation of initial possible
classification rules from a dataset. Then the propo
sed algorithm discovers important or
significant rules from all possible classification
rules without sacrificing predictive accuracy.
The proposed algorithm deals with discrete valued d
ata, and its initial population of candidate
solutions contains particles of different sizes. Th
e experiment has been done on the task of
optimal rule selection in the data sets collected f
rom UCI repository. Experimental results show
that the proposed algorithm can automatically evolv
e on average the small number of
conditions per rule and a few rules per rule set, a
nd achieved better classification performance
of predictive accuracy for few classes.
Quarkus Hidden and Forbidden ExtensionsMax Andersen
Quarkus has a vast extension ecosystem and is known for its subsonic and subatomic feature set. Some of these features are not as well known, and some extensions are less talked about, but that does not make them less interesting - quite the opposite.
Come join this talk to see some tips and tricks for using Quarkus and some of the lesser known features, extensions and development techniques.
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
Top Features to Include in Your Winzo Clone App for Business Growth (4).pptxrickgrimesss22
Discover the essential features to incorporate in your Winzo clone app to boost business growth, enhance user engagement, and drive revenue. Learn how to create a compelling gaming experience that stands out in the competitive market.
How Recreation Management Software Can Streamline Your Operations.pptxwottaspaceseo
Recreation management software streamlines operations by automating key tasks such as scheduling, registration, and payment processing, reducing manual workload and errors. It provides centralized management of facilities, classes, and events, ensuring efficient resource allocation and facility usage. The software offers user-friendly online portals for easy access to bookings and program information, enhancing customer experience. Real-time reporting and data analytics deliver insights into attendance and preferences, aiding in strategic decision-making. Additionally, effective communication tools keep participants and staff informed with timely updates. Overall, recreation management software enhances efficiency, improves service delivery, and boosts customer satisfaction.
In software engineering, the right architecture is essential for robust, scalable platforms. Wix has undergone a pivotal shift from event sourcing to a CRUD-based model for its microservices. This talk will chart the course of this pivotal journey.
Event sourcing, which records state changes as immutable events, provided robust auditing and "time travel" debugging for Wix Stores' microservices. Despite its benefits, the complexity it introduced in state management slowed development. Wix responded by adopting a simpler, unified CRUD model. This talk will explore the challenges of event sourcing and the advantages of Wix's new "CRUD on steroids" approach, which streamlines API integration and domain event management while preserving data integrity and system resilience.
Participants will gain valuable insights into Wix's strategies for ensuring atomicity in database updates and event production, as well as caching, materialization, and performance optimization techniques within a distributed system.
Join us to discover how Wix has mastered the art of balancing simplicity and extensibility, and learn how the re-adoption of the modest CRUD has turbocharged their development velocity, resilience, and scalability in a high-growth environment.
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...Anthony Dahanne
Les Buildpacks existent depuis plus de 10 ans ! D’abord, ils étaient utilisés pour détecter et construire une application avant de la déployer sur certains PaaS. Ensuite, nous avons pu créer des images Docker (OCI) avec leur dernière génération, les Cloud Native Buildpacks (CNCF en incubation). Sont-ils une bonne alternative au Dockerfile ? Que sont les buildpacks Paketo ? Quelles communautés les soutiennent et comment ?
Venez le découvrir lors de cette session ignite
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
In 2015, I used to write extensions for Joomla, WordPress, phpBB3, etc and I ...Juraj Vysvader
In 2015, I used to write extensions for Joomla, WordPress, phpBB3, etc and I didn't get rich from it but it did have 63K downloads (powered possible tens of thousands of websites).
Enhancing Project Management Efficiency_ Leveraging AI Tools like ChatGPT.pdfJay Das
With the advent of artificial intelligence or AI tools, project management processes are undergoing a transformative shift. By using tools like ChatGPT, and Bard organizations can empower their leaders and managers to plan, execute, and monitor projects more effectively.
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Globus
The Earth System Grid Federation (ESGF) is a global network of data servers that archives and distributes the planet’s largest collection of Earth system model output for thousands of climate and environmental scientists worldwide. Many of these petabyte-scale data archives are located in proximity to large high-performance computing (HPC) or cloud computing resources, but the primary workflow for data users consists of transferring data, and applying computations on a different system. As a part of the ESGF 2.0 US project (funded by the United States Department of Energy Office of Science), we developed pre-defined data workflows, which can be run on-demand, capable of applying many data reduction and data analysis to the large ESGF data archives, transferring only the resultant analysis (ex. visualizations, smaller data files). In this talk, we will showcase a few of these workflows, highlighting how Globus Flows can be used for petabyte-scale climate analysis.
Into the Box Keynote Day 2: Unveiling amazing updates and announcements for modern CFML developers! Get ready for exciting releases and updates on Ortus tools and products. Stay tuned for cutting-edge innovations designed to boost your productivity.
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
Globus Connect Server Deep Dive - GlobusWorld 2024Globus
We explore the Globus Connect Server (GCS) architecture and experiment with advanced configuration options and use cases. This content is targeted at system administrators who are familiar with GCS and currently operate—or are planning to operate—broader deployments at their institution.
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
Unleash Unlimited Potential with One-Time Purchase
BoxLang is more than just a language; it's a community. By choosing a Visionary License, you're not just investing in your success, you're actively contributing to the ongoing development and support of BoxLang.
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
Globus Compute wth IRI Workflows - GlobusWorld 2024
Particle swarm optimization
1.
2.
3. Group Member Details
NAME UNIVERSITY ROLL NO COLLEGE ROLL NO
Ananga Mohan Chatterjee 11500112046 142212
Aniket Anand 11500112047 142213
Madhuja Roy 11500112078 142244
Mahesh Tibrewal 11500112079 142245
4.
5. What is Swarm Intelligence?
The term swarm is used to represent
an aggregation of the animals or
insects which work collectively to
accomplish their day to day tasks in an
intelligent and efficient manner
SI systems are typically made up of a
population of simple agents interacting
locally with one another and with their
environment.
Natural examples of SI include ant
colonies, bird flocking, animal herding,
bacterial growth, and fish schooling.
6. Origin of Particle Swarm Optimization
Particle Swarm Optimization (PSO) is a population based stochastic
optimization technique developed by Dr. Russell C. Eberhart and Dr.
James Kennedy in 1995, inspired by social behaviour of bird flocking or
fish schooling.
Dr. Russell C.
Eberhart
Dr. James Kennedy
7. Origin of Particle Swarm Optimization (contd.)
Dr. Eberhart and Dr. Kennedy were
inspired by the flocking and schooling
patterns of birds and fish. Originally
these two started out developing
computer software simulations of bird
flocking around food sources, then later
realized how well their algorithms
worked on optimization problems.
8. Concept of Particle Swarm Optimization
PSO is an artificial intelligence (AI) technique that can
be used to find approximate solutions to extremely
difficult or impossible numeric maximization and
minimization problems.
In PSO, a swarm of n individuals communicate either
directly or indirectly with one another search directions
(gradients).
Simple algorithm, easy to implement and few
parameters to adjust mainly the velocity.
9.
10. Parameters in PSO
• Population initialized by assigning random positions (Xi) and
velocities (Vi); potential solutions are then flown through
hyperspace
• Each particle keeps track of its “best” (highest fitness)
position in hyperspace. This is called
• At each time step, each particle stochastically accelerates
towards its pbest and gbest (or lbest).
o “pbest for an individual particle.
o “gbest” for best on group.
o “lbest” for the best in neighbourhood.
11. Initialize Particles
end
Calculate fitness values
for each particle
Use each particle’s
velocity value to update
its data value
Keep previous pBest
Assign current fitness as
new pBest
Assign best particle’s pBest
value to gBest
Calculate velocity for each
particle
Is current
fitness value
better than
pBest
Target or
maximum apochs
reached
yes no
no yes
Flowchart
Target positionInitial position
12. Mathematical Approach
Equation:
Vi = [vi1,vi2, ...,vin] called the velocity for particle i.
Xi = [xi1,xi2, ..., xin] represents the position of particle i.
Pbest : represents the best previous position of particle i(i.e., local-best position or its experience)
Gbest : represents the best position among all particles in the population X= [X1,X2, . . .,XN] (i.e. global-best
position)
Rand(.)and rand(.) : are two random variables between [0,1].
C1 and C2 : are positive numbers called acceleration coefficients that guide each particle toward the individual
best and the swarm best positions, respectively.
13. PSO Pseudo Code
For each particle :
Initialize particle
Do :
For each particle :
Calculate fitness value
If the fitness value is better than the best fitness value (pBest) in history
Set current value as the new pBest
End
For each particle :
Find in the particle neighborhood, the particle with the best fitness
Calculate particle velocity according to the velocity equation (1)
Apply the velocity constriction
Update particle position according to the position equation (2)
Apply the position constriction
End
While maximum iterations or minimum error criteria is not attained
14. Modifications in PSO structure
1. Selection of maximum velocity:
The velocities may become too high so that the particles become uncontrolled and exceed search space.
Therefore, velocities are bounded to a maximum value Vmax, that is
2. Adding inertia weight:
A new parameter w for the PSO, named inertia weight is added in order to better control the scope of the
search. So, Eq. (1) is now becomes:
15. Modifications in PSO structure (Contd.)
3. Constriction factor:
If running algorithm without restraining the velocity, the system will explodes after a few
iterations . So,
induce a constriction coefficient in order to control the convergence properties.
With the constriction factor, the PSO equation for computing the velocity is:
Note that ,
• if C = 5 then 𝜒= 0.38 from Eq. (4), will cause a very pronounced damping effect.
• But , if C is set to 4 then 𝜒 is thus 0.729, which works fine.
16. Population Topology
Pattern of connectedness between individuals is like a social network
Connection pattern controls how the solutions can flow through the solution space
PSO gbest and lbest topologies
17.
18. Effect of Re-Initialization
Among the three algorithms, PSO has a higher tendency to cluster rapidly and the
swarm may quickly become stagnant. To remedy this drawback, several sub-
grouping approaches had been proposed to reduce the dominant influence of the
global best particle. A much simpler and frequently used alternative is to simply
keep the global best particle and regenerate all or part of the remaining particles.
This has the effect of generating a new swarm but with the global best as one of
the particles, and this process is called the re-initialization process.
In GA, the clustering is less obvious, but it is often found that the top part of the
population may look similar, and that re-initialization can also inject randomness
into the population to improve the diversity.
In DE, the clustering is the least and re-initialization has the least effect for DE
19. Effect of Local Search
In GA, the density of solution space is less, so it is often found that the GA
operators cannot produce all potential solutions. A popular fix is the use of
local search to see if a better solution can be found around the solutions
produced by GA operators. The local search process is often time consuming,
and to apply it over the whole population could lead to a long solution time.
For PSO, the best particle has a dominant influence over the whole swarm, and
a time saving strategy is to only apply local search to the best particle, and this
can lead to solution improvement with shorter solution time. This strategy was
demonstrated to be highly effective for job shop scheduling in Pratchayaborirak
and Kachitvichyanukul (2011).
This same strategy may not yield the same effect in DE since the best particle
does not have a dominant influence on the population of solutions.
20. Effect of Subgrouping
The use of sub-grouping of homogenous population to improve
solution quality has been demonstrated in GA and PSO.
This sub-grouping allows some groups of solutions to be freed from
the influence of the dominant solutions, and thus the group may be
searching in a different area of the solution space and improve the
exploration aspect of the algorithms.
For DE, the best particle has little influence on the perturbation
process so it is rational to presume that sub-grouping with
homogeneous population may have limited effect on the solution
quality of DE.
21. Qualitative comparison of GA, PSO and DE
GA PSO DE
Require Ranking of solution Yes No No
Influence of population size on solution time Exponential Linear Linear
Influence of best solution on population Medium Most Less
Average fitness cannot be worse False False True
Tendency for premature convergence Medium High Low
Continuity(density) of search space Less More More
Ability to reach good solution without local search Less More More
Homogeneous sub-grouping improves convergence Yes Yes NA
22.
23. Neural Network (NN) Training using PSO
A complex function that accepts some numeric inputs and that generates
some numeric outputs.
The best way to get an idea of what training a neural network using PSO
is like is to take a look at a program that creates a neural network
predictor for a set of Iris flowers, where the goal is to predict the species
based on sepal length and width, and petal length and width.
The program uses an artificially small, 30-item subset of a famous 150-
item benchmark data set called Fisher’s “Iris Data”.
A 4-input, 6-hidden, 3-output neural network is instantiated.
A fully-connected 4-6-3 neural network will have (4 * 6) + (6 * 3) + (6 +
3) = 51 weights and bias values.
A swarm consisting of 12 virtual particles, and the swarm attempts to
find the set of neural network weights and bias values in a maximum of
700 iterations.
After PSO training has completed, the 51 values of the best weights and
biases that were found are displayed. Using those weights and biases,
when the neural network is fed the six training items, the network
24. Mobile Robot Navigation Using Particle Swarm
Optimization and Adaptive NN
Improved particle swarm optimization (PSO) is
used to optimize the path of a mobile robot
through an environment containing static
obstacles.
Relative to many optimization methods that
produce non-smooth paths, the PSO method can
generate smooth paths, which are more preferable
for designing continuous control technologies to
realize path following using mobile robots.
To reduce computational cost of optimization,
the stochastic PSO (S-PSO) with high
exploration ability is developed, so that a
swarm with small size can accomplish path
planning.
25. Hybridization of PSO with Other Evolutionary
Techniques
A popular research trend is to merge or combine the PSO with the other techniques, especially the other
evolutionary computation techniques such as selection, cross-over and mutation.
Some improved and extended PSO methods:
Improved PSO (IPSO): It uses a combination of chaotic sequences and conventional linearly decreasing inertia
weights and crossover operation to increase both exploration and exploitation capability of PSO.
Modified PSO (MPSO): This approach is a mechanism to cope with the equality and inequality constraints.
Furthermore, a dynamic search-space reduction strategy is employed to accelerate the optimization process.
New PSO (NPSO): In this method, the particle is modified in order to remember its worst position. This
modification is improved to explore the search space very effectively.
Improved Coordinated Aggregation based PSO (ICA-PSO): In this algorithm each particle in the swarm retains
a memory of its best position ever encountered, and is attracted only by other particles with better
achievements than its own with the exception of the particle with the best achievement, which moves
randomly.
Hybrid PSO with Sequential Quadratic Programming (PSO-SQP): The SQP method seems to be the best
nonlinear programming method for constrained optimization. It outperforms every other nonlinear
programming method in terms of efficiency, accuracy, and percentage of successful solutions, over a large
number of test problems.
26.
27. ◊ The process of PSO algorithm in finding optimal values follows the work
of an animal society which has no leader.
◊ Particle swarm optimization consists of a swarm of particles, where
particle represent a potential solution (better condition).
◊ Particle will move through a multidimensional search space to find the
best position in that space (the best position may possible to the
maximum or minimum values).
◊ Constraints to be kept in mind are that velocity should have an optimum
value, as too less will be too slow, and if the velocity is too high then the
method will become unstable.
28.
29. We would like to express our gratitude to all
the respected faculty members of our
department for providing us with this
opportunity of giving a presentation on a topic
which was interesting to research on. We
thank our seniors for their able guidance and
support in completing the presentation.
Finally, a word of thanks to all those wo were
directly or indirectly involved in this
presentation.