In this Scilab tutorial we discuss about the importance of multiobjective optimization and we give an overview of all possible Pareto frontiers. Moreover we show how to use the NSGA-II algorithm available in Scilab.
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Ahmed Gad
When solving a problem, the goal is not only solving it but also optimizing such solution. There might be multiple solutions to a problem and the challenge is to find the best of them. The more metrics defining the solution goodness, the harder finding the best solution. This presentation discusses one of the multi-objective optimization techniques called non-dominated sorting genetic algorithm II (NSGA-II) explaining its steps including non-dominated sorting, crowding distance, tournament selection, and genetic algorithm. The presentation works through a numerical example step-by-step.
The difficulties associated with using mathematical optimization on large-scale engineering problems have contributed to the development of alternative solutions. Linear programming and dynamic programming techniques, for example, often fail (or reach local optimum) in solving NP-hard problems with large number of variables and non-linear objective functions. To overcome these problems, researchers have proposed evolutionary-based algorithms for searching near-optimum solutions to problems.
Evolutionary algorithms (EAs) are stochastic search methods that mimic the metaphor of natural biological evolution and/or the social behaviour of species. Examples include how ants find the shortest route to a source of food and how birds find their destination during migration. The behaviour of such species is guided by learning, adaptation, and evolution. To mimic the efficient behaviour of these species, various researchers have developed computational systems that seek fast and robust solutions to complex optimization problems. The first evolutionary-based technique introduced in the literature was the genetic algorithms (Gas). GAs were developed based on the Darwinian principle of the ‘survival of the fittest’ and the natural process of evolution through reproduction. Based on its demonstrated ability to reach near-optimum solutions to large problems, the GAs technique has been used in many applicationsin science and engineering. Despite their benefits, GAs may require long processing time for a near optimum solution to evolve. Also, not all problems lend themselves well to a solution with GAs.
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Ahmed Gad
When solving a problem, the goal is not only solving it but also optimizing such solution. There might be multiple solutions to a problem and the challenge is to find the best of them. The more metrics defining the solution goodness, the harder finding the best solution. This presentation discusses one of the multi-objective optimization techniques called non-dominated sorting genetic algorithm II (NSGA-II) explaining its steps including non-dominated sorting, crowding distance, tournament selection, and genetic algorithm. The presentation works through a numerical example step-by-step.
The difficulties associated with using mathematical optimization on large-scale engineering problems have contributed to the development of alternative solutions. Linear programming and dynamic programming techniques, for example, often fail (or reach local optimum) in solving NP-hard problems with large number of variables and non-linear objective functions. To overcome these problems, researchers have proposed evolutionary-based algorithms for searching near-optimum solutions to problems.
Evolutionary algorithms (EAs) are stochastic search methods that mimic the metaphor of natural biological evolution and/or the social behaviour of species. Examples include how ants find the shortest route to a source of food and how birds find their destination during migration. The behaviour of such species is guided by learning, adaptation, and evolution. To mimic the efficient behaviour of these species, various researchers have developed computational systems that seek fast and robust solutions to complex optimization problems. The first evolutionary-based technique introduced in the literature was the genetic algorithms (Gas). GAs were developed based on the Darwinian principle of the ‘survival of the fittest’ and the natural process of evolution through reproduction. Based on its demonstrated ability to reach near-optimum solutions to large problems, the GAs technique has been used in many applicationsin science and engineering. Despite their benefits, GAs may require long processing time for a near optimum solution to evolve. Also, not all problems lend themselves well to a solution with GAs.
Multi Objective Optimization and Pareto Multi Objective Optimization with cas...Aditya Deshpande
This gives basic idea of MOO ie. Multi Objective Optimization and also Pareto graph used for it.
Here i have done Ansys optimization on simple object to elaborate concept of MOO.
Thanks
Aditya D
deshadi805@gmail.com
How to optimize with the help of the Particle Swarm Optimization Technique and xlOptimizer ? This brief tutorial will enable you to solve any optimization problem with the application of Particle Swarm Optimization Method. After a brief introduction about the method the tutorial will show you the steps that you will need to follow for application of PSO in optimization even if you do not know any programming.(Some basic knowledge of MS Excel 2010 and later is required).
Multi objective optimization and Benchmark functions resultPiyush Agarwal
Implemented Strength Pareto Evolutionary Algorithm (SPEA 2) and Non Dominated Sorting Genetic Algorithm (NSGA II) in MATLAB, Guide Assistant Prof. Divy Kumar, MNNIT Allahabad.
The two algorithms are use to solve multiobjective functions. Tested the algorithms on all the benchmark functions.
Applied both the algorithms to solve Portfolio Optimization satisfying different types of constraints to derive the optimal portfolio.
Nonlinear Programming: Theories and Algorithms of Some Unconstrained Optimiza...Dr. Amarjeet Singh
Nonlinear programming problem (NPP) had become an important branch of operations research, and it was the mathematical programming with the objective function or constraints being nonlinear functions. There were a variety of traditional methods to solve nonlinear programming problems such as bisection method, gradient projection method, the penalty function method, feasible direction method, the multiplier method. But these methods had their specific scope and limitations, the objective function and constraint conditions generally had continuous and differentiable request. The traditional optimization methods were difficult to adopt as the optimized object being more complicated. However, in this paper, mathematical programming techniques that are commonly used to extremize nonlinear functions of single and multiple (n) design variables subject to no constraints are been used to overcome the above challenge. Although most structural optimization problems involve constraints that bound the design space, study of the methods of unconstrained optimization is important for several reasons. Steepest Descent and Newton’s methods are employed in this paper to solve an optimization problem.
Multi Objective Optimization and Pareto Multi Objective Optimization with cas...Aditya Deshpande
This gives basic idea of MOO ie. Multi Objective Optimization and also Pareto graph used for it.
Here i have done Ansys optimization on simple object to elaborate concept of MOO.
Thanks
Aditya D
deshadi805@gmail.com
How to optimize with the help of the Particle Swarm Optimization Technique and xlOptimizer ? This brief tutorial will enable you to solve any optimization problem with the application of Particle Swarm Optimization Method. After a brief introduction about the method the tutorial will show you the steps that you will need to follow for application of PSO in optimization even if you do not know any programming.(Some basic knowledge of MS Excel 2010 and later is required).
Multi objective optimization and Benchmark functions resultPiyush Agarwal
Implemented Strength Pareto Evolutionary Algorithm (SPEA 2) and Non Dominated Sorting Genetic Algorithm (NSGA II) in MATLAB, Guide Assistant Prof. Divy Kumar, MNNIT Allahabad.
The two algorithms are use to solve multiobjective functions. Tested the algorithms on all the benchmark functions.
Applied both the algorithms to solve Portfolio Optimization satisfying different types of constraints to derive the optimal portfolio.
Nonlinear Programming: Theories and Algorithms of Some Unconstrained Optimiza...Dr. Amarjeet Singh
Nonlinear programming problem (NPP) had become an important branch of operations research, and it was the mathematical programming with the objective function or constraints being nonlinear functions. There were a variety of traditional methods to solve nonlinear programming problems such as bisection method, gradient projection method, the penalty function method, feasible direction method, the multiplier method. But these methods had their specific scope and limitations, the objective function and constraint conditions generally had continuous and differentiable request. The traditional optimization methods were difficult to adopt as the optimized object being more complicated. However, in this paper, mathematical programming techniques that are commonly used to extremize nonlinear functions of single and multiple (n) design variables subject to no constraints are been used to overcome the above challenge. Although most structural optimization problems involve constraints that bound the design space, study of the methods of unconstrained optimization is important for several reasons. Steepest Descent and Newton’s methods are employed in this paper to solve an optimization problem.
In real world applications, most of the optimization problems involve more than one objective to
be optimized. The objectives in most of engineering problems are often conflicting, i.e., maximize
performance, minimize cost, maximize reliability, etc. In the case, one extreme solution would not satisfy
both objective functions and the optimal solution of one objective will not necessary be the best solution
for other objective(s). Therefore different solutions will produce trade-offs between different objectives
and a set of solutions is required to represent the optimal solutions of all objectives. Multi-objective
formulations are realistic models for many complex engineering optimization problems. Customized
genetic algorithms have been demonstrated to be particularly effective to determine excellent solutions to
these problems. A reasonable solution to a multi-objective problem is to investigate a set of solutions, each
of which satisfies the objectives at an acceptable level without being dominated by any other solution. In
this paper, an overview is presented describing various multi objective genetic algorithms developed to
handle different problems with multiple objectives.
In this study, we discussed a fuzzy programming approach to bi-level linear programming problems and their application. Bi-level linear programming is characterized as mathematical programming to solve decentralized problems with two decision-makers in the hierarchal organization. They become more important for the contemporary decentralized organization where each unit seeks to optimize its own objective. In addition to this, we have considered Bi-Level Linear Programming (BLPP) and applied the Fuzzy Mathematical Programming (FMP) approach to get the solution of the system. We have suggested the FMP method for the minimization of the objectives in terms of the linear membership functions. FMP is a supervised search procedure (supervised by the upper Decision Maker (DM)). The upper-level decision-maker provides the preferred values of decision variables under his control (to enable the lower level DM to search for his optimum in a wider feasible space) and the bounds of his objective function (to direct the lower level DM to search for his solutions in the right direction).
The Multi-Objective Genetic Algorithm Based Techniques for Intrusion Detectionijcsse
The Multi Objective Genetic Algorithms (MO- GAs) are one of the most widely used techniques that have the capability to find the solution to the problem having multiple conflicting objectives like Intrusion Detection. It is a population based technique capable of producing a set of non-inferior solutions that exhibit the classification trade-offs for the user. This capability of MOGA can be exploited for generating optimal base classifiers and ensembles thereof for Intrusion Detection. This paper explores the various MOGAs proposed in the literature along with their pros and cons. The motivation for the use of MOGA and its issues are high- lighted. Finally, the chapter highlights the concluding remarks.
Lexisearch Approach to Travelling Salesman ProblemIOSR Journals
The aim of this paper is to introduce Lexisearch the structure of the search algorithm does not
require huge dynamic memory during execution. Mathematical programming is concerned with finding optimal
solutions rather than obtaining good solutions. The Lexisearch derives its name from lexicography .This
approach has been used to solve various combinatorial problems efficiently , The Assignment problem, The
Travelling Salesman Problem , The job scheduling problem etc. In all these problems the lexicographic search
was found to be more efficient than the Branch bound algorithms. This algorithm is deterministic and is always
guaranteed to find an optimal solution.
Why electric vehicles need model-based design?
Because of the rising complexity in new vehicles, model-based design & systems engineering is needed to cascade the requirements and trace back any modification along the engineering lifecycle. Find out more in this presentation of a customer case about electric motor optimization.
Keynote of the French Space Agency CNES on the Asteroidlander MASCOT boarding the Hayabusa2 mission in collaboration with the Japanese Space Agency JAXA and the German Aerospace Center DLR
Faster Time to Market using Scilab/XCOS/X2C for motor control algorithm devel...Scilab
Rapid Prototyping becomes very popular for faster algorithm development. With a graphical representation of the algorithm and the possibility to simulate complete designs, engineers can help to reduce the time to market. A tight integration with MPLAB-X IDE allows the combination with standard C-coding to easily get mass production code. This solution was used to optimise a sensorless field oriented controlled PMSM motor driven pump efficiency. A model for closed loop simulation was developed using X2C blocks [1][2] for the FOC algorithm based on the existing application note AN1292 [3]. Enhancements to the original version were implemented and verified with simulation. The X2C Communicator was used to generate code of the new algorithm. With the online debugging capabilities and the scope functionality the algorithm was further tuned and optimized to achieve the highest possible efficiency of the pump.
Scilab and Xcos for Very Low Earth Orbits satellites modellingScilab
Very Low Earth Orbits are orbits in altitudes lower than 450 km. The interaction between the atmosphere particles and the surfaces of the spacecraft is responsible for the aerodynamic torques and forces. Simulating several aspects of the performance of a satellite flying in VLEO is very important to make decisions about the design of the spacecraft and the mission.
X2C -a tool for model-based control development and automated code generation...Scilab
Peter Dirnberger, Stefan Fragner
Nowadays, the market demands compact, stable, easy maintain-and customizable embedded systems. To meet these requirements, afast, simple and reliable implementation of control algorithms is crucial. This paper demonstrateshow model-based design with the help of Scilab/Xcosand X2C, developed by LCM,simplifiesand speedsup the development and implementation of controlalgorithms. As an example, acontrol schemefor a bearingless motoris presented.
A Real-Time Interface for Xcos – an illustrative demonstration using a batter...Scilab
As part of an EU-founded research project, the Scilab based development tool LoRra (Low-Cost Rapid Control Prototyping Platform) was created. This allows the realization of the continuously model based and highly automated Rapid Control Prototyping (RCP) design process for embedded software within the Scilab / Xcos environment (cf. Figure 1). Based on the application battery management system (BMS), this paper presents a Real-Time interface for Scilab.
Aircraft Simulation Model and Flight Control Laws Design Using Scilab and XCosScilab
The increasing demand in the aerospace industry for safety and performance has been requiring even more resourceful flight control laws in all market segments, since the airliners until the newest flying cars. The de facto standard for flight control laws design makes extensive use of tools supporting numerical computing and dynamic systems visual modeling, such that Scilab and XCos can nicely suit this kind of development.
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
BREEDING METHODS FOR DISEASE RESISTANCE.pptxRASHMI M G
Plant breeding for disease resistance is a strategy to reduce crop losses caused by disease. Plants have an innate immune system that allows them to recognize pathogens and provide resistance. However, breeding for long-lasting resistance often involves combining multiple resistance genes
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
The Evolution of Science Education PraxiLabs’ Vision- Presentation (2).pdfmediapraxi
The rise of virtual labs has been a key tool in universities and schools, enhancing active learning and student engagement.
💥 Let’s dive into the future of science and shed light on PraxiLabs’ crucial role in transforming this field!
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills MN
Travis Hills of Minnesota developed a method to convert waste into high-value dry fertilizer, significantly enriching soil quality. By providing farmers with a valuable resource derived from waste, Travis Hills helps enhance farm profitability while promoting environmental stewardship. Travis Hills' sustainable practices lead to cost savings and increased revenue for farmers by improving resource efficiency and reducing waste.
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...
Multiobjective optimization and Genetic algorithms in Scilab
1. www.openeering.com
powered by
MULTIOBJECTIVE OPTIMIZATION AND GENETIC
ALGORITHMS
In this Scilab tutorial we discuss about the importance of multiobjective
optimization and we give an overview of all possible Pareto frontiers. Moreover
we show how to use the NSGA-II algorithm available in Scilab.
Level
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.
2. Multiobjective optimization with NSGA-II
www.openeering.com page 2/16
Step 1: Purpose of this tutorial
It is very uncommon to have problems composed by only a single
objective when dealing with real-world industrial applications. Generally
multiple, often conflicting, objectives arise naturally in most practical
optimization problems.
Optimizing a problem means finding a set of decision variables which
satisfies constraints and optimizes simultaneously a vector function. The
elements of the vector represent the objective functions of all decision
makers. This vector optimization leads to a non-unique solution of the
problem.
For example, when selecting a vehicle that maximizes the comfort and
minimizes the cost, not a single car, but a segment of cars may represent
the final optimal selections (see figure).
After a general introduction on multiobjective optimization, the final aim of
this tutorial is to introduce the reader to multiobjective optimization in
Scilab and particularly to the use of the NSGA II algorithm.
(Example of car classification)
Step 2: Roadmap
In the first part of the tutorial we review some concepts on multiobjective
optimization, then we show how to use NSGA-II algorithm in Scilab.
Steps 14 to 16 present some examples and exercises.
Step 17 shows how to call external (black-box) functions in Scilab.
Descriptions Steps
Multiobjective optimization 3-5
NSGA 2 6-13
Examples and exercises 14-16
Calling external functions 17
Conclusion and remarks 18-19
3. Multiobjective optimization with NSGA-II
www.openeering.com page 3/16
Step 3: Multiobjective scenario
Here we consider, without loss of generality, the minimization of two
objectives all equally important, where no additional information about
the problem is available.
A solution of the problem can be described by a “decision vector” of
the form lying in the design space . The evaluation of the
two objective functions on produces a solution in the
objective space , i.e. is a vector map of the form: .
Comparing two solutions and requires to define a dominance
criteria. In modern multiobjective optimization the Pareto criteria is the
most used. This criteria states:
An objective vector is said to dominate another objective
vector (i.e., ) if no component of is greater than the
corresponding components of and at least one component is
greater;
accordingly, the solution dominates , if dominates
;
all non-dominated solutions are the optimal solutions of the
problem, solutions not dominated by any others. The set of these
solutions is named Pareto set while its image in objective space
is named Pareto front.
A generic multiobjective optimization solver searches for non-dominated
solutions that correspond to trade-offs between all the objectives.
The utopia (or ideal) point corresponds to the minimal of all the objectives
and typically is not a real and feasible point.
4. Multiobjective optimization with NSGA-II
www.openeering.com page 4/16
Step 4: Type of Pareto fronts
The computation of the Pareto front can be a very difficult task. Many
obstacles can make the problem complex: non-continuous design space,
high-dimensionality and clustered solutions. Moreover, in a similar
manner as local optimal points can trap algorithms in single objective
problems, local Pareto frontiers can cause bad convergence of the
multiobjective optimization approaches.
Two very typical Pareto fronts can arise when solving multiobjective
optimization problems:
Convex front
This is the most interest case for decision makers. When the Pareto
front has this shape, the decision makers can negotiate, fighting for
their own objective and they can more easily agree for a trade-off
point. In this situation, the trade-off is much better than the linear
combination of the original objectives. This means, practically, that if a
decision maker gives up a percentage of its target, say 20%, another
decision maker may have an improvement of more than 20% on his
personal target.
Non-convex front:
This is the opposite of the previous situation, negotiation between
decision markers is harder. Here, a decision maker should give up
more than 20% of his goal to give at least 20% advantage to another
decision maker. The final solution depends more on the influence of
the decision maker rather than on a “democratic” negotiation.
Discontinuous fronts are common and more complex to analyze, any
piece of a discontinuous front may be reduced to the two previous cases.
(Convex front)
(Non-convex front)
5. Multiobjective optimization with NSGA-II
www.openeering.com page 5/16
Step 5: Evolution algorithms
Many algorithms are based on a stochastic search approach such as
evolution algorithm, simulating annealing, genetic algorithm.
The idea of these kind of algorithms is the following:
1. Define a memory that contains current solutions;
2. Define a selection module that determines which of the
previously solutions should be kept in memory. Two types of
selection are available:
- Mating selection which consists of a fitness selection
phase where promising solutions are picked for variation;
- Environmental selection that determines which of stored
solutions are kept into the memory.
3. Define a variation module that takes a set of solutions and
systematically, or randomly, modifies these solutions to generate
potentially better solutions using specific operators such as:
- Crossover which produces new individuals combining
the information of two or more parents;
- Mutation which alters individuals with low probability of
survival.
When we consider an evolution algorithm, by analogy to natural evolution,
we call solutions as candidates and the set of candidates as population.
The fitness function is a particular objective function that characterizes
the problem measuring how close a given solution is to achieve the
target, considering also all problem constraints.
-
6. Multiobjective optimization with NSGA-II
www.openeering.com page 6/16
Step 6: NGSA-II
NSGA-II is the second version of the famous “Non-dominated Sorting
Genetic Algorithm” based on the work of Prof. Kalyanmoy Deb for
solving non-convex and non-smooth single and multiobjective
optimization problems.
Its main features are:
A sorting non-dominated procedure where all the individual are
sorted according to the level of non-domination;
It implements elitism which stores all non-dominated solutions,
and hence enhancing convergence properties;
It adapts a suitable automatic mechanics based on the crowding
distance in order to guarantee diversity and spread of solutions;
Constraints are implemented using a modified definition of
dominance without the use of penalty functions.
The Scilab function that implements the NSGA-II algorithm is:
optim_nsga2
which is directly available with Scilab installation.
Scilab syntax:
[pop_opt,fobj_pop_opt,pop_init,fobj_pop_init] =
optim_nsga2(ga_f,pop_size,nb_generation,p_mut,p_cross,Log,param)
Input arguments:
ga_f: the function to be optimized;
pop_size: the size of the population of individuals;
nb_generation: the number of generations to be computed;
p_mut: the mutation probability;
p_cross: the crossover probability;
Log: if %T, we will display to information message during the run of
the genetic algorithm;
param: a list of parameters:
- 'codage_func': the function which will perform the coding
and decoding of individuals;
- 'init_func': the function which will perform the initialization of
the population;
- 'crossover_func': the function which will perform the
crossover between two individuals;
- 'mutation_func': the function which will perform the
mutation of one individual;
- 'selection_func': the function which will perform the
selection of individuals at the end of a generation;
- 'nb_couples': the number of couples which will be selected
so as to perform the crossover and mutation;
- 'pressure': the value the efficiency of the worst individual.
Output Parameters:
pop_opt: the population of optimal individuals;
fobj_pop_opt: the set of objective function values associated to
pop_opt;
pop_init: the initial population of individuals;
fobj_pop_init: the set of objective function values associated to
pop_init (optional).
7. Multiobjective optimization with NSGA-II
www.openeering.com page 7/16
Step 7: Problem ZDT1
The ZDT1 problem consists of solving the following multiobjective
optimization problem:
where the object functions are
and
On the left we report the optimal Pareto front defined by
This function has a continuous optimal Pareto front. Moving along the
frontier, from left to right, we improve the value of making the objective
function worse.
(Optimal Pareto front for n=2)
(Pareto Set Points for n=2)
8. Multiobjective optimization with NSGA-II
www.openeering.com page 8/16
Step 8: Creating the ZDT1 function
In this first step, we create the "zdt1" function and its boundaries.
Please note that the function ZDT1 returns a horizontal vector of
multiobjective functions evaluations.
In the boundary functions we even define the problem dimension.
// ZDT1 multiobjective function
function f=zdt1(x)
f1 = x(1);
g = 1 + 9 * sum(x(2:$)) / (length(x)-1);
h = 1 - sqrt(f1 ./ g);
f = [f1, g.*h];
endfunction
// Min boundary function
function Res=min_bd_zdt1(n)
Res = zeros(n,1);
endfunction
// Max boundary function
function Res=max_bd_zdt1(n)
Res = ones(n,1);
endfunction
Step 9: Set NSGA-II parameters
In this step, we set algorithm parameters and problem dimension.
// Problem dimension
dim = 2;
// Example of use of the genetic algorithm
funcname = 'zdt1';
PopSize = 500;
Proba_cross = 0.7;
Proba_mut = 0.1;
NbGen = 10;
NbCouples = 110;
Log = %T;
pressure = 0.1;
9. Multiobjective optimization with NSGA-II
www.openeering.com page 9/16
Step 10: Set NSGA-II main functions
Here, we set the NSGA-II main functions. In our case, since the problem
is continuous we use the default NSGA functions.
In case of more complex mathematical optimization problem, the user can
easily change the NSGA-II operators. For example, the user may define
his own "mutation_func" function describing a mutation operation that
perfectly fits with the problem at hand. The same can be done for
"crossover_func" function or for the internal coding "codage_func".
// Setting parameters of optim_nsga2 function
ga_params = init_param();
// Parameters to adapt to the shape of the optimization
problem
ga_params =
add_param(ga_params,'minbound',min_bd_zdt1(dim));
ga_params =
add_param(ga_params,'maxbound',max_bd_zdt1(dim));
ga_params = add_param(ga_params,'dimension',dim);
ga_params = add_param(ga_params,'beta',0);
ga_params = add_param(ga_params,'delta',0.1);
// Parameters to fine tune the Genetic algorithm.
// All these parameters are optional for continuous
optimization.
// If you need to adapt the GA to a special problem.
ga_params =
add_param(ga_params,'init_func',init_ga_default);
ga_params =
add_param(ga_params,'crossover_func',crossover_ga_default
);
ga_params =
add_param(ga_params,'mutation_func',mutation_ga_default);
ga_params =
add_param(ga_params,'codage_func',coding_ga_identity);
ga_params = add_param(ga_params,'nb_couples',NbCouples);
ga_params = add_param(ga_params,'pressure',pressure);
// Define s function shortcut
deff('y=fobjs(x)','y = zdt1(x);');
Step 11: Performing optimization
The multiobjective optimization is performed using the command
"optim_nsga2". Other methods are available, see for example:
optim_moga: multiobjective genetic algorithm;
optim_ga: A flexible genetic algorithm;
optim_nsga: A multiobjective Niched Sharing Genetic Algorithm.
// Performing optimization
printf("Performing optimization:");
[pop_opt, fobj_pop_opt, pop_init, fobj_pop_init] =
optim_nsga2(fobjs, PopSize, NbGen, Proba_mut,
Proba_cross, Log, ga_params);
10. Multiobjective optimization with NSGA-II
www.openeering.com page 10/16
Step 12: Plot the Pareto front
The Pareto front is obtained using the "pareto_filter" Scilab
command, which automatically extracts non dominated solutions from a
set of multiobjective and multidimensional solutions.
In order to plot the “population” it is necessary to convert the list
"pop_pareto" to a vector using the command “list2vec“.
// Compute Pareto front and filter
[f_pareto,pop_pareto] =
pareto_filter(fobj_pop_opt,pop_opt);
// Optimal front function definition
f1_opt = linspace(0,1);
f2_opt = 1 - sqrt(f1_opt);
// Plot solution: Pareto front
scf(1);
// Plotting final population
plot(fobj_pop_opt(:,1),fobj_pop_opt(:,2),'g.');
// Plotting Pareto population
plot(f_pareto(:,1),f_pareto(:,2),'k.');
plot(f1_opt, f2_opt, 'k-');
title("Pareto front","fontsize",3);
xlabel("$f_1$","fontsize",4);
ylabel("$f_2$","fontsize",4);
legend(['Final pop.','Pareto pop.','Pareto front.']);
// Transform list to vector for plotting Pareto set
npop = length(pop_opt);
pop_opt = matrix(list2vec(pop_opt),dim,npop)';
nfpop = length(pop_pareto);
pop_pareto = matrix(list2vec(pop_pareto),dim,nfpop)';
// Plot the Pareto set
scf(2);
// Plotting final population
plot(pop_opt(:,1),pop_opt(:,2),'g.');
// Plotting Pareto population
plot(pop_pareto(:,1),pop_pareto(:,2),'k.');
title("Pareto Set","fontsize",3);
xlabel("$x_1$","fontsize",4);
ylabel("$x_2$","fontsize",4);
legend(['Final pop.','Pareto pop.']);
11. Multiobjective optimization with NSGA-II
www.openeering.com page 11/16
Step 13: Some results
Here, we report some results obtained running NSGA-II with the ZDT1
and changing the number of generations (parameter “NbGen“).
NbGen takes values from the set (5,10,15,20).
(NbGen = 5) (NbGen = 10)
(NbGen = 15) (NbGen = 20)
12. Multiobjective optimization with NSGA-II
www.openeering.com page 12/16
Step 14: Exercise #1: ZDT2 problem
Solve for the ZDT2 problem consists of solving the following
multiobjective optimization problem:
where the two object functions are
and
In the left we have reported the optimal Pareto front defined by
This function presents a continuous non-convex optimal Pareto front.
(Optimal Pareto front for n=2)
(Pareto Set Points for n=2)
13. Multiobjective optimization with NSGA-II
www.openeering.com page 13/16
Step 15: Exercise #2: ZDT3 problem
Solve for the ZDT3 problem consists of solving the following
multiobjective optimization problem:
where the two object functions are
and
In the left we have reported the optimal Pareto front defined by
with defined as
This function has a discontinuous optimal Pareto front.
(Optimal Pareto front)
(Pareto Set Points for n=2)
14. Multiobjective optimization with NSGA-II
www.openeering.com page 14/16
Step 16: Exercise #3
Modify the ZDT1 program in order to plot the population at each step.
Hints:
Create a global variable named “currPop“ where to save the
population at each time step;
Create an initial function for the optim_nsga2 named
“init_ga_previous“ that loads the computed population at
each step except the initial step where the original init function
must be called.
Add plot at each time step
If you want to produce a video or animate png save each plot
using the command “xs2png“ (for example an animated png
image can be created using the program JapngEditor).
// Create a global variable
global currPop;
// Create a function for initialize the global variable
function Pop_init=init_ga_previous(popsize, param)
global currPop;
Pop_init = currPop;
endfunction
// Performing optimization
for i=1:NbGen
// Call optim_nsga2 with a local number of generation equals
to 1
if i>1
// Change init generation function
end
// Save population
currPop = pop_opt;
// filtering and Plotting data
end
15. Multiobjective optimization with NSGA-II
www.openeering.com page 15/16
Step 17: Calling external functions
Scilab, with its Input/Output functions, enables coupling to any external
functions and tools (even CAD, CAE, CFD) that can be called from
external commands.
This can be done in a very easy way as shown in several examples in
“Made with Scilab”:
http://www.openeering.com/made_with_scilab
Scilab can even deal with parallel computing. This represents an
enormous advantage when combining optimization together with
engineering simulations. This approach can speed-up time-consuming
optimization problems that are very typical in industrial applications.
Moreover, if simulation time is still too demanding, Scilab can take
advantage of several meta-modeling techniques such as Kriging, DACE,
neural networks, etc.
If you are interested in integrating your simulation code into Scilab for
solving specific optimization problem, please do not hesitate to contact
the Openeering team.
The most useful commands to integrate Scilab with your external code are:
dos — shell (cmd) command execution (Windows only);
unix — shell (sh) command execution;
unix_g — shell (sh) command execution, output redirected to a
variable;
unix_s — shell (sh) command execution, no output ;
unix_w — shell (sh) command execution, output redirected to scilab
window;
unix_x — shell (sh) command execution, output redirected to a
window;
host — Unix or DOS command execution.
// Windows only example
[s,w] = dos('dir');
// general syntax to run my simulation code with options
in Windows systems
//[s, w] = dos('mysimulation.exe /option')
//Run list or dir according to operating systems
if getos() == 'Windows' then
unix_w("dir "+'""'+WSCI+"modules"+'""');
else
unix_w("ls $SCI/modules");
end
16. Multiobjective optimization with NSGA-II
www.openeering.com page 16/16
Step 18: Concluding remarks and References
In this Scilab tutorial we have shown how to use the NSGA-II within
Scilab.
On the right-hand column you may find a list of interesting references for
further studies.
1. Scilab Web Page: Available: www.scilab.org.
2. Openeering: www.openeering.com.
3. ZDT1, ZDT2 and ZDT3 documentation:
http://www.tik.ee.ethz.ch/sop/download/supplementary/testproblems/
4. JapngEditor : https://www.reto-hoehener.ch/japng/
Step 19: Software content
To report bugs or suggest improvements please contact the Openeering
team.
www.openeering.com.
Thank you for your attention,
Manolo Venturin and Silvia Poles
----------------
NSGA 2 IN SCILAB
----------------
--------------
Main directory
--------------
example_steps.sce : Exercise 1
example_ZDT1.sce : Example for the ZDT1 function
example_ZDT2.sce : Example for the ZDT2 function
example_ZDT3.sce : Example for the ZDT3 function
front_plots.sce : Plots Pareto fronts and sets
license.txt : The license file