SlideShare a Scribd company logo
1 of 41
MATLAB Optimization
Toolbox
Presented by
Chin Pei
February 28, 2003
Presentation Outline
 Introduction
 Function Optimization
 Optimization Toolbox
 Routines / Algorithms available
 Minimization Problems
 Unconstrained
 Constrained

Example

The Algorithm Description
 Multiobjective Optimization
 Optimal PID Control Example
Function Optimization
 Optimization concerns the minimization
or maximization of functions
 Standard Optimization Problem
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
( )~
~
min
x
f x
( )~
0jg x ≤
( )~
0ih x =
L U
k k kx x x≤ ≤
Equality ConstraintsSubject to:
Inequality Constraints
Side Constraints
Function Optimization
( )~
f x is the objective function, which measure
and evaluate the performance of a system.
In a standard problem, we are minimizing
the function.
For maximization, it is equivalent to minimization
of the –ve of the objective function.
~
x is a column vector of design variables, which can
affect the performance of the system.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Function Optimization
Constraints – Limitation to the design space.
Can be linear or nonlinear, explicit or implicit functions
( )~
0jg x ≤
( )~
0ih x =
L U
k k kx x x≤ ≤
Equality Constraints
Inequality Constraints
Side Constraints
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Most algorithm require less than!!!
Optimization Toolbox
 Is a collection of functions that extend the capability of
MATLAB. The toolbox includes routines for:
 Unconstrained optimization
 Constrained nonlinear optimization, including goal
attainment problems, minimax problems, and semi-
infinite minimization problems
 Quadratic and linear programming
 Nonlinear least squares and curve fitting
 Nonlinear systems of equations solving
 Constrained linear least squares
 Specialized algorithms for large scale problems
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Minimization Algorithm
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Minimization Algorithm (Cont.)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Equation Solving Algorithms
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Least-Squares Algorithms
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Implementing Opt. Toolbox
 Most of these optimization routines require
the definition of an M-file containing the
function, f, to be minimized.
 Maximization is achieved by supplying the
routines with –f.
 Optimization options passed to the routines
change optimization parameters.
 Default optimization parameters can be
changed through an options structure.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Unconstrained Minimization
 Consider the problem of finding a set of
values [x1 x2]T
that solves
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
( ) ( )1
~
2 2
1 2 1 2 2
~
min 4 2 4 2 1x
x
f x e x x x x x= + + + +
[ ]1 2
~
T
x x x=
Steps
 Create an M-file that returns the
function value (Objective Function)
 Call it objfun.m
 Then, invoke the unconstrained
minimization routine
 Use fminunc
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Step 1 – Obj. Function
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
function f = objfun(x)
f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1);
[ ]1 2
~
T
x x x=
Objective function
Step 2 – Invoke Routine
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
x0 = [-1,1];
options = optimset(‘LargeScale’,’off’);
[xmin,feval,exitflag,output]=
fminunc(‘objfun’,x0,options);
Output arguments
Input arguments
Starting with a guess
Optimization parameters settings
Results
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
xmin =
0.5000 -1.0000
feval =
1.3028e-010
exitflag =
1
output =
iterations: 7
funcCount: 40
stepsize: 1
firstorderopt: 8.1998e-004
algorithm: 'medium-scale: Quasi-Newton line search'
Minimum point of design variables
Objective function value
Exitflag tells if the algorithm is converged.
If exitflag > 0, then local minimum is found
Some other information
More on fminunc – Input
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
 fun: Return a function of objective function.
 x0: Starts with an initial guess. The guess must be a vector of size of
number of design variables.
 option: To set some of the optimization parameters. (More after few
slides)
 P1,P2,…: To pass additional parameters.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Ref. Manual: Pg. 5-5 to 5-9
More on fminunc – Output
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
 xmin: Vector of the minimum point (optimal point). The size is the
number of design variables.
 feval: The objective function value of at the optimal point.
 exitflag: A value shows whether the optimization routine is
terminated successfully. (converged if >0)
 output: This structure gives more details about the optimization
 grad: The gradient value at the optimal point.
 hessian: The hessian value of at the optimal point
Ref. Manual: Pg. 5-5 to 5-9
Options Setting – optimset
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
 The routines in Optimization Toolbox has a set of default optimization
parameters.
 However, the toolbox allows you to alter some of those parameters,
for example: the tolerance, the step size, the gradient or hessian
values, the max. number of iterations etc.
 There are also a list of features available, for example: displaying the
values at each iterations, compare the user supply gradient or
hessian, etc.
 You can also choose the algorithm you wish to use.
Ref. Manual: Pg. 5-10 to 5-14
Options Setting (Cont.)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
 Type help optimset in command window, a list of options setting available will be displayed.
 How to read? For example:
LargeScale - Use large-scale algorithm if
possible [ {on} | off ]
The default is with { }
Parameter (param1)
Value (value1)
Options Setting (Cont.)
LargeScale - Use large-scale algorithm if
possible [ {on} | off ]
 Since the default is on, if we would like to turn off, we just type:
Options =
optimset(‘LargeScale’, ‘off’)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
and pass to the input of fminunc.
Useful Option Settings
 Display - Level of display [ off | iter |
notify | final ]
 MaxIter - Maximum number of iterations
allowed [ positive integer ]
 TolCon - Termination tolerance on the
constraint violation [ positive scalar ]
 TolFun - Termination tolerance on the
function value [ positive scalar ]
 TolX - Termination tolerance on X [ positive
scalar ]
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Ref. Manual: Pg. 5-10 to 5-14
Highly recommended to use!!!
fminunc and fminsearch
 fminunc uses algorithm with gradient
and hessian information.
 Two modes:
 Large-Scale: interior-reflective Newton
 Medium-Scale: quasi-Newton (BFGS)
 Not preferred in solving highly
discontinuous functions.
 This function may only give local solutions.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
fminunc and fminsearch
 fminsearch is generally less efficient than
fminunc for problems of order greater than
two. However, when the problem is highly
discontinuous, fminsearch may be more
robust.
 This is a direct search method that does not
use numerical or analytic gradients as in
fminunc.
 This function may only give local solutions.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Constrained Minimization
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
[xmin,feval,exitflag,output,lambda,grad,hessian]
=
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,
P1,P2,…)
Vector of Lagrange
Multiplier at optimal
point
Example
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
( )~
1 2 3
~
min
x
f x x x x= −
2
1 22 0x x+ ≤
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
− − − ≤
+ + ≤
1 2 30 , , 30x x x≤ ≤
Subject to:
1 2 2 0
,
1 2 2 72
A B
− − −   
= =   
   
0 30
0 , 30
0 30
LB UB
   
   = =   
      
function f = myfun(x)
f=-x(1)*x(2)*x(3);
Example (Cont.)
2
1 22 0x x+ ≤For
Create a function call nonlcon which returns 2 constraint vectors [C,Ceq]
function [C,Ceq]=nonlcon(x)
C=2*x(1)^2+x(2);
Ceq=[];
Remember to return a null
Matrix if the constraint does
not apply
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Example (Cont.)
x0=[10;10;10];
A=[-1 -2 -2;1 2 2];
B=[0 72]';
LB = [0 0 0]';
UB = [30 30 30]';
[x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon)
1 2 2 0
,
1 2 2 72
A B
− − −   
= =   
   
Initial guess (3 design variables)
CAREFUL!!!
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
0 30
0 , 30
0 30
LB UB
   
   = =   
      
Example (Cont.)
Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to
medium-scale (line search).
> In D:ProgramsMATLAB6p1toolboxoptimfmincon.m at line 213
In D:usrCHINTANGOptToolboxmin_con.m at line 6
Optimization terminated successfully:
Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum
constraint violation is less than options.TolCon
Active Constraints:
2
9
x =
0.00050378663220
0.00000000000000
30.00000000000000
feval =
-4.657237250542452e-035
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
2
1 22 0x x+ ≤
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
− − − ≤
+ + ≤
1
2
3
0 30
0 30
0 30
x
x
x
≤ ≤
≤ ≤
≤ ≤
Const. 1
Const. 2
Const. 3
Const. 4
Const. 5
Const. 6
Const. 7
Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq
Const. 8
Const. 9
Multiobjective Optimization
 Previous examples involved problems
with a single objective function.
 Now let us look at solving problem with
multiobjective function by lsqnonlin.
 Example is taken by designing an
optimal PID controller for an plant.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Simulink Example
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Goal: Optimize the control parameters in Simulink model optsim.mdl
in order to minimize the error between the output and input.
Plant description:
• Third order under-damped with actuator limits.
• Actuation limits are a saturation limit and a slew rate limit.
• Saturation limit cuts off input: +/- 2 units
• Slew rate limit: 0.8 unit/sec
Simulink Example (Cont.)
Initial PID Controller Design
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Solving Methodology
 Design variables are the gains in PID
controller (KP, KI and KD) .
 Objective function is the error between
the output and input.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Solving Methodology (Cont.)
 Let pid = [Kp Ki Kd]T
 Let also the step input is unity.
 F = yout - 1
 Construct a function tracklsq for
objective function.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Objective Function
function F = tracklsq(pid,a1,a2)
Kp = pid(1);
Ki = pid(2);
Kd = pid(3);
% Compute function value
opt = simset('solver','ode5','SrcWorkspace','Current');
[tout,xout,yout] = sim('optsim',[0 100],opt);
F = yout-1;
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Getting the simulation
data from Simulink
The idea is perform nonlinear least squares
minimization of the errors from time 0 to 100
at the time step of 1.
So, there are 101 objective functions to minimize.
The lsqnonlin
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
[X,RESNORM,RESIDUAL,EXITFLAG,OUTPUT,LAMBDA,JACOBIAN]
= LSQNONLIN(FUN,X0,LB,UB,OPTIONS,P1,P2,..)
Invoking the Routine
clear all
Optsim;
pid0 = [0.63 0.0504 1.9688];
a1 = 3; a2 = 43;
options =
optimset('LargeScale','off','Display','iter','TolX',0.001,'TolFun
',0.001);
pid = lsqnonlin(@tracklsq,pid0,[],[],options,a1,a2)
Kp = pid(1); Ki = pid(2); Kd = pid(3);
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Results
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Optimal gains
Results (Cont.)
Initial Design
Optimization Process
Optimal Controller Result
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Conclusion
 Easy to use! But, we do not know what is happening
behind the routine. Therefore, it is still important to
understand the limitation of each routine.
 Basic steps:
 Recognize the class of optimization problem
 Define the design variables
 Create objective function
 Recognize the constraints
 Start an initial guess
 Invoke suitable routine
 Analyze the results (it might not make sense)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization Conclusion
Thank You!
Questions & Suggestions?

More Related Content

What's hot

Overview on Optimization algorithms in Deep Learning
Overview on Optimization algorithms in Deep LearningOverview on Optimization algorithms in Deep Learning
Overview on Optimization algorithms in Deep LearningKhang Pham
 
Particle Swarm Optimization
Particle Swarm OptimizationParticle Swarm Optimization
Particle Swarm OptimizationStelios Petrakis
 
Computational Complexity: Complexity Classes
Computational Complexity: Complexity ClassesComputational Complexity: Complexity Classes
Computational Complexity: Complexity ClassesAntonis Antonopoulos
 
MIT 6.S091: Introduction to Deep Reinforcement Learning (Deep RL) by Lex Fridman
MIT 6.S091: Introduction to Deep Reinforcement Learning (Deep RL) by Lex FridmanMIT 6.S091: Introduction to Deep Reinforcement Learning (Deep RL) by Lex Fridman
MIT 6.S091: Introduction to Deep Reinforcement Learning (Deep RL) by Lex FridmanPeerasak C.
 
Machine Learning-Linear regression
Machine Learning-Linear regressionMachine Learning-Linear regression
Machine Learning-Linear regressionkishanthkumaar
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationMohammed Bennamoun
 
Multi-Objective Evolutionary Algorithms
Multi-Objective Evolutionary AlgorithmsMulti-Objective Evolutionary Algorithms
Multi-Objective Evolutionary AlgorithmsSong Gao
 
Markov decision process
Markov decision processMarkov decision process
Markov decision processHamed Abdi
 
Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimizationanurag singh
 
Classical relations and fuzzy relations
Classical relations and fuzzy relationsClassical relations and fuzzy relations
Classical relations and fuzzy relationsBaran Kaynak
 
Lecture 9 Markov decision process
Lecture 9 Markov decision processLecture 9 Markov decision process
Lecture 9 Markov decision processVARUN KUMAR
 
Flowchart of GA
Flowchart of GAFlowchart of GA
Flowchart of GAIshucs
 
Optimization Methods
Optimization MethodsOptimization Methods
Optimization Methodsmetamath
 

What's hot (20)

Tsp is NP-Complete
Tsp is NP-CompleteTsp is NP-Complete
Tsp is NP-Complete
 
Overview on Optimization algorithms in Deep Learning
Overview on Optimization algorithms in Deep LearningOverview on Optimization algorithms in Deep Learning
Overview on Optimization algorithms in Deep Learning
 
Turing machine
Turing machineTuring machine
Turing machine
 
Particle Swarm Optimization
Particle Swarm OptimizationParticle Swarm Optimization
Particle Swarm Optimization
 
Computational Complexity: Complexity Classes
Computational Complexity: Complexity ClassesComputational Complexity: Complexity Classes
Computational Complexity: Complexity Classes
 
MIT 6.S091: Introduction to Deep Reinforcement Learning (Deep RL) by Lex Fridman
MIT 6.S091: Introduction to Deep Reinforcement Learning (Deep RL) by Lex FridmanMIT 6.S091: Introduction to Deep Reinforcement Learning (Deep RL) by Lex Fridman
MIT 6.S091: Introduction to Deep Reinforcement Learning (Deep RL) by Lex Fridman
 
Machine Learning-Linear regression
Machine Learning-Linear regressionMachine Learning-Linear regression
Machine Learning-Linear regression
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
 
Multi-Objective Evolutionary Algorithms
Multi-Objective Evolutionary AlgorithmsMulti-Objective Evolutionary Algorithms
Multi-Objective Evolutionary Algorithms
 
Markov decision process
Markov decision processMarkov decision process
Markov decision process
 
Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimization
 
A* algorithm
A* algorithmA* algorithm
A* algorithm
 
Secant Method
Secant MethodSecant Method
Secant Method
 
Classical relations and fuzzy relations
Classical relations and fuzzy relationsClassical relations and fuzzy relations
Classical relations and fuzzy relations
 
Final project
Final projectFinal project
Final project
 
Lecture 9 Markov decision process
Lecture 9 Markov decision processLecture 9 Markov decision process
Lecture 9 Markov decision process
 
Flowchart of GA
Flowchart of GAFlowchart of GA
Flowchart of GA
 
Traveling Salesman Problem
Traveling Salesman Problem Traveling Salesman Problem
Traveling Salesman Problem
 
Defuzzification
DefuzzificationDefuzzification
Defuzzification
 
Optimization Methods
Optimization MethodsOptimization Methods
Optimization Methods
 

Similar to Optimization toolbox presentation

R/Finance 2009 Chicago
R/Finance 2009 ChicagoR/Finance 2009 Chicago
R/Finance 2009 Chicagogyollin
 
2009 : Solving linear optimization problems with MOSEK
2009 : Solving linear optimization problems with MOSEK2009 : Solving linear optimization problems with MOSEK
2009 : Solving linear optimization problems with MOSEKjensenbo
 
Handout2.pdf
Handout2.pdfHandout2.pdf
Handout2.pdfShoukat13
 
Techniques in Deep Learning
Techniques in Deep LearningTechniques in Deep Learning
Techniques in Deep LearningSourya Dey
 
Economic Dispatch of Generated Power Using Modified Lambda-Iteration Method
Economic Dispatch of Generated Power Using Modified Lambda-Iteration MethodEconomic Dispatch of Generated Power Using Modified Lambda-Iteration Method
Economic Dispatch of Generated Power Using Modified Lambda-Iteration MethodIOSR Journals
 
4optmizationtechniques-150308051251-conversion-gate01.pdf
4optmizationtechniques-150308051251-conversion-gate01.pdf4optmizationtechniques-150308051251-conversion-gate01.pdf
4optmizationtechniques-150308051251-conversion-gate01.pdfBechanYadav4
 
XGBoost: the algorithm that wins every competition
XGBoost: the algorithm that wins every competitionXGBoost: the algorithm that wins every competition
XGBoost: the algorithm that wins every competitionJaroslaw Szymczak
 
Functional Programming in Java 8
Functional Programming in Java 8Functional Programming in Java 8
Functional Programming in Java 8Omar Bashir
 
Unit 1 - Optimization methods.pptx
Unit 1 - Optimization methods.pptxUnit 1 - Optimization methods.pptx
Unit 1 - Optimization methods.pptxssuser4debce1
 
Accelerated Training of Transformer Models
Accelerated Training of Transformer ModelsAccelerated Training of Transformer Models
Accelerated Training of Transformer ModelsDatabricks
 
5.2 Least Squares Linear Regression.pptx
5.2  Least Squares Linear Regression.pptx5.2  Least Squares Linear Regression.pptx
5.2 Least Squares Linear Regression.pptxMaiEllahham1
 

Similar to Optimization toolbox presentation (20)

R/Finance 2009 Chicago
R/Finance 2009 ChicagoR/Finance 2009 Chicago
R/Finance 2009 Chicago
 
2009 : Solving linear optimization problems with MOSEK
2009 : Solving linear optimization problems with MOSEK2009 : Solving linear optimization problems with MOSEK
2009 : Solving linear optimization problems with MOSEK
 
Xgboost
XgboostXgboost
Xgboost
 
Handout2.pdf
Handout2.pdfHandout2.pdf
Handout2.pdf
 
Techniques in Deep Learning
Techniques in Deep LearningTechniques in Deep Learning
Techniques in Deep Learning
 
Xgboost
XgboostXgboost
Xgboost
 
18.1 combining models
18.1 combining models18.1 combining models
18.1 combining models
 
Economic Dispatch of Generated Power Using Modified Lambda-Iteration Method
Economic Dispatch of Generated Power Using Modified Lambda-Iteration MethodEconomic Dispatch of Generated Power Using Modified Lambda-Iteration Method
Economic Dispatch of Generated Power Using Modified Lambda-Iteration Method
 
4optmizationtechniques-150308051251-conversion-gate01.pdf
4optmizationtechniques-150308051251-conversion-gate01.pdf4optmizationtechniques-150308051251-conversion-gate01.pdf
4optmizationtechniques-150308051251-conversion-gate01.pdf
 
Optmization techniques
Optmization techniquesOptmization techniques
Optmization techniques
 
optmizationtechniques.pdf
optmizationtechniques.pdfoptmizationtechniques.pdf
optmizationtechniques.pdf
 
Code Optimizatoion
Code OptimizatoionCode Optimizatoion
Code Optimizatoion
 
Repair dagstuhl jan2017
Repair dagstuhl jan2017Repair dagstuhl jan2017
Repair dagstuhl jan2017
 
XGBoost: the algorithm that wins every competition
XGBoost: the algorithm that wins every competitionXGBoost: the algorithm that wins every competition
XGBoost: the algorithm that wins every competition
 
Functional Programming in Java 8
Functional Programming in Java 8Functional Programming in Java 8
Functional Programming in Java 8
 
Optimization
OptimizationOptimization
Optimization
 
Unit 1 - Optimization methods.pptx
Unit 1 - Optimization methods.pptxUnit 1 - Optimization methods.pptx
Unit 1 - Optimization methods.pptx
 
Accelerated Training of Transformer Models
Accelerated Training of Transformer ModelsAccelerated Training of Transformer Models
Accelerated Training of Transformer Models
 
5.2 Least Squares Linear Regression.pptx
5.2  Least Squares Linear Regression.pptx5.2  Least Squares Linear Regression.pptx
5.2 Least Squares Linear Regression.pptx
 
CH1.ppt
CH1.pptCH1.ppt
CH1.ppt
 

Recently uploaded

HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
Current Transformer Drawing and GTP for MSETCL
Current Transformer Drawing and GTP for MSETCLCurrent Transformer Drawing and GTP for MSETCL
Current Transformer Drawing and GTP for MSETCLDeelipZope
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSCAESB
 
power system scada applications and uses
power system scada applications and usespower system scada applications and uses
power system scada applications and usesDevarapalliHaritha
 
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfCCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfAsst.prof M.Gokilavani
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxpurnimasatapathy1234
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...VICTOR MAESTRE RAMIREZ
 
HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2RajaP95
 
Sachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective IntroductionSachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective IntroductionDr.Costas Sachpazis
 
Biology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptxBiology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptxDeepakSakkari2
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girlsssuser7cb4ff
 
Artificial-Intelligence-in-Electronics (K).pptx
Artificial-Intelligence-in-Electronics (K).pptxArtificial-Intelligence-in-Electronics (K).pptx
Artificial-Intelligence-in-Electronics (K).pptxbritheesh05
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxJoão Esperancinha
 

Recently uploaded (20)

🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
Current Transformer Drawing and GTP for MSETCL
Current Transformer Drawing and GTP for MSETCLCurrent Transformer Drawing and GTP for MSETCL
Current Transformer Drawing and GTP for MSETCL
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentation
 
power system scada applications and uses
power system scada applications and usespower system scada applications and uses
power system scada applications and uses
 
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfCCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptx
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...
 
HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2
 
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
 
Sachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective IntroductionSachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
 
Biology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptxBiology for Computer Engineers Course Handout.pptx
Biology for Computer Engineers Course Handout.pptx
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girls
 
Artificial-Intelligence-in-Electronics (K).pptx
Artificial-Intelligence-in-Electronics (K).pptxArtificial-Intelligence-in-Electronics (K).pptx
Artificial-Intelligence-in-Electronics (K).pptx
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
 

Optimization toolbox presentation

  • 2. Presentation Outline  Introduction  Function Optimization  Optimization Toolbox  Routines / Algorithms available  Minimization Problems  Unconstrained  Constrained  Example  The Algorithm Description  Multiobjective Optimization  Optimal PID Control Example
  • 3. Function Optimization  Optimization concerns the minimization or maximization of functions  Standard Optimization Problem IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion ( )~ ~ min x f x ( )~ 0jg x ≤ ( )~ 0ih x = L U k k kx x x≤ ≤ Equality ConstraintsSubject to: Inequality Constraints Side Constraints
  • 4. Function Optimization ( )~ f x is the objective function, which measure and evaluate the performance of a system. In a standard problem, we are minimizing the function. For maximization, it is equivalent to minimization of the –ve of the objective function. ~ x is a column vector of design variables, which can affect the performance of the system. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 5. Function Optimization Constraints – Limitation to the design space. Can be linear or nonlinear, explicit or implicit functions ( )~ 0jg x ≤ ( )~ 0ih x = L U k k kx x x≤ ≤ Equality Constraints Inequality Constraints Side Constraints IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion Most algorithm require less than!!!
  • 6. Optimization Toolbox  Is a collection of functions that extend the capability of MATLAB. The toolbox includes routines for:  Unconstrained optimization  Constrained nonlinear optimization, including goal attainment problems, minimax problems, and semi- infinite minimization problems  Quadratic and linear programming  Nonlinear least squares and curve fitting  Nonlinear systems of equations solving  Constrained linear least squares  Specialized algorithms for large scale problems IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 7. Minimization Algorithm IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 8. Minimization Algorithm (Cont.) IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 9. Equation Solving Algorithms IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 10. Least-Squares Algorithms IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 11. Implementing Opt. Toolbox  Most of these optimization routines require the definition of an M-file containing the function, f, to be minimized.  Maximization is achieved by supplying the routines with –f.  Optimization options passed to the routines change optimization parameters.  Default optimization parameters can be changed through an options structure. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 12. Unconstrained Minimization  Consider the problem of finding a set of values [x1 x2]T that solves IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion ( ) ( )1 ~ 2 2 1 2 1 2 2 ~ min 4 2 4 2 1x x f x e x x x x x= + + + + [ ]1 2 ~ T x x x=
  • 13. Steps  Create an M-file that returns the function value (Objective Function)  Call it objfun.m  Then, invoke the unconstrained minimization routine  Use fminunc IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 14. Step 1 – Obj. Function IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion function f = objfun(x) f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1); [ ]1 2 ~ T x x x= Objective function
  • 15. Step 2 – Invoke Routine IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion x0 = [-1,1]; options = optimset(‘LargeScale’,’off’); [xmin,feval,exitflag,output]= fminunc(‘objfun’,x0,options); Output arguments Input arguments Starting with a guess Optimization parameters settings
  • 16. Results IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion xmin = 0.5000 -1.0000 feval = 1.3028e-010 exitflag = 1 output = iterations: 7 funcCount: 40 stepsize: 1 firstorderopt: 8.1998e-004 algorithm: 'medium-scale: Quasi-Newton line search' Minimum point of design variables Objective function value Exitflag tells if the algorithm is converged. If exitflag > 0, then local minimum is found Some other information
  • 17. More on fminunc – Input [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…)  fun: Return a function of objective function.  x0: Starts with an initial guess. The guess must be a vector of size of number of design variables.  option: To set some of the optimization parameters. (More after few slides)  P1,P2,…: To pass additional parameters. IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion Ref. Manual: Pg. 5-5 to 5-9
  • 18. More on fminunc – Output IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…)  xmin: Vector of the minimum point (optimal point). The size is the number of design variables.  feval: The objective function value of at the optimal point.  exitflag: A value shows whether the optimization routine is terminated successfully. (converged if >0)  output: This structure gives more details about the optimization  grad: The gradient value at the optimal point.  hessian: The hessian value of at the optimal point Ref. Manual: Pg. 5-5 to 5-9
  • 19. Options Setting – optimset IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion Options = optimset(‘param1’,value1, ‘param2’,value2,…)  The routines in Optimization Toolbox has a set of default optimization parameters.  However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc.  There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or hessian, etc.  You can also choose the algorithm you wish to use. Ref. Manual: Pg. 5-10 to 5-14
  • 20. Options Setting (Cont.) Options = optimset(‘param1’,value1, ‘param2’,value2,…) IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion  Type help optimset in command window, a list of options setting available will be displayed.  How to read? For example: LargeScale - Use large-scale algorithm if possible [ {on} | off ] The default is with { } Parameter (param1) Value (value1)
  • 21. Options Setting (Cont.) LargeScale - Use large-scale algorithm if possible [ {on} | off ]  Since the default is on, if we would like to turn off, we just type: Options = optimset(‘LargeScale’, ‘off’) Options = optimset(‘param1’,value1, ‘param2’,value2,…) IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion and pass to the input of fminunc.
  • 22. Useful Option Settings  Display - Level of display [ off | iter | notify | final ]  MaxIter - Maximum number of iterations allowed [ positive integer ]  TolCon - Termination tolerance on the constraint violation [ positive scalar ]  TolFun - Termination tolerance on the function value [ positive scalar ]  TolX - Termination tolerance on X [ positive scalar ] IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion Ref. Manual: Pg. 5-10 to 5-14 Highly recommended to use!!!
  • 23. fminunc and fminsearch  fminunc uses algorithm with gradient and hessian information.  Two modes:  Large-Scale: interior-reflective Newton  Medium-Scale: quasi-Newton (BFGS)  Not preferred in solving highly discontinuous functions.  This function may only give local solutions. IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 24. fminunc and fminsearch  fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous, fminsearch may be more robust.  This is a direct search method that does not use numerical or analytic gradients as in fminunc.  This function may only give local solutions. IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 25. Constrained Minimization IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion [xmin,feval,exitflag,output,lambda,grad,hessian] = fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options, P1,P2,…) Vector of Lagrange Multiplier at optimal point
  • 26. Example IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion ( )~ 1 2 3 ~ min x f x x x x= − 2 1 22 0x x+ ≤ 1 2 3 1 2 3 2 2 0 2 2 72 x x x x x x − − − ≤ + + ≤ 1 2 30 , , 30x x x≤ ≤ Subject to: 1 2 2 0 , 1 2 2 72 A B − − −    = =        0 30 0 , 30 0 30 LB UB        = =           function f = myfun(x) f=-x(1)*x(2)*x(3);
  • 27. Example (Cont.) 2 1 22 0x x+ ≤For Create a function call nonlcon which returns 2 constraint vectors [C,Ceq] function [C,Ceq]=nonlcon(x) C=2*x(1)^2+x(2); Ceq=[]; Remember to return a null Matrix if the constraint does not apply IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 28. Example (Cont.) x0=[10;10;10]; A=[-1 -2 -2;1 2 2]; B=[0 72]'; LB = [0 0 0]'; UB = [30 30 30]'; [x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon) 1 2 2 0 , 1 2 2 72 A B − − −    = =        Initial guess (3 design variables) CAREFUL!!! fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…) IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion 0 30 0 , 30 0 30 LB UB        = =          
  • 29. Example (Cont.) Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search). > In D:ProgramsMATLAB6p1toolboxoptimfmincon.m at line 213 In D:usrCHINTANGOptToolboxmin_con.m at line 6 Optimization terminated successfully: Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon Active Constraints: 2 9 x = 0.00050378663220 0.00000000000000 30.00000000000000 feval = -4.657237250542452e-035 IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion 2 1 22 0x x+ ≤ 1 2 3 1 2 3 2 2 0 2 2 72 x x x x x x − − − ≤ + + ≤ 1 2 3 0 30 0 30 0 30 x x x ≤ ≤ ≤ ≤ ≤ ≤ Const. 1 Const. 2 Const. 3 Const. 4 Const. 5 Const. 6 Const. 7 Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq Const. 8 Const. 9
  • 30. Multiobjective Optimization  Previous examples involved problems with a single objective function.  Now let us look at solving problem with multiobjective function by lsqnonlin.  Example is taken by designing an optimal PID controller for an plant. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 31. Simulink Example IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion Goal: Optimize the control parameters in Simulink model optsim.mdl in order to minimize the error between the output and input. Plant description: • Third order under-damped with actuator limits. • Actuation limits are a saturation limit and a slew rate limit. • Saturation limit cuts off input: +/- 2 units • Slew rate limit: 0.8 unit/sec
  • 32. Simulink Example (Cont.) Initial PID Controller Design IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 33. Solving Methodology  Design variables are the gains in PID controller (KP, KI and KD) .  Objective function is the error between the output and input. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 34. Solving Methodology (Cont.)  Let pid = [Kp Ki Kd]T  Let also the step input is unity.  F = yout - 1  Construct a function tracklsq for objective function. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 35. Objective Function function F = tracklsq(pid,a1,a2) Kp = pid(1); Ki = pid(2); Kd = pid(3); % Compute function value opt = simset('solver','ode5','SrcWorkspace','Current'); [tout,xout,yout] = sim('optsim',[0 100],opt); F = yout-1; IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion Getting the simulation data from Simulink The idea is perform nonlinear least squares minimization of the errors from time 0 to 100 at the time step of 1. So, there are 101 objective functions to minimize.
  • 36. The lsqnonlin IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion [X,RESNORM,RESIDUAL,EXITFLAG,OUTPUT,LAMBDA,JACOBIAN] = LSQNONLIN(FUN,X0,LB,UB,OPTIONS,P1,P2,..)
  • 37. Invoking the Routine clear all Optsim; pid0 = [0.63 0.0504 1.9688]; a1 = 3; a2 = 43; options = optimset('LargeScale','off','Display','iter','TolX',0.001,'TolFun ',0.001); pid = lsqnonlin(@tracklsq,pid0,[],[],options,a1,a2) Kp = pid(1); Ki = pid(2); Kd = pid(3); IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 38. Results IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion Optimal gains
  • 39. Results (Cont.) Initial Design Optimization Process Optimal Controller Result IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 40. Conclusion  Easy to use! But, we do not know what is happening behind the routine. Therefore, it is still important to understand the limitation of each routine.  Basic steps:  Recognize the class of optimization problem  Define the design variables  Create objective function  Recognize the constraints  Start an initial guess  Invoke suitable routine  Analyze the results (it might not make sense) IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization Conclusion
  • 41. Thank You! Questions & Suggestions?