2. Presentation Outline
Introduction
Function Optimization
Optimization Toolbox
Routines / Algorithms available
Minimization Problems
Unconstrained
Constrained
Example
The Algorithm Description
Multiobjective Optimization
Optimal PID Control Example
3. Function Optimization
Optimization concerns the minimization
or maximization of functions
Standard Optimization Problem
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
( )~
~
min
x
f x
( )~
0jg x ≤
( )~
0ih x =
L U
k k kx x x≤ ≤
Equality ConstraintsSubject to:
Inequality Constraints
Side Constraints
4. Function Optimization
( )~
f x is the objective function, which measure
and evaluate the performance of a system.
In a standard problem, we are minimizing
the function.
For maximization, it is equivalent to minimization
of the –ve of the objective function.
~
x is a column vector of design variables, which can
affect the performance of the system.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
5. Function Optimization
Constraints – Limitation to the design space.
Can be linear or nonlinear, explicit or implicit functions
( )~
0jg x ≤
( )~
0ih x =
L U
k k kx x x≤ ≤
Equality Constraints
Inequality Constraints
Side Constraints
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Most algorithm require less than!!!
6. Optimization Toolbox
Is a collection of functions that extend the capability of
MATLAB. The toolbox includes routines for:
Unconstrained optimization
Constrained nonlinear optimization, including goal
attainment problems, minimax problems, and semi-
infinite minimization problems
Quadratic and linear programming
Nonlinear least squares and curve fitting
Nonlinear systems of equations solving
Constrained linear least squares
Specialized algorithms for large scale problems
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
11. Implementing Opt. Toolbox
Most of these optimization routines require
the definition of an M-file containing the
function, f, to be minimized.
Maximization is achieved by supplying the
routines with –f.
Optimization options passed to the routines
change optimization parameters.
Default optimization parameters can be
changed through an options structure.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
12. Unconstrained Minimization
Consider the problem of finding a set of
values [x1 x2]T
that solves
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
( ) ( )1
~
2 2
1 2 1 2 2
~
min 4 2 4 2 1x
x
f x e x x x x x= + + + +
[ ]1 2
~
T
x x x=
13. Steps
Create an M-file that returns the
function value (Objective Function)
Call it objfun.m
Then, invoke the unconstrained
minimization routine
Use fminunc
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
14. Step 1 – Obj. Function
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
function f = objfun(x)
f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1);
[ ]1 2
~
T
x x x=
Objective function
16. Results
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
xmin =
0.5000 -1.0000
feval =
1.3028e-010
exitflag =
1
output =
iterations: 7
funcCount: 40
stepsize: 1
firstorderopt: 8.1998e-004
algorithm: 'medium-scale: Quasi-Newton line search'
Minimum point of design variables
Objective function value
Exitflag tells if the algorithm is converged.
If exitflag > 0, then local minimum is found
Some other information
17. More on fminunc – Input
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
fun: Return a function of objective function.
x0: Starts with an initial guess. The guess must be a vector of size of
number of design variables.
option: To set some of the optimization parameters. (More after few
slides)
P1,P2,…: To pass additional parameters.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Ref. Manual: Pg. 5-5 to 5-9
18. More on fminunc – Output
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
xmin: Vector of the minimum point (optimal point). The size is the
number of design variables.
feval: The objective function value of at the optimal point.
exitflag: A value shows whether the optimization routine is
terminated successfully. (converged if >0)
output: This structure gives more details about the optimization
grad: The gradient value at the optimal point.
hessian: The hessian value of at the optimal point
Ref. Manual: Pg. 5-5 to 5-9
19. Options Setting – optimset
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
The routines in Optimization Toolbox has a set of default optimization
parameters.
However, the toolbox allows you to alter some of those parameters,
for example: the tolerance, the step size, the gradient or hessian
values, the max. number of iterations etc.
There are also a list of features available, for example: displaying the
values at each iterations, compare the user supply gradient or
hessian, etc.
You can also choose the algorithm you wish to use.
Ref. Manual: Pg. 5-10 to 5-14
20. Options Setting (Cont.)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Type help optimset in command window, a list of options setting available will be displayed.
How to read? For example:
LargeScale - Use large-scale algorithm if
possible [ {on} | off ]
The default is with { }
Parameter (param1)
Value (value1)
21. Options Setting (Cont.)
LargeScale - Use large-scale algorithm if
possible [ {on} | off ]
Since the default is on, if we would like to turn off, we just type:
Options =
optimset(‘LargeScale’, ‘off’)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
and pass to the input of fminunc.
22. Useful Option Settings
Display - Level of display [ off | iter |
notify | final ]
MaxIter - Maximum number of iterations
allowed [ positive integer ]
TolCon - Termination tolerance on the
constraint violation [ positive scalar ]
TolFun - Termination tolerance on the
function value [ positive scalar ]
TolX - Termination tolerance on X [ positive
scalar ]
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Ref. Manual: Pg. 5-10 to 5-14
Highly recommended to use!!!
23. fminunc and fminsearch
fminunc uses algorithm with gradient
and hessian information.
Two modes:
Large-Scale: interior-reflective Newton
Medium-Scale: quasi-Newton (BFGS)
Not preferred in solving highly
discontinuous functions.
This function may only give local solutions.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
24. fminunc and fminsearch
fminsearch is generally less efficient than
fminunc for problems of order greater than
two. However, when the problem is highly
discontinuous, fminsearch may be more
robust.
This is a direct search method that does not
use numerical or analytic gradients as in
fminunc.
This function may only give local solutions.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
25. Constrained Minimization
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
[xmin,feval,exitflag,output,lambda,grad,hessian]
=
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,
P1,P2,…)
Vector of Lagrange
Multiplier at optimal
point
26. Example
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
( )~
1 2 3
~
min
x
f x x x x= −
2
1 22 0x x+ ≤
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
− − − ≤
+ + ≤
1 2 30 , , 30x x x≤ ≤
Subject to:
1 2 2 0
,
1 2 2 72
A B
− − −
= =
0 30
0 , 30
0 30
LB UB
= =
function f = myfun(x)
f=-x(1)*x(2)*x(3);
27. Example (Cont.)
2
1 22 0x x+ ≤For
Create a function call nonlcon which returns 2 constraint vectors [C,Ceq]
function [C,Ceq]=nonlcon(x)
C=2*x(1)^2+x(2);
Ceq=[];
Remember to return a null
Matrix if the constraint does
not apply
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
29. Example (Cont.)
Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to
medium-scale (line search).
> In D:ProgramsMATLAB6p1toolboxoptimfmincon.m at line 213
In D:usrCHINTANGOptToolboxmin_con.m at line 6
Optimization terminated successfully:
Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum
constraint violation is less than options.TolCon
Active Constraints:
2
9
x =
0.00050378663220
0.00000000000000
30.00000000000000
feval =
-4.657237250542452e-035
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
2
1 22 0x x+ ≤
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
− − − ≤
+ + ≤
1
2
3
0 30
0 30
0 30
x
x
x
≤ ≤
≤ ≤
≤ ≤
Const. 1
Const. 2
Const. 3
Const. 4
Const. 5
Const. 6
Const. 7
Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq
Const. 8
Const. 9
30. Multiobjective Optimization
Previous examples involved problems
with a single objective function.
Now let us look at solving problem with
multiobjective function by lsqnonlin.
Example is taken by designing an
optimal PID controller for an plant.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
31. Simulink Example
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Goal: Optimize the control parameters in Simulink model optsim.mdl
in order to minimize the error between the output and input.
Plant description:
• Third order under-damped with actuator limits.
• Actuation limits are a saturation limit and a slew rate limit.
• Saturation limit cuts off input: +/- 2 units
• Slew rate limit: 0.8 unit/sec
33. Solving Methodology
Design variables are the gains in PID
controller (KP, KI and KD) .
Objective function is the error between
the output and input.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
34. Solving Methodology (Cont.)
Let pid = [Kp Ki Kd]T
Let also the step input is unity.
F = yout - 1
Construct a function tracklsq for
objective function.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
35. Objective Function
function F = tracklsq(pid,a1,a2)
Kp = pid(1);
Ki = pid(2);
Kd = pid(3);
% Compute function value
opt = simset('solver','ode5','SrcWorkspace','Current');
[tout,xout,yout] = sim('optsim',[0 100],opt);
F = yout-1;
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Getting the simulation
data from Simulink
The idea is perform nonlinear least squares
minimization of the errors from time 0 to 100
at the time step of 1.
So, there are 101 objective functions to minimize.
39. Results (Cont.)
Initial Design
Optimization Process
Optimal Controller Result
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
40. Conclusion
Easy to use! But, we do not know what is happening
behind the routine. Therefore, it is still important to
understand the limitation of each routine.
Basic steps:
Recognize the class of optimization problem
Define the design variables
Create objective function
Recognize the constraints
Start an initial guess
Invoke suitable routine
Analyze the results (it might not make sense)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization Conclusion