SlideShare a Scribd company logo
1 of 34
Multivariable Optimization
Multivariable Functions
than one
□ Functions depends on more
variable
□ In a multivariable function, the gradient of
a function is not a scalar quantity; instead
it is a vector quantity
□ The objective function is a function of N
variables represented by x1, x2, . . . , xN.
x(t)
is
an N
□ The gradient vector at any point
represented by ∇f(x(t)) which is
dimensional vector given as follows:
Gradient of a multivariable function
□ Geometrically
, the gradient vector is
normal to the tangent plane at the point
x*,
□ Also, it points in the direction of
maximum increase in the function
Function of Two Variables (contour)
Unidirectional Search
□ Many multivariable optimization techniques
use successive unidirectional search techniques
to find the minimum point along a particular
search direction..
□ A unidirectional search is a search performed
by comparing function values only along a
specified direction.
□ A unidirectional search is performed from a
point x(t) and in a specified direction s(t)..
□ Any arbitrary point on that line can be
expressed as follows:
□ The parameter α is a scalar quantity, specifying
⚫
Direct Search Methods
□ Use function values only.
□ In a single-variable function
optimization, there are only two search
directions a point
modified—either in the
can be
positive x-
direction or the negative x-direction
□ In multi-variable function optimization,
each variable can be modified either in
the positive or in the negative direction,
thereby totaling 2N
different ways
Box’s Evolutionary
Optimization Method
□ Developed by G. E. P. Box in 1957
□ The algorithm requires (2N
+1) points, of
2N
which are corner points of an N-
dimensional hypercube centred on the
other point
□ All (2N
+ 1) function values are compared
and the best point is identified
□ In the next iteration, another hypercube is
formed around this best point.
□ If at any iteration, an improved point is not
found, the size of the hypercube is reduced.
□ This process continues until the hypercube
Algorithm for Box Evolutionary
optimization method
⚫ Step 1 Choose an initial point x(0)
and
size reduction parameters Δi for all
design variables, i = 1, 2, . . . ,N. Choose a
termination parameter ϵ. Set
⚫ Step 2 If ∥ Δ ∥ < ϵ, Terminate; Else create
2N
points by adding and subtracting Δi/2
from each variable at the point
⚫ Step 3 Compute function values at all
(2N
+1) points. Find the point having the
minimum function value. Designate the
minimum point to be
⚫ Step 4 If , reduce size parameters Δi =
⚫ In the above algorithm, x(0)
is always set
as the current best point.
⚫ Thus, at the end of simulation, x(0)
becomes the obtained optimum point.
⚫ It is evident from the algorithm that at
most 2N
functions are evaluated at each
iteration.
⚫ Thus, the required number of function
evaluations increases exponentially
with N.
Box’s Evolutionary
Optimization Method
Problem
⚫
⚫ It is interesting to note that although the
minimum point is found, the algorithm
does not terminate at this step.
⚫ Since the current point is the minimum,
no other point can be found better than
x(0) = (3, 2)T
⚫ therefore, in subsequent iterations the
value of the size parameter will
continue to decrease (according to Step
4 of the algorithm).
⚫ When the value ∥ Δ ∥ becomes smaller
than ϵ, the algorithm terminates.
Simplex Search Method
□ The number of points in the initial simplex is
much less compared to that in Box’s
evolutionary optimization method
□ This reduces the number of function
evaluations required in each iteration.
□ For N variables only (N + 1) points are used
in the initial simplex
□ It is important that the points chosen for the
initial simplex should not form a zero-
volume N-dimensional hypercube.
□ Thus, in a function with two variables, the
chosen three points in the simplex should
not lie along a line
□ At each iteration, the worst point in the
simplex is found first.
□ Then, a new simplex is formed from the
old simplex by fixed rules that steer the
search away from the worst point in the
simplex.
□ The extent of steering depends on the
relative function values of the simplex.
Four
depen
different situations may arise
ding on the function values.
Simplex Search Method
□ This algorithm was originally proposed by Spendley,
et al. (1962) and later modified by Nelder and Mead
(1965).
□ At first, the centroid (xc) of all points except worst
point is determined.
□ Thereafter, the worst point in the simplex is
reflected about the centroid and a new point xr is
found.
□ If the function value at this point is better than the
best point in the simplex, the reflection is
considered to have taken the simplex to a good
region
□ Thus, an expansion along the direction from the
Simplex Search Method
□ If the function value at the reflected point is
worse than the worst point in the simplex, the
reflection is considered to have taken the simplex
to a bad region in the search space.
centroid to the reflected point is made .
□ Thus, a contraction in the direction from the
□ The amount of contraction is controlled by a
factor β (a negative value of β is used).
□ If the function value at the reflected point is better
than the worst point in the simplex, a contraction
is made with a positive β value
□ The default scenario is the reflected point itself.
The obtained new point replaces the worst point
in the simplex and the algorithm continues with
Simplex Search Method
Algorithm Simplex Search Method
⚫ Step 1 Choose γ > 1, β ∈ (0, 1), and a
termination parameter ϵ. Create an initial
simplex.
⚫ Step 2 Find xh (the worst point), xl (the best point),
and xg (next to the worst point). Calculate
⚫ Step 3 Calculate the reflected point xr = 2xc − xh. Set xnew
= xr.
□ If f(xr) < f(xl), set xnew = (1 + γ)xc − γxh (expansion);
□ Else if f(xr) ≥ f(xh), set xnew = (1 − β)xc + βxh (contraction);
□ Else if f(xg) < f(xr) < f(xh), set xnew = (1 + β)xc − βxh
(contraction).
□ Calculate f(xnew) and replace xh by xnew.
⚫ Step 4 , Terminate;
⚫
Simplex Search Method
Hooke-Jeeves Pattern Search
Method
⚫ The pattern search method works by creating a set
of search directions iteratively. The created search
directions should be such that they completely
span the search space..
⚫ In a N-dimensional problem, this requires at least
N linearly independent search directions.
⚫ In the Hooke-Jeeves method, a combination of
exploratory moves and pattern moves is made
iteratively.
⚫ An exploratory move is performed in the vicinity
of the current point systematically to find the best
point around the current point.
⚫ Thereafter, two such points are used to make a
pattern move.
1. Algorithm of Exploratory
move
Assume that the current solution (the base
point) is denoted by xc
. Assume also that the
variable xc
i is perturbed by Δi. Set i = 1 and x
= xc
.
□ Step 1 Calculate f = f(x), f+
=f(xi+Δi) and
f−
=f(xi−Δi).
□ Step 2 Find fmin = min(f, f+
, f−
). Set x
corresponds to fmin.
□ Step 3 Is i = N? If no, set i = i + 1 and go to
Step 1; Else x is the result and go to Step 4.
□ Step 4 If x ̸= xc
, success; Else failure.
Exploratory move
□ In the exploratory move, the current point is
perturbed in positive and negative directions
along each variable one at a time and the best
point is recorded.
□ The current point is changed to the best point
at the end of each variable perturbation.
□ If the point found at the end of all variable
point, the exploratory move is a
perturbations is different than the original
success,
otherwise the exploratory move is a failure.
□ In any case, the best point is considered to be
the outcome of the exploratory move.
2. Pattern move
⚫ A new point is found by jumping from the
xc
current best point along a direction
connecting the previous best point x(k−1)
and the
current base point x(k)
as follows:
⚫ The Hooke-Jeeves method comprises of an
iterative application of an exploratory move in
the locality of the current point and a
subsequent jump using the pattern move.
⚫ If the pattern move does not take the solution
to a better region, the pattern move is not
accepted and the extent of the exploratory
search is reduced.
⚫ Step 1 Choose a starting point x(0)
, variable increments
Δi (i = 1, 2, . . . ,N), a step reduction factor α > 1, and a
termination parameter, ϵ. Set k = 0.
⚫ Step 2 Perform an exploratory move with x(k) as the
base point. Say x is the outcome of the exploratory move.
If the exploratory move is a success, set x(k+1)
= x and go to
Step 4; Else go to Step 3.
⚫ Step 3 Is ∥ Δ ∥ < ϵ? If yes, Terminate; Else set Δi = Δi/α
for i = 1, 2, . . . ,N and go to Step 2.
⚫ Step 4 Set k = k + 1 and perform the pattern move: x(k+1)
p
= x(k)
+ (x(k)
− x(k−1)
).
⚫ Step 5 Perform another exploratory move using x(k+1)
p as
the base point. Let the result be x(k+1).
⚫ Step 6 Is f(x(k+1)
) < f(x(k)
)? If yes, go to Step 4; Else go
to Step 3.
1. Algorithm of Pattern move
⚫
Gradient-based Methods
□ Direct search methods described
require many function evaluations to
converge to the minimum point.
□ Gradient-based methods discussed
exploit the derivative information of the
function and are usually faster search
methods.
□ where the derivative information is
easily available, gradient-based
methods are very efficient.
Search Direction
□ The first derivative ∇f(x(t)
) at any point
x(t)
represents the direction of the
maximum increase of the function
value.
Search Direction
□ Finding a point with the minimum function
value, ideally searching along the opposite to
the first derivative direction, that is, we should
search along −∇f(x(t)) direction.
□ Any search direction s(t) having smaller
function value than that at the current point
x(t). Thus, a search direction s(t) that satisfies
the following relation is a descent direction.
Descent direction
A search direction, s(t), is a descent direction at
point x(t) if the condition ∇f(x(t)) · s(t) ≤ 0 is
satisfied in the vicinity of the point x(t).
x(k+1)
=x(k)
+α.s(k)
Cauchy’s (steepest descent)
Method
□ The steepest descent method uses the
gradient vector at each point as the search
direction for each iteration.
□ The search direction used in Cauchy’s method
is the negative of the gradient at any
particular point x(t)
:
s(k)
= −∇f(x(k)
).
□ Since this direction gives maximum descent in
function values, it is also known as the steepest
descent method.
□ At every iteration, the derivative is computed at
the current point and a unidirectional search is
performed in the negative to this derivative
direction to find the minimum point along that
direction.
□ The minimum point becomes the current point
and the search is continued from this point.
□ The algorithm continues until a point having a
algorithm guarantees improvement in
small enough gradient vector is found. This
the
function value at every iteration.
Cauchy’s (steepest descent)
Method
□ Step 1 Choose a maximum number of iterations
x(0)
,
M to be performed, an initial point two
termination parameters ϵ1, ϵ2, and set k = 0.
□ Step 2 Calculate ∇f(x(k)
), the first derivative at
the point x(k)
.
□ Step 3 If ∥∇f(x( k )
)∥ ≤ ϵ1, Terminate; Else if k ≥ M;
Terminate; Else go to Step 4.
□ Step 4 Perform a unidirectional search to find α(k)
using ϵ2 such that f(x(k+1)
) = f(x(k)
−α(k)
∇f(x(k)
)) is
minimum. One criterion for termination is when
|∇f(x(k+1)
) · ∇f(x(k))| ≤ ϵ2.
∥x(k+1)
−x(k)
∥
□ Step 5 Is /∥x( k )
∥ ≤ ϵ1? If yes,
Terminate; Else set k = k + 1 and go to Step
2.
Algorithm :Cauchy’s (steepest
descent) Method
⚫

More Related Content

What's hot

Probability mass functions and probability density functions
Probability mass functions and probability density functionsProbability mass functions and probability density functions
Probability mass functions and probability density functions
Ankit Katiyar
 
Graphical Method
Graphical MethodGraphical Method
Graphical Method
Sachin MK
 

What's hot (20)

LINEAR PROGRAMMING PROBLEMS.pptx
LINEAR PROGRAMMING PROBLEMS.pptxLINEAR PROGRAMMING PROBLEMS.pptx
LINEAR PROGRAMMING PROBLEMS.pptx
 
System of linear equations
System of linear equationsSystem of linear equations
System of linear equations
 
Interpolation
InterpolationInterpolation
Interpolation
 
Secant Method
Secant MethodSecant Method
Secant Method
 
Secant Method
Secant MethodSecant Method
Secant Method
 
Uniform Distribution
Uniform DistributionUniform Distribution
Uniform Distribution
 
Bifurcation
BifurcationBifurcation
Bifurcation
 
newton raphson method
newton raphson methodnewton raphson method
newton raphson method
 
Newton-Raphson Method
Newton-Raphson MethodNewton-Raphson Method
Newton-Raphson Method
 
Mathematics and History of Complex Variables
Mathematics and History of Complex VariablesMathematics and History of Complex Variables
Mathematics and History of Complex Variables
 
Probability mass functions and probability density functions
Probability mass functions and probability density functionsProbability mass functions and probability density functions
Probability mass functions and probability density functions
 
Linear Programming
Linear ProgrammingLinear Programming
Linear Programming
 
Es272 ch2
Es272 ch2Es272 ch2
Es272 ch2
 
Lecture27 linear programming
Lecture27 linear programmingLecture27 linear programming
Lecture27 linear programming
 
Lect 2 bif_th
Lect 2 bif_thLect 2 bif_th
Lect 2 bif_th
 
NUMERICAL INTEGRATION AND ITS APPLICATIONS
NUMERICAL INTEGRATION AND ITS APPLICATIONSNUMERICAL INTEGRATION AND ITS APPLICATIONS
NUMERICAL INTEGRATION AND ITS APPLICATIONS
 
Graphical Method
Graphical MethodGraphical Method
Graphical Method
 
Lecture 3 - Introduction to Interpolation
Lecture 3 - Introduction to InterpolationLecture 3 - Introduction to Interpolation
Lecture 3 - Introduction to Interpolation
 
Applications of Derivatives
Applications of DerivativesApplications of Derivatives
Applications of Derivatives
 
NUMERICAL METHODS MULTIPLE CHOICE QUESTIONS
NUMERICAL METHODS MULTIPLE CHOICE QUESTIONSNUMERICAL METHODS MULTIPLE CHOICE QUESTIONS
NUMERICAL METHODS MULTIPLE CHOICE QUESTIONS
 

Similar to Multivariable Optimization-for class (1).pptx

AOT3 Multivariable Optimization Algorithms.pdf
AOT3 Multivariable Optimization Algorithms.pdfAOT3 Multivariable Optimization Algorithms.pdf
AOT3 Multivariable Optimization Algorithms.pdf
SandipBarik8
 
AOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdfAOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdf
SandipBarik8
 
Rasterisation of a circle by the bresenham algorithm
Rasterisation of a circle by the bresenham algorithmRasterisation of a circle by the bresenham algorithm
Rasterisation of a circle by the bresenham algorithm
KALAIRANJANI21
 
Rasterisation of a circle by the bresenham algorithm
Rasterisation of a circle by the bresenham algorithmRasterisation of a circle by the bresenham algorithm
Rasterisation of a circle by the bresenham algorithm
KALAIRANJANI21
 

Similar to Multivariable Optimization-for class (1).pptx (20)

AOT3 Multivariable Optimization Algorithms.pdf
AOT3 Multivariable Optimization Algorithms.pdfAOT3 Multivariable Optimization Algorithms.pdf
AOT3 Multivariable Optimization Algorithms.pdf
 
Optimization methods in engineering 2.pptx
Optimization methods in engineering 2.pptxOptimization methods in engineering 2.pptx
Optimization methods in engineering 2.pptx
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 
Optim_methods.pdf
Optim_methods.pdfOptim_methods.pdf
Optim_methods.pdf
 
Lecture 7_introduction_algorithms.pptx
Lecture 7_introduction_algorithms.pptxLecture 7_introduction_algorithms.pptx
Lecture 7_introduction_algorithms.pptx
 
Intro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdfIntro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdf
 
Fortran chapter 2.pdf
Fortran chapter 2.pdfFortran chapter 2.pdf
Fortran chapter 2.pdf
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Nelder Mead Search Algorithm
Nelder Mead Search AlgorithmNelder Mead Search Algorithm
Nelder Mead Search Algorithm
 
Heuristic search
Heuristic searchHeuristic search
Heuristic search
 
AOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdfAOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdf
 
Optimum engineering design - Day 6. Classical optimization methods
Optimum engineering design - Day 6. Classical optimization methodsOptimum engineering design - Day 6. Classical optimization methods
Optimum engineering design - Day 6. Classical optimization methods
 
algorithm Unit 2
algorithm Unit 2 algorithm Unit 2
algorithm Unit 2
 
Unit 2 in daa
Unit 2 in daaUnit 2 in daa
Unit 2 in daa
 
Newton raphson method
Newton raphson methodNewton raphson method
Newton raphson method
 
Rasterisation of a circle by the bresenham algorithm
Rasterisation of a circle by the bresenham algorithmRasterisation of a circle by the bresenham algorithm
Rasterisation of a circle by the bresenham algorithm
 
Rasterisation of a circle by the bresenham algorithm
Rasterisation of a circle by the bresenham algorithmRasterisation of a circle by the bresenham algorithm
Rasterisation of a circle by the bresenham algorithm
 
Scalable k-means plus plus
Scalable k-means plus plusScalable k-means plus plus
Scalable k-means plus plus
 
Unit 4 jwfiles
Unit 4 jwfilesUnit 4 jwfiles
Unit 4 jwfiles
 
Classical optimization theory Unconstrained Problem
Classical optimization theory Unconstrained ProblemClassical optimization theory Unconstrained Problem
Classical optimization theory Unconstrained Problem
 

Recently uploaded

Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
HenryBriggs2
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Kandungan 087776558899
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
mphochane1998
 
+97470301568>> buy weed in qatar,buy thc oil qatar,buy weed and vape oil in d...
+97470301568>> buy weed in qatar,buy thc oil qatar,buy weed and vape oil in d...+97470301568>> buy weed in qatar,buy thc oil qatar,buy weed and vape oil in d...
+97470301568>> buy weed in qatar,buy thc oil qatar,buy weed and vape oil in d...
Health
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
ssuser89054b
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
Epec Engineered Technologies
 
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills KuwaitKuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
jaanualu31
 

Recently uploaded (20)

Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdf
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
 
+97470301568>> buy weed in qatar,buy thc oil qatar,buy weed and vape oil in d...
+97470301568>> buy weed in qatar,buy thc oil qatar,buy weed and vape oil in d...+97470301568>> buy weed in qatar,buy thc oil qatar,buy weed and vape oil in d...
+97470301568>> buy weed in qatar,buy thc oil qatar,buy weed and vape oil in d...
 
A CASE STUDY ON CERAMIC INDUSTRY OF BANGLADESH.pptx
A CASE STUDY ON CERAMIC INDUSTRY OF BANGLADESH.pptxA CASE STUDY ON CERAMIC INDUSTRY OF BANGLADESH.pptx
A CASE STUDY ON CERAMIC INDUSTRY OF BANGLADESH.pptx
 
Hostel management system project report..pdf
Hostel management system project report..pdfHostel management system project report..pdf
Hostel management system project report..pdf
 
Double Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueDouble Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torque
 
2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projects2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projects
 
Employee leave management system project.
Employee leave management system project.Employee leave management system project.
Employee leave management system project.
 
kiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal loadkiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal load
 
School management system project Report.pdf
School management system project Report.pdfSchool management system project Report.pdf
School management system project Report.pdf
 
Introduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaIntroduction to Serverless with AWS Lambda
Introduction to Serverless with AWS Lambda
 
Minimum and Maximum Modes of microprocessor 8086
Minimum and Maximum Modes of microprocessor 8086Minimum and Maximum Modes of microprocessor 8086
Minimum and Maximum Modes of microprocessor 8086
 
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
 
DC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equationDC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equation
 
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills KuwaitKuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
Kuwait City MTP kit ((+919101817206)) Buy Abortion Pills Kuwait
 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the start
 

Multivariable Optimization-for class (1).pptx

  • 2. Multivariable Functions than one □ Functions depends on more variable □ In a multivariable function, the gradient of a function is not a scalar quantity; instead it is a vector quantity □ The objective function is a function of N variables represented by x1, x2, . . . , xN. x(t) is an N □ The gradient vector at any point represented by ∇f(x(t)) which is dimensional vector given as follows:
  • 3. Gradient of a multivariable function □ Geometrically , the gradient vector is normal to the tangent plane at the point x*, □ Also, it points in the direction of maximum increase in the function
  • 4. Function of Two Variables (contour)
  • 5. Unidirectional Search □ Many multivariable optimization techniques use successive unidirectional search techniques to find the minimum point along a particular search direction.. □ A unidirectional search is a search performed by comparing function values only along a specified direction. □ A unidirectional search is performed from a point x(t) and in a specified direction s(t).. □ Any arbitrary point on that line can be expressed as follows: □ The parameter α is a scalar quantity, specifying
  • 6.
  • 7.
  • 8. Direct Search Methods □ Use function values only. □ In a single-variable function optimization, there are only two search directions a point modified—either in the can be positive x- direction or the negative x-direction □ In multi-variable function optimization, each variable can be modified either in the positive or in the negative direction, thereby totaling 2N different ways
  • 9. Box’s Evolutionary Optimization Method □ Developed by G. E. P. Box in 1957 □ The algorithm requires (2N +1) points, of 2N which are corner points of an N- dimensional hypercube centred on the other point □ All (2N + 1) function values are compared and the best point is identified □ In the next iteration, another hypercube is formed around this best point. □ If at any iteration, an improved point is not found, the size of the hypercube is reduced. □ This process continues until the hypercube
  • 10. Algorithm for Box Evolutionary optimization method ⚫ Step 1 Choose an initial point x(0) and size reduction parameters Δi for all design variables, i = 1, 2, . . . ,N. Choose a termination parameter ϵ. Set ⚫ Step 2 If ∥ Δ ∥ < ϵ, Terminate; Else create 2N points by adding and subtracting Δi/2 from each variable at the point ⚫ Step 3 Compute function values at all (2N +1) points. Find the point having the minimum function value. Designate the minimum point to be ⚫ Step 4 If , reduce size parameters Δi =
  • 11. ⚫ In the above algorithm, x(0) is always set as the current best point. ⚫ Thus, at the end of simulation, x(0) becomes the obtained optimum point. ⚫ It is evident from the algorithm that at most 2N functions are evaluated at each iteration. ⚫ Thus, the required number of function evaluations increases exponentially with N. Box’s Evolutionary Optimization Method
  • 13. ⚫ It is interesting to note that although the minimum point is found, the algorithm does not terminate at this step. ⚫ Since the current point is the minimum, no other point can be found better than x(0) = (3, 2)T ⚫ therefore, in subsequent iterations the value of the size parameter will continue to decrease (according to Step 4 of the algorithm). ⚫ When the value ∥ Δ ∥ becomes smaller than ϵ, the algorithm terminates.
  • 14. Simplex Search Method □ The number of points in the initial simplex is much less compared to that in Box’s evolutionary optimization method □ This reduces the number of function evaluations required in each iteration. □ For N variables only (N + 1) points are used in the initial simplex □ It is important that the points chosen for the initial simplex should not form a zero- volume N-dimensional hypercube. □ Thus, in a function with two variables, the chosen three points in the simplex should not lie along a line
  • 15. □ At each iteration, the worst point in the simplex is found first. □ Then, a new simplex is formed from the old simplex by fixed rules that steer the search away from the worst point in the simplex. □ The extent of steering depends on the relative function values of the simplex. Four depen different situations may arise ding on the function values. Simplex Search Method
  • 16. □ This algorithm was originally proposed by Spendley, et al. (1962) and later modified by Nelder and Mead (1965). □ At first, the centroid (xc) of all points except worst point is determined. □ Thereafter, the worst point in the simplex is reflected about the centroid and a new point xr is found. □ If the function value at this point is better than the best point in the simplex, the reflection is considered to have taken the simplex to a good region □ Thus, an expansion along the direction from the Simplex Search Method
  • 17. □ If the function value at the reflected point is worse than the worst point in the simplex, the reflection is considered to have taken the simplex to a bad region in the search space. centroid to the reflected point is made . □ Thus, a contraction in the direction from the □ The amount of contraction is controlled by a factor β (a negative value of β is used). □ If the function value at the reflected point is better than the worst point in the simplex, a contraction is made with a positive β value □ The default scenario is the reflected point itself. The obtained new point replaces the worst point in the simplex and the algorithm continues with Simplex Search Method
  • 18. Algorithm Simplex Search Method ⚫ Step 1 Choose γ > 1, β ∈ (0, 1), and a termination parameter ϵ. Create an initial simplex. ⚫ Step 2 Find xh (the worst point), xl (the best point), and xg (next to the worst point). Calculate ⚫ Step 3 Calculate the reflected point xr = 2xc − xh. Set xnew = xr. □ If f(xr) < f(xl), set xnew = (1 + γ)xc − γxh (expansion); □ Else if f(xr) ≥ f(xh), set xnew = (1 − β)xc + βxh (contraction); □ Else if f(xg) < f(xr) < f(xh), set xnew = (1 + β)xc − βxh (contraction). □ Calculate f(xnew) and replace xh by xnew. ⚫ Step 4 , Terminate;
  • 19.
  • 21. Hooke-Jeeves Pattern Search Method ⚫ The pattern search method works by creating a set of search directions iteratively. The created search directions should be such that they completely span the search space.. ⚫ In a N-dimensional problem, this requires at least N linearly independent search directions. ⚫ In the Hooke-Jeeves method, a combination of exploratory moves and pattern moves is made iteratively. ⚫ An exploratory move is performed in the vicinity of the current point systematically to find the best point around the current point. ⚫ Thereafter, two such points are used to make a pattern move.
  • 22. 1. Algorithm of Exploratory move Assume that the current solution (the base point) is denoted by xc . Assume also that the variable xc i is perturbed by Δi. Set i = 1 and x = xc . □ Step 1 Calculate f = f(x), f+ =f(xi+Δi) and f− =f(xi−Δi). □ Step 2 Find fmin = min(f, f+ , f− ). Set x corresponds to fmin. □ Step 3 Is i = N? If no, set i = i + 1 and go to Step 1; Else x is the result and go to Step 4. □ Step 4 If x ̸= xc , success; Else failure.
  • 23. Exploratory move □ In the exploratory move, the current point is perturbed in positive and negative directions along each variable one at a time and the best point is recorded. □ The current point is changed to the best point at the end of each variable perturbation. □ If the point found at the end of all variable point, the exploratory move is a perturbations is different than the original success, otherwise the exploratory move is a failure. □ In any case, the best point is considered to be the outcome of the exploratory move.
  • 24. 2. Pattern move ⚫ A new point is found by jumping from the xc current best point along a direction connecting the previous best point x(k−1) and the current base point x(k) as follows: ⚫ The Hooke-Jeeves method comprises of an iterative application of an exploratory move in the locality of the current point and a subsequent jump using the pattern move. ⚫ If the pattern move does not take the solution to a better region, the pattern move is not accepted and the extent of the exploratory search is reduced.
  • 25. ⚫ Step 1 Choose a starting point x(0) , variable increments Δi (i = 1, 2, . . . ,N), a step reduction factor α > 1, and a termination parameter, ϵ. Set k = 0. ⚫ Step 2 Perform an exploratory move with x(k) as the base point. Say x is the outcome of the exploratory move. If the exploratory move is a success, set x(k+1) = x and go to Step 4; Else go to Step 3. ⚫ Step 3 Is ∥ Δ ∥ < ϵ? If yes, Terminate; Else set Δi = Δi/α for i = 1, 2, . . . ,N and go to Step 2. ⚫ Step 4 Set k = k + 1 and perform the pattern move: x(k+1) p = x(k) + (x(k) − x(k−1) ). ⚫ Step 5 Perform another exploratory move using x(k+1) p as the base point. Let the result be x(k+1). ⚫ Step 6 Is f(x(k+1) ) < f(x(k) )? If yes, go to Step 4; Else go to Step 3. 1. Algorithm of Pattern move
  • 26.
  • 27. Gradient-based Methods □ Direct search methods described require many function evaluations to converge to the minimum point. □ Gradient-based methods discussed exploit the derivative information of the function and are usually faster search methods. □ where the derivative information is easily available, gradient-based methods are very efficient.
  • 28. Search Direction □ The first derivative ∇f(x(t) ) at any point x(t) represents the direction of the maximum increase of the function value.
  • 29. Search Direction □ Finding a point with the minimum function value, ideally searching along the opposite to the first derivative direction, that is, we should search along −∇f(x(t)) direction. □ Any search direction s(t) having smaller function value than that at the current point x(t). Thus, a search direction s(t) that satisfies the following relation is a descent direction. Descent direction A search direction, s(t), is a descent direction at point x(t) if the condition ∇f(x(t)) · s(t) ≤ 0 is satisfied in the vicinity of the point x(t). x(k+1) =x(k) +α.s(k)
  • 30. Cauchy’s (steepest descent) Method □ The steepest descent method uses the gradient vector at each point as the search direction for each iteration. □ The search direction used in Cauchy’s method is the negative of the gradient at any particular point x(t) : s(k) = −∇f(x(k) ).
  • 31. □ Since this direction gives maximum descent in function values, it is also known as the steepest descent method. □ At every iteration, the derivative is computed at the current point and a unidirectional search is performed in the negative to this derivative direction to find the minimum point along that direction. □ The minimum point becomes the current point and the search is continued from this point. □ The algorithm continues until a point having a algorithm guarantees improvement in small enough gradient vector is found. This the function value at every iteration. Cauchy’s (steepest descent) Method
  • 32. □ Step 1 Choose a maximum number of iterations x(0) , M to be performed, an initial point two termination parameters ϵ1, ϵ2, and set k = 0. □ Step 2 Calculate ∇f(x(k) ), the first derivative at the point x(k) . □ Step 3 If ∥∇f(x( k ) )∥ ≤ ϵ1, Terminate; Else if k ≥ M; Terminate; Else go to Step 4. □ Step 4 Perform a unidirectional search to find α(k) using ϵ2 such that f(x(k+1) ) = f(x(k) −α(k) ∇f(x(k) )) is minimum. One criterion for termination is when |∇f(x(k+1) ) · ∇f(x(k))| ≤ ϵ2. ∥x(k+1) −x(k) ∥ □ Step 5 Is /∥x( k ) ∥ ≤ ϵ1? If yes, Terminate; Else set k = k + 1 and go to Step 2. Algorithm :Cauchy’s (steepest descent) Method
  • 33.
  • 34.