SlideShare a Scribd company logo
1 of 47
Download to read offline
R. Zakhama
Structural Optimisation course
Chapter 2: Mathematical
Programming
Part 1: Unconstrained Optimisation
R. Zakhama 2
Structural Optimisation course
Chap2: Mathematical Programming
I. Introduction
I.1 General problem formulation
min ( )
min ( ) min ( )
s.t. ( ) 0
s.t. ( ) 0 s.t. ( ) 0
( ) 0
X
X Z
f X
f X Z
g X
X Z
h X
X Z
X
ϕ
ψ θ
χ χ
χ

 

 
  
≤
 ≤  =
  
=
   ′
∈ ∈
 
  
∈








≤
≤
−
⇔
=
0
)
(
0
)
(
0
)
(
X
h
X
h
X
h
and
with
0
0
)
(
0
)
(




n
R
Y
X
Z
Y
Y
X
X
+
×
=
′






=



≥
=
+
⇔
≤
χ
χ
ψ
ψ
R. Zakhama 3
Structural Optimisation course
Chap2: Mathematical Programming
I.2 Classification of optimisation problems
• Linear programming
Algorithms: Simplex (Dantzig 1947), Interior point (Karmarkar 1984)
• Nonlinear programming
where f,g,h: Rn→ R ∈ C1 or C2
0
s.t.
min T







≥
≤
X
b
AX
X
C
X
0
)
(
0
)
(
s.t.
)
(
min







⊆
∈
=
≤
n
X
R
X
X
h
X
g
X
f
χ
R. Zakhama 4
Structural Optimisation course
Chap2: Mathematical Programming
Definition
Unconstrained optimisation problem
II. Unconstrained problems
Let χ ⊆ Rn , f: χ→ R and
Be(X*)={X∈Rn : DX-X*D≤ ε} where ε ≥ 0
Question: Under what condition on f and χ does the function f attain its
minimum in the set χ?
)
(
)
( *
χ
∈
∀
≤ x
X
f
X
f
X* ∈ χ is said to be global minimum of f over χ, if
)
(
min X
f
X χ
∈
Definition
)
(
)
( *
e
B
X
X
f
X
f ∩
∈
∀
≤ χ
X* ∈ χ is said to be a local minimum of f over χ, if
R. Zakhama 5
Structural Optimisation course
Chap2: Mathematical Programming
II. Unconstrained problems
f
x
global
local
saddle point
f
x
global
FigII.1. : Global and local minima
R. Zakhama 6
Structural Optimisation course
Chap2: Mathematical Programming
f
x
local minimum x*
x*
II. Unconstrained problems
• Every global minimum is also a local minimum
• It may not possible to identify a global minimum by finding all local minima
• f does not have a global minimum
FigII.1. : No Global minimum
R. Zakhama 7
Structural Optimisation course
Chap2: Mathematical Programming
II. Unconstrained problems
II.1 Necessary and Sufficient Conditions for Optimality
What are necessary and sufficient conditions for a local minimum?
• The conditions that must be satisfied at the optimum point are called
necessary
If any point does not satisfy the necessary condition, it cannot be optimum
However, not every point that satisfy the necessary conditions are optimal
• Points satisfying the necessary conditions are called candidate optimum
points.
• If a candidate optimum point satisfies the sufficient condition, then it is
indeed optimum.
• If sufficiency conditions cannot be used or they are not satisfied, we may not
be able to draw any conclusions about the optimality of the candidate point.
R. Zakhama 8
Structural Optimisation course
Chap2: Mathematical Programming
II.1 Necessary and Sufficient Conditions for Optimality
• First order necessary condition for a function of Single Variable:
• Let x* be the minimum point, and investigate its neighbourhood (i.e.,
points x at a small distance d from x*).
• Based on Taylor series expansion (first order)
• Since d is small and can arbitrarily take any sign
NECESSARY CONDITION
0
)
(
)
(
)
( *
≥
∆
=
− x
f
x
f
x
f
0
)
(
'
)
( *
≥
=
∆ d
x
f
x
f
0
)
(
' *
=
x
f
R. Zakhama 9
Structural Optimisation course
Chap2: Mathematical Programming
II.1 Necessary and Sufficient Conditions for Optimality
• Second order sufficient condition for a function of Single
Variable:
• Based on Taylor series expansion (second order)
• If the necessary condition is satisfied, then
•Since d2 is always positive regardless of the sign of d
SUFFICIENT CONDITION
0
)
(
'
'
2
1
)
(
'
)
( 2
*
*
≥
+
=
∆ d
x
f
d
x
f
x
f
0
)
(
'
'
2
1
)
( 2
*
≥
=
∆ d
x
f
x
f
0
)
(
'
' *

x
f
R. Zakhama 10
Structural Optimisation course
Chap2: Mathematical Programming
II.1 Necessary and Sufficient Conditions for Optimality
• Optimality Conditions for Functions of Several Variables:
• The necessary condition
• The sufficient condition
R. Zakhama 11
Structural Optimisation course
Chap2: Mathematical Programming
Definiteness H Nature x*
Positive d. Minimum
Positive semi-d. Valley
Indefinite Saddlepoint
Negative semi-d. Ridge
Negative d. Maximum
)
( Hy
yT
i
λ
0

0
≥
0
≠
0
≤
0

II.1 Necessary and Sufficient Conditions for Optimality
• Stationary point nature summary :
R. Zakhama 12
Structural Optimisation course
Chap2: Mathematical Programming
II.2 Example 1
f(x)=(x-2)2 , f’(2)=0 f(x)=-x2 , f’(0)=0
f’’(2)=2 0 f’’(0)=-2 0
f
x
0
-1 1 2 3 4 5 6
4
0
8
12
16
f
x
-1 0 1
-1
0
FigII.2. : Example 1
R. Zakhama 13
Structural Optimisation course
Chap2: Mathematical Programming
( )
7
-
3x
min
2
R
x∈
II.2 Example 2
• Consider the problem:
• We first find the stationary points (which satisfy f’(x)=0):
• f’’(7/3) = 18  0  x* is a strict local minimum.
• Stationary points are found by solving a nonlinear equation:
• Finding the real roots of g(x) may not be always easy
- Consider the problem to minimise f(x)=x2+ex
- g(x)=2x+ex
- Need an algorithm to find x which satisfies g(x)=0.
3
7
0
)
7
3
(
6
0
(x)
' *
=

=
−

= x
x
f
0
(x)
'
)
( =
≡ f
x
g
R. Zakhama 14
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Introduction
• Most n-dimensional search algorithms use one dimensional minimisation to
determine the minimum along a specified direction,
• where xq is the current point dq is the direction vector at that point, and α
is the variable which determines how far one needs to move along dq to
reach minimum.
• The function to be minimised, f(x), can now be expressed in terms of the
variable α.
Minimise f(x) = f(xq + αdq) = f(α)
at minimum α = α∗
• The different categories are:
• Zeroth order methods: use f
• First order methods: use f and the gradient of f
• Second order methods: use f, the gradient of f and the hessian of f
q
q
d
x
x α
+
=
R. Zakhama 15
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Unimodal functions
• Let φ : R→ R
• Consider the problem:
• Let x* be the minimum point of φ(x) and x* ∈ [a,b]
)
(
min x
R
x
φ
∈
Definition
The function φ is said to be unimodal on [a,b], if for a ≤ x1 ≤ x2 ≤ b
φ(x1)  φ(x2)  x2  x*,
φ(x2)  φ(x1)  x1  x* .
R. Zakhama 16
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Dichotomous Search (zeroth order)
• f is unimodal in the initial interval of uncertainty [a,b].
• Place λ and µ symmetrically, each at distance δ from the mid-point of [a,b].
f
x
λ µ b
a
FigII.3. : Dichotomous search
R. Zakhama 17
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Dichotomous Search (zeroth order)
• Input: initial interval of uncertainty [a,b].
1. Initialisation: k = 0, ak = a, bk = b, δ(0), ε (final length of uncertainty interval).
2.
3. If f(λk) ≥ f(µk),
let ak+1 = λk , bk+1 = bk go to step 5
4. Otherwise If f(λk)  f(µk),
let ak+1 = ak , bk+1 = µk
5. k=k+1
6. if (bk-ak)≤ε , exit
7. else go to step 2
• Output:
δ
µ
δ +
+
=
−
+
=
2
,
2
k
k
k
k
k
k
b
a
b
a
λ
2
* k
k b
a
x
+
=
R. Zakhama 18
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Dichotomous Search (zeroth order)
Consider:
x* = -2.5652, f(x*) = -56.2626
7
19
6
3
5
4
1
min 2
3
4
−
+
−
− x
x
x
x
x
k ak bk bk-ak
0 -4 0 4
1 -4 -1.98 2.02
2 -3.0001 -1.98 1.0201
3 -3.0001 2.4849 0.5152
10 -2.5669 -2.5626 0.0043
20 -2.5652 -2.5652 4.65e-6
23 -2.5652 -2.5652 5.99e-7
…
…
…
…
…
…
…
…
…
…
…
…
R. Zakhama 19
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Golden Section Search (zeroth order)
• Input: initial interval of uncertainty [a,b].
1. Initialisation: k = 0, ak = a, bk = b, ε(0), τ = 1- 1/s = (3-√5 )/2 and s = Golden
Section Ratio
2. λk =ak + τ (bk - ak), µk =bk - τ (bk - ak)
3. If f(λk) ≥ f(µk),
let ak+1 = λk , bk+1 = bk , λk+1 = µk
µk+1 =bk+1 - τ (bk+1 – ak+1) , go to step 5
4. Otherwise If f(λk) ≤ f(µk),
let ak+1 = ak , bk+1 = µ k , µk+1 = λk
λk+1 =ak+1 + τ (bk+1 – ak+1)
5. k=k+1
6. if (bk-ak)≤ε , exit
7. else go to step 3
• Output:
2
* k
k b
a
x
+
=
R. Zakhama 20
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Bisection Search (first order)
• f ∈ C1.
• f is unimodal in the initial interval of uncertainty [a,b].
Idea: Compute f’(c) where c is the midpoint [a,b]
• if f’(c) = 0 then c is a minimum point.
• if f’(c)  0  [a,c] is the new interval of uncertainty.
• if f’(c)  0  [c,b] is the new interval of uncertainty.
c b
a c
a
FigII.4. : Bisection search
R. Zakhama 21
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Newton Method (second order)
• An iterative technique to find a root of a function.
• Problem: Find an approximate root of the function,
• An iteration of Newton’s method on
2
)
( 2
−
= x
x
f
2
)
( 2
−
= x
x
f
f
x
-2
-3 -1 0 1 2 3 4
0
5
10
xk
xk+1
x*
FigII.5. : Newton method
R. Zakhama 22
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Newton Method (second order)
• Consider the problem to minimise f(x), x ∈ R.
• f ∈ C2.
• The problem of minimising the function f(x) is equivalent to obtaining the
root of the nonlinear equation,
• Based on Taylor series expansion,
• qk+1(x) vanishes,
0
)
(
'
)
( =
≡ x
f
x
q
)
)(
(
)
(
)
(
)
( 1
1
1 k
k
k
k
k
k x
x
x
f
x
f
x
f
x
q −
′
′
+
′
=
′
≡ +
+
+
)
(
)
(
1
k
k
k
k
x
f
x
f
x
x
′
′
′
−
=
+
R. Zakhama 23
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Newton Method (second order)
• Consider the problem to minimise f(x), f ∈ C2.
• Need to find the roots of
1. Initialisation: Choose initial point x0 , ε  0 , and set k=0
2.
3. k=k+1
4. if |q(xk)|  ε , exit
5. else go to step 2.
0
)
(
'
)
( =
≡ x
f
x
q
)
(
)
(
1
k
k
k
k
x
f
x
f
x
x
′
′
′
−
=
+
R. Zakhama 24
Structural Optimisation course
Chap2: Mathematical Programming
II.3 One-Dimensional Minimisation
 Newton Method (second order)
• Best convergence of all methods:
• Unless it diverges.
xk
f’
xk+1
xk+2
f’
xk
xk+1 xk+2
FigII.6. : Newton method convergence
R. Zakhama 25
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Introduction
• Consider the problem to minimise f(x),
• Problem:
)
(
min x
f
n
R
x∈
2
1
2
2
2
1 5
5
min x
x
x
x
n
R
x
−
−
+
∈
x1
x2
0.1
0.2
0.3 0.4
FigII.7. : Contours plot
x1
x2
f
R. Zakhama 26
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Introduction
• Problem:
•
• Necessity condition: g(x) = 0 at x* = (2.5, 2.5)T
• Sufficient condition:
• x* is a strict local minimum
2
1
2
2
2
1 5
5
min x
x
x
x
n
R
x
−
−
+
∈



−
−
=
∇
=
5
2
5
2
)
(
)
(
2
1
x
x
x
f
x
g
definite
positive
is
2
0
0
2
)
( *








=
x
H
R. Zakhama 27
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Introduction
Definition
Let . If there exists a direction and δ  0 such that
for all α∈(0,δ), then d is said to be a descent
direction of f at
n
R
x ∈ n
R
d ∈
)
(
)
( x
f
d
x
f 
+α
x
Result
Let f∈C1 and . Let . If then,
d is a descent direction of f at
n
R
x ∈ )
(
)
( x
f
x
g ∇
=
x
0
)
( 
d
x
g T
R. Zakhama 28
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Introduction
FigII.8. : Descent direction
)}
(
)
(
:
{ 0
x
f
x
f
x
S =
=
)}
(
)
(
:
{ 0
x
f
x
f
x
S 
=
)
( 0
x
g
0
x
}
0
)
(
)
(
:
{
at
of
ion
approximat
order
First
0
0
0
=
− x
x
x
g
x
x
S
T
0
d
0
d
R. Zakhama 29
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Introduction
(1) Initialise: x0 , k = 0
(2) while stopping condition is not satisfied at xk
(a) Find xk+1 such that f(xk+1)  f(xk)
(b) k = k+1
endwhile
Output: x* = xk , a local minimum of f(x).
• How to find xk+1 in step 2(a) of the algorithm?
• Which stopping condition can be used?
• Does the algorithm converge? If yes how fast does it converge?
• Does the convergence and its speed depend on x0?
R. Zakhama 30
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Stopping conditions
• Stopping conditions for a minimisation problem:
Practical stopping conditions
•
•
•
te
semidefini
positive
is
)
(
and
0
)
( k
x
H
x
g =
ε
≤
)
(x
g
( )
)
(
1
)
( k
x
f
x
g +
≤ ε
ε
≤
− +
)
(
)
(
)
( 1
k
k
k
x
f
x
f
x
f
R. Zakhama 31
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Speed of convergence
We say the convergence is:
• Linear, if there exists a real τ ∈ ]0,1[, such that for all k≥1 we have:
• Superlinear, if
• Quadratic, if there exists a constant C  0, such that for all k≥1 we have:
*
*
1
x
x
x
x k
k
−
≤
−
+
τ
0
/ *
*
1
→
−
−
+
x
x
x
x k
k
2
*
*
1
x
x
C
x
x k
k
−
≤
−
+
R. Zakhama 32
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Steepest descent method (first order)
• Basic principle is to minimise the N-dimensional function by a series of 1D
line-minimisations:
• The steepest descent method chooses dk to be parallel to the gradient:
• Step-size αk is chosen to minimise f(xk + αkdk).
d
α
x
x k
k
k
k
+
=
+1
)
( k
k
x
f
d −∇
=
)
(
min
arg
0
k
k
k
d
x
f α
α
α
+
=
R. Zakhama 33
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Steepest descent method (first order)
(1) Initialise: x0 , ε  0 and k = 0
(2) while
(a) dk = -g(xk)
(b) Find αk along dk such that
(c)
(d) k = k+1
endwhile
Output: x* = xk , a stationary point of f(x).
ε

)
( k
x
g
d
α
x
x k
k
k
k
+
=
+1
)
(
min
arg
0
k
k
k
d
x
f α
α
α
+
=
R. Zakhama 34
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Steepest descent method (first order)
• The gradient is everywhere perpendicular to the contour lines.
• After each line minimisation the new gradient is always orthogonal to the
previous step direction (true of any line minimisation).
• Consequently, the iterates tend to zig-zag down the valley in a very inefficient
manner.
FigII.9. : Steepest descent method
R. Zakhama 35
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Conjugate gradient method (first order)
• This method is used for the quadratic function in the form:
where H(n,n) is a symmetric matrix positive definite, b ∈ Rn and c ∈ R
• This method converges in n iterations
• Each dk is chosen to be conjugate to all previous search directions with
respect to the Hessian H:
iables
design var
,
2
1
)
( n
c
x
b
Hx
x
x
f T
T
+
+
=
Definition
Let be a symmetric matrix. The vectors {d0,d1,…,dn-1} are
said to be H- conjugate if they are linearly independent and
n
n
R
H ×
∈
j
i
Hd
d j
T
i
≠
∀
= 0
j
i
Hd
d j
T
i
≠
∀
= 0
R. Zakhama 36
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Conjugate gradient method (first order)
(1) Initialise: x0 , ε  0, d0=-g0 and k = 0
(2) while
(a)
(b)
(c)
(d)
(e)
(f) k = k+1
endwhile
Output: x* = xk , a global minimum of f(x).
ε

k
g
k
T
k
k
T
k
k
Hd
d
d
g
=
α
k
k
k
k
d
x
x α
+
=
+1
b
Hx
g k
k
+
= +
+ 1
1
k
T
k
k
T
k
k
Hd
d
Hd
g 1
+
=
β
k
k
k
k
d
g
d β
+
−
= +
+ 1
1
R. Zakhama 37
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Fletcher-Reeves’ Conjugate gradient method (first order)
• For non-quadratic function, we know that any function can be approximated
locally by a quadratic function using a second order Taylor expansion.
• The Fletcher-Reeves method is an adaptation of the conjugate gradient
method for any non-quadratic function. The differences are:
- Find αk using a 1D search algorithm.
- Calculate βk such that:
k
T
k
k
T
k
k
g
g
g
g 1
1 +
+
=
β
R. Zakhama 38
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Fletcher-Reeves’ Conjugate gradient method (first order)
(1) Initialise: x0 , ε  0, d0=-g0 and k = 0
(2) while
(a)
(b)
(c) Compute
(d)
(e)
(f) k = k+1
endwhile
Output: x* = xk , a stationary point of f(x).
ε

k
g
)
(
min
arg
0
k
k
k
d
x
f α
α
α
+
=

k
k
k
k
d
x
x α
+
=
+1
1
+
k
g
k
T
k
k
T
k
k
g
g
g
g 1
+
=
β
k
k
k
k
d
g
d β
+
−
= +
+ 1
1
R. Zakhama 39
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Newton method (second order)
• Concept:
– Construct local quadratic approximation
– Minimise approximation
– Repeat
• Local approximation: 2nd order Taylor series of f at xk:
• First order necessary condition:
• is on the form,
• Classical Newton Method:
- Newton Direction:
- Step Length:
• Is dk a descent direction?
)
(
)
(
2
1
)
(
)
(
)
(
)
( k
k
T
k
k
T
k
k
q x
x
H
x
x
x
x
g
x
f
x
f
x
f −
−
+
−
+
=
≈
)
invertible
is
(assuming
)
(
0
)
( 1
1 k
k
k
k
k
q H
g
H
x
x
x
f −
+
−
=

=
∇
)
( 1
1 k
k
k
k
g
H
x
x −
+
−
= 1 k
k
k
k
d
x
x α
+
=
+
)
( 1 k
k
k
g
H
d −
−
=
1
=
k
α
R. Zakhama 40
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Newton method (second order)
• If f(x) is quadratic, then the solution is found in one step.
• The method has quadratic convergence (as in the 1D case).
• Risk of divergence, if the function f(x) non convex.
• To avoid divergence → Line search:
- Newton Direction:
- Update:
• If H=I then this reduces to steepest descent.
1 k
k
k
k
d
x
x α
+
=
+
FigII.10. : Line Search.
)
( 1 k
k
k
g
H
d −
−
=
1
+
k
x
k
x
k
ks
α d
d
R. Zakhama 41
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Newton method (second order)
(1) Initialise: x0 , ε  0 and k = 0
(2) while
(a)
(b)
(c)
(f) k = k+1
endwhile
Output: x* = xk , a stationary point of f(x).
ε

k
g
k
k
k
k
d
x
x α
+
=
+1
k
k
k
g
H
d 1
)
( −
−
=
)
(
min
arg
0
k
k
k
d
x
f α
α
α
+
=
R. Zakhama 42
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Comparison
• Problem: minimisation of Rosenbrock’s function
•
• Necessity condition: g(x) = 0 at x* = (1, 1)T
• Sufficient condition:
• x* is a strict local minimum
2
1
2
2
1
2 )
1
(
)
(
100
min
2
x
x
x
R
x
−
+
−
∈





−
−
−
−
−
=
∇
=
)
(
200
)
1
(
2
)
(
400
)
(
)
( 2
1
2
1
2
1
2
1
x
x
x
x
x
x
x
f
x
g
definite
positive
is
200
400
400
802
)
( *








−
−
=
x
H
R. Zakhama 43
Structural Optimisation course
Chap2: Mathematical Programming
Minimum at [1, 1]T
II.4 Multi-Dimensional Minimisation
 Comparison
FigII.11. : Rosenbrock’s function.
R. Zakhama 44
Structural Optimisation course
Chap2: Mathematical Programming
II.4 Multi-Dimensional Minimisation
 Comparison
FRCG Newton
SD
FigII.12. : Comparison
R. Zakhama 45
Structural Optimisation course
Chap2: Mathematical Programming
III. Problems
III.1 Problem 1
Consider the constrained minimisation problem
Minimise F = x2
Subject to
g1 = 1-x ≤ 0
g2 =x-2 ≤ 0
This represents the minimisation of x2 in the interval 1 ≤ x ≤ 2.
1 – Plot the function F, g1 and g2 on the interval 0 ≤ x ≤ 3.
2 – We can convert this to an equivalent unconstrained problem by using penalty
parameters. To do this we minimise the following unconstrained function:
Plot on the interval 0 ≤ x ≤ 3 for values of R = 5.0 and R = 0.5.
3 – For a value of R = 5.0 in part (2), and beginning with xl = 1.001 and xu = 1.999,
perform several iterations of the golden section method. Reduce the interval of
uncertainty to less than 0.1 in order to minimise the unconstrained function .








+
−
=
2
1
1
1
~
g
g
R
F
F
F
~
F
~
R. Zakhama 46
Structural Optimisation course
Chap2: Mathematical Programming
III. Problems
III.2 Problem 2
Given the function
1- Calculate {∇F} and [H] at the points {X1} = {0,0}, {X2} = {1,1} and {X3}T = {2,2}.
Check if [H] is positive definite at each of these points. Are the optimality conditions for
relative minimum satisfied at any of the points?
2- Apply three iterations of the method of steepest descent to this problem using {X1} as
initial point. .
5
2
2 1
2
1
2
2
2
1
2
4
1 +
−
+
+
−
= x
x
x
x
x
x
F
R. Zakhama 47
Structural Optimisation course
Chap2: Mathematical Programming
Sources
Presentation based on material from:
• Haftka  Gürdal (1994): Elements of Structural Optimization
• van Keulen: TUD Optimization course
• Etman (2006): EM course Engineering Optimization

More Related Content

Similar to Chapter 2-1.pdf

Chapter 1 (math 1)
Chapter 1 (math 1)Chapter 1 (math 1)
Chapter 1 (math 1)Amr Mohamed
 
Dynamic Programming.pptx
Dynamic Programming.pptxDynamic Programming.pptx
Dynamic Programming.pptxThanga Ramya S
 
optimization methods by using matlab.pptx
optimization methods by using matlab.pptxoptimization methods by using matlab.pptx
optimization methods by using matlab.pptxabbas miry
 
Dynamic Programming Matrix Chain Multiplication
Dynamic Programming Matrix Chain MultiplicationDynamic Programming Matrix Chain Multiplication
Dynamic Programming Matrix Chain MultiplicationKrishnakoumarC
 
Admission for b.tech
Admission for b.techAdmission for b.tech
Admission for b.techEdhole.com
 
Modeling with Quadratics
Modeling with QuadraticsModeling with Quadratics
Modeling with QuadraticsPLeach
 
introduction to Operation Research
introduction to Operation Research introduction to Operation Research
introduction to Operation Research amanyosama12
 
Integer Programming PPt.ernxzamnbmbmspdf
Integer Programming PPt.ernxzamnbmbmspdfInteger Programming PPt.ernxzamnbmbmspdf
Integer Programming PPt.ernxzamnbmbmspdfRaja Manyam
 
06 Recursion in C.pptx
06 Recursion in C.pptx06 Recursion in C.pptx
06 Recursion in C.pptxMouDhara1
 
Multi objective optimization and Benchmark functions result
Multi objective optimization and Benchmark functions resultMulti objective optimization and Benchmark functions result
Multi objective optimization and Benchmark functions resultPiyush Agarwal
 
Least Square Optimization and Sparse-Linear Solver
Least Square Optimization and Sparse-Linear SolverLeast Square Optimization and Sparse-Linear Solver
Least Square Optimization and Sparse-Linear SolverJi-yong Kwon
 
Lesson 2_Eval Functions.pptx
Lesson 2_Eval Functions.pptxLesson 2_Eval Functions.pptx
Lesson 2_Eval Functions.pptxAlfredoLabador
 
Introduction to FEM
Introduction to FEMIntroduction to FEM
Introduction to FEMmezkurra
 
functions limits and continuity
functions limits and continuityfunctions limits and continuity
functions limits and continuityPume Ananda
 

Similar to Chapter 2-1.pdf (20)

Chapter 1 (math 1)
Chapter 1 (math 1)Chapter 1 (math 1)
Chapter 1 (math 1)
 
Dynamic Programming.pptx
Dynamic Programming.pptxDynamic Programming.pptx
Dynamic Programming.pptx
 
optimization methods by using matlab.pptx
optimization methods by using matlab.pptxoptimization methods by using matlab.pptx
optimization methods by using matlab.pptx
 
02 basics i-handout
02 basics i-handout02 basics i-handout
02 basics i-handout
 
Dynamic Programming Matrix Chain Multiplication
Dynamic Programming Matrix Chain MultiplicationDynamic Programming Matrix Chain Multiplication
Dynamic Programming Matrix Chain Multiplication
 
Admission for b.tech
Admission for b.techAdmission for b.tech
Admission for b.tech
 
Modeling with Quadratics
Modeling with QuadraticsModeling with Quadratics
Modeling with Quadratics
 
Riemann sumsdefiniteintegrals
Riemann sumsdefiniteintegralsRiemann sumsdefiniteintegrals
Riemann sumsdefiniteintegrals
 
CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...
CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...
CLIM Fall 2017 Course: Statistics for Climate Research, Estimating Curves and...
 
Functions limits and continuity
Functions limits and continuityFunctions limits and continuity
Functions limits and continuity
 
introduction to Operation Research
introduction to Operation Research introduction to Operation Research
introduction to Operation Research
 
Integer Programming PPt.ernxzamnbmbmspdf
Integer Programming PPt.ernxzamnbmbmspdfInteger Programming PPt.ernxzamnbmbmspdf
Integer Programming PPt.ernxzamnbmbmspdf
 
06 Recursion in C.pptx
06 Recursion in C.pptx06 Recursion in C.pptx
06 Recursion in C.pptx
 
Multi objective optimization and Benchmark functions result
Multi objective optimization and Benchmark functions resultMulti objective optimization and Benchmark functions result
Multi objective optimization and Benchmark functions result
 
Least Square Optimization and Sparse-Linear Solver
Least Square Optimization and Sparse-Linear SolverLeast Square Optimization and Sparse-Linear Solver
Least Square Optimization and Sparse-Linear Solver
 
Lesson 2_Eval Functions.pptx
Lesson 2_Eval Functions.pptxLesson 2_Eval Functions.pptx
Lesson 2_Eval Functions.pptx
 
add math form 4/5
add math form 4/5add math form 4/5
add math form 4/5
 
Integration
IntegrationIntegration
Integration
 
Introduction to FEM
Introduction to FEMIntroduction to FEM
Introduction to FEM
 
functions limits and continuity
functions limits and continuityfunctions limits and continuity
functions limits and continuity
 

Recently uploaded

CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordAsst.prof M.Gokilavani
 
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptxBSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptxfenichawla
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)simmis5
 
UNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular ConduitsUNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular Conduitsrknatarajan
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations120cr0395
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...ranjana rawat
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Christo Ananth
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINESIVASHANKAR N
 
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...Call Girls in Nagpur High Profile
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...ranjana rawat
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlysanyuktamishra911
 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...roncy bisnoi
 
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Bookingdharasingh5698
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingrakeshbaidya232001
 
UNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its PerformanceUNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its Performancesivaprakash250
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdfKamal Acharya
 

Recently uploaded (20)

CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
 
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptxBSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
BSides Seattle 2024 - Stopping Ethan Hunt From Taking Your Data.pptx
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)
 
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
 
UNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular ConduitsUNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular Conduits
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
 
Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
 
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...Booking open Available Pune Call Girls Pargaon  6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Pargaon 6297143586 Call Hot Indian Gi...
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghly
 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
 
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 BookingVIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
VIP Call Girls Ankleshwar 7001035870 Whatsapp Number, 24/07 Booking
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writing
 
UNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its PerformanceUNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its Performance
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdf
 

Chapter 2-1.pdf

  • 1. R. Zakhama Structural Optimisation course Chapter 2: Mathematical Programming Part 1: Unconstrained Optimisation
  • 2. R. Zakhama 2 Structural Optimisation course Chap2: Mathematical Programming I. Introduction I.1 General problem formulation min ( ) min ( ) min ( ) s.t. ( ) 0 s.t. ( ) 0 s.t. ( ) 0 ( ) 0 X X Z f X f X Z g X X Z h X X Z X ϕ ψ θ χ χ χ          ≤  ≤  =    =    ′ ∈ ∈      ∈     ≤ ≤ − ⇔ = 0 ) ( 0 ) ( 0 ) ( X h X h X h and with 0 0 ) ( 0 ) ( n R Y X Z Y Y X X + × = ′       =    ≥ = + ⇔ ≤ χ χ ψ ψ
  • 3. R. Zakhama 3 Structural Optimisation course Chap2: Mathematical Programming I.2 Classification of optimisation problems • Linear programming Algorithms: Simplex (Dantzig 1947), Interior point (Karmarkar 1984) • Nonlinear programming where f,g,h: Rn→ R ∈ C1 or C2 0 s.t. min T        ≥ ≤ X b AX X C X 0 ) ( 0 ) ( s.t. ) ( min        ⊆ ∈ = ≤ n X R X X h X g X f χ
  • 4. R. Zakhama 4 Structural Optimisation course Chap2: Mathematical Programming Definition Unconstrained optimisation problem II. Unconstrained problems Let χ ⊆ Rn , f: χ→ R and Be(X*)={X∈Rn : DX-X*D≤ ε} where ε ≥ 0 Question: Under what condition on f and χ does the function f attain its minimum in the set χ? ) ( ) ( * χ ∈ ∀ ≤ x X f X f X* ∈ χ is said to be global minimum of f over χ, if ) ( min X f X χ ∈ Definition ) ( ) ( * e B X X f X f ∩ ∈ ∀ ≤ χ X* ∈ χ is said to be a local minimum of f over χ, if
  • 5. R. Zakhama 5 Structural Optimisation course Chap2: Mathematical Programming II. Unconstrained problems f x global local saddle point f x global FigII.1. : Global and local minima
  • 6. R. Zakhama 6 Structural Optimisation course Chap2: Mathematical Programming f x local minimum x* x* II. Unconstrained problems • Every global minimum is also a local minimum • It may not possible to identify a global minimum by finding all local minima • f does not have a global minimum FigII.1. : No Global minimum
  • 7. R. Zakhama 7 Structural Optimisation course Chap2: Mathematical Programming II. Unconstrained problems II.1 Necessary and Sufficient Conditions for Optimality What are necessary and sufficient conditions for a local minimum? • The conditions that must be satisfied at the optimum point are called necessary If any point does not satisfy the necessary condition, it cannot be optimum However, not every point that satisfy the necessary conditions are optimal • Points satisfying the necessary conditions are called candidate optimum points. • If a candidate optimum point satisfies the sufficient condition, then it is indeed optimum. • If sufficiency conditions cannot be used or they are not satisfied, we may not be able to draw any conclusions about the optimality of the candidate point.
  • 8. R. Zakhama 8 Structural Optimisation course Chap2: Mathematical Programming II.1 Necessary and Sufficient Conditions for Optimality • First order necessary condition for a function of Single Variable: • Let x* be the minimum point, and investigate its neighbourhood (i.e., points x at a small distance d from x*). • Based on Taylor series expansion (first order) • Since d is small and can arbitrarily take any sign NECESSARY CONDITION 0 ) ( ) ( ) ( * ≥ ∆ = − x f x f x f 0 ) ( ' ) ( * ≥ = ∆ d x f x f 0 ) ( ' * = x f
  • 9. R. Zakhama 9 Structural Optimisation course Chap2: Mathematical Programming II.1 Necessary and Sufficient Conditions for Optimality • Second order sufficient condition for a function of Single Variable: • Based on Taylor series expansion (second order) • If the necessary condition is satisfied, then •Since d2 is always positive regardless of the sign of d SUFFICIENT CONDITION 0 ) ( ' ' 2 1 ) ( ' ) ( 2 * * ≥ + = ∆ d x f d x f x f 0 ) ( ' ' 2 1 ) ( 2 * ≥ = ∆ d x f x f 0 ) ( ' ' * x f
  • 10. R. Zakhama 10 Structural Optimisation course Chap2: Mathematical Programming II.1 Necessary and Sufficient Conditions for Optimality • Optimality Conditions for Functions of Several Variables: • The necessary condition • The sufficient condition
  • 11. R. Zakhama 11 Structural Optimisation course Chap2: Mathematical Programming Definiteness H Nature x* Positive d. Minimum Positive semi-d. Valley Indefinite Saddlepoint Negative semi-d. Ridge Negative d. Maximum ) ( Hy yT i λ 0 0 ≥ 0 ≠ 0 ≤ 0 II.1 Necessary and Sufficient Conditions for Optimality • Stationary point nature summary :
  • 12. R. Zakhama 12 Structural Optimisation course Chap2: Mathematical Programming II.2 Example 1 f(x)=(x-2)2 , f’(2)=0 f(x)=-x2 , f’(0)=0 f’’(2)=2 0 f’’(0)=-2 0 f x 0 -1 1 2 3 4 5 6 4 0 8 12 16 f x -1 0 1 -1 0 FigII.2. : Example 1
  • 13. R. Zakhama 13 Structural Optimisation course Chap2: Mathematical Programming ( ) 7 - 3x min 2 R x∈ II.2 Example 2 • Consider the problem: • We first find the stationary points (which satisfy f’(x)=0): • f’’(7/3) = 18 0  x* is a strict local minimum. • Stationary points are found by solving a nonlinear equation: • Finding the real roots of g(x) may not be always easy - Consider the problem to minimise f(x)=x2+ex - g(x)=2x+ex - Need an algorithm to find x which satisfies g(x)=0. 3 7 0 ) 7 3 ( 6 0 (x) ' * =  = −  = x x f 0 (x) ' ) ( = ≡ f x g
  • 14. R. Zakhama 14 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Introduction • Most n-dimensional search algorithms use one dimensional minimisation to determine the minimum along a specified direction, • where xq is the current point dq is the direction vector at that point, and α is the variable which determines how far one needs to move along dq to reach minimum. • The function to be minimised, f(x), can now be expressed in terms of the variable α. Minimise f(x) = f(xq + αdq) = f(α) at minimum α = α∗ • The different categories are: • Zeroth order methods: use f • First order methods: use f and the gradient of f • Second order methods: use f, the gradient of f and the hessian of f q q d x x α + =
  • 15. R. Zakhama 15 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Unimodal functions • Let φ : R→ R • Consider the problem: • Let x* be the minimum point of φ(x) and x* ∈ [a,b] ) ( min x R x φ ∈ Definition The function φ is said to be unimodal on [a,b], if for a ≤ x1 ≤ x2 ≤ b φ(x1) φ(x2)  x2 x*, φ(x2) φ(x1)  x1 x* .
  • 16. R. Zakhama 16 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Dichotomous Search (zeroth order) • f is unimodal in the initial interval of uncertainty [a,b]. • Place λ and µ symmetrically, each at distance δ from the mid-point of [a,b]. f x λ µ b a FigII.3. : Dichotomous search
  • 17. R. Zakhama 17 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Dichotomous Search (zeroth order) • Input: initial interval of uncertainty [a,b]. 1. Initialisation: k = 0, ak = a, bk = b, δ(0), ε (final length of uncertainty interval). 2. 3. If f(λk) ≥ f(µk), let ak+1 = λk , bk+1 = bk go to step 5 4. Otherwise If f(λk) f(µk), let ak+1 = ak , bk+1 = µk 5. k=k+1 6. if (bk-ak)≤ε , exit 7. else go to step 2 • Output: δ µ δ + + = − + = 2 , 2 k k k k k k b a b a λ 2 * k k b a x + =
  • 18. R. Zakhama 18 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Dichotomous Search (zeroth order) Consider: x* = -2.5652, f(x*) = -56.2626 7 19 6 3 5 4 1 min 2 3 4 − + − − x x x x x k ak bk bk-ak 0 -4 0 4 1 -4 -1.98 2.02 2 -3.0001 -1.98 1.0201 3 -3.0001 2.4849 0.5152 10 -2.5669 -2.5626 0.0043 20 -2.5652 -2.5652 4.65e-6 23 -2.5652 -2.5652 5.99e-7 … … … … … … … … … … … …
  • 19. R. Zakhama 19 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Golden Section Search (zeroth order) • Input: initial interval of uncertainty [a,b]. 1. Initialisation: k = 0, ak = a, bk = b, ε(0), τ = 1- 1/s = (3-√5 )/2 and s = Golden Section Ratio 2. λk =ak + τ (bk - ak), µk =bk - τ (bk - ak) 3. If f(λk) ≥ f(µk), let ak+1 = λk , bk+1 = bk , λk+1 = µk µk+1 =bk+1 - τ (bk+1 – ak+1) , go to step 5 4. Otherwise If f(λk) ≤ f(µk), let ak+1 = ak , bk+1 = µ k , µk+1 = λk λk+1 =ak+1 + τ (bk+1 – ak+1) 5. k=k+1 6. if (bk-ak)≤ε , exit 7. else go to step 3 • Output: 2 * k k b a x + =
  • 20. R. Zakhama 20 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Bisection Search (first order) • f ∈ C1. • f is unimodal in the initial interval of uncertainty [a,b]. Idea: Compute f’(c) where c is the midpoint [a,b] • if f’(c) = 0 then c is a minimum point. • if f’(c) 0  [a,c] is the new interval of uncertainty. • if f’(c) 0  [c,b] is the new interval of uncertainty. c b a c a FigII.4. : Bisection search
  • 21. R. Zakhama 21 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Newton Method (second order) • An iterative technique to find a root of a function. • Problem: Find an approximate root of the function, • An iteration of Newton’s method on 2 ) ( 2 − = x x f 2 ) ( 2 − = x x f f x -2 -3 -1 0 1 2 3 4 0 5 10 xk xk+1 x* FigII.5. : Newton method
  • 22. R. Zakhama 22 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Newton Method (second order) • Consider the problem to minimise f(x), x ∈ R. • f ∈ C2. • The problem of minimising the function f(x) is equivalent to obtaining the root of the nonlinear equation, • Based on Taylor series expansion, • qk+1(x) vanishes, 0 ) ( ' ) ( = ≡ x f x q ) )( ( ) ( ) ( ) ( 1 1 1 k k k k k k x x x f x f x f x q − ′ ′ + ′ = ′ ≡ + + + ) ( ) ( 1 k k k k x f x f x x ′ ′ ′ − = +
  • 23. R. Zakhama 23 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Newton Method (second order) • Consider the problem to minimise f(x), f ∈ C2. • Need to find the roots of 1. Initialisation: Choose initial point x0 , ε 0 , and set k=0 2. 3. k=k+1 4. if |q(xk)| ε , exit 5. else go to step 2. 0 ) ( ' ) ( = ≡ x f x q ) ( ) ( 1 k k k k x f x f x x ′ ′ ′ − = +
  • 24. R. Zakhama 24 Structural Optimisation course Chap2: Mathematical Programming II.3 One-Dimensional Minimisation  Newton Method (second order) • Best convergence of all methods: • Unless it diverges. xk f’ xk+1 xk+2 f’ xk xk+1 xk+2 FigII.6. : Newton method convergence
  • 25. R. Zakhama 25 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Introduction • Consider the problem to minimise f(x), • Problem: ) ( min x f n R x∈ 2 1 2 2 2 1 5 5 min x x x x n R x − − + ∈ x1 x2 0.1 0.2 0.3 0.4 FigII.7. : Contours plot x1 x2 f
  • 26. R. Zakhama 26 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Introduction • Problem: • • Necessity condition: g(x) = 0 at x* = (2.5, 2.5)T • Sufficient condition: • x* is a strict local minimum 2 1 2 2 2 1 5 5 min x x x x n R x − − + ∈    − − = ∇ = 5 2 5 2 ) ( ) ( 2 1 x x x f x g definite positive is 2 0 0 2 ) ( *         = x H
  • 27. R. Zakhama 27 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Introduction Definition Let . If there exists a direction and δ 0 such that for all α∈(0,δ), then d is said to be a descent direction of f at n R x ∈ n R d ∈ ) ( ) ( x f d x f +α x Result Let f∈C1 and . Let . If then, d is a descent direction of f at n R x ∈ ) ( ) ( x f x g ∇ = x 0 ) ( d x g T
  • 28. R. Zakhama 28 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Introduction FigII.8. : Descent direction )} ( ) ( : { 0 x f x f x S = = )} ( ) ( : { 0 x f x f x S = ) ( 0 x g 0 x } 0 ) ( ) ( : { at of ion approximat order First 0 0 0 = − x x x g x x S T 0 d 0 d
  • 29. R. Zakhama 29 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Introduction (1) Initialise: x0 , k = 0 (2) while stopping condition is not satisfied at xk (a) Find xk+1 such that f(xk+1) f(xk) (b) k = k+1 endwhile Output: x* = xk , a local minimum of f(x). • How to find xk+1 in step 2(a) of the algorithm? • Which stopping condition can be used? • Does the algorithm converge? If yes how fast does it converge? • Does the convergence and its speed depend on x0?
  • 30. R. Zakhama 30 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Stopping conditions • Stopping conditions for a minimisation problem: Practical stopping conditions • • • te semidefini positive is ) ( and 0 ) ( k x H x g = ε ≤ ) (x g ( ) ) ( 1 ) ( k x f x g + ≤ ε ε ≤ − + ) ( ) ( ) ( 1 k k k x f x f x f
  • 31. R. Zakhama 31 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Speed of convergence We say the convergence is: • Linear, if there exists a real τ ∈ ]0,1[, such that for all k≥1 we have: • Superlinear, if • Quadratic, if there exists a constant C 0, such that for all k≥1 we have: * * 1 x x x x k k − ≤ − + τ 0 / * * 1 → − − + x x x x k k 2 * * 1 x x C x x k k − ≤ − +
  • 32. R. Zakhama 32 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Steepest descent method (first order) • Basic principle is to minimise the N-dimensional function by a series of 1D line-minimisations: • The steepest descent method chooses dk to be parallel to the gradient: • Step-size αk is chosen to minimise f(xk + αkdk). d α x x k k k k + = +1 ) ( k k x f d −∇ = ) ( min arg 0 k k k d x f α α α + =
  • 33. R. Zakhama 33 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Steepest descent method (first order) (1) Initialise: x0 , ε 0 and k = 0 (2) while (a) dk = -g(xk) (b) Find αk along dk such that (c) (d) k = k+1 endwhile Output: x* = xk , a stationary point of f(x). ε ) ( k x g d α x x k k k k + = +1 ) ( min arg 0 k k k d x f α α α + =
  • 34. R. Zakhama 34 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Steepest descent method (first order) • The gradient is everywhere perpendicular to the contour lines. • After each line minimisation the new gradient is always orthogonal to the previous step direction (true of any line minimisation). • Consequently, the iterates tend to zig-zag down the valley in a very inefficient manner. FigII.9. : Steepest descent method
  • 35. R. Zakhama 35 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Conjugate gradient method (first order) • This method is used for the quadratic function in the form: where H(n,n) is a symmetric matrix positive definite, b ∈ Rn and c ∈ R • This method converges in n iterations • Each dk is chosen to be conjugate to all previous search directions with respect to the Hessian H: iables design var , 2 1 ) ( n c x b Hx x x f T T + + = Definition Let be a symmetric matrix. The vectors {d0,d1,…,dn-1} are said to be H- conjugate if they are linearly independent and n n R H × ∈ j i Hd d j T i ≠ ∀ = 0 j i Hd d j T i ≠ ∀ = 0
  • 36. R. Zakhama 36 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Conjugate gradient method (first order) (1) Initialise: x0 , ε 0, d0=-g0 and k = 0 (2) while (a) (b) (c) (d) (e) (f) k = k+1 endwhile Output: x* = xk , a global minimum of f(x). ε k g k T k k T k k Hd d d g = α k k k k d x x α + = +1 b Hx g k k + = + + 1 1 k T k k T k k Hd d Hd g 1 + = β k k k k d g d β + − = + + 1 1
  • 37. R. Zakhama 37 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Fletcher-Reeves’ Conjugate gradient method (first order) • For non-quadratic function, we know that any function can be approximated locally by a quadratic function using a second order Taylor expansion. • The Fletcher-Reeves method is an adaptation of the conjugate gradient method for any non-quadratic function. The differences are: - Find αk using a 1D search algorithm. - Calculate βk such that: k T k k T k k g g g g 1 1 + + = β
  • 38. R. Zakhama 38 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Fletcher-Reeves’ Conjugate gradient method (first order) (1) Initialise: x0 , ε 0, d0=-g0 and k = 0 (2) while (a) (b) (c) Compute (d) (e) (f) k = k+1 endwhile Output: x* = xk , a stationary point of f(x). ε k g ) ( min arg 0 k k k d x f α α α + = k k k k d x x α + = +1 1 + k g k T k k T k k g g g g 1 + = β k k k k d g d β + − = + + 1 1
  • 39. R. Zakhama 39 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Newton method (second order) • Concept: – Construct local quadratic approximation – Minimise approximation – Repeat • Local approximation: 2nd order Taylor series of f at xk: • First order necessary condition: • is on the form, • Classical Newton Method: - Newton Direction: - Step Length: • Is dk a descent direction? ) ( ) ( 2 1 ) ( ) ( ) ( ) ( k k T k k T k k q x x H x x x x g x f x f x f − − + − + = ≈ ) invertible is (assuming ) ( 0 ) ( 1 1 k k k k k q H g H x x x f − + − =  = ∇ ) ( 1 1 k k k k g H x x − + − = 1 k k k k d x x α + = + ) ( 1 k k k g H d − − = 1 = k α
  • 40. R. Zakhama 40 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Newton method (second order) • If f(x) is quadratic, then the solution is found in one step. • The method has quadratic convergence (as in the 1D case). • Risk of divergence, if the function f(x) non convex. • To avoid divergence → Line search: - Newton Direction: - Update: • If H=I then this reduces to steepest descent. 1 k k k k d x x α + = + FigII.10. : Line Search. ) ( 1 k k k g H d − − = 1 + k x k x k ks α d d
  • 41. R. Zakhama 41 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Newton method (second order) (1) Initialise: x0 , ε 0 and k = 0 (2) while (a) (b) (c) (f) k = k+1 endwhile Output: x* = xk , a stationary point of f(x). ε k g k k k k d x x α + = +1 k k k g H d 1 ) ( − − = ) ( min arg 0 k k k d x f α α α + =
  • 42. R. Zakhama 42 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Comparison • Problem: minimisation of Rosenbrock’s function • • Necessity condition: g(x) = 0 at x* = (1, 1)T • Sufficient condition: • x* is a strict local minimum 2 1 2 2 1 2 ) 1 ( ) ( 100 min 2 x x x R x − + − ∈      − − − − − = ∇ = ) ( 200 ) 1 ( 2 ) ( 400 ) ( ) ( 2 1 2 1 2 1 2 1 x x x x x x x f x g definite positive is 200 400 400 802 ) ( *         − − = x H
  • 43. R. Zakhama 43 Structural Optimisation course Chap2: Mathematical Programming Minimum at [1, 1]T II.4 Multi-Dimensional Minimisation  Comparison FigII.11. : Rosenbrock’s function.
  • 44. R. Zakhama 44 Structural Optimisation course Chap2: Mathematical Programming II.4 Multi-Dimensional Minimisation  Comparison FRCG Newton SD FigII.12. : Comparison
  • 45. R. Zakhama 45 Structural Optimisation course Chap2: Mathematical Programming III. Problems III.1 Problem 1 Consider the constrained minimisation problem Minimise F = x2 Subject to g1 = 1-x ≤ 0 g2 =x-2 ≤ 0 This represents the minimisation of x2 in the interval 1 ≤ x ≤ 2. 1 – Plot the function F, g1 and g2 on the interval 0 ≤ x ≤ 3. 2 – We can convert this to an equivalent unconstrained problem by using penalty parameters. To do this we minimise the following unconstrained function: Plot on the interval 0 ≤ x ≤ 3 for values of R = 5.0 and R = 0.5. 3 – For a value of R = 5.0 in part (2), and beginning with xl = 1.001 and xu = 1.999, perform several iterations of the golden section method. Reduce the interval of uncertainty to less than 0.1 in order to minimise the unconstrained function .         + − = 2 1 1 1 ~ g g R F F F ~ F ~
  • 46. R. Zakhama 46 Structural Optimisation course Chap2: Mathematical Programming III. Problems III.2 Problem 2 Given the function 1- Calculate {∇F} and [H] at the points {X1} = {0,0}, {X2} = {1,1} and {X3}T = {2,2}. Check if [H] is positive definite at each of these points. Are the optimality conditions for relative minimum satisfied at any of the points? 2- Apply three iterations of the method of steepest descent to this problem using {X1} as initial point. . 5 2 2 1 2 1 2 2 2 1 2 4 1 + − + + − = x x x x x x F
  • 47. R. Zakhama 47 Structural Optimisation course Chap2: Mathematical Programming Sources Presentation based on material from: • Haftka Gürdal (1994): Elements of Structural Optimization • van Keulen: TUD Optimization course • Etman (2006): EM course Engineering Optimization