SlideShare a Scribd company logo
1 of 43
Download to read offline
R. Zakhama
Structural Optimisation course
Chapter 2: Mathematical
Programming
Part 2: Constrained Optimisation
R. Zakhama 2
Structural Optimisation course
Chap2: Mathematical Programming
I. Introduction
 Feasible point: That satisfies all constraints.
 Feasible region: The set of values X of the design variables that satisfy all
the constraints.
X = {x∈Rn / gj(x) ≤ 0 j=1, ..., ng and hk (x) = 0 k=1, ..., ne}
(P) can be written as:
 Remarks:
- Generally, the solution x* is on the surface of the feasible region (if not,
the constraints are useless, ill-posed problem).
- From algorithmic point of view, one of the difficulties, is that we do not
know a priori what constraints will be active at the optimum.
The idea is to convert the problem to an unconstrained optimisation.
[ ]
where
1
0
)
(
1
0
)
(
s.t.
)
(
min
)
( 2
1
T
n
e
k
g
j
x
,...,x
,x
x
x
...n
k
x
h
...n
j
x
g
x
f
P =







=
=
=
≤
s.t.
)
(
min





∈ X
x
x
f
x
R. Zakhama 3
Structural Optimisation course
Chap2: Mathematical Programming
II. Feasible and Descent Directions
• Let F(x) = Set of feasible directions at x ∈ X.
• Let D(x) = Set of descent directions at x ∈ X.
s.t.
)
(
min





∈ X
x
x
f
x
Definition
A vector , is said to be a feasible direction at if
there exists δ1 > 0 such that for all α∈(0,δ1).
n
R
x∈
n
R
d ∈
X
∈
+ d
x α
0
≠
d
Definition
A vector , is said to be a descent direction at if
there exists δ2 > 0 such that for all α∈(0,δ2).
n
R
x∈
n
R
d ∈
)
(
)
( x
f
d
x
f <
+α
0
≠
d
R. Zakhama 4
Structural Optimisation course
Chap2: Mathematical Programming
II. Feasible and Descent Directions
g2
g1
x1
x2
s
s
FigII.1. : Feasible directions
R. Zakhama 5
Structural Optimisation course
Chap2: Mathematical Programming
II. Feasible and Descent Directions
s.t.
)
(
min





∈ X
x
x
f
x
Theorem
Let X be a nonempty set in Rn and be a local minimum
of f over X . Then,
X
∈
*
x
φ
=
∩ )
(
)
( *
*
x
x D
F
Proof.
Let be a local minimum.
By contradiction, assume that ∃ a nonzero .
∴∃ δ1 > 0 ∋ ∀ α ∈(0,δ1) and
∃ δ2 > 0 ∋ ∀ α ∈(0,δ2).
Hence, ∃ x ∈ B(x*,α)∩X ∋ f(x) < f(x*), for all α∈(0,min(δ1 ,δ2)).
)
(
)
( *
*
x
f
d
x
f <
+α
X
∈
*
x
)
(
)
( *
*
x
x
d D
F ∩
∈
X
∈
+ d
x α
*
R. Zakhama 6
Structural Optimisation course
Chap2: Mathematical Programming
II. Feasible and Descent Directions
• is a local minimum
• Consider any and assume
• Let such that  d is a descent
direction  d ∈ D(x)
• Let
• is a local minimum
• If F(x*) = Rn (every direction in Rn is locally feasible), x* ∈ X is a local
minimum
• Can we characterise F(x*) algebraically for a constrained optimisation
problem?
s.t.
)
(
min





∈ X
x
x
f
x
X
∈
*
x φ
=
∩
 )
(
)
( *
*
x
x D
F
X
∈
x 2
C
∈
f
n
R
d ∈ 0
)
( <
∇ d
x
f T
)
(
)
( x
f
d
x
f <
+
 α
)
(
}
0
)
(
:
{
)
(
~
x
d
x
f
d
x T
D
D ⊆
<
∇
=
X
∈
*
x φ
=
∩
 )
(
~
)
( *
*
x
x D
F
0
)
(
}
0
)
(
:
{ *
=
∇

=
<
∇
 x
f
d
x
f
d T
φ
R. Zakhama 7
Structural Optimisation course
Chap2: Mathematical Programming
II. Feasible and Descent Directions
• Assume f, gj ∈ C2 j=1…ng
• X = {x ∈ Rn : gj(x) ≤ 0 , j=1…ng}
• Active constraints : A(x) = {j : gj(x) = 0}
 is a local minimum
• This is only a necessary condition for a local minimum.
• Cannot be directly used for equality constrained problems
...
1
,
0
)
(
s.t.
)
(
min
j





=
≤ g
x
n
j
x
g
x
f
Lemma
For any x ∈X ,
)
(
)}
(
,
0
)
(
:
{
)
(
~
x
x
j
d
x
g
d
x T
j F
A
F ⊆
∈
<
∇
=
X
∈
*
x φ
=
∩
 )
(
~
)
(
~ *
*
x
x D
F
R. Zakhama 8
Structural Optimisation course
Chap2: Mathematical Programming
II. Optimality conditions
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Equality constraints
• Let consider the following optimisation problem:
• We introduce the Lagrangian function:
• The necessary condition for the local minimiser is
and it must be a feasible point (i.e. constraints are satisfied).
• These are Karush-Kuhn-Tucker conditions
1
0
)
(
s.t.
)
(
min
)
(





=
= ...n
k
x
h
x
f
P
e
k
x
s
multiplier
Lagrangian
the
are
where
)
(
)
(
1
k
n
k
k
k λ
x
h
x
f
e

=
+
= λ
L
0
,
0
0
)
,
( *
*
=
∇
=
∇
⇔
=
∇ L
L
L λ
λ x
x
R. Zakhama 9
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Equality constraints
• These conditions, however, apply only at a regular point, that is at a point
where the gradients of the constraints are linearly independent.
s
multiplier
Lagrangian
the
are
where
)
(
)
(
1
k
n
k
k
k λ
X
h
X
f
e

=
+
= λ
L
KKT necessary conditions (First order)
If x* ∈ X is a local minimum and a regular point, then there exists a
unique vector λ* (= (λ1
*, … , λne
*)T) such that
tarity)
(complemen
...
1
R
ty)
(feasibili
...
1
0
)
(
y)
(optimalit
0
)
(
)
(
*
*
*
1
*
*
e
k
e
k
k
n
k
x
k
x
n
k
n
k
x
h
x
h
x
f
e
=
∀
∈
=
∀
=
=
∇
+
∇ 
=
λ
λ
R. Zakhama 10
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Equality constraints
• Example 1
0
1
)
(
s.t.
2
min
2
2
2
1
2
1
, 2
1
=
−
+
=
+
x
x
x
h
x
x
x
x
)
1
(
2 2
2
2
1
2
1 −
+
+
+
= x
x
x
x λ
L







=
+
−
=
−
=






=
=
+
=
+






=
∂
∂
=
∇

=
∇

1
)
)
2
(
1
(
)
1
(
)
2
(
1
1
0
)
(
0
2
1
0
2
2
0
0
0
)
,
(
2
2
2
1
2
1
*
*
λ
λ
λ
λ
λ
λ
λ
λ x
x
x
h
x
x
x
x
L
L
L
KKT
2
5
±
=
 λ
2 solutions satisfy the KKT conditions, but only one is a minimum: these
conditions are necessary but not sufficient to be a local minimum.
R. Zakhama 11
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Equality constraints
• Example 1
FigII.2. : Example 1
R. Zakhama 12
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Equality constraints
• Geometrical interpretation
• For single equality constraint: simple geometrical interpretation of
Lagrange optimality condition:
Meaning
Gradients parallel 
tangents parallel  h tangent to isolines
0
=
∂
∂
+
∂
∂
x
h
x
f
λ
x
h
x
f
∂
∂
∂
∂
//
f
h
x1
x2
∇h
∇f
FigII.3. : Geometrical interpretation 1
R. Zakhama 13
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Equality constraints
• Geometrical interpretation
FigII.4. : Geometrical interpretation 2
f
h
∇h
∇f
f
h
∇h
∇f
f
h
∇h
∇f
f
h
∇h
∇f
maximum minimum
minimum no extremum
R. Zakhama 14
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Inequality constraints
• Let consider the following optimisation problem:
• To be able to apply the Lagrange multiplier method we first transform the
inequality constraints to equality constraints by adding slack variables.
• We can now form a Lagrangian function:
• The necessary condition for the local minimiser is
and it must be a feasible point (i.e. constraints are satisfied).
• These are Karush-Kuhn-Tucker conditions
2
,
1
0
)
(
s.t.
)
(
min
)
( C
g
f
...n
j
x
g
x
f
P j
g
j
x
∈





=
≤
s
multiplier
Lagrangian
the
are
where
)
)
(
(
)
(
1
2
k
n
j
j
j
j
g
s
x
g
x
f µ
µ

=
+
+
=
L
0
,
0
,
0
0
)
,
,
( *
*
*
=
∇
=
∇
=
∇
⇔
=
∇ L
L
L
L s
x
s
x µ
µ
g
j
j
j n
j
s
s
x
g ...
1
able,
slack vari
a
is
,
0
)
( 2
=
=
+
R. Zakhama 15
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Inequality constraints
• The KKT can be written as:
• What is the sign of µ?
,...,
1
0
1
n
i
x
g
x
f
x
g
n
j i
j
j
i
i
=
=
∂
∂
+
∂
∂
=
∂
∂

=
µ
L
0 g
j
j
j
n
j
s
g ,...,
1
2
=
=
+
=
∂
∂
µ
L
,...,
1
0
2 g
j
j
j
n
j
s
s
=
=
=
∂
∂
µ
L
We have
Feasible direction
Descent direction
Since x* is a local minimum
1

=
∇
−
=
∇
g
n
j
j
x
j
x g
f µ
g
j
x
T
j
x
T
n
j
g
d
g
d
d ,...,
1
0
0
/ =
≥
∇
−

≤
∇
∃

0
≤
∇
 f
dT

=
∇
−
=
∇

g
n
j
x
T
j
x
T
T
g
d
f
d
d
1
by
g
Multiplyin µ
φ
=
∩
 )
(
~
)
(
~ *
*
x
x D
F
0
≥
 j
µ
R. Zakhama 16
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Inequality constraints
s
multiplier
Lagrangian
the
are
where
)
(
)
(
1
k
n
k
k
k λ
X
h
X
f
e

=
+
= λ
L
KKT necessary conditions (First order)
If x* is a local minimum of problem (P) and a regular point, then there
exists a unique vector µ* (= (µ1
*, … , µng
*)T) such that
tarity)
(complemen
...
1
0
...
1
0
ty)
(feasibili
...
1
0
)
(
y)
(optimalit
0
)
(
)
(
*
*
*
2
*
*
*
1
*
*





=
∀
≥
=
∀
=
=
∀
=
+
=
∇
+
∇ 
=
g
j
g
j
j
g
j
j
j
n
j
x
j
x
n
j
n
j
s
n
j
s
x
g
x
g
x
f
g
µ
µ
µ
R. Zakhama 17
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Inequality constraints
• Graphical interpretation of KKT conditions
FigII.5. : Geometrical interpretation 3
( ) g
T
j n
k
j
d
g ≤
=
≤
∇ …
1
0
( ) 0
≤
∇ d
f
T
1
g
∇
2
g
∇
g2
g1
x1
x2
∇f
-∇f
1
g
∇
− 2
g
∇
−
Feasible cone
• Feasible direction:
• Descent direction:
• x* is a local optimum: -∇x f (x*) form an
obtuse angle (> 90 °) with each feasible
direction (we can not decrease f
without violating the feasible domain).
• negative gradient (descent direction) lies
in cone spanned by positive constraint
gradients.
• no descent direction exists within the
cone of feasible directions.
• -∇x f (x*) is a linear combination, with
coefficients µj > 0 of ∇xgj (x*), for
any gj active at x*
R. Zakhama 18
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Inequality constraints
• Graphical interpretation of KKT necessary conditions
f3 > f2 > f1
No active constraints at x*,
g(x)≤0
g(x)=0
f3 > f2 > f1
x* is not a minimiser, μ<0
g(x)≤0
g(x)=0
-∇g(x*)
f3 > f2 > f1
x* is a minimiser, μ>0
g(x)=0
-∇g(x*)
g(x)≤0
FigII.6. : Geometrical interpretation 4
R. Zakhama 19
Structural Optimisation course
Chap2: Mathematical Programming
II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)
 Equality and Inequality constraints
[ ]
where
1
0
)
(
1
0
)
(
s.t.
)
(
min
)
( 2
1
T
n
e
k
g
j
x
,...,x
,x
x
x
...n
k
x
h
...n
j
x
g
x
f
P =







=
=
=
≤
s
multiplier
Lagrangian
the
are
where
)
(
)
(
1
k
n
k
k
k λ
X
h
X
f
e

=
+
= λ
L
KKT necessary conditions (First order)
If x* is a local minimum of problem (P) and a regular point, then there
exists a unique vector λ* (= (λ1
*, … , λne
*)T) and a unique vector
µ* (= (µ1
*, … , µng
*)T) such that
y)
(optimalit
0
)
(
)
(
)
( *
1
*
*
1
*
*
=
∇
+
∇
+
∇ 
 =
=
x
g
x
h
x
f j
n
j
x
j
k
n
k
x
k
x
g
e
µ
λ
tarity)
(complemen
,...,
1
0
,...,
1
,...,
1
0
ty)
(feasibili
,...,
1
0
)
(
,...,
1
0
)
(
*
*
*
*
2
*
*
*







=
∀
≥
=
∀
∈
=
∀
=





=
∀
=
+
=
∀
=
g
j
e
k
g
j
j
g
j
j
e
k
n
j
n
k
R
λ
n
j
s
n
j
s
x
g
n
k
x
h
µ
µ
R. Zakhama 20
Structural Optimisation course
Chap2: Mathematical Programming
II.2 KKT Sufficient optimality conditions
s
multiplier
Lagrangian
the
are
where
)
(
)
(
1
k
n
k
k
k λ
X
h
X
f
e

=
+
= λ
L
KKT sufficient conditions (Second order)
If there exists x* ∈ X, λ* ∈ Rne , µ* ∈ R+
ng and s* ∈ Rng such that
The KKT necessary conditions are satisfied
and
For all d ≠ 0 such that
then x* is a strict local minimum of (P)
e
T
k n
k
d
x
h ,...,
1
,
0
)
( *
=
=
∇
0
)
,
,
,
( *
*
*
*
2
≥
∇ d
s
x
L
d x
T
µ
λ
0
and
)
(
,
0
)
( *
>
∈
=
∇ j
T
j x
j
d
x
g µ
A
0
and
)
(

}
,...,
1
{
,
0
)
( *
=
∈
≤
∇ j
g
T
j x
n
j
d
x
g µ
A
R. Zakhama 21
Structural Optimisation course
Chap2: Mathematical Programming
II.2 KKT Sufficient optimality conditions
 Example 2
• Problem:
• Write the KKT necessary conditions and find a solution of this problem by
discussing the different cases. Check if that solution is a local minimum.
0
1
)
(
0
9
)
(
s.t.
)
(
min
2
1
2
2
2
2
1
1
2
2
1
≤
−
+
=
≤
−
+
=
+
=
x
x
x
g
x
x
x
g
x
x
x
f
R. Zakhama 22
Structural Optimisation course
Chap2: Mathematical Programming
II.2 KKT Sufficient optimality conditions
 Example 2
• f , g1 , g2 are differentiable.
• The Lagrangian function can be written as:
• The KKT conditions are:
( ) ( ) 0
)
(
)
(
)
( 2
2
2
2
2
1
1
1 =
+
+
+
+
= s
x
g
s
x
g
x
f µ
µ
L















∈
≥
=
=
=
+
−
+
=
+
−
+






=






+






+












∈
≥
=
∇
R
s
s
s
s
s
x
x
s
x
x
x
x
x
R
s
s
s
x
L
2
1
2
1
2
2
1
1
2
2
2
1
2
1
2
2
2
1
2
2
1
1
1
2
1
2
1
,
(6)
0
,
(5)
0
(4)
0
(3)
0
1
(2)
0
9
(1)
0
0
1
1
2
2
1
2
,
0
,
0
)
,
,
(
µ
µ
µ
µ
µ
µ
µ
µ
µ
R. Zakhama 23
Structural Optimisation course
Chap2: Mathematical Programming
II.2 KKT Sufficient optimality conditions
 Example 2
• To solve this optimisation problem, we have to study the different possible
cases (it is not a general method to solve an optimisation problem).
• If µ1 = µ2 = 0, (4) and (5) are satisfied but not (1)
• If µ1 ≠ 0 and µ2 = 0, (1)  2x1+2µ1x1 = 0
if x1 ≠ 0  µ1 = -1 which does not satisfy (6)
 x1 = 0
(4)  s1 = 0 since µ1 ≠ 0  (2) is simplified to x2
2 – 9 =0  x2 = ± 3
if x2 = 3 , (1)  1+6µ1 = 0 does not satisfy (6)
if x2 = -3 , (1)  1-6µ1 = 0  µ
µ
µ
µ1 = 1/6
• The resolution of KKT necessary conditions give
x1
* = 0, x2
* = -3, µ
µ
µ
µ1
* = 1/6, µ
µ
µ
µ2
* = 0
•  ∇x
2L is positive definite  x* is a local minimum








=
∇
3
1
0
0
3
7
2
xL
(KKT sufficient conditions)
R. Zakhama 24
Structural Optimisation course
Chap2: Mathematical Programming
X
II.2 KKT Sufficient optimality conditions
 Example 2
FigII.7. : Example 2
R. Zakhama 25
Structural Optimisation course
Chap2: Mathematical Programming
III. Constrained optimisation methods
III.1 Feasible directions method
• The method is intended for problems with general inequality constraints.
• The iterative solutions are moving along the boundary.
• The method relies on determining a search direction and performing 1-D
minimisation along the specified direction,
• We seek a direction that will reduce the objective function without violating
the constraints for a finite move along the direction:
k
k
k
k
d
x
x α
+
=
+1
• A direction which reduces the objective function is called a usable
direction,
• A direction which does not cause violation of the active constraints
upon move is a feasible direction,
• Direction satisfying both conditions is a usable feasible direction.
0
)
( ≤
∇ k
T
k
d
x
f
)
(
,
0
)
( k
k
T
k
j x
j
d
x
g A
∈
≤
∇
R. Zakhama 26
Structural Optimisation course
Chap2: Mathematical Programming
III. Constrained optimisation methods
III.1 Feasible directions method
• Zoutendijk’s basic idea is at each stage of iteration to determine a vector dk
that will be both a feasible direction and a descent direction. This is
accomplished numerically by finding a normalised direction vector dk and a
scalar parameter θ < 0 such that:
• This subproblem is linear and can be solved using the simplex algorithm,
• If θ < 0, we have found a usable feasible.
• If θ = 0, it can be shown that the KKT conditions are satisfied.







≤
≤
−
∈
≤
∇
≤
∇
1
1
)
(
,
)
(
)
(
s.t.
min
)
(
,
k
i
k
k
T
k
j
k
T
k
d
d
x
j
d
x
g
d
x
f
L
k
A
θ
θ
θ
θ
R. Zakhama 27
Structural Optimisation course
Chap2: Mathematical Programming
III. Constrained optimisation methods
III.1 Feasible directions method
(a) Initialise: x0 and k = 0
(b) Solve the linear problem (L) to find dk and θ
(c) If θ < 0,
• Find the maximum step such that will be in the feasible domain
• Find αk along dk such that
•
• k = k+1, go to (b)
(d) Else if θ ≥ 0, the iteration terminates.
Output: x* = xk , a stationary point.
∞
=
>
≥
=
=
+
=
α
α
α
α
α
α
set
exists,
0
no
if
}
0
and
,...,
1
,
0
)
(
:
min{ g
k
k
j n
j
d
x
g
α k
k
d
x α
+
d
α
x
x k
k
k
k
+
=
+1
[ ]
)
(
min
arg
,
0
k
k
k
d
x
f α
α
α
α
+
=
∈
R. Zakhama 28
Structural Optimisation course
Chap2: Mathematical Programming
III. Constrained optimisation methods
III.2 Penalty function methods
• Consider the problem:
where X ∈ Rn
• Idea:
• Transform constrained problem into unconstrained one.
• Solve a sequence of unconstrained optimisation problems.
• Define a function,
• Solve an equivalent unconstrained problem:
X
∈
x
x
f
x
s.t.
)
(
min



∉
∞
+
∈
=
X
X
x
x
x
if
if
0
)
(
ψ
)
(
)
(
min x
x
f
x
ψ
+
R. Zakhama 29
Structural Optimisation course
Chap2: Mathematical Programming
III. Constrained optimisation methods
III.2 Penalty function methods
• The function ψ(x) is not a practical approach.
• Replace ψ(x) by a sequence of continuous non-negative functions that
approach ψ(x).
• Penalty function ψ :
- Exterior penalty functions.
- Interior penalty functions (or barrier functions).
R. Zakhama 30
Structural Optimisation course
Chap2: Mathematical Programming
III. Constrained optimisation methods
III.2 Penalty function methods
 Exterior penalty functions
• Constrained problem:
• Define,
• Unconstrained exterior penalty transformation:
• Solve for increasing sequence {rk} such that:
• Let
• Ideally,
[ ] 
 =
=
+
=
e
g n
l
l
n
j
j x
h
x
g
x
P
1
2
2
1
)
(
)}
(
,
0
max{
)
(
1
0
)
(
1
0
)
(
s.t.
)
(
min
...n
l
x
h
...n
j
x
g
x
f
e
l
g
j
R
x n
=
=
=
≤
∈
)
(
)
(
)
,
(
min x
rP
x
f
r
x
q
n
R
x
+
=
∈
+∞
→
<
<
<
+∞
→
k
k
r
r
r
r
k
2
1
lim
,
⋯
)
,
(
min
arg k
R
x
k
r
x
q
x n
∈
=
+∞
→
→ }
{
as
}
{ * k
k
r
x
x
R. Zakhama 31
Structural Optimisation course
Chap2: Mathematical Programming
III.2 Penalty function methods
 Exterior penalty functions
rk
q(x,rk)
FigIII.1. : Inequality constrained problem in 1-D
R. Zakhama 32
Structural Optimisation course
Chap2: Mathematical Programming
III.2 Penalty function methods
 Exterior penalty functions
(1) Initialise: x0 , r0 , ε > 0 and k = 0
(2) while
(a)
(b)
(c) k = k+1
endwhile
Output: x* = xk
( ) ε
>
− )
(
)
,
( k
k
k
x
f
r
x
q
)
,
(
min
arg
1 k
R
x
k
r
x
q
x n
∈
+
=
)
10
1
example
(for
1
<
<
=
+
c
c
r
r k
k
R. Zakhama 33
Structural Optimisation course
Chap2: Mathematical Programming
III.2 Penalty function methods
 Interior penalty functions
• Typically applicable to inequality constrained problems:
• Some interior penalty (Barrier) functions:
• Unconstrained exterior penalty transformation:
• Solve for increasing sequence {rk}.
[ ]
)
(
ln
)
(
or
)
(
1
)
(
1
1

 =
=
−
−
=
−
=
g
g n
j
j
n
j j
x
g
x
B
x
g
x
B
1
0
)
(
s.t.
)
(
min
...n
j
x
g
x
f
g
j
R
x n
=
≤
∈
)
(
1
)
(
)
,
(
min x
B
r
x
f
r
x
q
n
R
x
+
=
∈
R. Zakhama 34
Structural Optimisation course
Chap2: Mathematical Programming
III.2 Penalty function methods
 Interior penalty functions
1/rk
q(x,rk)
FigIII.2. : Inequality constrained problem in 1-D
R. Zakhama 35
Structural Optimisation course
Chap2: Mathematical Programming
III.3 Augmented Lagrangian (multiplier) methods
• Consider the problem (equality constraints):
• The Lagrangian function:
• The KKT necessary conditions:
• Augmented Lagrangian function:
• Note that the optimal value of λl are not known.
• If all λl = 0, then the method is identical to the Exterior Penalty Function.
• If, on the other hand, we have optimal values λl
* , then min T is the answer
for any finite value of r.

 =
=
+
+
=
e
e n
l
l
n
l
l
l x
h
r
x
h
x
f
r
x
T
1
2
1
)
(
)
(
)
(
)
,
,
( λ
λ
1
0
)
(
s.t.
)
(
min
...n
l
x
h
x
f
e
l
R
x n
=
=
∈

=
+
=
e
n
l
l
l x
h
x
f
x
L
1
)
(
)
(
)
,
( λ
λ
0
)
(
,
0
1
*
=
=
∇
+
∇
=
∇ 
=
x
h
h
f l
n
l
l
x
l
x
x
e
λ
L
R. Zakhama 36
Structural Optimisation course
Chap2: Mathematical Programming
III.3 Augmented Lagrangian (multiplier) methods
• First order KKT conditions for the augmented Lagrangian function:
• The exact KKT conditions for the considered problem:
• Comparing the two last equations, we expect that:
• Hestenes (Hestenes, 1969) suggested using the following update for λl :
• Easy to extend the augmented Lagrangian for the inequality constraints
(g(x) ≤ 0 ) since we can write:
( ) 0
2
1
=
∇
+
+
∇
=
∇ 
=
e
n
l
l
x
l
l
x
x h
rh
f
T λ
0
1
*
=
∇
+
∇
=
∇ 
=
e
n
l
l
x
l
x
x h
f λ
L
*
2 l
l
l rh λ
λ →
+
)
(
2
1 k
l
k
l
k
l x
rh
+
=
+
λ
λ
0
)
( 2
=
+ s
x
g
R. Zakhama 37
Structural Optimisation course
Chap2: Mathematical Programming
III.3 Augmented Lagrangian (multiplier) methods
(1) Input: r, ε > 0
(2) Initialise: x0 and k = 0
(3) while
(a)
(b)
(c) k = k+1
endwhile
Output: x* = xk
( ) ε
λ >
− )
(
)
,
,
( k
k
k
x
f
r
x
T
)
,
,
(
min
arg
1
r
x
T
x k
R
x
k
n
λ
∈
+
=
)
(
2
1 k
k
k
x
rh
+
=
+
λ
λ
R. Zakhama 38
Structural Optimisation course
Chap2: Mathematical Programming
IV. Problems
IV.1 Problem 1
In this exercise we reconsider the optimisation problem of a truss structure with 2 bars for
a minimum weight (figure IV.1).
We want to find the traversal sections X1 and X2 such that the structure weight Z is
minimised.
Bars must:
- resist under static loads.
- the displacement δy of node A must not exceed a limit δ0.
L et L/cosα
α
α
α : bares length
E : Young modulus
X1 et X2 : transversal bares sections
δx , δy : node displacements A
F : load applied at A
ρ : density
Z : total masse
σ0 and -σ0 : elasticity limit in tension and in
compression, respectively.
FigIV.1. : Problem 1
R. Zakhama 39
Structural Optimisation course
Chap2: Mathematical Programming
IV. Problems
IV.1 Problem 1
The problem formulation can be written as:
We take (1)
(P)






+
= 1
2
cos
min
2
1
X
α
L
LX
ρ
Z
,X
X
s.t.
E
L
0
0
σ
δ =
R. Zakhama 40
Structural Optimisation course
Chap2: Mathematical Programming
IV. Problems
IV.1 Problem 1
1- From the expression of equation (1), what does correspond physically the displacement
limit δ0.
2- We consider the following variables replacements:
and
We pose
Write the new formulation (P’) of the optimisation problem (P) with respect Y1 and Y2.
3- Demonstrate if the constraints g1 , g2 and g3 are satisfied then the constraints g4 and g5
are satisfied too and can be eliminated (we say g1 , g2 and g3 dominate g4 and g5).
4- Write the necessary KKT conditions for the new problem (P’).
5- Knowing that the only constraint g3 is active at the optimum, solve the new problem
(P’) and deduce the optimal section values X1 and X2 as well as the Lagrangian multipliers
values.
1
0
1
2
X
F
Y
σ
=
2
0
2
3
X
F
Y
σ
=
0
σ
ρLF
c =
R. Zakhama 41
Structural Optimisation course
Chap2: Mathematical Programming
IV. Problems
IV.2 Problem 2
Let consider the following problem:
Where s and h are positive values
1- Write the KKT necessary conditions for the given problem.
2- Knowing that the only constraint g1 and g5 are active at the optimum, compute the
optimum values x1 , x2 and the Lagrangian multipliers values (Assuming that h = 1200
and s = 100).
3- Compute the values of the objective and the Lagrangian function at the optimum.
( ) ( )
[ ]















>
≤
−
+
≡
≤
−
≡
≥
−
≡
≤
+
−
≡
≤
+
−
≡
≥
−
⋅
≡
−
+
+
+
=
0
,
0
1
150
,
0
1
750
0
1
10
,
0
1
01
.
0
0
1
13
.
0
,
0
1
10
12
.
2
s.t.
2
min
2
1
1
6
2
5
2
4
1
3
1
2
7
2
2
1
1
2
2
1
2
1
2
1
x
x
s
x
g
x
g
x
g
x
s
g
x
h
g
x
x
g
x
x
s
x
h
s
x
f
x
π
π
π
R. Zakhama 42
Structural Optimisation course
Chap2: Mathematical Programming
IV. Problems
IV.3 Problem 3
Given the problem:
1- Solve by the exterior penalty function for r1 = 0.1, r2 = 1.0. Show graphically q(X,r) as a
function of X= X1 = X2 for the values of r.
2- Solve by the interior penalty function for r1 = 10.0, r2 =100.0. Show graphically q(X,r) as
a function of X= X1 = X2 for the values of r.
3- Solve by the method of feasible directions. Calculate two iterations cycles, starting at the
initial point {X1}T = {7,4}. Show graphically the directions of move in the space of
X1 and X2.
0
9
)
5
(
)
5
(
g
s.t.
min
2
2
2
1
2
1
, 2
1
≤
−
−
+
−
≡
+
=
X
X
X
X
Z
X
X
R. Zakhama 43
Structural Optimisation course
Chap2: Mathematical Programming
Sources
Presentation based on material from:
• Haftka & Gürdal (1994): Elements of Structural Optimization
• van Keulen: TUD Optimization course
• Etman (2006): EM course Engineering Optimization

More Related Content

Similar to Chapter 2-2.pdf

NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learning
zukun
 
Integration techniques
Integration techniquesIntegration techniques
Integration techniques
Krishna Gali
 
slides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadslides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhad
Farhad Gholami
 

Similar to Chapter 2-2.pdf (20)

Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
Minimax optimal alternating minimization \\ for kernel nonparametric tensor l...
 
NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learning
 
Calculus of variations
Calculus of variationsCalculus of variations
Calculus of variations
 
Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3
 
Metodo gauss_newton.pdf
Metodo gauss_newton.pdfMetodo gauss_newton.pdf
Metodo gauss_newton.pdf
 
Main
MainMain
Main
 
The Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability DistributionThe Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability Distribution
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
ma112011id535
ma112011id535ma112011id535
ma112011id535
 
An Approach For Solving Nonlinear Programming Problems
An Approach For Solving Nonlinear Programming ProblemsAn Approach For Solving Nonlinear Programming Problems
An Approach For Solving Nonlinear Programming Problems
 
Numerical analysis m2 l4slides
Numerical analysis  m2 l4slidesNumerical analysis  m2 l4slides
Numerical analysis m2 l4slides
 
Integration techniques
Integration techniquesIntegration techniques
Integration techniques
 
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
 
slides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadslides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhad
 
(α ψ)- Construction with q- function for coupled fixed point
(α   ψ)-  Construction with q- function for coupled fixed point(α   ψ)-  Construction with q- function for coupled fixed point
(α ψ)- Construction with q- function for coupled fixed point
 
Convex Optimization Modelling with CVXOPT
Convex Optimization Modelling with CVXOPTConvex Optimization Modelling with CVXOPT
Convex Optimization Modelling with CVXOPT
 
RT
RTRT
RT
 
Berans qm overview
Berans qm overviewBerans qm overview
Berans qm overview
 
Seismic data processing lecture 3
Seismic data processing lecture 3Seismic data processing lecture 3
Seismic data processing lecture 3
 
Solvability of Fractionl q -Difference Equations of Order 2   3 Involving ...
Solvability of Fractionl q -Difference Equations of Order 2   3 Involving ...Solvability of Fractionl q -Difference Equations of Order 2   3 Involving ...
Solvability of Fractionl q -Difference Equations of Order 2   3 Involving ...
 

Recently uploaded

Call Girls In Bangalore ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Bangalore ☎ 7737669865 🥵 Book Your One night StandCall Girls In Bangalore ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Bangalore ☎ 7737669865 🥵 Book Your One night Stand
amitlee9823
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Kandungan 087776558899
 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdf
ankushspencer015
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
dollysharma2066
 
Top Rated Call Girls In chittoor 📱 {7001035870} VIP Escorts chittoor
Top Rated Call Girls In chittoor 📱 {7001035870} VIP Escorts chittoorTop Rated Call Girls In chittoor 📱 {7001035870} VIP Escorts chittoor
Top Rated Call Girls In chittoor 📱 {7001035870} VIP Escorts chittoor
dharasingh5698
 

Recently uploaded (20)

chapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineeringchapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineering
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghly
 
(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7
(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7
(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7
 
Call Girls In Bangalore ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Bangalore ☎ 7737669865 🥵 Book Your One night StandCall Girls In Bangalore ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Bangalore ☎ 7737669865 🥵 Book Your One night Stand
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
 
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
 
Thermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VThermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - V
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
 
Call Girls Wakad Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Wakad Call Me 7737669865 Budget Friendly No Advance BookingCall Girls Wakad Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Wakad Call Me 7737669865 Budget Friendly No Advance Booking
 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdf
 
Double Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueDouble Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torque
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
 
Bhosari ( Call Girls ) Pune 6297143586 Hot Model With Sexy Bhabi Ready For ...
Bhosari ( Call Girls ) Pune  6297143586  Hot Model With Sexy Bhabi Ready For ...Bhosari ( Call Girls ) Pune  6297143586  Hot Model With Sexy Bhabi Ready For ...
Bhosari ( Call Girls ) Pune 6297143586 Hot Model With Sexy Bhabi Ready For ...
 
(INDIRA) Call Girl Meerut Call Now 8617697112 Meerut Escorts 24x7
(INDIRA) Call Girl Meerut Call Now 8617697112 Meerut Escorts 24x7(INDIRA) Call Girl Meerut Call Now 8617697112 Meerut Escorts 24x7
(INDIRA) Call Girl Meerut Call Now 8617697112 Meerut Escorts 24x7
 
Top Rated Call Girls In chittoor 📱 {7001035870} VIP Escorts chittoor
Top Rated Call Girls In chittoor 📱 {7001035870} VIP Escorts chittoorTop Rated Call Girls In chittoor 📱 {7001035870} VIP Escorts chittoor
Top Rated Call Girls In chittoor 📱 {7001035870} VIP Escorts chittoor
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdf
 
Unit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdfUnit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdf
 
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
 
Block diagram reduction techniques in control systems.ppt
Block diagram reduction techniques in control systems.pptBlock diagram reduction techniques in control systems.ppt
Block diagram reduction techniques in control systems.ppt
 
Generative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTGenerative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPT
 

Chapter 2-2.pdf

  • 1. R. Zakhama Structural Optimisation course Chapter 2: Mathematical Programming Part 2: Constrained Optimisation
  • 2. R. Zakhama 2 Structural Optimisation course Chap2: Mathematical Programming I. Introduction  Feasible point: That satisfies all constraints.  Feasible region: The set of values X of the design variables that satisfy all the constraints. X = {x∈Rn / gj(x) ≤ 0 j=1, ..., ng and hk (x) = 0 k=1, ..., ne} (P) can be written as:  Remarks: - Generally, the solution x* is on the surface of the feasible region (if not, the constraints are useless, ill-posed problem). - From algorithmic point of view, one of the difficulties, is that we do not know a priori what constraints will be active at the optimum. The idea is to convert the problem to an unconstrained optimisation. [ ] where 1 0 ) ( 1 0 ) ( s.t. ) ( min ) ( 2 1 T n e k g j x ,...,x ,x x x ...n k x h ...n j x g x f P =        = = = ≤ s.t. ) ( min      ∈ X x x f x
  • 3. R. Zakhama 3 Structural Optimisation course Chap2: Mathematical Programming II. Feasible and Descent Directions • Let F(x) = Set of feasible directions at x ∈ X. • Let D(x) = Set of descent directions at x ∈ X. s.t. ) ( min      ∈ X x x f x Definition A vector , is said to be a feasible direction at if there exists δ1 > 0 such that for all α∈(0,δ1). n R x∈ n R d ∈ X ∈ + d x α 0 ≠ d Definition A vector , is said to be a descent direction at if there exists δ2 > 0 such that for all α∈(0,δ2). n R x∈ n R d ∈ ) ( ) ( x f d x f < +α 0 ≠ d
  • 4. R. Zakhama 4 Structural Optimisation course Chap2: Mathematical Programming II. Feasible and Descent Directions g2 g1 x1 x2 s s FigII.1. : Feasible directions
  • 5. R. Zakhama 5 Structural Optimisation course Chap2: Mathematical Programming II. Feasible and Descent Directions s.t. ) ( min      ∈ X x x f x Theorem Let X be a nonempty set in Rn and be a local minimum of f over X . Then, X ∈ * x φ = ∩ ) ( ) ( * * x x D F Proof. Let be a local minimum. By contradiction, assume that ∃ a nonzero . ∴∃ δ1 > 0 ∋ ∀ α ∈(0,δ1) and ∃ δ2 > 0 ∋ ∀ α ∈(0,δ2). Hence, ∃ x ∈ B(x*,α)∩X ∋ f(x) < f(x*), for all α∈(0,min(δ1 ,δ2)). ) ( ) ( * * x f d x f < +α X ∈ * x ) ( ) ( * * x x d D F ∩ ∈ X ∈ + d x α *
  • 6. R. Zakhama 6 Structural Optimisation course Chap2: Mathematical Programming II. Feasible and Descent Directions • is a local minimum • Consider any and assume • Let such that  d is a descent direction  d ∈ D(x) • Let • is a local minimum • If F(x*) = Rn (every direction in Rn is locally feasible), x* ∈ X is a local minimum • Can we characterise F(x*) algebraically for a constrained optimisation problem? s.t. ) ( min      ∈ X x x f x X ∈ * x φ = ∩  ) ( ) ( * * x x D F X ∈ x 2 C ∈ f n R d ∈ 0 ) ( < ∇ d x f T ) ( ) ( x f d x f < +  α ) ( } 0 ) ( : { ) ( ~ x d x f d x T D D ⊆ < ∇ = X ∈ * x φ = ∩  ) ( ~ ) ( * * x x D F 0 ) ( } 0 ) ( : { * = ∇  = < ∇  x f d x f d T φ
  • 7. R. Zakhama 7 Structural Optimisation course Chap2: Mathematical Programming II. Feasible and Descent Directions • Assume f, gj ∈ C2 j=1…ng • X = {x ∈ Rn : gj(x) ≤ 0 , j=1…ng} • Active constraints : A(x) = {j : gj(x) = 0}  is a local minimum • This is only a necessary condition for a local minimum. • Cannot be directly used for equality constrained problems ... 1 , 0 ) ( s.t. ) ( min j      = ≤ g x n j x g x f Lemma For any x ∈X , ) ( )} ( , 0 ) ( : { ) ( ~ x x j d x g d x T j F A F ⊆ ∈ < ∇ = X ∈ * x φ = ∩  ) ( ~ ) ( ~ * * x x D F
  • 8. R. Zakhama 8 Structural Optimisation course Chap2: Mathematical Programming II. Optimality conditions II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Equality constraints • Let consider the following optimisation problem: • We introduce the Lagrangian function: • The necessary condition for the local minimiser is and it must be a feasible point (i.e. constraints are satisfied). • These are Karush-Kuhn-Tucker conditions 1 0 ) ( s.t. ) ( min ) (      = = ...n k x h x f P e k x s multiplier Lagrangian the are where ) ( ) ( 1 k n k k k λ x h x f e  = + = λ L 0 , 0 0 ) , ( * * = ∇ = ∇ ⇔ = ∇ L L L λ λ x x
  • 9. R. Zakhama 9 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Equality constraints • These conditions, however, apply only at a regular point, that is at a point where the gradients of the constraints are linearly independent. s multiplier Lagrangian the are where ) ( ) ( 1 k n k k k λ X h X f e  = + = λ L KKT necessary conditions (First order) If x* ∈ X is a local minimum and a regular point, then there exists a unique vector λ* (= (λ1 *, … , λne *)T) such that tarity) (complemen ... 1 R ty) (feasibili ... 1 0 ) ( y) (optimalit 0 ) ( ) ( * * * 1 * * e k e k k n k x k x n k n k x h x h x f e = ∀ ∈ = ∀ = = ∇ + ∇  = λ λ
  • 10. R. Zakhama 10 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Equality constraints • Example 1 0 1 ) ( s.t. 2 min 2 2 2 1 2 1 , 2 1 = − + = + x x x h x x x x ) 1 ( 2 2 2 2 1 2 1 − + + + = x x x x λ L        = + − = − =       = = + = +       = ∂ ∂ = ∇  = ∇  1 ) ) 2 ( 1 ( ) 1 ( ) 2 ( 1 1 0 ) ( 0 2 1 0 2 2 0 0 0 ) , ( 2 2 2 1 2 1 * * λ λ λ λ λ λ λ λ x x x h x x x x L L L KKT 2 5 ± =  λ 2 solutions satisfy the KKT conditions, but only one is a minimum: these conditions are necessary but not sufficient to be a local minimum.
  • 11. R. Zakhama 11 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Equality constraints • Example 1 FigII.2. : Example 1
  • 12. R. Zakhama 12 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Equality constraints • Geometrical interpretation • For single equality constraint: simple geometrical interpretation of Lagrange optimality condition: Meaning Gradients parallel  tangents parallel  h tangent to isolines 0 = ∂ ∂ + ∂ ∂ x h x f λ x h x f ∂ ∂ ∂ ∂ // f h x1 x2 ∇h ∇f FigII.3. : Geometrical interpretation 1
  • 13. R. Zakhama 13 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Equality constraints • Geometrical interpretation FigII.4. : Geometrical interpretation 2 f h ∇h ∇f f h ∇h ∇f f h ∇h ∇f f h ∇h ∇f maximum minimum minimum no extremum
  • 14. R. Zakhama 14 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Inequality constraints • Let consider the following optimisation problem: • To be able to apply the Lagrange multiplier method we first transform the inequality constraints to equality constraints by adding slack variables. • We can now form a Lagrangian function: • The necessary condition for the local minimiser is and it must be a feasible point (i.e. constraints are satisfied). • These are Karush-Kuhn-Tucker conditions 2 , 1 0 ) ( s.t. ) ( min ) ( C g f ...n j x g x f P j g j x ∈      = ≤ s multiplier Lagrangian the are where ) ) ( ( ) ( 1 2 k n j j j j g s x g x f µ µ  = + + = L 0 , 0 , 0 0 ) , , ( * * * = ∇ = ∇ = ∇ ⇔ = ∇ L L L L s x s x µ µ g j j j n j s s x g ... 1 able, slack vari a is , 0 ) ( 2 = = +
  • 15. R. Zakhama 15 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Inequality constraints • The KKT can be written as: • What is the sign of µ? ,..., 1 0 1 n i x g x f x g n j i j j i i = = ∂ ∂ + ∂ ∂ = ∂ ∂  = µ L 0 g j j j n j s g ,..., 1 2 = = + = ∂ ∂ µ L ,..., 1 0 2 g j j j n j s s = = = ∂ ∂ µ L We have Feasible direction Descent direction Since x* is a local minimum 1  = ∇ − = ∇ g n j j x j x g f µ g j x T j x T n j g d g d d ,..., 1 0 0 / = ≥ ∇ −  ≤ ∇ ∃  0 ≤ ∇  f dT  = ∇ − = ∇  g n j x T j x T T g d f d d 1 by g Multiplyin µ φ = ∩  ) ( ~ ) ( ~ * * x x D F 0 ≥  j µ
  • 16. R. Zakhama 16 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Inequality constraints s multiplier Lagrangian the are where ) ( ) ( 1 k n k k k λ X h X f e  = + = λ L KKT necessary conditions (First order) If x* is a local minimum of problem (P) and a regular point, then there exists a unique vector µ* (= (µ1 *, … , µng *)T) such that tarity) (complemen ... 1 0 ... 1 0 ty) (feasibili ... 1 0 ) ( y) (optimalit 0 ) ( ) ( * * * 2 * * * 1 * *      = ∀ ≥ = ∀ = = ∀ = + = ∇ + ∇  = g j g j j g j j j n j x j x n j n j s n j s x g x g x f g µ µ µ
  • 17. R. Zakhama 17 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Inequality constraints • Graphical interpretation of KKT conditions FigII.5. : Geometrical interpretation 3 ( ) g T j n k j d g ≤ = ≤ ∇ … 1 0 ( ) 0 ≤ ∇ d f T 1 g ∇ 2 g ∇ g2 g1 x1 x2 ∇f -∇f 1 g ∇ − 2 g ∇ − Feasible cone • Feasible direction: • Descent direction: • x* is a local optimum: -∇x f (x*) form an obtuse angle (> 90 °) with each feasible direction (we can not decrease f without violating the feasible domain). • negative gradient (descent direction) lies in cone spanned by positive constraint gradients. • no descent direction exists within the cone of feasible directions. • -∇x f (x*) is a linear combination, with coefficients µj > 0 of ∇xgj (x*), for any gj active at x*
  • 18. R. Zakhama 18 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Inequality constraints • Graphical interpretation of KKT necessary conditions f3 > f2 > f1 No active constraints at x*, g(x)≤0 g(x)=0 f3 > f2 > f1 x* is not a minimiser, μ<0 g(x)≤0 g(x)=0 -∇g(x*) f3 > f2 > f1 x* is a minimiser, μ>0 g(x)=0 -∇g(x*) g(x)≤0 FigII.6. : Geometrical interpretation 4
  • 19. R. Zakhama 19 Structural Optimisation course Chap2: Mathematical Programming II.1 The K.K.T. necessary conditions ( Karush, Kuhn, Tucker)  Equality and Inequality constraints [ ] where 1 0 ) ( 1 0 ) ( s.t. ) ( min ) ( 2 1 T n e k g j x ,...,x ,x x x ...n k x h ...n j x g x f P =        = = = ≤ s multiplier Lagrangian the are where ) ( ) ( 1 k n k k k λ X h X f e  = + = λ L KKT necessary conditions (First order) If x* is a local minimum of problem (P) and a regular point, then there exists a unique vector λ* (= (λ1 *, … , λne *)T) and a unique vector µ* (= (µ1 *, … , µng *)T) such that y) (optimalit 0 ) ( ) ( ) ( * 1 * * 1 * * = ∇ + ∇ + ∇   = = x g x h x f j n j x j k n k x k x g e µ λ tarity) (complemen ,..., 1 0 ,..., 1 ,..., 1 0 ty) (feasibili ,..., 1 0 ) ( ,..., 1 0 ) ( * * * * 2 * * *        = ∀ ≥ = ∀ ∈ = ∀ =      = ∀ = + = ∀ = g j e k g j j g j j e k n j n k R λ n j s n j s x g n k x h µ µ
  • 20. R. Zakhama 20 Structural Optimisation course Chap2: Mathematical Programming II.2 KKT Sufficient optimality conditions s multiplier Lagrangian the are where ) ( ) ( 1 k n k k k λ X h X f e  = + = λ L KKT sufficient conditions (Second order) If there exists x* ∈ X, λ* ∈ Rne , µ* ∈ R+ ng and s* ∈ Rng such that The KKT necessary conditions are satisfied and For all d ≠ 0 such that then x* is a strict local minimum of (P) e T k n k d x h ,..., 1 , 0 ) ( * = = ∇ 0 ) , , , ( * * * * 2 ≥ ∇ d s x L d x T µ λ 0 and ) ( , 0 ) ( * > ∈ = ∇ j T j x j d x g µ A 0 and ) ( } ,..., 1 { , 0 ) ( * = ∈ ≤ ∇ j g T j x n j d x g µ A
  • 21. R. Zakhama 21 Structural Optimisation course Chap2: Mathematical Programming II.2 KKT Sufficient optimality conditions  Example 2 • Problem: • Write the KKT necessary conditions and find a solution of this problem by discussing the different cases. Check if that solution is a local minimum. 0 1 ) ( 0 9 ) ( s.t. ) ( min 2 1 2 2 2 2 1 1 2 2 1 ≤ − + = ≤ − + = + = x x x g x x x g x x x f
  • 22. R. Zakhama 22 Structural Optimisation course Chap2: Mathematical Programming II.2 KKT Sufficient optimality conditions  Example 2 • f , g1 , g2 are differentiable. • The Lagrangian function can be written as: • The KKT conditions are: ( ) ( ) 0 ) ( ) ( ) ( 2 2 2 2 2 1 1 1 = + + + + = s x g s x g x f µ µ L                ∈ ≥ = = = + − + = + − +       =       +       +             ∈ ≥ = ∇ R s s s s s x x s x x x x x R s s s x L 2 1 2 1 2 2 1 1 2 2 2 1 2 1 2 2 2 1 2 2 1 1 1 2 1 2 1 , (6) 0 , (5) 0 (4) 0 (3) 0 1 (2) 0 9 (1) 0 0 1 1 2 2 1 2 , 0 , 0 ) , , ( µ µ µ µ µ µ µ µ µ
  • 23. R. Zakhama 23 Structural Optimisation course Chap2: Mathematical Programming II.2 KKT Sufficient optimality conditions  Example 2 • To solve this optimisation problem, we have to study the different possible cases (it is not a general method to solve an optimisation problem). • If µ1 = µ2 = 0, (4) and (5) are satisfied but not (1) • If µ1 ≠ 0 and µ2 = 0, (1)  2x1+2µ1x1 = 0 if x1 ≠ 0  µ1 = -1 which does not satisfy (6)  x1 = 0 (4)  s1 = 0 since µ1 ≠ 0  (2) is simplified to x2 2 – 9 =0  x2 = ± 3 if x2 = 3 , (1)  1+6µ1 = 0 does not satisfy (6) if x2 = -3 , (1)  1-6µ1 = 0  µ µ µ µ1 = 1/6 • The resolution of KKT necessary conditions give x1 * = 0, x2 * = -3, µ µ µ µ1 * = 1/6, µ µ µ µ2 * = 0 •  ∇x 2L is positive definite  x* is a local minimum         = ∇ 3 1 0 0 3 7 2 xL (KKT sufficient conditions)
  • 24. R. Zakhama 24 Structural Optimisation course Chap2: Mathematical Programming X II.2 KKT Sufficient optimality conditions  Example 2 FigII.7. : Example 2
  • 25. R. Zakhama 25 Structural Optimisation course Chap2: Mathematical Programming III. Constrained optimisation methods III.1 Feasible directions method • The method is intended for problems with general inequality constraints. • The iterative solutions are moving along the boundary. • The method relies on determining a search direction and performing 1-D minimisation along the specified direction, • We seek a direction that will reduce the objective function without violating the constraints for a finite move along the direction: k k k k d x x α + = +1 • A direction which reduces the objective function is called a usable direction, • A direction which does not cause violation of the active constraints upon move is a feasible direction, • Direction satisfying both conditions is a usable feasible direction. 0 ) ( ≤ ∇ k T k d x f ) ( , 0 ) ( k k T k j x j d x g A ∈ ≤ ∇
  • 26. R. Zakhama 26 Structural Optimisation course Chap2: Mathematical Programming III. Constrained optimisation methods III.1 Feasible directions method • Zoutendijk’s basic idea is at each stage of iteration to determine a vector dk that will be both a feasible direction and a descent direction. This is accomplished numerically by finding a normalised direction vector dk and a scalar parameter θ < 0 such that: • This subproblem is linear and can be solved using the simplex algorithm, • If θ < 0, we have found a usable feasible. • If θ = 0, it can be shown that the KKT conditions are satisfied.        ≤ ≤ − ∈ ≤ ∇ ≤ ∇ 1 1 ) ( , ) ( ) ( s.t. min ) ( , k i k k T k j k T k d d x j d x g d x f L k A θ θ θ θ
  • 27. R. Zakhama 27 Structural Optimisation course Chap2: Mathematical Programming III. Constrained optimisation methods III.1 Feasible directions method (a) Initialise: x0 and k = 0 (b) Solve the linear problem (L) to find dk and θ (c) If θ < 0, • Find the maximum step such that will be in the feasible domain • Find αk along dk such that • • k = k+1, go to (b) (d) Else if θ ≥ 0, the iteration terminates. Output: x* = xk , a stationary point. ∞ = > ≥ = = + = α α α α α α set exists, 0 no if } 0 and ,..., 1 , 0 ) ( : min{ g k k j n j d x g α k k d x α + d α x x k k k k + = +1 [ ] ) ( min arg , 0 k k k d x f α α α α + = ∈
  • 28. R. Zakhama 28 Structural Optimisation course Chap2: Mathematical Programming III. Constrained optimisation methods III.2 Penalty function methods • Consider the problem: where X ∈ Rn • Idea: • Transform constrained problem into unconstrained one. • Solve a sequence of unconstrained optimisation problems. • Define a function, • Solve an equivalent unconstrained problem: X ∈ x x f x s.t. ) ( min    ∉ ∞ + ∈ = X X x x x if if 0 ) ( ψ ) ( ) ( min x x f x ψ +
  • 29. R. Zakhama 29 Structural Optimisation course Chap2: Mathematical Programming III. Constrained optimisation methods III.2 Penalty function methods • The function ψ(x) is not a practical approach. • Replace ψ(x) by a sequence of continuous non-negative functions that approach ψ(x). • Penalty function ψ : - Exterior penalty functions. - Interior penalty functions (or barrier functions).
  • 30. R. Zakhama 30 Structural Optimisation course Chap2: Mathematical Programming III. Constrained optimisation methods III.2 Penalty function methods  Exterior penalty functions • Constrained problem: • Define, • Unconstrained exterior penalty transformation: • Solve for increasing sequence {rk} such that: • Let • Ideally, [ ]   = = + = e g n l l n j j x h x g x P 1 2 2 1 ) ( )} ( , 0 max{ ) ( 1 0 ) ( 1 0 ) ( s.t. ) ( min ...n l x h ...n j x g x f e l g j R x n = = = ≤ ∈ ) ( ) ( ) , ( min x rP x f r x q n R x + = ∈ +∞ → < < < +∞ → k k r r r r k 2 1 lim , ⋯ ) , ( min arg k R x k r x q x n ∈ = +∞ → → } { as } { * k k r x x
  • 31. R. Zakhama 31 Structural Optimisation course Chap2: Mathematical Programming III.2 Penalty function methods  Exterior penalty functions rk q(x,rk) FigIII.1. : Inequality constrained problem in 1-D
  • 32. R. Zakhama 32 Structural Optimisation course Chap2: Mathematical Programming III.2 Penalty function methods  Exterior penalty functions (1) Initialise: x0 , r0 , ε > 0 and k = 0 (2) while (a) (b) (c) k = k+1 endwhile Output: x* = xk ( ) ε > − ) ( ) , ( k k k x f r x q ) , ( min arg 1 k R x k r x q x n ∈ + = ) 10 1 example (for 1 < < = + c c r r k k
  • 33. R. Zakhama 33 Structural Optimisation course Chap2: Mathematical Programming III.2 Penalty function methods  Interior penalty functions • Typically applicable to inequality constrained problems: • Some interior penalty (Barrier) functions: • Unconstrained exterior penalty transformation: • Solve for increasing sequence {rk}. [ ] ) ( ln ) ( or ) ( 1 ) ( 1 1   = = − − = − = g g n j j n j j x g x B x g x B 1 0 ) ( s.t. ) ( min ...n j x g x f g j R x n = ≤ ∈ ) ( 1 ) ( ) , ( min x B r x f r x q n R x + = ∈
  • 34. R. Zakhama 34 Structural Optimisation course Chap2: Mathematical Programming III.2 Penalty function methods  Interior penalty functions 1/rk q(x,rk) FigIII.2. : Inequality constrained problem in 1-D
  • 35. R. Zakhama 35 Structural Optimisation course Chap2: Mathematical Programming III.3 Augmented Lagrangian (multiplier) methods • Consider the problem (equality constraints): • The Lagrangian function: • The KKT necessary conditions: • Augmented Lagrangian function: • Note that the optimal value of λl are not known. • If all λl = 0, then the method is identical to the Exterior Penalty Function. • If, on the other hand, we have optimal values λl * , then min T is the answer for any finite value of r.   = = + + = e e n l l n l l l x h r x h x f r x T 1 2 1 ) ( ) ( ) ( ) , , ( λ λ 1 0 ) ( s.t. ) ( min ...n l x h x f e l R x n = = ∈  = + = e n l l l x h x f x L 1 ) ( ) ( ) , ( λ λ 0 ) ( , 0 1 * = = ∇ + ∇ = ∇  = x h h f l n l l x l x x e λ L
  • 36. R. Zakhama 36 Structural Optimisation course Chap2: Mathematical Programming III.3 Augmented Lagrangian (multiplier) methods • First order KKT conditions for the augmented Lagrangian function: • The exact KKT conditions for the considered problem: • Comparing the two last equations, we expect that: • Hestenes (Hestenes, 1969) suggested using the following update for λl : • Easy to extend the augmented Lagrangian for the inequality constraints (g(x) ≤ 0 ) since we can write: ( ) 0 2 1 = ∇ + + ∇ = ∇  = e n l l x l l x x h rh f T λ 0 1 * = ∇ + ∇ = ∇  = e n l l x l x x h f λ L * 2 l l l rh λ λ → + ) ( 2 1 k l k l k l x rh + = + λ λ 0 ) ( 2 = + s x g
  • 37. R. Zakhama 37 Structural Optimisation course Chap2: Mathematical Programming III.3 Augmented Lagrangian (multiplier) methods (1) Input: r, ε > 0 (2) Initialise: x0 and k = 0 (3) while (a) (b) (c) k = k+1 endwhile Output: x* = xk ( ) ε λ > − ) ( ) , , ( k k k x f r x T ) , , ( min arg 1 r x T x k R x k n λ ∈ + = ) ( 2 1 k k k x rh + = + λ λ
  • 38. R. Zakhama 38 Structural Optimisation course Chap2: Mathematical Programming IV. Problems IV.1 Problem 1 In this exercise we reconsider the optimisation problem of a truss structure with 2 bars for a minimum weight (figure IV.1). We want to find the traversal sections X1 and X2 such that the structure weight Z is minimised. Bars must: - resist under static loads. - the displacement δy of node A must not exceed a limit δ0. L et L/cosα α α α : bares length E : Young modulus X1 et X2 : transversal bares sections δx , δy : node displacements A F : load applied at A ρ : density Z : total masse σ0 and -σ0 : elasticity limit in tension and in compression, respectively. FigIV.1. : Problem 1
  • 39. R. Zakhama 39 Structural Optimisation course Chap2: Mathematical Programming IV. Problems IV.1 Problem 1 The problem formulation can be written as: We take (1) (P)       + = 1 2 cos min 2 1 X α L LX ρ Z ,X X s.t. E L 0 0 σ δ =
  • 40. R. Zakhama 40 Structural Optimisation course Chap2: Mathematical Programming IV. Problems IV.1 Problem 1 1- From the expression of equation (1), what does correspond physically the displacement limit δ0. 2- We consider the following variables replacements: and We pose Write the new formulation (P’) of the optimisation problem (P) with respect Y1 and Y2. 3- Demonstrate if the constraints g1 , g2 and g3 are satisfied then the constraints g4 and g5 are satisfied too and can be eliminated (we say g1 , g2 and g3 dominate g4 and g5). 4- Write the necessary KKT conditions for the new problem (P’). 5- Knowing that the only constraint g3 is active at the optimum, solve the new problem (P’) and deduce the optimal section values X1 and X2 as well as the Lagrangian multipliers values. 1 0 1 2 X F Y σ = 2 0 2 3 X F Y σ = 0 σ ρLF c =
  • 41. R. Zakhama 41 Structural Optimisation course Chap2: Mathematical Programming IV. Problems IV.2 Problem 2 Let consider the following problem: Where s and h are positive values 1- Write the KKT necessary conditions for the given problem. 2- Knowing that the only constraint g1 and g5 are active at the optimum, compute the optimum values x1 , x2 and the Lagrangian multipliers values (Assuming that h = 1200 and s = 100). 3- Compute the values of the objective and the Lagrangian function at the optimum. ( ) ( ) [ ]                > ≤ − + ≡ ≤ − ≡ ≥ − ≡ ≤ + − ≡ ≤ + − ≡ ≥ − ⋅ ≡ − + + + = 0 , 0 1 150 , 0 1 750 0 1 10 , 0 1 01 . 0 0 1 13 . 0 , 0 1 10 12 . 2 s.t. 2 min 2 1 1 6 2 5 2 4 1 3 1 2 7 2 2 1 1 2 2 1 2 1 2 1 x x s x g x g x g x s g x h g x x g x x s x h s x f x π π π
  • 42. R. Zakhama 42 Structural Optimisation course Chap2: Mathematical Programming IV. Problems IV.3 Problem 3 Given the problem: 1- Solve by the exterior penalty function for r1 = 0.1, r2 = 1.0. Show graphically q(X,r) as a function of X= X1 = X2 for the values of r. 2- Solve by the interior penalty function for r1 = 10.0, r2 =100.0. Show graphically q(X,r) as a function of X= X1 = X2 for the values of r. 3- Solve by the method of feasible directions. Calculate two iterations cycles, starting at the initial point {X1}T = {7,4}. Show graphically the directions of move in the space of X1 and X2. 0 9 ) 5 ( ) 5 ( g s.t. min 2 2 2 1 2 1 , 2 1 ≤ − − + − ≡ + = X X X X Z X X
  • 43. R. Zakhama 43 Structural Optimisation course Chap2: Mathematical Programming Sources Presentation based on material from: • Haftka & Gürdal (1994): Elements of Structural Optimization • van Keulen: TUD Optimization course • Etman (2006): EM course Engineering Optimization