The 7 Things I Know About Cyber Security After 25 Years | April 2024
Introduction to the theory of optimization
1. Σ YSTEMS
Introduction to the Theory of Optimization
Dimitrios Papadopoulos
Delta Pi Systems
Thessaloniki, Greece
2. Optimization Tree
Nonlinear Equations
Nonlinear Least Squares
Unconstrained Global Optimization
Nondifferentiable Optimization
Linear Programming
Nonlinearly Constrained
Continuous Constrained Bound Constraint
Optimization
Discrete Integer Programming Network Programming
Stochastic Programming
Σ YSTEMS
3. Taylor’s Theorem
◮ Suppose that f : Rn → R is continuously differentiable and that p ∈ Rn .
Then we have that
f (x + p) = f (x) + ∇f (x + tp)T p, (1)
for some t ∈ (0, 1). Moreover, if f is twice differentiable, we have that
1
∇f (x + p) = ∇f (x) + ∇2 f (x + tp)pdt, (2)
0
and that
1
f (x + p) = f (x) + ∇f (x)T p + pT ∇2 f (x + tp)p, (3)
2
for some t ∈ (0, 1)
Σ YSTEMS
4. Positive Definiteness
◮ A symmetric matrix A ∈ Rn×n is positive definite if there is a positive
constant α such that
xT Ax ≥ α||x||2 , for all x ∈ Rn . (4)
n×n
◮ A symmetric matrix A ∈ R is positive semidefinite if
xT Ax ≥ 0, for all x ∈ Rn . (5)
Σ YSTEMS
5. Convexity
◮ S ∈ Rn is a convex set if the straight line segment connecting any two
points in S lies entirely inside. Formally, for any two points x ∈ S and
y ∈ S, we have
αx + (1 − α)y ∈ S for all α ∈ [0, 1]. (6)
◮ f is a convex function if its domain is a convex set and if for any two
points x and y in this domain, the graph of f lies below the straight line
connecting (x, f (x)) to (y, f (y)) in the space Rn+1 . That is, we have
f (α + (1 − α)y) ≤ αf (x) + (1 − α)f (y), for all α ∈ [0, 1]. (7)
Σ YSTEMS
6. Local and global minima
◮ A point x∗ is a global minimizer if f (x∗ ) ≤ f (x) for all x ∈ Rn .
◮ A point x∗ is a local minizer if there is a neighborhood N of x∗ such that
f (x∗ ) ≤ f (x) for x ∈ N .
◮ A point x∗ is a strict local minimizer (also called a strong local minimizer)
if there is a neighborhood N of x∗ such that f (x∗ ) < f (x) for all x ∈ N
with x = x∗ .
◮ A point x∗ is an isolated local minimizer if there is a neighborhood N of x∗
such that x∗ is the only local minimizer in N .
Σ YSTEMS
7. First-Order Necessary Conditions
Theorem
If x∗ is a local minimizer and f is continously differentiable in an open
neighborhood of x∗ , then ∇f (x∗ ) = 0.
◮ x∗ is a stationary point if ∇f (x∗ ) = 0.
◮ Any local minimizer must be a stationary point.
Σ YSTEMS
8. Second-Order Necessary Conditions
Theorem
If x∗ is a local minimizer of f and ∇2 f is continuous in an open neighborhood
of x∗ , then ∇f (x∗ ) = 0 and ∇2 f (x∗ ) is positive semidefinite.
Σ YSTEMS
9. Second-Order Sufficient Conditions
Theorem
Suppose that ∇2 f is continuous in an open neighborhood of x∗ and that
∇f (x∗ ) = 0 and ∇2 f (x∗ ) is positive definite. Then x∗ is a strict local
minimizer of f .
Σ YSTEMS
10. Uniqueness of minimum for convex functions
Theorem
When f is convex, any local minimizer x∗ is a global minimizer of f . If in
addition f is differentiable, then any stationary point x∗ is a global minimizer of
f.
Σ YSTEMS
11. Linear Least-Squares
◮ Model: y(ti ; x1 , ..., xn ) = ai,1 + ... + ai,n xn , i = 1, ..., m,
◮ Minimization problem:
m m
(y(ti ; x1 , ..., xn ) − bi )2 = (ai,1 + ... + ai,n xn − bi )2 = min (8)
i=1 i=1
or in matrix form
◮ Given A = (aij )m,n ∈ Rm×n and b ∈ Rn find x∗ ∈ Rn , such that
i,j=1
||Ax∗ − b||2 = min ||Ax − b||2 (9)
n x∈R
◮ x∗ ∈ Rn is exactly then a solution of equation (1), if x∗ is the solution of
the equation
AT Ax = AT b (10)
The system of normal equations has at least one solution. It has exactly
one, if Rank(A) = n
◮ If A ∈ Rm×n has n full ranks (columns), then the matrix AT A ∈ Rn×n is
symmetric positive definite. Σ YSTEMS
12. Contact us
Delta Pi Systems
Optimization and Control of Processes and Systems
Thessaloniki, Greece
http://www.delta-pi-systems.eu
Σ YSTEMS