SlideShare a Scribd company logo
1 of 27
Download to read offline
Non Linear Optimization: Finance Applications
Thilanka Munasinghe
Department of Mathematics
West Virginia University
February 24, 2015
Thilanka Munasinghe (WVU) Optimization Methods in Finance 1 / 27
Overview
1 Volatility estimation and ARCH and GARCH models
2 Line Search Methods
3 Newton’s Method
4 Steepest Descent Method
5 Golden Section Search Method
6 Conjugate Gradient Method
Thilanka Munasinghe (WVU) Optimization Methods in Finance 2 / 27
Volatility in Finance
Volatility
Volatility is a term used to describe how much the security prices,
market indices, interest rates, etc., move up and down around their
mean.
Volatility is measured by the standard deviation of the random variable
that represents the financial quantity we are interested in.
Most investors prefer low volatility to high volatility and therefore expect
to be rewarded with higher long-term returns for holding higher
volatility securities.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 3 / 27
Volatility in Finance
Volatility estimations
Many financial computations require volatility estimates.
Most volatility estimation techniques can be classified as either a
historical or an implied method.
One either uses historical time series to infer patterns and estimate the
volatility using a statistical technique
or considers the known prices of related securities such as options that
may reveal the market sentiment on the volatility of the security in
question.
GARCH models exemplify the first approach, while the implied
volatilities calculated from the Black, Scholes and Merton (BSM)
formulas are the best known examples of the second approach.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 4 / 27
ARCH and GARCH models
The basics
Let Y be a stochastic process indexed by natural numbers.
Its value at time t, Yt is an n-dimensional vector of random variables.
The behavior of these random variables is modeled as
Yt =
m
i=1
Φi · Yt−i + t .
Here m is a positive integer representing the number of periods we
look back in our model and t satisfies:
E[ t | 1, ..., t−1] = 0.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 5 / 27
ARCH and GARCH models
The model for the univariate case
We set:
ht = E[ 2
t | 1, ..., t−1].
Then we model the conditional time dependence as follows:
ht = c +
q
i=1
αi · 2
t−i +
p
j=1
βj · ht−j.
This model is called GARCH(p,q).
ARCH model corresponds to choosing p = 0.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 6 / 27
ARCH and GARCH models
The optimization problem in the univariate case
After choosing p and q, the objective is to determine the optimal
parameters Φi, αi, βj.
Usually, this is achieved via maximum likelihood estimation.
If we assume that Yt has a normal distribution conditional on the
historical observations, the log-likelihood function can be written as
follows:
−
T
2
· ln(2 · π) −
1
2
·
T
i=1
ln ht −
1
2
·
T
i=1
2
t
ht
.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 7 / 27
ARCH and GARCH models
The optimization problem in the univariate case
max(−
T
2
· ln(2 · π) −
1
2
·
T
i=1
ln ht −
1
2
·
T
i=1
2
t
ht
)
Yt =
m
i=1
Φi · Yt−i + t
E[ t | 1, ..., t−1] = 0
ht = E[ 2
t | 1, ..., t−1] ≥ 0, for all t
Thilanka Munasinghe (WVU) Optimization Methods in Finance 8 / 27
ARCH and GARCH models
Stationarity properties
An important issue in GARCH parameter estimations is the
stationarity properties of the resulting model.
It is unclear whether one can assume that the model parameters
for financial time series are stationary over the time.
However, the forecasting and estimation is easier on stationary
models.
Theorem
If αi’s, βj’s and the scalar c are strictly positive and
q
i=1
αi +
p
j=1
βj < 1,
then the univariate GARCH model is stationary.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 9 / 27
Black-Scholes-Merton (BSM) equations
BSM for European pricing option
dSt
St
= µ · dt + σ · dWt ,
where
St - the underlying security price at time t,
µ- drift,
σ- the constant volatility,
Wt - a random variable.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 10 / 27
Black-Scholes-Merton (BSM) equations
The solutions
C(K, T) = S0 · Φ(d1) − K · e−r·T
· Φ(d2),
P(K, T) = K · e−r·T
· Φ(−d2) − S0 · Φ(−d1),
where
d1 =
log(S0/K) + (r + σ2/2) · T)
σ ·
√
T
,
d2 = d1 − σ ·
√
T,
Φ()-cumulative distribution function for the standard normal
distribution,
r-continuously compounded risk-free interest rate (a constant
available in US markets),
Thilanka Munasinghe (WVU) Optimization Methods in Finance 11 / 27
Black-Scholes-Merton (BSM) equations
Implied volatility
In order to determine the price, we just need the value or a close
approximation to σ.
Conversely, given the market price for a particular European call or
put, one can determine the volatility (implied by the price), called
implied volatility, by solving these equations with the unknown σ.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 12 / 27
Linear Vs Non Linear
Definition of Linear and Non Linear
f(x) = x1 + 2 · x2 − (3.4) · x3 is linear in x = [x1, x2, x3]T
f(x) = x1 · x2 + 2 · x2 − (3.4) · x3 is Non Linear in x
f(x) = cos(x1) + 2 · x2 − (3.4) · x3 is Non Linear in x
Thilanka Munasinghe (WVU) Optimization Methods in Finance 13 / 27
Existence and Uniqueness of the an Optimum Solution
Existence and uniqueness of the solution
Usually can not guarantee that we have found the Global Optima.
There can be multiple solutions that exist.
For unconstrained problems, at the minimum, f(x∗) = 0
Calculas : at minimum, the Second Derivative is greater than zero
Vector Case : at minimum, Hessian is positive definite.
Remark Necessary and Sufficient conditions for a minimum for an
unconstrained problem :
Gradient must equal to zero, f(x∗) = 0
Hessian must be positive definite.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 14 / 27
Line Search Methods
The Formulation of the Line Search Method
Consider an unconstrained optimization problem,
min f(x)
x ∈ R
Assume the function f(x) is smooth and continuous.
The objective is to find a minimum of f(x).
Thilanka Munasinghe (WVU) Optimization Methods in Finance 15 / 27
Optimization Algorithm
Initial point
Optimization Algorithm starts by an initial point x0, and performs series
of iterations.
Optimal Point
Goal is to find the ”optimal point” x∗
Iteration equation / Iteration Scheme
xk+1 = xk + αk · dk
dk = ”search direction”.
αk = ”step-length”
step-length,αk determines how far to go on the ”search direction”, dk
Thilanka Munasinghe (WVU) Optimization Methods in Finance 16 / 27
Constraints
Definition
The function value of the new point, f(xk+1) should be less than or
equal to the previous point function value, f(xk ).
Constraint
f(xk+1) ≤ f(xk )
f(xk + αk · dk ) ≤ f(xk )
Remark Optimization Algorithm starts with an initial point, x0.
Find the decent ”search direction”, dk .
Determine the ”step-size”, αk .
Check the iteration criteria.
Check the stopping conditions.
Output the Optimal point, x∗
Thilanka Munasinghe (WVU) Optimization Methods in Finance 17 / 27
Newton’s Method
Definition
Newton’s method (also known as the Newton-Raphson method)
Uses to finding successively better approximations to the roots (or
zeroes) of a real-valued function defined on the interval [a, b]
f(x) = 0
Where,f(x) is continuous and differentiable.
Definition
Given a function f(x) is defined over the x ∈ R, and its first derivative is
f (x),
Calculation begins with a first guessing of the initial point x0 for a root
of the function f(x). A new, considerably a better approximation point,
x1
which obtained by,
x1 = x0 + f(x0)
f (x0)Thilanka Munasinghe (WVU) Optimization Methods in Finance 18 / 27
Newton’s Method
Iterative Scheme
As the iterative process repeats, current approximation xn is used to
derive the formula for a new, better approximation, xn+1
xn+1 = xn + f(xn)
f (xn)
Thilanka Munasinghe (WVU) Optimization Methods in Finance 19 / 27
Newton’s Method
Tangent Line
Suppose we have some current approximation xn. Then we can derive
the formula for a better approximation, xn+1. The equation of the
tangent line to the curve y = f(x) at the point x = xn is,
y = f (xn)(xn+1 − xn) + f(xn)
Definition
The x-intercept of this line (the value of x such that y = 0) is then used
as the next approximation to the root, xn+1. In other words, setting
y = 0 and x = xn+1 gives
0 = f (xn)(xn+1 − xn) + f(xn)
Thilanka Munasinghe (WVU) Optimization Methods in Finance 20 / 27
Steepest Descent Method
Steepest Descent Algorithm for Unconstrained Optimization
Consider an unconstrained optimization problem,
min f(x)
x ∈ Rn
Assume the function f(x) is smooth and continuous and differentiable.
If, x = ¯x is a given point, f(x) can be approximated by its linear
expansion
f(¯x + d) ≈ f(¯x) + f(¯x)T d
if, d is small, notice that the approximation above is good.
We choose the value of d so that the inner product, f(¯x)T d is small
as possible.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 21 / 27
Steepest Descent Method
Gradient vector
Let’s normalize d so that, d = 1.
Then among all directions, d with norm d = 1,
the direction,
˜d = − f(¯x)
f(¯x
makes the smallest inner product with the gradient, f(¯x).
This fact follows from the following inequalites:
f(¯x)T d ≥ − f(¯x) d = f(¯x)T (− f(¯x)
f(¯x ) = − f(˜d).
Direction of the Steepest Descent
Due to the above reason, Steepest Descent Direction :
¯d = − f(¯x)
Thilanka Munasinghe (WVU) Optimization Methods in Finance 22 / 27
Steepest Descent Algorithm
Steps of the SD Algorithm
Steps are
Step 1. Initialize x0 and machine accuracy ε , set k = 0
Step 2. dk = − f(xk ). If dk = 0, then stop.
Step 3. Solve minαf(xk + αk · dk ) for the stepsize αk
Step 4. Set xk+1 ← xk + αk · dk , k ← k + 1
Note from Step 3, the fact that dk = − f(xk ) is a descent direction,
which follows the condition of
f(xk + αk · dk ) ≤ f(xk ) .
Thilanka Munasinghe (WVU) Optimization Methods in Finance 23 / 27
Example of Steepest Descent Method
Using SD Method to minimize f(x)
Find the the first iteration value of x∗ using the Steepest Descent
Method :
min f(x) = 0.5x2
1 + 2.5x2
2
Gradient is given by : f(xk ) =
x1
5x2
Taking x0 =
5
1
, we have f(x0) =
5
5
Performing line search along negative gradient direction,
minα0
f(x0 − α0 f(xo))
Exact minimum along line is given by α0 = 1/3 ,
so next approximation isThilanka Munasinghe (WVU) Optimization Methods in Finance 24 / 27
Golden Section Search Method
Definition
Suppose f(x) is unimodal on [a, b] ,
let x1 and x2 be two points within [a, b] ,
where x1 < x2
Evaluating and comparing the values of f(x1) and f(x2), we can
discard either,
(x2, b] or [a, x1), with minimum known to lie in remaining subintervals.
In order to repeat the process, we need to compute only one new
function evaluation.
To reduce length of an interval by a fixed fraction at each iteration, each
new pair of points must have the same relationship with respect to new
interval that previous pair had with respect to that previous interval.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 25 / 27
Golden Section Search Algorithm
Consider τ = (
√
5 − 1)/2
x1 = a + (1 − τ)(b − a)
f1 = f(x1)
x2 = a + τ(b − a) ;
f2 = f(x2)
While
((b − a) > tol) Do
if (f1 > f2) then
a = x1
x1 = x2
f1 = f2
x2 = a + τ(b − a)
f2 = f(x2)
Else
b = x2
x2 = x1
x1 = a + (1 − τ)(b − a)
f1 = f(x1)Thilanka Munasinghe (WVU) Optimization Methods in Finance 26 / 27
Definition
Another method that does not require explicit second derivatives,
and does not even store approximation to the Hessian matrix is,
Conjugate Gradient (CG) method.
CG generates sequence of conjugate search directions, implicitly
accumulating information about Hessian matrix.
For quadratic objective function, CG is theoretically exact after at
most n iterations, where n is the dimension of the problem.
CG is effective for general unconstrained minimization.
Thilanka Munasinghe (WVU) Optimization Methods in Finance 27 / 27

More Related Content

What's hot

2. Fixed Point Iteration.pptx
2. Fixed Point Iteration.pptx2. Fixed Point Iteration.pptx
2. Fixed Point Iteration.pptxsaadhaq6
 
Butterfly optimization algorithm
Butterfly optimization algorithmButterfly optimization algorithm
Butterfly optimization algorithmAhmed Fouad Ali
 
Render 03 Quantitative analysis for management
Render 03 Quantitative analysis for managementRender 03 Quantitative analysis for management
Render 03 Quantitative analysis for managementFranchezka Pegollo
 
Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)Matthew Leingang
 
Grovers Algorithm
Grovers Algorithm Grovers Algorithm
Grovers Algorithm CaseyHaaland
 
Lagrange's method
Lagrange's methodLagrange's method
Lagrange's methodKarnav Rana
 
Cooperative Game Theory
Cooperative Game TheoryCooperative Game Theory
Cooperative Game TheorySSA KPI
 
Metaheuristic Algorithms: A Critical Analysis
Metaheuristic Algorithms: A Critical AnalysisMetaheuristic Algorithms: A Critical Analysis
Metaheuristic Algorithms: A Critical AnalysisXin-She Yang
 
LPP, Duality and Game Theory
LPP, Duality and Game TheoryLPP, Duality and Game Theory
LPP, Duality and Game TheoryPurnima Pandit
 
Quadratic Programming : KKT conditions with inequality constraints
Quadratic Programming : KKT conditions with inequality constraintsQuadratic Programming : KKT conditions with inequality constraints
Quadratic Programming : KKT conditions with inequality constraintsMrinmoy Majumder
 
Image segmentation using a genetic algorithm and hierarchical local search
Image segmentation using a genetic algorithm and hierarchical local searchImage segmentation using a genetic algorithm and hierarchical local search
Image segmentation using a genetic algorithm and hierarchical local searchMartin Pelikan
 
Generalized additives models (gam)
Generalized additives models (gam)Generalized additives models (gam)
Generalized additives models (gam)AursTCHICHE
 
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)Fynn McKay
 
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...Ajay Kumar
 
Introduction to Mechanism Design
Introduction to Mechanism DesignIntroduction to Mechanism Design
Introduction to Mechanism DesignYosuke YASUDA
 
Differential Calculus- differentiation
Differential Calculus- differentiationDifferential Calculus- differentiation
Differential Calculus- differentiationSanthanam Krishnan
 
AVR 기초와 응용 강의노트(최한호)
AVR 기초와 응용 강의노트(최한호)AVR 기초와 응용 강의노트(최한호)
AVR 기초와 응용 강의노트(최한호)활 김
 

What's hot (20)

2. Fixed Point Iteration.pptx
2. Fixed Point Iteration.pptx2. Fixed Point Iteration.pptx
2. Fixed Point Iteration.pptx
 
Butterfly optimization algorithm
Butterfly optimization algorithmButterfly optimization algorithm
Butterfly optimization algorithm
 
Render 03 Quantitative analysis for management
Render 03 Quantitative analysis for managementRender 03 Quantitative analysis for management
Render 03 Quantitative analysis for management
 
Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)
 
Grovers Algorithm
Grovers Algorithm Grovers Algorithm
Grovers Algorithm
 
Lagrange's method
Lagrange's methodLagrange's method
Lagrange's method
 
graphical method
graphical method graphical method
graphical method
 
Firefly algorithm
Firefly algorithmFirefly algorithm
Firefly algorithm
 
Cooperative Game Theory
Cooperative Game TheoryCooperative Game Theory
Cooperative Game Theory
 
Metaheuristic Algorithms: A Critical Analysis
Metaheuristic Algorithms: A Critical AnalysisMetaheuristic Algorithms: A Critical Analysis
Metaheuristic Algorithms: A Critical Analysis
 
LPP, Duality and Game Theory
LPP, Duality and Game TheoryLPP, Duality and Game Theory
LPP, Duality and Game Theory
 
Quadratic Programming : KKT conditions with inequality constraints
Quadratic Programming : KKT conditions with inequality constraintsQuadratic Programming : KKT conditions with inequality constraints
Quadratic Programming : KKT conditions with inequality constraints
 
Image segmentation using a genetic algorithm and hierarchical local search
Image segmentation using a genetic algorithm and hierarchical local searchImage segmentation using a genetic algorithm and hierarchical local search
Image segmentation using a genetic algorithm and hierarchical local search
 
Generalized additives models (gam)
Generalized additives models (gam)Generalized additives models (gam)
Generalized additives models (gam)
 
Introduction to optimization Problems
Introduction to optimization ProblemsIntroduction to optimization Problems
Introduction to optimization Problems
 
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
 
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
 
Introduction to Mechanism Design
Introduction to Mechanism DesignIntroduction to Mechanism Design
Introduction to Mechanism Design
 
Differential Calculus- differentiation
Differential Calculus- differentiationDifferential Calculus- differentiation
Differential Calculus- differentiation
 
AVR 기초와 응용 강의노트(최한호)
AVR 기초와 응용 강의노트(최한호)AVR 기초와 응용 강의노트(최한호)
AVR 기초와 응용 강의노트(최한호)
 

Viewers also liked

Grds international conference on pure and applied science (5)
Grds international conference on pure and applied science (5)Grds international conference on pure and applied science (5)
Grds international conference on pure and applied science (5)Global R & D Services
 
Grds international conference on pure and applied science (6)
Grds international conference on pure and applied science (6)Grds international conference on pure and applied science (6)
Grds international conference on pure and applied science (6)Global R & D Services
 
Higher-Order Conjugate Gradient Method (HCGM) For Solving Continuous Optimal ...
Higher-Order Conjugate Gradient Method (HCGM) For Solving Continuous Optimal ...Higher-Order Conjugate Gradient Method (HCGM) For Solving Continuous Optimal ...
Higher-Order Conjugate Gradient Method (HCGM) For Solving Continuous Optimal ...iosrjce
 
Applying Linear Optimization Using GLPK
Applying Linear Optimization Using GLPKApplying Linear Optimization Using GLPK
Applying Linear Optimization Using GLPKJeremy Chen
 
Lesson 4.3 First and Second Derivative Theory
Lesson 4.3 First and Second Derivative TheoryLesson 4.3 First and Second Derivative Theory
Lesson 4.3 First and Second Derivative TheorySharon Henry
 
Time series data mining
Time series data miningTime series data mining
Time series data miningO-Ta Kung
 
Neural network and GA approaches for dwelling fire occurrence prediction
Neural network and GA approaches for dwelling fire occurrence predictionNeural network and GA approaches for dwelling fire occurrence prediction
Neural network and GA approaches for dwelling fire occurrence predictionPhisan Shukkhi
 
นาย จตุรพัฒน์ ภัควนิตย์
นาย จตุรพัฒน์ ภัควนิตย์นาย จตุรพัฒน์ ภัควนิตย์
นาย จตุรพัฒน์ ภัควนิตย์Pornchanida Ansietà
 
01 introduction to data mining
01 introduction to data mining01 introduction to data mining
01 introduction to data miningphakhwan22
 
โครงงานคอมพิวเตอร์ เรื่อง สมุนไพรรางจืด
โครงงานคอมพิวเตอร์ เรื่อง สมุนไพรรางจืดโครงงานคอมพิวเตอร์ เรื่อง สมุนไพรรางจืด
โครงงานคอมพิวเตอร์ เรื่อง สมุนไพรรางจืดRungnaree Uun
 
Comparative study of algorithms of nonlinear optimization
Comparative study of algorithms of nonlinear optimizationComparative study of algorithms of nonlinear optimization
Comparative study of algorithms of nonlinear optimizationPranamesh Chakraborty
 
15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer Perceptron15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer PerceptronAndres Mendez-Vazquez
 
Parallel algorithms for solving linear systems with block-fivediagonal matric...
Parallel algorithms for solving linear systems with block-fivediagonal matric...Parallel algorithms for solving linear systems with block-fivediagonal matric...
Parallel algorithms for solving linear systems with block-fivediagonal matric...Ural-PDC
 
02.03 Artificial Intelligence: Search by Optimization
02.03 Artificial Intelligence: Search by Optimization02.03 Artificial Intelligence: Search by Optimization
02.03 Artificial Intelligence: Search by OptimizationAndres Mendez-Vazquez
 
Solving Poisson Equation using Conjugate Gradient Method and its implementation
Solving Poisson Equation using Conjugate Gradient Methodand its implementationSolving Poisson Equation using Conjugate Gradient Methodand its implementation
Solving Poisson Equation using Conjugate Gradient Method and its implementationJongsu "Liam" Kim
 

Viewers also liked (20)

Grds international conference on pure and applied science (5)
Grds international conference on pure and applied science (5)Grds international conference on pure and applied science (5)
Grds international conference on pure and applied science (5)
 
Grds international conference on pure and applied science (6)
Grds international conference on pure and applied science (6)Grds international conference on pure and applied science (6)
Grds international conference on pure and applied science (6)
 
Higher-Order Conjugate Gradient Method (HCGM) For Solving Continuous Optimal ...
Higher-Order Conjugate Gradient Method (HCGM) For Solving Continuous Optimal ...Higher-Order Conjugate Gradient Method (HCGM) For Solving Continuous Optimal ...
Higher-Order Conjugate Gradient Method (HCGM) For Solving Continuous Optimal ...
 
Applying Linear Optimization Using GLPK
Applying Linear Optimization Using GLPKApplying Linear Optimization Using GLPK
Applying Linear Optimization Using GLPK
 
Lesson 4.3 First and Second Derivative Theory
Lesson 4.3 First and Second Derivative TheoryLesson 4.3 First and Second Derivative Theory
Lesson 4.3 First and Second Derivative Theory
 
Thai Glossary Repository
Thai Glossary RepositoryThai Glossary Repository
Thai Glossary Repository
 
Time series data mining
Time series data miningTime series data mining
Time series data mining
 
Neural network and GA approaches for dwelling fire occurrence prediction
Neural network and GA approaches for dwelling fire occurrence predictionNeural network and GA approaches for dwelling fire occurrence prediction
Neural network and GA approaches for dwelling fire occurrence prediction
 
นาย จตุรพัฒน์ ภัควนิตย์
นาย จตุรพัฒน์ ภัควนิตย์นาย จตุรพัฒน์ ภัควนิตย์
นาย จตุรพัฒน์ ภัควนิตย์
 
My topic
My topicMy topic
My topic
 
Microsoft Azure day 1
Microsoft Azure day 1Microsoft Azure day 1
Microsoft Azure day 1
 
01 introduction to data mining
01 introduction to data mining01 introduction to data mining
01 introduction to data mining
 
โครงงานคอมพิวเตอร์ เรื่อง สมุนไพรรางจืด
โครงงานคอมพิวเตอร์ เรื่อง สมุนไพรรางจืดโครงงานคอมพิวเตอร์ เรื่อง สมุนไพรรางจืด
โครงงานคอมพิวเตอร์ เรื่อง สมุนไพรรางจืด
 
Comparative study of algorithms of nonlinear optimization
Comparative study of algorithms of nonlinear optimizationComparative study of algorithms of nonlinear optimization
Comparative study of algorithms of nonlinear optimization
 
15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer Perceptron15 Machine Learning Multilayer Perceptron
15 Machine Learning Multilayer Perceptron
 
Parallel algorithms for solving linear systems with block-fivediagonal matric...
Parallel algorithms for solving linear systems with block-fivediagonal matric...Parallel algorithms for solving linear systems with block-fivediagonal matric...
Parallel algorithms for solving linear systems with block-fivediagonal matric...
 
02.03 Artificial Intelligence: Search by Optimization
02.03 Artificial Intelligence: Search by Optimization02.03 Artificial Intelligence: Search by Optimization
02.03 Artificial Intelligence: Search by Optimization
 
Media&tech2learn 001-Part 1
Media&tech2learn 001-Part 1Media&tech2learn 001-Part 1
Media&tech2learn 001-Part 1
 
勾配法
勾配法勾配法
勾配法
 
Solving Poisson Equation using Conjugate Gradient Method and its implementation
Solving Poisson Equation using Conjugate Gradient Methodand its implementationSolving Poisson Equation using Conjugate Gradient Methodand its implementation
Solving Poisson Equation using Conjugate Gradient Method and its implementation
 

Similar to Optimization Methods in Finance

Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Asma Ben Slimene
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Asma Ben Slimene
 
Slides econometrics-2018-graduate-2
Slides econometrics-2018-graduate-2Slides econometrics-2018-graduate-2
Slides econometrics-2018-graduate-2Arthur Charpentier
 
Computing near-optimal policies from trajectories by solving a sequence of st...
Computing near-optimal policies from trajectories by solving a sequence of st...Computing near-optimal policies from trajectories by solving a sequence of st...
Computing near-optimal policies from trajectories by solving a sequence of st...Université de Liège (ULg)
 
STA003_WK4_L.pptx
STA003_WK4_L.pptxSTA003_WK4_L.pptx
STA003_WK4_L.pptxMAmir23
 
Reinforcement Learning: Hidden Theory and New Super-Fast Algorithms
Reinforcement Learning: Hidden Theory and New Super-Fast AlgorithmsReinforcement Learning: Hidden Theory and New Super-Fast Algorithms
Reinforcement Learning: Hidden Theory and New Super-Fast AlgorithmsSean Meyn
 
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...Michael Lie
 
Numerical Methods
Numerical MethodsNumerical Methods
Numerical MethodsTeja Ande
 
Approximation in Stochastic Integer Programming
Approximation in Stochastic Integer ProgrammingApproximation in Stochastic Integer Programming
Approximation in Stochastic Integer ProgrammingSSA KPI
 
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...SAJJAD KHUDHUR ABBAS
 
A Monte Carlo strategy for structure multiple-step-head time series prediction
A Monte Carlo strategy for structure multiple-step-head time series predictionA Monte Carlo strategy for structure multiple-step-head time series prediction
A Monte Carlo strategy for structure multiple-step-head time series predictionGianluca Bontempi
 
Strategies for Sensor Data Aggregation in Support of Emergency Response
Strategies for Sensor Data Aggregation in Support of Emergency ResponseStrategies for Sensor Data Aggregation in Support of Emergency Response
Strategies for Sensor Data Aggregation in Support of Emergency ResponseMichele Weigle
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Valentin De Bortoli
 

Similar to Optimization Methods in Finance (20)

Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 
Slides econometrics-2018-graduate-2
Slides econometrics-2018-graduate-2Slides econometrics-2018-graduate-2
Slides econometrics-2018-graduate-2
 
Slides ACTINFO 2016
Slides ACTINFO 2016Slides ACTINFO 2016
Slides ACTINFO 2016
 
Computing near-optimal policies from trajectories by solving a sequence of st...
Computing near-optimal policies from trajectories by solving a sequence of st...Computing near-optimal policies from trajectories by solving a sequence of st...
Computing near-optimal policies from trajectories by solving a sequence of st...
 
Slides Bank England
Slides Bank EnglandSlides Bank England
Slides Bank England
 
STA003_WK4_L.pptx
STA003_WK4_L.pptxSTA003_WK4_L.pptx
STA003_WK4_L.pptx
 
Reinforcement Learning: Hidden Theory and New Super-Fast Algorithms
Reinforcement Learning: Hidden Theory and New Super-Fast AlgorithmsReinforcement Learning: Hidden Theory and New Super-Fast Algorithms
Reinforcement Learning: Hidden Theory and New Super-Fast Algorithms
 
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
 
Numerical Methods
Numerical MethodsNumerical Methods
Numerical Methods
 
Approximation in Stochastic Integer Programming
Approximation in Stochastic Integer ProgrammingApproximation in Stochastic Integer Programming
Approximation in Stochastic Integer Programming
 
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
report
reportreport
report
 
A Monte Carlo strategy for structure multiple-step-head time series prediction
A Monte Carlo strategy for structure multiple-step-head time series predictionA Monte Carlo strategy for structure multiple-step-head time series prediction
A Monte Carlo strategy for structure multiple-step-head time series prediction
 
Strategies for Sensor Data Aggregation in Support of Emergency Response
Strategies for Sensor Data Aggregation in Support of Emergency ResponseStrategies for Sensor Data Aggregation in Support of Emergency Response
Strategies for Sensor Data Aggregation in Support of Emergency Response
 
intro
introintro
intro
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 
Talk iccf 19_ben_hammouda
Talk iccf 19_ben_hammoudaTalk iccf 19_ben_hammouda
Talk iccf 19_ben_hammouda
 

Recently uploaded

Call Us 📲8800102216📞 Call Girls In DLF City Gurgaon
Call Us 📲8800102216📞 Call Girls In DLF City GurgaonCall Us 📲8800102216📞 Call Girls In DLF City Gurgaon
Call Us 📲8800102216📞 Call Girls In DLF City Gurgaoncallgirls2057
 
8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCR8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCRashishs7044
 
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...lizamodels9
 
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCRashishs7044
 
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCRashishs7044
 
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,noida100girls
 
Investment in The Coconut Industry by Nancy Cheruiyot
Investment in The Coconut Industry by Nancy CheruiyotInvestment in The Coconut Industry by Nancy Cheruiyot
Investment in The Coconut Industry by Nancy Cheruiyotictsugar
 
Call US-88OO1O2216 Call Girls In Mahipalpur Female Escort Service
Call US-88OO1O2216 Call Girls In Mahipalpur Female Escort ServiceCall US-88OO1O2216 Call Girls In Mahipalpur Female Escort Service
Call US-88OO1O2216 Call Girls In Mahipalpur Female Escort Servicecallgirls2057
 
Future Of Sample Report 2024 | Redacted Version
Future Of Sample Report 2024 | Redacted VersionFuture Of Sample Report 2024 | Redacted Version
Future Of Sample Report 2024 | Redacted VersionMintel Group
 
MAHA Global and IPR: Do Actions Speak Louder Than Words?
MAHA Global and IPR: Do Actions Speak Louder Than Words?MAHA Global and IPR: Do Actions Speak Louder Than Words?
MAHA Global and IPR: Do Actions Speak Louder Than Words?Olivia Kresic
 
NewBase 19 April 2024 Energy News issue - 1717 by Khaled Al Awadi.pdf
NewBase  19 April  2024  Energy News issue - 1717 by Khaled Al Awadi.pdfNewBase  19 April  2024  Energy News issue - 1717 by Khaled Al Awadi.pdf
NewBase 19 April 2024 Energy News issue - 1717 by Khaled Al Awadi.pdfKhaled Al Awadi
 
Case study on tata clothing brand zudio in detail
Case study on tata clothing brand zudio in detailCase study on tata clothing brand zudio in detail
Case study on tata clothing brand zudio in detailAriel592675
 
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...lizamodels9
 
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In.../:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...lizamodels9
 
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607dollysharma2066
 
Intro to BCG's Carbon Emissions Benchmark_vF.pdf
Intro to BCG's Carbon Emissions Benchmark_vF.pdfIntro to BCG's Carbon Emissions Benchmark_vF.pdf
Intro to BCG's Carbon Emissions Benchmark_vF.pdfpollardmorgan
 
FULL ENJOY Call girls in Paharganj Delhi | 8377087607
FULL ENJOY Call girls in Paharganj Delhi | 8377087607FULL ENJOY Call girls in Paharganj Delhi | 8377087607
FULL ENJOY Call girls in Paharganj Delhi | 8377087607dollysharma2066
 
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...lizamodels9
 

Recently uploaded (20)

Call Us 📲8800102216📞 Call Girls In DLF City Gurgaon
Call Us 📲8800102216📞 Call Girls In DLF City GurgaonCall Us 📲8800102216📞 Call Girls In DLF City Gurgaon
Call Us 📲8800102216📞 Call Girls In DLF City Gurgaon
 
8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCR8447779800, Low rate Call girls in Saket Delhi NCR
8447779800, Low rate Call girls in Saket Delhi NCR
 
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
Call Girls In Connaught Place Delhi ❤️88604**77959_Russian 100% Genuine Escor...
 
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
8447779800, Low rate Call girls in New Ashok Nagar Delhi NCR
 
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
8447779800, Low rate Call girls in Kotla Mubarakpur Delhi NCR
 
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
BEST Call Girls In Greater Noida ✨ 9773824855 ✨ Escorts Service In Delhi Ncr,
 
Investment in The Coconut Industry by Nancy Cheruiyot
Investment in The Coconut Industry by Nancy CheruiyotInvestment in The Coconut Industry by Nancy Cheruiyot
Investment in The Coconut Industry by Nancy Cheruiyot
 
Call US-88OO1O2216 Call Girls In Mahipalpur Female Escort Service
Call US-88OO1O2216 Call Girls In Mahipalpur Female Escort ServiceCall US-88OO1O2216 Call Girls In Mahipalpur Female Escort Service
Call US-88OO1O2216 Call Girls In Mahipalpur Female Escort Service
 
Future Of Sample Report 2024 | Redacted Version
Future Of Sample Report 2024 | Redacted VersionFuture Of Sample Report 2024 | Redacted Version
Future Of Sample Report 2024 | Redacted Version
 
MAHA Global and IPR: Do Actions Speak Louder Than Words?
MAHA Global and IPR: Do Actions Speak Louder Than Words?MAHA Global and IPR: Do Actions Speak Louder Than Words?
MAHA Global and IPR: Do Actions Speak Louder Than Words?
 
NewBase 19 April 2024 Energy News issue - 1717 by Khaled Al Awadi.pdf
NewBase  19 April  2024  Energy News issue - 1717 by Khaled Al Awadi.pdfNewBase  19 April  2024  Energy News issue - 1717 by Khaled Al Awadi.pdf
NewBase 19 April 2024 Energy News issue - 1717 by Khaled Al Awadi.pdf
 
Case study on tata clothing brand zudio in detail
Case study on tata clothing brand zudio in detailCase study on tata clothing brand zudio in detail
Case study on tata clothing brand zudio in detail
 
Corporate Profile 47Billion Information Technology
Corporate Profile 47Billion Information TechnologyCorporate Profile 47Billion Information Technology
Corporate Profile 47Billion Information Technology
 
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
Call Girls In Sikandarpur Gurgaon ❤️8860477959_Russian 100% Genuine Escorts I...
 
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In.../:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
/:Call Girls In Indirapuram Ghaziabad ➥9990211544 Independent Best Escorts In...
 
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
(Best) ENJOY Call Girls in Faridabad Ex | 8377087607
 
Intro to BCG's Carbon Emissions Benchmark_vF.pdf
Intro to BCG's Carbon Emissions Benchmark_vF.pdfIntro to BCG's Carbon Emissions Benchmark_vF.pdf
Intro to BCG's Carbon Emissions Benchmark_vF.pdf
 
Japan IT Week 2024 Brochure by 47Billion (English)
Japan IT Week 2024 Brochure by 47Billion (English)Japan IT Week 2024 Brochure by 47Billion (English)
Japan IT Week 2024 Brochure by 47Billion (English)
 
FULL ENJOY Call girls in Paharganj Delhi | 8377087607
FULL ENJOY Call girls in Paharganj Delhi | 8377087607FULL ENJOY Call girls in Paharganj Delhi | 8377087607
FULL ENJOY Call girls in Paharganj Delhi | 8377087607
 
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
Lowrate Call Girls In Sector 18 Noida ❤️8860477959 Escorts 100% Genuine Servi...
 

Optimization Methods in Finance

  • 1. Non Linear Optimization: Finance Applications Thilanka Munasinghe Department of Mathematics West Virginia University February 24, 2015 Thilanka Munasinghe (WVU) Optimization Methods in Finance 1 / 27
  • 2. Overview 1 Volatility estimation and ARCH and GARCH models 2 Line Search Methods 3 Newton’s Method 4 Steepest Descent Method 5 Golden Section Search Method 6 Conjugate Gradient Method Thilanka Munasinghe (WVU) Optimization Methods in Finance 2 / 27
  • 3. Volatility in Finance Volatility Volatility is a term used to describe how much the security prices, market indices, interest rates, etc., move up and down around their mean. Volatility is measured by the standard deviation of the random variable that represents the financial quantity we are interested in. Most investors prefer low volatility to high volatility and therefore expect to be rewarded with higher long-term returns for holding higher volatility securities. Thilanka Munasinghe (WVU) Optimization Methods in Finance 3 / 27
  • 4. Volatility in Finance Volatility estimations Many financial computations require volatility estimates. Most volatility estimation techniques can be classified as either a historical or an implied method. One either uses historical time series to infer patterns and estimate the volatility using a statistical technique or considers the known prices of related securities such as options that may reveal the market sentiment on the volatility of the security in question. GARCH models exemplify the first approach, while the implied volatilities calculated from the Black, Scholes and Merton (BSM) formulas are the best known examples of the second approach. Thilanka Munasinghe (WVU) Optimization Methods in Finance 4 / 27
  • 5. ARCH and GARCH models The basics Let Y be a stochastic process indexed by natural numbers. Its value at time t, Yt is an n-dimensional vector of random variables. The behavior of these random variables is modeled as Yt = m i=1 Φi · Yt−i + t . Here m is a positive integer representing the number of periods we look back in our model and t satisfies: E[ t | 1, ..., t−1] = 0. Thilanka Munasinghe (WVU) Optimization Methods in Finance 5 / 27
  • 6. ARCH and GARCH models The model for the univariate case We set: ht = E[ 2 t | 1, ..., t−1]. Then we model the conditional time dependence as follows: ht = c + q i=1 αi · 2 t−i + p j=1 βj · ht−j. This model is called GARCH(p,q). ARCH model corresponds to choosing p = 0. Thilanka Munasinghe (WVU) Optimization Methods in Finance 6 / 27
  • 7. ARCH and GARCH models The optimization problem in the univariate case After choosing p and q, the objective is to determine the optimal parameters Φi, αi, βj. Usually, this is achieved via maximum likelihood estimation. If we assume that Yt has a normal distribution conditional on the historical observations, the log-likelihood function can be written as follows: − T 2 · ln(2 · π) − 1 2 · T i=1 ln ht − 1 2 · T i=1 2 t ht . Thilanka Munasinghe (WVU) Optimization Methods in Finance 7 / 27
  • 8. ARCH and GARCH models The optimization problem in the univariate case max(− T 2 · ln(2 · π) − 1 2 · T i=1 ln ht − 1 2 · T i=1 2 t ht ) Yt = m i=1 Φi · Yt−i + t E[ t | 1, ..., t−1] = 0 ht = E[ 2 t | 1, ..., t−1] ≥ 0, for all t Thilanka Munasinghe (WVU) Optimization Methods in Finance 8 / 27
  • 9. ARCH and GARCH models Stationarity properties An important issue in GARCH parameter estimations is the stationarity properties of the resulting model. It is unclear whether one can assume that the model parameters for financial time series are stationary over the time. However, the forecasting and estimation is easier on stationary models. Theorem If αi’s, βj’s and the scalar c are strictly positive and q i=1 αi + p j=1 βj < 1, then the univariate GARCH model is stationary. Thilanka Munasinghe (WVU) Optimization Methods in Finance 9 / 27
  • 10. Black-Scholes-Merton (BSM) equations BSM for European pricing option dSt St = µ · dt + σ · dWt , where St - the underlying security price at time t, µ- drift, σ- the constant volatility, Wt - a random variable. Thilanka Munasinghe (WVU) Optimization Methods in Finance 10 / 27
  • 11. Black-Scholes-Merton (BSM) equations The solutions C(K, T) = S0 · Φ(d1) − K · e−r·T · Φ(d2), P(K, T) = K · e−r·T · Φ(−d2) − S0 · Φ(−d1), where d1 = log(S0/K) + (r + σ2/2) · T) σ · √ T , d2 = d1 − σ · √ T, Φ()-cumulative distribution function for the standard normal distribution, r-continuously compounded risk-free interest rate (a constant available in US markets), Thilanka Munasinghe (WVU) Optimization Methods in Finance 11 / 27
  • 12. Black-Scholes-Merton (BSM) equations Implied volatility In order to determine the price, we just need the value or a close approximation to σ. Conversely, given the market price for a particular European call or put, one can determine the volatility (implied by the price), called implied volatility, by solving these equations with the unknown σ. Thilanka Munasinghe (WVU) Optimization Methods in Finance 12 / 27
  • 13. Linear Vs Non Linear Definition of Linear and Non Linear f(x) = x1 + 2 · x2 − (3.4) · x3 is linear in x = [x1, x2, x3]T f(x) = x1 · x2 + 2 · x2 − (3.4) · x3 is Non Linear in x f(x) = cos(x1) + 2 · x2 − (3.4) · x3 is Non Linear in x Thilanka Munasinghe (WVU) Optimization Methods in Finance 13 / 27
  • 14. Existence and Uniqueness of the an Optimum Solution Existence and uniqueness of the solution Usually can not guarantee that we have found the Global Optima. There can be multiple solutions that exist. For unconstrained problems, at the minimum, f(x∗) = 0 Calculas : at minimum, the Second Derivative is greater than zero Vector Case : at minimum, Hessian is positive definite. Remark Necessary and Sufficient conditions for a minimum for an unconstrained problem : Gradient must equal to zero, f(x∗) = 0 Hessian must be positive definite. Thilanka Munasinghe (WVU) Optimization Methods in Finance 14 / 27
  • 15. Line Search Methods The Formulation of the Line Search Method Consider an unconstrained optimization problem, min f(x) x ∈ R Assume the function f(x) is smooth and continuous. The objective is to find a minimum of f(x). Thilanka Munasinghe (WVU) Optimization Methods in Finance 15 / 27
  • 16. Optimization Algorithm Initial point Optimization Algorithm starts by an initial point x0, and performs series of iterations. Optimal Point Goal is to find the ”optimal point” x∗ Iteration equation / Iteration Scheme xk+1 = xk + αk · dk dk = ”search direction”. αk = ”step-length” step-length,αk determines how far to go on the ”search direction”, dk Thilanka Munasinghe (WVU) Optimization Methods in Finance 16 / 27
  • 17. Constraints Definition The function value of the new point, f(xk+1) should be less than or equal to the previous point function value, f(xk ). Constraint f(xk+1) ≤ f(xk ) f(xk + αk · dk ) ≤ f(xk ) Remark Optimization Algorithm starts with an initial point, x0. Find the decent ”search direction”, dk . Determine the ”step-size”, αk . Check the iteration criteria. Check the stopping conditions. Output the Optimal point, x∗ Thilanka Munasinghe (WVU) Optimization Methods in Finance 17 / 27
  • 18. Newton’s Method Definition Newton’s method (also known as the Newton-Raphson method) Uses to finding successively better approximations to the roots (or zeroes) of a real-valued function defined on the interval [a, b] f(x) = 0 Where,f(x) is continuous and differentiable. Definition Given a function f(x) is defined over the x ∈ R, and its first derivative is f (x), Calculation begins with a first guessing of the initial point x0 for a root of the function f(x). A new, considerably a better approximation point, x1 which obtained by, x1 = x0 + f(x0) f (x0)Thilanka Munasinghe (WVU) Optimization Methods in Finance 18 / 27
  • 19. Newton’s Method Iterative Scheme As the iterative process repeats, current approximation xn is used to derive the formula for a new, better approximation, xn+1 xn+1 = xn + f(xn) f (xn) Thilanka Munasinghe (WVU) Optimization Methods in Finance 19 / 27
  • 20. Newton’s Method Tangent Line Suppose we have some current approximation xn. Then we can derive the formula for a better approximation, xn+1. The equation of the tangent line to the curve y = f(x) at the point x = xn is, y = f (xn)(xn+1 − xn) + f(xn) Definition The x-intercept of this line (the value of x such that y = 0) is then used as the next approximation to the root, xn+1. In other words, setting y = 0 and x = xn+1 gives 0 = f (xn)(xn+1 − xn) + f(xn) Thilanka Munasinghe (WVU) Optimization Methods in Finance 20 / 27
  • 21. Steepest Descent Method Steepest Descent Algorithm for Unconstrained Optimization Consider an unconstrained optimization problem, min f(x) x ∈ Rn Assume the function f(x) is smooth and continuous and differentiable. If, x = ¯x is a given point, f(x) can be approximated by its linear expansion f(¯x + d) ≈ f(¯x) + f(¯x)T d if, d is small, notice that the approximation above is good. We choose the value of d so that the inner product, f(¯x)T d is small as possible. Thilanka Munasinghe (WVU) Optimization Methods in Finance 21 / 27
  • 22. Steepest Descent Method Gradient vector Let’s normalize d so that, d = 1. Then among all directions, d with norm d = 1, the direction, ˜d = − f(¯x) f(¯x makes the smallest inner product with the gradient, f(¯x). This fact follows from the following inequalites: f(¯x)T d ≥ − f(¯x) d = f(¯x)T (− f(¯x) f(¯x ) = − f(˜d). Direction of the Steepest Descent Due to the above reason, Steepest Descent Direction : ¯d = − f(¯x) Thilanka Munasinghe (WVU) Optimization Methods in Finance 22 / 27
  • 23. Steepest Descent Algorithm Steps of the SD Algorithm Steps are Step 1. Initialize x0 and machine accuracy ε , set k = 0 Step 2. dk = − f(xk ). If dk = 0, then stop. Step 3. Solve minαf(xk + αk · dk ) for the stepsize αk Step 4. Set xk+1 ← xk + αk · dk , k ← k + 1 Note from Step 3, the fact that dk = − f(xk ) is a descent direction, which follows the condition of f(xk + αk · dk ) ≤ f(xk ) . Thilanka Munasinghe (WVU) Optimization Methods in Finance 23 / 27
  • 24. Example of Steepest Descent Method Using SD Method to minimize f(x) Find the the first iteration value of x∗ using the Steepest Descent Method : min f(x) = 0.5x2 1 + 2.5x2 2 Gradient is given by : f(xk ) = x1 5x2 Taking x0 = 5 1 , we have f(x0) = 5 5 Performing line search along negative gradient direction, minα0 f(x0 − α0 f(xo)) Exact minimum along line is given by α0 = 1/3 , so next approximation isThilanka Munasinghe (WVU) Optimization Methods in Finance 24 / 27
  • 25. Golden Section Search Method Definition Suppose f(x) is unimodal on [a, b] , let x1 and x2 be two points within [a, b] , where x1 < x2 Evaluating and comparing the values of f(x1) and f(x2), we can discard either, (x2, b] or [a, x1), with minimum known to lie in remaining subintervals. In order to repeat the process, we need to compute only one new function evaluation. To reduce length of an interval by a fixed fraction at each iteration, each new pair of points must have the same relationship with respect to new interval that previous pair had with respect to that previous interval. Thilanka Munasinghe (WVU) Optimization Methods in Finance 25 / 27
  • 26. Golden Section Search Algorithm Consider τ = ( √ 5 − 1)/2 x1 = a + (1 − τ)(b − a) f1 = f(x1) x2 = a + τ(b − a) ; f2 = f(x2) While ((b − a) > tol) Do if (f1 > f2) then a = x1 x1 = x2 f1 = f2 x2 = a + τ(b − a) f2 = f(x2) Else b = x2 x2 = x1 x1 = a + (1 − τ)(b − a) f1 = f(x1)Thilanka Munasinghe (WVU) Optimization Methods in Finance 26 / 27
  • 27. Definition Another method that does not require explicit second derivatives, and does not even store approximation to the Hessian matrix is, Conjugate Gradient (CG) method. CG generates sequence of conjugate search directions, implicitly accumulating information about Hessian matrix. For quadratic objective function, CG is theoretically exact after at most n iterations, where n is the dimension of the problem. CG is effective for general unconstrained minimization. Thilanka Munasinghe (WVU) Optimization Methods in Finance 27 / 27