Top Rated Pune Call Girls Viman Nagar ⟟ 6297143586 ⟟ Call Me For Genuine Sex...
Chester Nov08 Terry Lynch
1. Introduction Bounded Noise Extensions Future work
On the Growth of the Extreme Fluctuations
of SDEs with Markovian Switching
Terry Lynch
(joint work with Dr. John Appleby)
Dublin City University, Ireland.
Leverhulme International Network
University of Chester, UK.
November 7th 2008
Supported by the Irish Research Council for Science, Engineering and Technology
2. Introduction Bounded Noise Extensions Future work
Outline
1 Introduction
Motivation
Regular Variation
2 Bounded Noise
Main Results
Outline of Proofs
3 Extensions
4 Future work
3. Introduction Bounded Noise Extensions Future work
Outline
1 Introduction
Motivation
Regular Variation
2 Bounded Noise
Main Results
Outline of Proofs
3 Extensions
4 Future work
4. Introduction Bounded Noise Extensions Future work
Recap - one year ago
Theorem 1
Let X be the unique adapted continuous solution satisfying
dX (t) = f (X (t), t) dt + g (X (t), t) dB(t).
If there exist ρ > 0 and real numbers K1 and K2 such that
∀ (x, t) ∈ R × [0, ∞)
xf (x, t) ≤ ρ, 0 < K2 ≤ g 2 (x, t) ≤ K1
then X satisfies
|X (t)|
lim sup √ ≤ K1 , a.s.
t→∞ 2t log log t
Question: Why the hell is everyone using that iterated logarithm?!
5. Introduction Bounded Noise Extensions Future work
Recap - one year ago
Theorem 1
Let X be the unique adapted continuous solution satisfying
dX (t) = f (X (t), t) dt + g (X (t), t) dB(t).
If there exist ρ > 0 and real numbers K1 and K2 such that
∀ (x, t) ∈ R × [0, ∞)
xf (x, t) ≤ ρ, 0 < K2 ≤ g 2 (x, t) ≤ K1
then X satisfies
|X (t)|
lim sup √ ≤ K1 , a.s.
t→∞ 2t log log t
Question: Why the hell is everyone using that iterated logarithm?!
6. Introduction Bounded Noise Extensions Future work
Recap - one year ago
Theorem 1
Let X be the unique adapted continuous solution satisfying
dX (t) = f (X (t), t) dt + g (X (t), t) dB(t).
If there exist ρ > 0 and real numbers K1 and K2 such that
∀ (x, t) ∈ R × [0, ∞)
xf (x, t) ≤ ρ, 0 < K2 ≤ g 2 (x, t) ≤ K1
then X satisfies
|X (t)|
lim sup √ ≤ K1 , a.s.
t→∞ 2t log log t
Question: Why the hell is everyone using that iterated logarithm?!
7. Introduction Bounded Noise Extensions Future work
Recap - one year ago
Theorem 1
Let X be the unique adapted continuous solution satisfying
dX (t) = f (X (t), t) dt + g (X (t), t) dB(t).
If there exist ρ > 0 and real numbers K1 and K2 such that
∀ (x, t) ∈ R × [0, ∞)
xf (x, t) ≤ ρ, 0 < K2 ≤ g 2 (x, t) ≤ K1
then X satisfies
|X (t)|
lim sup √ ≤ K1 , a.s.
t→∞ 2t log log t
Question: Why the hell is everyone using that iterated logarithm?!
8. Introduction Bounded Noise Extensions Future work
Introduction
We consider the rate of growth of the partial maxima of the
solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
differential equation (SDE) with Markovian Switching.
We find upper and lower estimates on this rate of growth by
finding constants α and β and an increasing function ρ s.t.
X (t)
0 < α ≤ lim sup ≤ β, a.s.
t→∞ ρ(t)
We look at processes which have mean-reverting drift terms
and which have either: (a) bounded noise intensity, or (b)
unbounded noise intensity.
9. Introduction Bounded Noise Extensions Future work
Introduction
We consider the rate of growth of the partial maxima of the
solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
differential equation (SDE) with Markovian Switching.
We find upper and lower estimates on this rate of growth by
finding constants α and β and an increasing function ρ s.t.
X (t)
0 < α ≤ lim sup ≤ β, a.s.
t→∞ ρ(t)
We look at processes which have mean-reverting drift terms
and which have either: (a) bounded noise intensity, or (b)
unbounded noise intensity.
10. Introduction Bounded Noise Extensions Future work
Introduction
We consider the rate of growth of the partial maxima of the
solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
differential equation (SDE) with Markovian Switching.
We find upper and lower estimates on this rate of growth by
finding constants α and β and an increasing function ρ s.t.
X (t)
0 < α ≤ lim sup ≤ β, a.s.
t→∞ ρ(t)
We look at processes which have mean-reverting drift terms
and which have either: (a) bounded noise intensity, or (b)
unbounded noise intensity.
11. Introduction Bounded Noise Extensions Future work
Introduction
We consider the rate of growth of the partial maxima of the
solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
differential equation (SDE) with Markovian Switching.
We find upper and lower estimates on this rate of growth by
finding constants α and β and an increasing function ρ s.t.
X (t)
0 < α ≤ lim sup ≤ β, a.s.
t→∞ ρ(t)
We look at processes which have mean-reverting drift terms
and which have either: (a) bounded noise intensity, or (b)
unbounded noise intensity.
12. Introduction Bounded Noise Extensions Future work
Introduction
We consider the rate of growth of the partial maxima of the
solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
differential equation (SDE) with Markovian Switching.
We find upper and lower estimates on this rate of growth by
finding constants α and β and an increasing function ρ s.t.
X (t)
0 < α ≤ lim sup ≤ β, a.s.
t→∞ ρ(t)
We look at processes which have mean-reverting drift terms
and which have either: (a) bounded noise intensity, or (b)
unbounded noise intensity.
13. Introduction Bounded Noise Extensions Future work
Motivation
The ability to model a self-regulating economic system which
is subjected to persistent stochastic shocks is facilitated by
the use of mean-reverting drift terms.
The type of finite-dimensional system studied allows for the
analysis of heavy tailed returns distributions in stochastic
volatility market models in which many assets are traded.
Equations with Markovian switching are motivated by
econometric evidence which suggests that security prices often
move from confident to nervous (or other) regimes.
14. Introduction Bounded Noise Extensions Future work
Motivation
The ability to model a self-regulating economic system which
is subjected to persistent stochastic shocks is facilitated by
the use of mean-reverting drift terms.
The type of finite-dimensional system studied allows for the
analysis of heavy tailed returns distributions in stochastic
volatility market models in which many assets are traded.
Equations with Markovian switching are motivated by
econometric evidence which suggests that security prices often
move from confident to nervous (or other) regimes.
15. Introduction Bounded Noise Extensions Future work
Motivation
The ability to model a self-regulating economic system which
is subjected to persistent stochastic shocks is facilitated by
the use of mean-reverting drift terms.
The type of finite-dimensional system studied allows for the
analysis of heavy tailed returns distributions in stochastic
volatility market models in which many assets are traded.
Equations with Markovian switching are motivated by
econometric evidence which suggests that security prices often
move from confident to nervous (or other) regimes.
16. Introduction Bounded Noise Extensions Future work
Motivational case study
To demonstrate the potential impact of swift changes in investor
sentiment, we look at a case study of Elan Corporation.
17. Introduction Bounded Noise Extensions Future work
Motivational case study
To demonstrate the potential impact of swift changes in investor
sentiment, we look at a case study of Elan Corporation.
18. Introduction Bounded Noise Extensions Future work
Motivational case study
To demonstrate the potential impact of swift changes in investor
sentiment, we look at a case study of Elan Corporation.
19. Introduction Bounded Noise Extensions Future work
Motivational case study
To demonstrate the potential impact of swift changes in investor
sentiment, we look at a case study of Elan Corporation.
20. Introduction Bounded Noise Extensions Future work
Motivational case study
To demonstrate the potential impact of swift changes in investor
sentiment, we look at a case study of Elan Corporation.
21. Introduction Bounded Noise Extensions Future work
Regular Variation
Our analysis is aided by the use of regularly varying functions
In its basic form, may be viewed as relations such as
f (λx)
lim = λζ ∈ (0, ∞) ∀ λ > 0
x→∞ f (x)
We say that f is regularly varying at infinity with index ζ, or
f ∈ RV∞ (ζ)
Regularly varying functions have useful properties such as:
x
f ∈ RV∞ (ζ) ⇒ F (x) := f (t) dt ∈ RV∞ (ζ + 1)
1
−1
f ∈ RV∞ (ζ) ⇒ f (x) ∈ RV∞ (1/ζ).
22. Introduction Bounded Noise Extensions Future work
Regular Variation
Our analysis is aided by the use of regularly varying functions
In its basic form, may be viewed as relations such as
f (λx)
lim = λζ ∈ (0, ∞) ∀ λ > 0
x→∞ f (x)
We say that f is regularly varying at infinity with index ζ, or
f ∈ RV∞ (ζ)
Regularly varying functions have useful properties such as:
x
f ∈ RV∞ (ζ) ⇒ F (x) := f (t) dt ∈ RV∞ (ζ + 1)
1
−1
f ∈ RV∞ (ζ) ⇒ f (x) ∈ RV∞ (1/ζ).
23. Introduction Bounded Noise Extensions Future work
Regular Variation
Our analysis is aided by the use of regularly varying functions
In its basic form, may be viewed as relations such as
f (λx)
lim = λζ ∈ (0, ∞) ∀ λ > 0
x→∞ f (x)
We say that f is regularly varying at infinity with index ζ, or
f ∈ RV∞ (ζ)
Regularly varying functions have useful properties such as:
x
f ∈ RV∞ (ζ) ⇒ F (x) := f (t) dt ∈ RV∞ (ζ + 1)
1
−1
f ∈ RV∞ (ζ) ⇒ f (x) ∈ RV∞ (1/ζ).
24. Introduction Bounded Noise Extensions Future work
Regular Variation
Our analysis is aided by the use of regularly varying functions
In its basic form, may be viewed as relations such as
f (λx)
lim = λζ ∈ (0, ∞) ∀ λ > 0
x→∞ f (x)
We say that f is regularly varying at infinity with index ζ, or
f ∈ RV∞ (ζ)
Regularly varying functions have useful properties such as:
x
f ∈ RV∞ (ζ) ⇒ F (x) := f (t) dt ∈ RV∞ (ζ + 1)
1
−1
f ∈ RV∞ (ζ) ⇒ f (x) ∈ RV∞ (1/ζ).
25. Introduction Bounded Noise Extensions Future work
Outline
1 Introduction
Motivation
Regular Variation
2 Bounded Noise
Main Results
Outline of Proofs
3 Extensions
4 Future work
26. Introduction Bounded Noise Extensions Future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1)
where
Y is a Markov chain with finite irreducible state space S,
f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
Lipschitz continuous functions,
B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adapted
process which satisfies (1).
27. Introduction Bounded Noise Extensions Future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1)
where
Y is a Markov chain with finite irreducible state space S,
f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
Lipschitz continuous functions,
B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adapted
process which satisfies (1).
28. Introduction Bounded Noise Extensions Future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1)
where
Y is a Markov chain with finite irreducible state space S,
f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
Lipschitz continuous functions,
B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adapted
process which satisfies (1).
29. Introduction Bounded Noise Extensions Future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1)
where
Y is a Markov chain with finite irreducible state space S,
f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
Lipschitz continuous functions,
B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adapted
process which satisfies (1).
30. Introduction Bounded Noise Extensions Future work
S.D.E with Markovian switching
We study the finite-dimensional SDE with Markovian switching
dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1)
where
Y is a Markov chain with finite irreducible state space S,
f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
Lipschitz continuous functions,
B is an r -dimensional Brownian motion, independent of Y .
Under these conditions, there is a unique continuous and adapted
process which satisfies (1).
31. Introduction Bounded Noise Extensions Future work
Bounded Noise
Consider the i th component of our SDE
r
dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t).
j=1
We suppose that the noise intensity g is bounded in the sense that
there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S,
2
r d
j=1 i=1 xi gij (x,y ) 2
there exists K1 > 0 s.t. inf x ∈Rd x 2
≥ K1 .
y ∈S
By Cauchy-Schwarz we get the following upper and lower estimate
2 2 2
K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
32. Introduction Bounded Noise Extensions Future work
Bounded Noise
Consider the i th component of our SDE
r
dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t).
j=1
We suppose that the noise intensity g is bounded in the sense that
there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S,
2
r d
j=1 i=1 xi gij (x,y ) 2
there exists K1 > 0 s.t. inf x ∈Rd x 2
≥ K1 .
y ∈S
By Cauchy-Schwarz we get the following upper and lower estimate
2 2 2
K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
33. Introduction Bounded Noise Extensions Future work
Bounded Noise
Consider the i th component of our SDE
r
dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t).
j=1
We suppose that the noise intensity g is bounded in the sense that
there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S,
2
r d
j=1 i=1 xi gij (x,y ) 2
there exists K1 > 0 s.t. inf x ∈Rd x 2
≥ K1 .
y ∈S
By Cauchy-Schwarz we get the following upper and lower estimate
2 2 2
K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
34. Introduction Bounded Noise Extensions Future work
Bounded Noise
Consider the i th component of our SDE
r
dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t).
j=1
We suppose that the noise intensity g is bounded in the sense that
there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S,
2
r d
j=1 i=1 xi gij (x,y ) 2
there exists K1 > 0 s.t. inf x ∈Rd x 2
≥ K1 .
y ∈S
By Cauchy-Schwarz we get the following upper and lower estimate
2 2 2
K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
35. Introduction Bounded Noise Extensions Future work
Bounded Noise
Consider the i th component of our SDE
r
dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t).
j=1
We suppose that the noise intensity g is bounded in the sense that
there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S,
2
r d
j=1 i=1 xi gij (x,y ) 2
there exists K1 > 0 s.t. inf x ∈Rd x 2
≥ K1 .
y ∈S
By Cauchy-Schwarz we get the following upper and lower estimate
2 2 2
K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
36. Introduction Bounded Noise Extensions Future work
Upper bound
The strength of the restoring force is characterised by a locally
Lipschitz continuous function φ : [0, ∞) → (0, ∞) with
xφ(x) → ∞ as x → ∞, where
x, f (x, y )
lim sup sup ≤ −c2 ∈ (−∞, 0). (2)
x →∞ y ∈S x φ( x )
Theorem 1
Let X be the unique adapted continuous solution satisfying (1).
Suppose there exists a function φ satisfying (2) and that g satisfies
the bounded noise conditions. Then X satisfies
X (t)
lim sup 2 ≤ 1, a.s. on Ω ,
t→∞ Φ−1 K2 (1+ ) log t
2c2
x
where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
37. Introduction Bounded Noise Extensions Future work
Upper bound
The strength of the restoring force is characterised by a locally
Lipschitz continuous function φ : [0, ∞) → (0, ∞) with
xφ(x) → ∞ as x → ∞, where
x, f (x, y )
lim sup sup ≤ −c2 ∈ (−∞, 0). (2)
x →∞ y ∈S x φ( x )
Theorem 1
Let X be the unique adapted continuous solution satisfying (1).
Suppose there exists a function φ satisfying (2) and that g satisfies
the bounded noise conditions. Then X satisfies
X (t)
lim sup 2 ≤ 1, a.s. on Ω ,
t→∞ Φ−1 K2 (1+ ) log t
2c2
x
where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
38. Introduction Bounded Noise Extensions Future work
Upper bound
The strength of the restoring force is characterised by a locally
Lipschitz continuous function φ : [0, ∞) → (0, ∞) with
xφ(x) → ∞ as x → ∞, where
x, f (x, y )
lim sup sup ≤ −c2 ∈ (−∞, 0). (2)
x →∞ y ∈S x φ( x )
Theorem 1
Let X be the unique adapted continuous solution satisfying (1).
Suppose there exists a function φ satisfying (2) and that g satisfies
the bounded noise conditions. Then X satisfies
X (t)
lim sup 2 ≤ 1, a.s. on Ω ,
t→∞ Φ−1 K2 (1+ ) log t
2c2
x
where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
39. Introduction Bounded Noise Extensions Future work
Lower bound
Moreover, we ensure that the degree of non-linearity in f is
characterised by φ also, by means of the assumption
| x, f (x, y ) |
lim sup sup ≤ c1 ∈ (0, ∞). (3)
x →∞ y ∈S x φ( x )
Theorem 2
Let X be the unique adapted continuous solution satisfying (1).
Suppose there exists a function φ satisfying (3) and that g satisfies
the bounded noise conditions. Then X satisfies
X (t)
lim sup 2 ≥ 1, a.s. on Ω ,
t→∞ Φ−1 K1 (1− ) log t
2c1
x
where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
40. Introduction Bounded Noise Extensions Future work
Lower bound
Moreover, we ensure that the degree of non-linearity in f is
characterised by φ also, by means of the assumption
| x, f (x, y ) |
lim sup sup ≤ c1 ∈ (0, ∞). (3)
x →∞ y ∈S x φ( x )
Theorem 2
Let X be the unique adapted continuous solution satisfying (1).
Suppose there exists a function φ satisfying (3) and that g satisfies
the bounded noise conditions. Then X satisfies
X (t)
lim sup 2 ≥ 1, a.s. on Ω ,
t→∞ Φ−1 K1 (1− ) log t
2c1
x
where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
41. Introduction Bounded Noise Extensions Future work
Lower bound
Moreover, we ensure that the degree of non-linearity in f is
characterised by φ also, by means of the assumption
| x, f (x, y ) |
lim sup sup ≤ c1 ∈ (0, ∞). (3)
x →∞ y ∈S x φ( x )
Theorem 2
Let X be the unique adapted continuous solution satisfying (1).
Suppose there exists a function φ satisfying (3) and that g satisfies
the bounded noise conditions. Then X satisfies
X (t)
lim sup 2 ≥ 1, a.s. on Ω ,
t→∞ Φ−1 K1 (1− ) log t
2c1
x
where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
42. Introduction Bounded Noise Extensions Future work
Growth rate of large fluctuations
Taking both theorems together, in the special case where φ is a
regularly varying function, we get the following:
Theorem 3
Let X be the unique adapted continuous solution satisfying (1).
Suppose there exists a function φ ∈ RV∞ (ζ) satisfying (2) and (3)
and that g satisfies the bounded noise conditions. Then X
satisfies
1 1
K12 ζ+1 X (t) K22 ζ+1
≤ lim sup −1 ≤ a.s.,
2c1 t→∞ Φ (log t) 2c2
x
where Φ(x) = 1 φ(u) du ∈ RV∞ (ζ + 1).
43. Introduction Bounded Noise Extensions Future work
Growth rate of large fluctuations
Taking both theorems together, in the special case where φ is a
regularly varying function, we get the following:
Theorem 3
Let X be the unique adapted continuous solution satisfying (1).
Suppose there exists a function φ ∈ RV∞ (ζ) satisfying (2) and (3)
and that g satisfies the bounded noise conditions. Then X
satisfies
1 1
K12 ζ+1 X (t) K22 ζ+1
≤ lim sup −1 ≤ a.s.,
2c1 t→∞ Φ (log t) 2c2
x
where Φ(x) = 1 φ(u) du ∈ RV∞ (ζ + 1).
44. Introduction Bounded Noise Extensions Future work
Growth rate of large fluctuations
Taking both theorems together, in the special case where φ is a
regularly varying function, we get the following:
Theorem 3
Let X be the unique adapted continuous solution satisfying (1).
Suppose there exists a function φ ∈ RV∞ (ζ) satisfying (2) and (3)
and that g satisfies the bounded noise conditions. Then X
satisfies
1 1
K12 ζ+1 X (t) K22 ζ+1
≤ lim sup −1 ≤ a.s.,
2c1 t→∞ Φ (log t) 2c2
x
where Φ(x) = 1 φ(u) du ∈ RV∞ (ζ + 1).
45. Introduction Bounded Noise Extensions Future work
Comments and Example
Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
expected, weak mean-reversion results in large fluctuations.
Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,
c1 = c2 = c and K1 = K2 = K ,
φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
2 √
Φ(x) = x2 and Φ−1 (x) = 2x
Then applying theorem 3 recovers the well known result
1 1 1
K12 |X (t)| K 2 2 K2 ζ+1
2
lim sup √sup X (t)
ζ+1
≤ lim = ≤ , a.s.
2c1 t→∞ t→∞ Φ−1 (log2c
2 log t t) 2c2
46. Introduction Bounded Noise Extensions Future work
Comments and Example
Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
expected, weak mean-reversion results in large fluctuations.
Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,
c1 = c2 = c and K1 = K2 = K ,
φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
2 √
Φ(x) = x2 and Φ−1 (x) = 2x
Then applying theorem 3 recovers the well known result
1 1 1
K12 |X (t)| K 2 2 K2 ζ+1
2
lim sup √sup X (t)
ζ+1
≤ lim = ≤ , a.s.
2c1 t→∞ t→∞ Φ−1 (log2c
2 log t t) 2c2
47. Introduction Bounded Noise Extensions Future work
Comments and Example
Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
expected, weak mean-reversion results in large fluctuations.
Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,
c1 = c2 = c and K1 = K2 = K ,
φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
2 √
Φ(x) = x2 and Φ−1 (x) = 2x
Then applying theorem 3 recovers the well known result
1 1 1
K12 |X (t)| K 2 2 K2 ζ+1
2
lim sup √sup X (t)
ζ+1
≤ lim = ≤ , a.s.
2c1 t→∞ t→∞ Φ−1 (log2c
2 log t t) 2c2
48. Introduction Bounded Noise Extensions Future work
Comments and Example
Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
expected, weak mean-reversion results in large fluctuations.
Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,
c1 = c2 = c and K1 = K2 = K ,
φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
2 √
Φ(x) = x2 and Φ−1 (x) = 2x
Then applying theorem 3 recovers the well known result
1 1 1
K12 |X (t)| K 2 2 K2 ζ+1
2
lim sup √sup X (t)
ζ+1
≤ lim = ≤ , a.s.
2c1 t→∞ t→∞ Φ−1 (log2c
2 log t t) 2c2
49. Introduction Bounded Noise Extensions Future work
Comments and Example
Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
expected, weak mean-reversion results in large fluctuations.
Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,
c1 = c2 = c and K1 = K2 = K ,
φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
2 √
Φ(x) = x2 and Φ−1 (x) = 2x
Then applying theorem 3 recovers the well known result
1 1 1
K12 |X (t)| K 2 2 K2 ζ+1
2
lim sup √sup X (t)
ζ+1
≤ lim = ≤ , a.s.
2c1 t→∞ t→∞ Φ−1 (log2c
2 log t t) 2c2
50. Introduction Bounded Noise Extensions Future work
Comments and Example
Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
expected, weak mean-reversion results in large fluctuations.
Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,
c1 = c2 = c and K1 = K2 = K ,
φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
2 √
Φ(x) = x2 and Φ−1 (x) = 2x
Then applying theorem 3 recovers the well known result
1 1 1
K12 |X (t)| K 2 2 K2 ζ+1
2
lim sup √sup X (t)
ζ+1
≤ lim = ≤ , a.s.
2c1 t→∞ t→∞ Φ−1 (log2c
2 log t t) 2c2
51. Introduction Bounded Noise Extensions Future work
Comments and Example
Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
expected, weak mean-reversion results in large fluctuations.
Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,
c1 = c2 = c and K1 = K2 = K ,
φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
2 √
Φ(x) = x2 and Φ−1 (x) = 2x
Then applying theorem 3 recovers the well known result
1 1 1
K12 |X (t)| K 2 2 K2 ζ+1
2
lim sup √sup X (t)
ζ+1
≤ lim = ≤ , a.s.
2c1 t→∞ t→∞ Φ−1 (log2c
2 log t t) 2c2
52. Introduction Bounded Noise Extensions Future work
Comments and Example
Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
expected, weak mean-reversion results in large fluctuations.
Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,
c1 = c2 = c and K1 = K2 = K ,
φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
2 √
Φ(x) = x2 and Φ−1 (x) = 2x
Then applying theorem 3 recovers the well known result
1 1 1
K12 |X (t)| K 2 2 K2 ζ+1
2
lim sup √sup X (t)
ζ+1
≤ lim = ≤ , a.s.
2c1 t→∞ t→∞ Φ−1 (log2c
2 log t t) 2c2
53. Introduction Bounded Noise Extensions Future work
Comments and Example
Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
expected, weak mean-reversion results in large fluctuations.
Take a simple one-dimensional Ornstein Uhlenbeck process
dX (t) = −cX (t)dt + K dB(t)
where, in our notation,
c1 = c2 = c and K1 = K2 = K ,
φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
2 √
Φ(x) = x2 and Φ−1 (x) = 2x
Then applying theorem 3 recovers the well known result
1 1 1
K12 |X (t)| K 2 2 K2 ζ+1
2
lim sup √sup X (t)
ζ+1
≤ lim = ≤ , a.s.
2c1 t→∞ t→∞ Φ−1 (log2c
2 log t t) 2c2
54. Introduction Bounded Noise Extensions Future work
Outline of proofs
Apply time–change and Itˆ transformation to reduce
o
dimensionality and facilitate the use of one-dimensional theory,
Use the bounding conditions on f and g to construct upper
and lower comparison processes whose dynamics are not
determined by the switching parameter Y ,
Use stochastic comparison theorem to prove that our
transformed process is bounded above and below by the
comparison processes for all t ≥ 0 a.s.,
The large deviations of the comparison processes are
determined by means of a classical theorem of Motoo.
55. Introduction Bounded Noise Extensions Future work
Outline of proofs
Apply time–change and Itˆ transformation to reduce
o
dimensionality and facilitate the use of one-dimensional theory,
Use the bounding conditions on f and g to construct upper
and lower comparison processes whose dynamics are not
determined by the switching parameter Y ,
Use stochastic comparison theorem to prove that our
transformed process is bounded above and below by the
comparison processes for all t ≥ 0 a.s.,
The large deviations of the comparison processes are
determined by means of a classical theorem of Motoo.
56. Introduction Bounded Noise Extensions Future work
Outline of proofs
Apply time–change and Itˆ transformation to reduce
o
dimensionality and facilitate the use of one-dimensional theory,
Use the bounding conditions on f and g to construct upper
and lower comparison processes whose dynamics are not
determined by the switching parameter Y ,
Use stochastic comparison theorem to prove that our
transformed process is bounded above and below by the
comparison processes for all t ≥ 0 a.s.,
The large deviations of the comparison processes are
determined by means of a classical theorem of Motoo.
57. Introduction Bounded Noise Extensions Future work
Outline of proofs
Apply time–change and Itˆ transformation to reduce
o
dimensionality and facilitate the use of one-dimensional theory,
Use the bounding conditions on f and g to construct upper
and lower comparison processes whose dynamics are not
determined by the switching parameter Y ,
Use stochastic comparison theorem to prove that our
transformed process is bounded above and below by the
comparison processes for all t ≥ 0 a.s.,
The large deviations of the comparison processes are
determined by means of a classical theorem of Motoo.
58. Introduction Bounded Noise Extensions Future work
Outline
1 Introduction
Motivation
Regular Variation
2 Bounded Noise
Main Results
Outline of Proofs
3 Extensions
4 Future work
59. Introduction Bounded Noise Extensions Future work
Extensions
We can impose a rate of nonlinear growth of g with x as
x → ∞ through an increasing scalar function γ.
Then the growth rate of the deviations are of the order
x
Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du.
Using norm equivalence in Rd we can study the size of the
largest component of the system rather than the norm, to get
upper and lower bounds of the form
1 max1≤j≤d |Xj (t)|
0 < √ α ≤ lim sup ≤ β ≤ +∞, a.s.
d t→∞ ρ(t)
We can extend the state space of the Markov chain to a
countable state space.
60. Introduction Bounded Noise Extensions Future work
Extensions
We can impose a rate of nonlinear growth of g with x as
x → ∞ through an increasing scalar function γ.
Then the growth rate of the deviations are of the order
x
Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du.
Using norm equivalence in Rd we can study the size of the
largest component of the system rather than the norm, to get
upper and lower bounds of the form
1 max1≤j≤d |Xj (t)|
0 < √ α ≤ lim sup ≤ β ≤ +∞, a.s.
d t→∞ ρ(t)
We can extend the state space of the Markov chain to a
countable state space.
61. Introduction Bounded Noise Extensions Future work
Extensions
We can impose a rate of nonlinear growth of g with x as
x → ∞ through an increasing scalar function γ.
Then the growth rate of the deviations are of the order
x
Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du.
Using norm equivalence in Rd we can study the size of the
largest component of the system rather than the norm, to get
upper and lower bounds of the form
1 max1≤j≤d |Xj (t)|
0 < √ α ≤ lim sup ≤ β ≤ +∞, a.s.
d t→∞ ρ(t)
We can extend the state space of the Markov chain to a
countable state space.
62. Introduction Bounded Noise Extensions Future work
Extensions
We can impose a rate of nonlinear growth of g with x as
x → ∞ through an increasing scalar function γ.
Then the growth rate of the deviations are of the order
x
Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du.
Using norm equivalence in Rd we can study the size of the
largest component of the system rather than the norm, to get
upper and lower bounds of the form
1 max1≤j≤d |Xj (t)|
0 < √ α ≤ lim sup ≤ β ≤ +∞, a.s.
d t→∞ ρ(t)
We can extend the state space of the Markov chain to a
countable state space.
63. Introduction Bounded Noise Extensions Future work
Outline
1 Introduction
Motivation
Regular Variation
2 Bounded Noise
Main Results
Outline of Proofs
3 Extensions
4 Future work
64. Introduction Bounded Noise Extensions Future work
Future work
To study finite-dimensional processes which are always
positive, e.g. the Cox Ingersoll Ross (CIR) model.
To investigate the growth, explosion and simulation of our
developed results. This will involve stochastic numerical
analysis and Monte Carlo simulation.
This simulation will allow us to determine whether the
qualitative features of our dynamical systems are preserved
when analysed in discrete time.
To investigate whether our results can be extended to SDEs
with delay.
65. Introduction Bounded Noise Extensions Future work
Future work
To study finite-dimensional processes which are always
positive, e.g. the Cox Ingersoll Ross (CIR) model.
To investigate the growth, explosion and simulation of our
developed results. This will involve stochastic numerical
analysis and Monte Carlo simulation.
This simulation will allow us to determine whether the
qualitative features of our dynamical systems are preserved
when analysed in discrete time.
To investigate whether our results can be extended to SDEs
with delay.
66. Introduction Bounded Noise Extensions Future work
Future work
To study finite-dimensional processes which are always
positive, e.g. the Cox Ingersoll Ross (CIR) model.
To investigate the growth, explosion and simulation of our
developed results. This will involve stochastic numerical
analysis and Monte Carlo simulation.
This simulation will allow us to determine whether the
qualitative features of our dynamical systems are preserved
when analysed in discrete time.
To investigate whether our results can be extended to SDEs
with delay.
67. Introduction Bounded Noise Extensions Future work
Future work
To study finite-dimensional processes which are always
positive, e.g. the Cox Ingersoll Ross (CIR) model.
To investigate the growth, explosion and simulation of our
developed results. This will involve stochastic numerical
analysis and Monte Carlo simulation.
This simulation will allow us to determine whether the
qualitative features of our dynamical systems are preserved
when analysed in discrete time.
To investigate whether our results can be extended to SDEs
with delay.