SlideShare a Scribd company logo
1 of 67
Download to read offline
Introduction                Bounded Noise                  Extensions             Future work




On the Growth of the Extreme Fluctuations
    of SDEs with Markovian Switching

                               Terry Lynch
                   (joint work with Dr. John Appleby)

                          Dublin City University, Ireland.

                        Leverhulme International Network
                           University of Chester, UK.
                              November 7th 2008


    Supported by the Irish Research Council for Science, Engineering and Technology
Introduction      Bounded Noise   Extensions   Future work



Outline


  1   Introduction
        Motivation
        Regular Variation

  2   Bounded Noise
       Main Results
       Outline of Proofs

  3   Extensions

  4   Future work
Introduction      Bounded Noise   Extensions   Future work



Outline


  1   Introduction
        Motivation
        Regular Variation

  2   Bounded Noise
       Main Results
       Outline of Proofs

  3   Extensions

  4   Future work
Introduction           Bounded Noise          Extensions   Future work



Recap - one year ago

  Theorem 1
  Let X be the unique adapted continuous solution satisfying
                 dX (t) = f (X (t), t) dt + g (X (t), t) dB(t).
  If there exist ρ > 0 and real numbers K1 and K2 such that
  ∀ (x, t) ∈ R × [0, ∞)
                   xf (x, t) ≤ ρ,        0 < K2 ≤ g 2 (x, t) ≤ K1
  then X satisfies
                                      |X (t)|
                         lim sup √                ≤   K1 ,    a.s.
                          t→∞        2t log log t

  Question: Why the hell is everyone using that iterated logarithm?!
Introduction           Bounded Noise          Extensions   Future work



Recap - one year ago

  Theorem 1
  Let X be the unique adapted continuous solution satisfying
                 dX (t) = f (X (t), t) dt + g (X (t), t) dB(t).
  If there exist ρ > 0 and real numbers K1 and K2 such that
  ∀ (x, t) ∈ R × [0, ∞)
                   xf (x, t) ≤ ρ,        0 < K2 ≤ g 2 (x, t) ≤ K1
  then X satisfies
                                      |X (t)|
                         lim sup √                ≤   K1 ,    a.s.
                          t→∞        2t log log t

  Question: Why the hell is everyone using that iterated logarithm?!
Introduction           Bounded Noise          Extensions   Future work



Recap - one year ago

  Theorem 1
  Let X be the unique adapted continuous solution satisfying
                 dX (t) = f (X (t), t) dt + g (X (t), t) dB(t).
  If there exist ρ > 0 and real numbers K1 and K2 such that
  ∀ (x, t) ∈ R × [0, ∞)
                   xf (x, t) ≤ ρ,        0 < K2 ≤ g 2 (x, t) ≤ K1
  then X satisfies
                                      |X (t)|
                         lim sup √                ≤   K1 ,    a.s.
                          t→∞        2t log log t

  Question: Why the hell is everyone using that iterated logarithm?!
Introduction           Bounded Noise          Extensions   Future work



Recap - one year ago

  Theorem 1
  Let X be the unique adapted continuous solution satisfying
                 dX (t) = f (X (t), t) dt + g (X (t), t) dB(t).
  If there exist ρ > 0 and real numbers K1 and K2 such that
  ∀ (x, t) ∈ R × [0, ∞)
                   xf (x, t) ≤ ρ,        0 < K2 ≤ g 2 (x, t) ≤ K1
  then X satisfies
                                      |X (t)|
                         lim sup √                ≤   K1 ,    a.s.
                          t→∞        2t log log t

  Question: Why the hell is everyone using that iterated logarithm?!
Introduction         Bounded Noise         Extensions          Future work



Introduction

      We consider the rate of growth of the partial maxima of the
      solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
      differential equation (SDE) with Markovian Switching.

      We find upper and lower estimates on this rate of growth by
      finding constants α and β and an increasing function ρ s.t.

                                              X (t)
                        0 < α ≤ lim sup             ≤ β,         a.s.
                                   t→∞        ρ(t)
      We look at processes which have mean-reverting drift terms
      and which have either: (a) bounded noise intensity, or (b)
      unbounded noise intensity.
Introduction         Bounded Noise         Extensions          Future work



Introduction

      We consider the rate of growth of the partial maxima of the
      solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
      differential equation (SDE) with Markovian Switching.

      We find upper and lower estimates on this rate of growth by
      finding constants α and β and an increasing function ρ s.t.

                                              X (t)
                        0 < α ≤ lim sup             ≤ β,         a.s.
                                   t→∞        ρ(t)
      We look at processes which have mean-reverting drift terms
      and which have either: (a) bounded noise intensity, or (b)
      unbounded noise intensity.
Introduction         Bounded Noise         Extensions          Future work



Introduction

      We consider the rate of growth of the partial maxima of the
      solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
      differential equation (SDE) with Markovian Switching.

      We find upper and lower estimates on this rate of growth by
      finding constants α and β and an increasing function ρ s.t.

                                              X (t)
                        0 < α ≤ lim sup             ≤ β,         a.s.
                                   t→∞        ρ(t)
      We look at processes which have mean-reverting drift terms
      and which have either: (a) bounded noise intensity, or (b)
      unbounded noise intensity.
Introduction         Bounded Noise         Extensions          Future work



Introduction

      We consider the rate of growth of the partial maxima of the
      solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
      differential equation (SDE) with Markovian Switching.

      We find upper and lower estimates on this rate of growth by
      finding constants α and β and an increasing function ρ s.t.

                                              X (t)
                        0 < α ≤ lim sup             ≤ β,         a.s.
                                   t→∞        ρ(t)
      We look at processes which have mean-reverting drift terms
      and which have either: (a) bounded noise intensity, or (b)
      unbounded noise intensity.
Introduction         Bounded Noise         Extensions          Future work



Introduction

      We consider the rate of growth of the partial maxima of the
      solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic
      differential equation (SDE) with Markovian Switching.

      We find upper and lower estimates on this rate of growth by
      finding constants α and β and an increasing function ρ s.t.

                                              X (t)
                        0 < α ≤ lim sup             ≤ β,         a.s.
                                   t→∞        ρ(t)
      We look at processes which have mean-reverting drift terms
      and which have either: (a) bounded noise intensity, or (b)
      unbounded noise intensity.
Introduction     Bounded Noise      Extensions      Future work



Motivation


     The ability to model a self-regulating economic system which
     is subjected to persistent stochastic shocks is facilitated by
     the use of mean-reverting drift terms.

     The type of finite-dimensional system studied allows for the
     analysis of heavy tailed returns distributions in stochastic
     volatility market models in which many assets are traded.

     Equations with Markovian switching are motivated by
     econometric evidence which suggests that security prices often
     move from confident to nervous (or other) regimes.
Introduction     Bounded Noise      Extensions      Future work



Motivation


     The ability to model a self-regulating economic system which
     is subjected to persistent stochastic shocks is facilitated by
     the use of mean-reverting drift terms.

     The type of finite-dimensional system studied allows for the
     analysis of heavy tailed returns distributions in stochastic
     volatility market models in which many assets are traded.

     Equations with Markovian switching are motivated by
     econometric evidence which suggests that security prices often
     move from confident to nervous (or other) regimes.
Introduction     Bounded Noise      Extensions      Future work



Motivation


     The ability to model a self-regulating economic system which
     is subjected to persistent stochastic shocks is facilitated by
     the use of mean-reverting drift terms.

     The type of finite-dimensional system studied allows for the
     analysis of heavy tailed returns distributions in stochastic
     volatility market models in which many assets are traded.

     Equations with Markovian switching are motivated by
     econometric evidence which suggests that security prices often
     move from confident to nervous (or other) regimes.
Introduction     Bounded Noise     Extensions      Future work



Motivational case study

  To demonstrate the potential impact of swift changes in investor
  sentiment, we look at a case study of Elan Corporation.
Introduction     Bounded Noise     Extensions      Future work



Motivational case study

  To demonstrate the potential impact of swift changes in investor
  sentiment, we look at a case study of Elan Corporation.
Introduction     Bounded Noise     Extensions      Future work



Motivational case study

  To demonstrate the potential impact of swift changes in investor
  sentiment, we look at a case study of Elan Corporation.
Introduction     Bounded Noise     Extensions      Future work



Motivational case study

  To demonstrate the potential impact of swift changes in investor
  sentiment, we look at a case study of Elan Corporation.
Introduction     Bounded Noise     Extensions      Future work



Motivational case study

  To demonstrate the potential impact of swift changes in investor
  sentiment, we look at a case study of Elan Corporation.
Introduction          Bounded Noise                 Extensions     Future work



Regular Variation

      Our analysis is aided by the use of regularly varying functions
      In its basic form, may be viewed as relations such as

                              f (λx)
                        lim          = λζ ∈ (0, ∞) ∀ λ > 0
                        x→∞    f (x)

      We say that f is regularly varying at infinity with index ζ, or
      f ∈ RV∞ (ζ)
      Regularly varying functions have useful properties such as:
                                                     x
             f ∈ RV∞ (ζ) ⇒ F (x) :=                      f (t) dt ∈ RV∞ (ζ + 1)
                                                 1
                                               −1
                        f ∈ RV∞ (ζ) ⇒ f          (x) ∈ RV∞ (1/ζ).
Introduction          Bounded Noise                 Extensions     Future work



Regular Variation

      Our analysis is aided by the use of regularly varying functions
      In its basic form, may be viewed as relations such as

                              f (λx)
                        lim          = λζ ∈ (0, ∞) ∀ λ > 0
                        x→∞    f (x)

      We say that f is regularly varying at infinity with index ζ, or
      f ∈ RV∞ (ζ)
      Regularly varying functions have useful properties such as:
                                                     x
             f ∈ RV∞ (ζ) ⇒ F (x) :=                      f (t) dt ∈ RV∞ (ζ + 1)
                                                 1
                                               −1
                        f ∈ RV∞ (ζ) ⇒ f          (x) ∈ RV∞ (1/ζ).
Introduction          Bounded Noise                 Extensions     Future work



Regular Variation

      Our analysis is aided by the use of regularly varying functions
      In its basic form, may be viewed as relations such as

                              f (λx)
                        lim          = λζ ∈ (0, ∞) ∀ λ > 0
                        x→∞    f (x)

      We say that f is regularly varying at infinity with index ζ, or
      f ∈ RV∞ (ζ)
      Regularly varying functions have useful properties such as:
                                                     x
             f ∈ RV∞ (ζ) ⇒ F (x) :=                      f (t) dt ∈ RV∞ (ζ + 1)
                                                 1
                                               −1
                        f ∈ RV∞ (ζ) ⇒ f          (x) ∈ RV∞ (1/ζ).
Introduction          Bounded Noise                 Extensions     Future work



Regular Variation

      Our analysis is aided by the use of regularly varying functions
      In its basic form, may be viewed as relations such as

                              f (λx)
                        lim          = λζ ∈ (0, ∞) ∀ λ > 0
                        x→∞    f (x)

      We say that f is regularly varying at infinity with index ζ, or
      f ∈ RV∞ (ζ)
      Regularly varying functions have useful properties such as:
                                                     x
             f ∈ RV∞ (ζ) ⇒ F (x) :=                      f (t) dt ∈ RV∞ (ζ + 1)
                                                 1
                                               −1
                        f ∈ RV∞ (ζ) ⇒ f          (x) ∈ RV∞ (1/ζ).
Introduction      Bounded Noise   Extensions   Future work



Outline


  1   Introduction
        Motivation
        Regular Variation

  2   Bounded Noise
       Main Results
       Outline of Proofs

  3   Extensions

  4   Future work
Introduction      Bounded Noise       Extensions         Future work



S.D.E with Markovian switching


  We study the finite-dimensional SDE with Markovian switching

           dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t)          (1)

  where
      Y is a Markov chain with finite irreducible state space S,
      f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
      Lipschitz continuous functions,
      B is an r -dimensional Brownian motion, independent of Y .
  Under these conditions, there is a unique continuous and adapted
  process which satisfies (1).
Introduction      Bounded Noise       Extensions         Future work



S.D.E with Markovian switching


  We study the finite-dimensional SDE with Markovian switching

           dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t)          (1)

  where
      Y is a Markov chain with finite irreducible state space S,
      f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
      Lipschitz continuous functions,
      B is an r -dimensional Brownian motion, independent of Y .
  Under these conditions, there is a unique continuous and adapted
  process which satisfies (1).
Introduction      Bounded Noise       Extensions         Future work



S.D.E with Markovian switching


  We study the finite-dimensional SDE with Markovian switching

           dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t)          (1)

  where
      Y is a Markov chain with finite irreducible state space S,
      f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
      Lipschitz continuous functions,
      B is an r -dimensional Brownian motion, independent of Y .
  Under these conditions, there is a unique continuous and adapted
  process which satisfies (1).
Introduction      Bounded Noise       Extensions         Future work



S.D.E with Markovian switching


  We study the finite-dimensional SDE with Markovian switching

           dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t)          (1)

  where
      Y is a Markov chain with finite irreducible state space S,
      f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
      Lipschitz continuous functions,
      B is an r -dimensional Brownian motion, independent of Y .
  Under these conditions, there is a unique continuous and adapted
  process which satisfies (1).
Introduction      Bounded Noise       Extensions         Future work



S.D.E with Markovian switching


  We study the finite-dimensional SDE with Markovian switching

           dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t)          (1)

  where
      Y is a Markov chain with finite irreducible state space S,
      f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally
      Lipschitz continuous functions,
      B is an r -dimensional Brownian motion, independent of Y .
  Under these conditions, there is a unique continuous and adapted
  process which satisfies (1).
Introduction       Bounded Noise                   Extensions                  Future work



Bounded Noise

  Consider the i th component of our SDE
                                              r
       dXi (t) = fi (X (t), Y (t)) dt +            gij (X (t), Y (t)) dBj (t).
                                             j=1

  We suppose that the noise intensity g is bounded in the sense that
      there exists K2 > 0 s.t. g (x, y )           F   ≤ K2 for all y ∈ S,
                                                                                     2
                                                       r         d
                                                       j=1       i=1 xi gij (x,y )          2
      there exists K1 > 0 s.t. inf       x ∈Rd                     x 2
                                                                                         ≥ K1 .
                                         y ∈S
  By Cauchy-Schwarz we get the following upper and lower estimate
                    2               2      2
                   K1 ≤ g (x, y )   F   ≤ K2 ,         for all y ∈ S.
Introduction       Bounded Noise                   Extensions                  Future work



Bounded Noise

  Consider the i th component of our SDE
                                              r
       dXi (t) = fi (X (t), Y (t)) dt +            gij (X (t), Y (t)) dBj (t).
                                             j=1

  We suppose that the noise intensity g is bounded in the sense that
      there exists K2 > 0 s.t. g (x, y )           F   ≤ K2 for all y ∈ S,
                                                                                     2
                                                       r         d
                                                       j=1       i=1 xi gij (x,y )          2
      there exists K1 > 0 s.t. inf       x ∈Rd                     x 2
                                                                                         ≥ K1 .
                                         y ∈S
  By Cauchy-Schwarz we get the following upper and lower estimate
                    2               2      2
                   K1 ≤ g (x, y )   F   ≤ K2 ,         for all y ∈ S.
Introduction       Bounded Noise                   Extensions                  Future work



Bounded Noise

  Consider the i th component of our SDE
                                              r
       dXi (t) = fi (X (t), Y (t)) dt +            gij (X (t), Y (t)) dBj (t).
                                             j=1

  We suppose that the noise intensity g is bounded in the sense that
      there exists K2 > 0 s.t. g (x, y )           F   ≤ K2 for all y ∈ S,
                                                                                     2
                                                       r         d
                                                       j=1       i=1 xi gij (x,y )          2
      there exists K1 > 0 s.t. inf       x ∈Rd                     x 2
                                                                                         ≥ K1 .
                                         y ∈S
  By Cauchy-Schwarz we get the following upper and lower estimate
                    2               2      2
                   K1 ≤ g (x, y )   F   ≤ K2 ,         for all y ∈ S.
Introduction       Bounded Noise                   Extensions                  Future work



Bounded Noise

  Consider the i th component of our SDE
                                              r
       dXi (t) = fi (X (t), Y (t)) dt +            gij (X (t), Y (t)) dBj (t).
                                             j=1

  We suppose that the noise intensity g is bounded in the sense that
      there exists K2 > 0 s.t. g (x, y )           F   ≤ K2 for all y ∈ S,
                                                                                     2
                                                       r         d
                                                       j=1       i=1 xi gij (x,y )          2
      there exists K1 > 0 s.t. inf       x ∈Rd                     x 2
                                                                                         ≥ K1 .
                                         y ∈S
  By Cauchy-Schwarz we get the following upper and lower estimate
                    2               2      2
                   K1 ≤ g (x, y )   F   ≤ K2 ,         for all y ∈ S.
Introduction       Bounded Noise                   Extensions                  Future work



Bounded Noise

  Consider the i th component of our SDE
                                              r
       dXi (t) = fi (X (t), Y (t)) dt +            gij (X (t), Y (t)) dBj (t).
                                             j=1

  We suppose that the noise intensity g is bounded in the sense that
      there exists K2 > 0 s.t. g (x, y )           F   ≤ K2 for all y ∈ S,
                                                                                     2
                                                       r         d
                                                       j=1       i=1 xi gij (x,y )          2
      there exists K1 > 0 s.t. inf       x ∈Rd                     x 2
                                                                                         ≥ K1 .
                                         y ∈S
  By Cauchy-Schwarz we get the following upper and lower estimate
                    2               2      2
                   K1 ≤ g (x, y )   F   ≤ K2 ,         for all y ∈ S.
Introduction              Bounded Noise     Extensions       Future work



Upper bound

  The strength of the restoring force is characterised by a locally
  Lipschitz continuous function φ : [0, ∞) → (0, ∞) with
  xφ(x) → ∞ as x → ∞, where
                           x, f (x, y )
             lim sup sup                 ≤ −c2 ∈ (−∞, 0).           (2)
               x →∞ y ∈S x φ( x )

  Theorem 1
  Let X be the unique adapted continuous solution satisfying (1).
  Suppose there exists a function φ satisfying (2) and that g satisfies
  the bounded noise conditions. Then X satisfies
                           X (t)
            lim sup        2            ≤ 1, a.s. on Ω ,
              t→∞ Φ−1 K2 (1+ ) log t
                            2c2
                          x
  where Φ(x) =           1    φ(u) du and Ω is an almost sure event.
Introduction              Bounded Noise     Extensions       Future work



Upper bound

  The strength of the restoring force is characterised by a locally
  Lipschitz continuous function φ : [0, ∞) → (0, ∞) with
  xφ(x) → ∞ as x → ∞, where
                           x, f (x, y )
             lim sup sup                 ≤ −c2 ∈ (−∞, 0).           (2)
               x →∞ y ∈S x φ( x )

  Theorem 1
  Let X be the unique adapted continuous solution satisfying (1).
  Suppose there exists a function φ satisfying (2) and that g satisfies
  the bounded noise conditions. Then X satisfies
                           X (t)
            lim sup        2            ≤ 1, a.s. on Ω ,
              t→∞ Φ−1 K2 (1+ ) log t
                            2c2
                          x
  where Φ(x) =           1    φ(u) du and Ω is an almost sure event.
Introduction              Bounded Noise     Extensions       Future work



Upper bound

  The strength of the restoring force is characterised by a locally
  Lipschitz continuous function φ : [0, ∞) → (0, ∞) with
  xφ(x) → ∞ as x → ∞, where
                           x, f (x, y )
             lim sup sup                 ≤ −c2 ∈ (−∞, 0).           (2)
               x →∞ y ∈S x φ( x )

  Theorem 1
  Let X be the unique adapted continuous solution satisfying (1).
  Suppose there exists a function φ satisfying (2) and that g satisfies
  the bounded noise conditions. Then X satisfies
                           X (t)
            lim sup        2            ≤ 1, a.s. on Ω ,
              t→∞ Φ−1 K2 (1+ ) log t
                            2c2
                          x
  where Φ(x) =           1    φ(u) du and Ω is an almost sure event.
Introduction                  Bounded Noise        Extensions     Future work



Lower bound

  Moreover, we ensure that the degree of non-linearity in f is
  characterised by φ also, by means of the assumption
                                        | x, f (x, y ) |
                lim sup sup                                ≤ c1 ∈ (0, ∞).          (3)
                  x →∞           y ∈S     x φ( x )

  Theorem 2
  Let X be the unique adapted continuous solution satisfying (1).
  Suppose there exists a function φ satisfying (3) and that g satisfies
  the bounded noise conditions. Then X satisfies
                           X (t)
            lim sup        2            ≥ 1, a.s. on Ω ,
              t→∞ Φ−1 K1 (1− ) log t
                            2c1
                          x
  where Φ(x) =           1    φ(u) du and Ω is an almost sure event.
Introduction                  Bounded Noise        Extensions     Future work



Lower bound

  Moreover, we ensure that the degree of non-linearity in f is
  characterised by φ also, by means of the assumption
                                        | x, f (x, y ) |
                lim sup sup                                ≤ c1 ∈ (0, ∞).          (3)
                  x →∞           y ∈S     x φ( x )

  Theorem 2
  Let X be the unique adapted continuous solution satisfying (1).
  Suppose there exists a function φ satisfying (3) and that g satisfies
  the bounded noise conditions. Then X satisfies
                           X (t)
            lim sup        2            ≥ 1, a.s. on Ω ,
              t→∞ Φ−1 K1 (1− ) log t
                            2c1
                          x
  where Φ(x) =           1    φ(u) du and Ω is an almost sure event.
Introduction                  Bounded Noise        Extensions     Future work



Lower bound

  Moreover, we ensure that the degree of non-linearity in f is
  characterised by φ also, by means of the assumption
                                        | x, f (x, y ) |
                lim sup sup                                ≤ c1 ∈ (0, ∞).          (3)
                  x →∞           y ∈S     x φ( x )

  Theorem 2
  Let X be the unique adapted continuous solution satisfying (1).
  Suppose there exists a function φ satisfying (3) and that g satisfies
  the bounded noise conditions. Then X satisfies
                           X (t)
            lim sup        2            ≥ 1, a.s. on Ω ,
              t→∞ Φ−1 K1 (1− ) log t
                            2c1
                          x
  where Φ(x) =           1    φ(u) du and Ω is an almost sure event.
Introduction              Bounded Noise      Extensions             Future work



Growth rate of large fluctuations

  Taking both theorems together, in the special case where φ is a
  regularly varying function, we get the following:
  Theorem 3
  Let X be the unique adapted continuous solution satisfying (1).
  Suppose there exists a function φ ∈ RV∞ (ζ) satisfying (2) and (3)
  and that g satisfies the bounded noise conditions. Then X
  satisfies
                    1                                               1
           K12     ζ+1                  X (t)          K22         ζ+1
                             ≤ lim sup −1      ≤                         a.s.,
           2c1                   t→∞ Φ (log t)         2c2
                         x
  where Φ(x) =          1    φ(u) du ∈ RV∞ (ζ + 1).
Introduction              Bounded Noise      Extensions             Future work



Growth rate of large fluctuations

  Taking both theorems together, in the special case where φ is a
  regularly varying function, we get the following:
  Theorem 3
  Let X be the unique adapted continuous solution satisfying (1).
  Suppose there exists a function φ ∈ RV∞ (ζ) satisfying (2) and (3)
  and that g satisfies the bounded noise conditions. Then X
  satisfies
                    1                                               1
           K12     ζ+1                  X (t)          K22         ζ+1
                             ≤ lim sup −1      ≤                         a.s.,
           2c1                   t→∞ Φ (log t)         2c2
                         x
  where Φ(x) =          1    φ(u) du ∈ RV∞ (ζ + 1).
Introduction              Bounded Noise      Extensions             Future work



Growth rate of large fluctuations

  Taking both theorems together, in the special case where φ is a
  regularly varying function, we get the following:
  Theorem 3
  Let X be the unique adapted continuous solution satisfying (1).
  Suppose there exists a function φ ∈ RV∞ (ζ) satisfying (2) and (3)
  and that g satisfies the bounded noise conditions. Then X
  satisfies
                    1                                               1
           K12     ζ+1                  X (t)          K22         ζ+1
                             ≤ lim sup −1      ≤                         a.s.,
           2c1                   t→∞ Φ (log t)         2c2
                         x
  where Φ(x) =          1    φ(u) du ∈ RV∞ (ζ + 1).
Introduction         Bounded Noise    Extensions       Future work



Comments and Example

     Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
     expected, weak mean-reversion results in large fluctuations.

     Take a simple one-dimensional Ornstein Uhlenbeck process

                       dX (t) = −cX (t)dt + K dB(t)

     where, in our notation,
          c1 = c2 = c and K1 = K2 = K ,
          φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
                   2              √
          Φ(x) = x2 and Φ−1 (x) = 2x
     Then applying theorem 3 recovers the well known result
                        1                       1          1
               K12              |X (t)|    K 2 2 K2 ζ+1
                                                    2
                       lim sup √sup X (t)
                       ζ+1
                           ≤ lim         =     ≤ , a.s.
               2c1       t→∞ t→∞ Φ−1 (log2c
                                 2 log t    t)    2c2
Introduction         Bounded Noise    Extensions       Future work



Comments and Example

     Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
     expected, weak mean-reversion results in large fluctuations.

     Take a simple one-dimensional Ornstein Uhlenbeck process

                       dX (t) = −cX (t)dt + K dB(t)

     where, in our notation,
          c1 = c2 = c and K1 = K2 = K ,
          φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
                   2              √
          Φ(x) = x2 and Φ−1 (x) = 2x
     Then applying theorem 3 recovers the well known result
                        1                       1          1
               K12              |X (t)|    K 2 2 K2 ζ+1
                                                    2
                       lim sup √sup X (t)
                       ζ+1
                           ≤ lim         =     ≤ , a.s.
               2c1       t→∞ t→∞ Φ−1 (log2c
                                 2 log t    t)    2c2
Introduction         Bounded Noise    Extensions       Future work



Comments and Example

     Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
     expected, weak mean-reversion results in large fluctuations.

     Take a simple one-dimensional Ornstein Uhlenbeck process

                       dX (t) = −cX (t)dt + K dB(t)

     where, in our notation,
          c1 = c2 = c and K1 = K2 = K ,
          φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
                   2              √
          Φ(x) = x2 and Φ−1 (x) = 2x
     Then applying theorem 3 recovers the well known result
                        1                       1          1
               K12              |X (t)|    K 2 2 K2 ζ+1
                                                    2
                       lim sup √sup X (t)
                       ζ+1
                           ≤ lim         =     ≤ , a.s.
               2c1       t→∞ t→∞ Φ−1 (log2c
                                 2 log t    t)    2c2
Introduction         Bounded Noise    Extensions       Future work



Comments and Example

     Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
     expected, weak mean-reversion results in large fluctuations.

     Take a simple one-dimensional Ornstein Uhlenbeck process

                       dX (t) = −cX (t)dt + K dB(t)

     where, in our notation,
          c1 = c2 = c and K1 = K2 = K ,
          φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
                   2              √
          Φ(x) = x2 and Φ−1 (x) = 2x
     Then applying theorem 3 recovers the well known result
                        1                       1          1
               K12              |X (t)|    K 2 2 K2 ζ+1
                                                    2
                       lim sup √sup X (t)
                       ζ+1
                           ≤ lim         =     ≤ , a.s.
               2c1       t→∞ t→∞ Φ−1 (log2c
                                 2 log t    t)    2c2
Introduction         Bounded Noise    Extensions       Future work



Comments and Example

     Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
     expected, weak mean-reversion results in large fluctuations.

     Take a simple one-dimensional Ornstein Uhlenbeck process

                       dX (t) = −cX (t)dt + K dB(t)

     where, in our notation,
          c1 = c2 = c and K1 = K2 = K ,
          φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
                   2              √
          Φ(x) = x2 and Φ−1 (x) = 2x
     Then applying theorem 3 recovers the well known result
                        1                       1          1
               K12              |X (t)|    K 2 2 K2 ζ+1
                                                    2
                       lim sup √sup X (t)
                       ζ+1
                           ≤ lim         =     ≤ , a.s.
               2c1       t→∞ t→∞ Φ−1 (log2c
                                 2 log t    t)    2c2
Introduction         Bounded Noise    Extensions       Future work



Comments and Example

     Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
     expected, weak mean-reversion results in large fluctuations.

     Take a simple one-dimensional Ornstein Uhlenbeck process

                       dX (t) = −cX (t)dt + K dB(t)

     where, in our notation,
          c1 = c2 = c and K1 = K2 = K ,
          φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
                   2              √
          Φ(x) = x2 and Φ−1 (x) = 2x
     Then applying theorem 3 recovers the well known result
                        1                       1          1
               K12              |X (t)|    K 2 2 K2 ζ+1
                                                    2
                       lim sup √sup X (t)
                       ζ+1
                           ≤ lim         =     ≤ , a.s.
               2c1       t→∞ t→∞ Φ−1 (log2c
                                 2 log t    t)    2c2
Introduction         Bounded Noise    Extensions       Future work



Comments and Example

     Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
     expected, weak mean-reversion results in large fluctuations.

     Take a simple one-dimensional Ornstein Uhlenbeck process

                       dX (t) = −cX (t)dt + K dB(t)

     where, in our notation,
          c1 = c2 = c and K1 = K2 = K ,
          φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
                   2              √
          Φ(x) = x2 and Φ−1 (x) = 2x
     Then applying theorem 3 recovers the well known result
                        1                       1          1
               K12              |X (t)|    K 2 2 K2 ζ+1
                                                    2
                       lim sup √sup X (t)
                       ζ+1
                           ≤ lim         =     ≤ , a.s.
               2c1       t→∞ t→∞ Φ−1 (log2c
                                 2 log t    t)    2c2
Introduction         Bounded Noise    Extensions       Future work



Comments and Example

     Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
     expected, weak mean-reversion results in large fluctuations.

     Take a simple one-dimensional Ornstein Uhlenbeck process

                       dX (t) = −cX (t)dt + K dB(t)

     where, in our notation,
          c1 = c2 = c and K1 = K2 = K ,
          φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
                   2              √
          Φ(x) = x2 and Φ−1 (x) = 2x
     Then applying theorem 3 recovers the well known result
                        1                       1          1
               K12              |X (t)|    K 2 2 K2 ζ+1
                                                    2
                       lim sup √sup X (t)
                       ζ+1
                           ≤ lim         =     ≤ , a.s.
               2c1       t→∞ t→∞ Φ−1 (log2c
                                 2 log t    t)    2c2
Introduction         Bounded Noise    Extensions       Future work



Comments and Example

     Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as
     expected, weak mean-reversion results in large fluctuations.

     Take a simple one-dimensional Ornstein Uhlenbeck process

                       dX (t) = −cX (t)dt + K dB(t)

     where, in our notation,
          c1 = c2 = c and K1 = K2 = K ,
          φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1
                   2              √
          Φ(x) = x2 and Φ−1 (x) = 2x
     Then applying theorem 3 recovers the well known result
                        1                       1          1
               K12              |X (t)|    K 2 2 K2 ζ+1
                                                    2
                       lim sup √sup X (t)
                       ζ+1
                           ≤ lim         =     ≤ , a.s.
               2c1       t→∞ t→∞ Φ−1 (log2c
                                 2 log t    t)    2c2
Introduction     Bounded Noise      Extensions      Future work



Outline of proofs

      Apply time–change and Itˆ transformation to reduce
                                 o
      dimensionality and facilitate the use of one-dimensional theory,

      Use the bounding conditions on f and g to construct upper
      and lower comparison processes whose dynamics are not
      determined by the switching parameter Y ,

      Use stochastic comparison theorem to prove that our
      transformed process is bounded above and below by the
      comparison processes for all t ≥ 0 a.s.,

      The large deviations of the comparison processes are
      determined by means of a classical theorem of Motoo.
Introduction     Bounded Noise      Extensions      Future work



Outline of proofs

      Apply time–change and Itˆ transformation to reduce
                                 o
      dimensionality and facilitate the use of one-dimensional theory,

      Use the bounding conditions on f and g to construct upper
      and lower comparison processes whose dynamics are not
      determined by the switching parameter Y ,

      Use stochastic comparison theorem to prove that our
      transformed process is bounded above and below by the
      comparison processes for all t ≥ 0 a.s.,

      The large deviations of the comparison processes are
      determined by means of a classical theorem of Motoo.
Introduction     Bounded Noise      Extensions      Future work



Outline of proofs

      Apply time–change and Itˆ transformation to reduce
                                 o
      dimensionality and facilitate the use of one-dimensional theory,

      Use the bounding conditions on f and g to construct upper
      and lower comparison processes whose dynamics are not
      determined by the switching parameter Y ,

      Use stochastic comparison theorem to prove that our
      transformed process is bounded above and below by the
      comparison processes for all t ≥ 0 a.s.,

      The large deviations of the comparison processes are
      determined by means of a classical theorem of Motoo.
Introduction     Bounded Noise      Extensions      Future work



Outline of proofs

      Apply time–change and Itˆ transformation to reduce
                                 o
      dimensionality and facilitate the use of one-dimensional theory,

      Use the bounding conditions on f and g to construct upper
      and lower comparison processes whose dynamics are not
      determined by the switching parameter Y ,

      Use stochastic comparison theorem to prove that our
      transformed process is bounded above and below by the
      comparison processes for all t ≥ 0 a.s.,

      The large deviations of the comparison processes are
      determined by means of a classical theorem of Motoo.
Introduction      Bounded Noise   Extensions   Future work



Outline


  1   Introduction
        Motivation
        Regular Variation

  2   Bounded Noise
       Main Results
       Outline of Proofs

  3   Extensions

  4   Future work
Introduction     Bounded Noise     Extensions      Future work



Extensions

      We can impose a rate of nonlinear growth of g with x as
       x → ∞ through an increasing scalar function γ.

      Then the growth rate of the deviations are of the order
                                 x
      Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du.

      Using norm equivalence in Rd we can study the size of the
      largest component of the system rather than the norm, to get
      upper and lower bounds of the form

             1            max1≤j≤d |Xj (t)|
        0 < √ α ≤ lim sup                   ≤ β ≤ +∞,           a.s.
              d     t→∞        ρ(t)

      We can extend the state space of the Markov chain to a
      countable state space.
Introduction     Bounded Noise     Extensions      Future work



Extensions

      We can impose a rate of nonlinear growth of g with x as
       x → ∞ through an increasing scalar function γ.

      Then the growth rate of the deviations are of the order
                                 x
      Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du.

      Using norm equivalence in Rd we can study the size of the
      largest component of the system rather than the norm, to get
      upper and lower bounds of the form

             1            max1≤j≤d |Xj (t)|
        0 < √ α ≤ lim sup                   ≤ β ≤ +∞,           a.s.
              d     t→∞        ρ(t)

      We can extend the state space of the Markov chain to a
      countable state space.
Introduction     Bounded Noise     Extensions      Future work



Extensions

      We can impose a rate of nonlinear growth of g with x as
       x → ∞ through an increasing scalar function γ.

      Then the growth rate of the deviations are of the order
                                 x
      Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du.

      Using norm equivalence in Rd we can study the size of the
      largest component of the system rather than the norm, to get
      upper and lower bounds of the form

             1            max1≤j≤d |Xj (t)|
        0 < √ α ≤ lim sup                   ≤ β ≤ +∞,           a.s.
              d     t→∞        ρ(t)

      We can extend the state space of the Markov chain to a
      countable state space.
Introduction     Bounded Noise     Extensions      Future work



Extensions

      We can impose a rate of nonlinear growth of g with x as
       x → ∞ through an increasing scalar function γ.

      Then the growth rate of the deviations are of the order
                                 x
      Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du.

      Using norm equivalence in Rd we can study the size of the
      largest component of the system rather than the norm, to get
      upper and lower bounds of the form

             1            max1≤j≤d |Xj (t)|
        0 < √ α ≤ lim sup                   ≤ β ≤ +∞,           a.s.
              d     t→∞        ρ(t)

      We can extend the state space of the Markov chain to a
      countable state space.
Introduction      Bounded Noise   Extensions   Future work



Outline


  1   Introduction
        Motivation
        Regular Variation

  2   Bounded Noise
       Main Results
       Outline of Proofs

  3   Extensions

  4   Future work
Introduction     Bounded Noise     Extensions        Future work



Future work

     To study finite-dimensional processes which are always
     positive, e.g. the Cox Ingersoll Ross (CIR) model.

     To investigate the growth, explosion and simulation of our
     developed results. This will involve stochastic numerical
     analysis and Monte Carlo simulation.

     This simulation will allow us to determine whether the
     qualitative features of our dynamical systems are preserved
     when analysed in discrete time.

     To investigate whether our results can be extended to SDEs
     with delay.
Introduction     Bounded Noise     Extensions        Future work



Future work

     To study finite-dimensional processes which are always
     positive, e.g. the Cox Ingersoll Ross (CIR) model.

     To investigate the growth, explosion and simulation of our
     developed results. This will involve stochastic numerical
     analysis and Monte Carlo simulation.

     This simulation will allow us to determine whether the
     qualitative features of our dynamical systems are preserved
     when analysed in discrete time.

     To investigate whether our results can be extended to SDEs
     with delay.
Introduction     Bounded Noise     Extensions        Future work



Future work

     To study finite-dimensional processes which are always
     positive, e.g. the Cox Ingersoll Ross (CIR) model.

     To investigate the growth, explosion and simulation of our
     developed results. This will involve stochastic numerical
     analysis and Monte Carlo simulation.

     This simulation will allow us to determine whether the
     qualitative features of our dynamical systems are preserved
     when analysed in discrete time.

     To investigate whether our results can be extended to SDEs
     with delay.
Introduction     Bounded Noise     Extensions        Future work



Future work

     To study finite-dimensional processes which are always
     positive, e.g. the Cox Ingersoll Ross (CIR) model.

     To investigate the growth, explosion and simulation of our
     developed results. This will involve stochastic numerical
     analysis and Monte Carlo simulation.

     This simulation will allow us to determine whether the
     qualitative features of our dynamical systems are preserved
     when analysed in discrete time.

     To investigate whether our results can be extended to SDEs
     with delay.

More Related Content

What's hot

Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605ketanaka
 
Fatigue damage in solder joint interconnects - presentation
Fatigue damage in solder joint interconnects - presentationFatigue damage in solder joint interconnects - presentation
Fatigue damage in solder joint interconnects - presentationDr. Adnan Judeh (Abdul-Baqi)
 
Aurelian Isar - Decoherence And Transition From Quantum To Classical In Open ...
Aurelian Isar - Decoherence And Transition From Quantum To Classical In Open ...Aurelian Isar - Decoherence And Transition From Quantum To Classical In Open ...
Aurelian Isar - Decoherence And Transition From Quantum To Classical In Open ...SEENET-MTP
 
An Altering Distance Function in Fuzzy Metric Fixed Point Theorems
An Altering Distance Function in Fuzzy Metric Fixed Point TheoremsAn Altering Distance Function in Fuzzy Metric Fixed Point Theorems
An Altering Distance Function in Fuzzy Metric Fixed Point Theoremsijtsrd
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas EberleBigMC
 
Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2Fabian Pedregosa
 
Stochastic Approximation and Simulated Annealing
Stochastic Approximation and Simulated AnnealingStochastic Approximation and Simulated Annealing
Stochastic Approximation and Simulated AnnealingSSA KPI
 
Generalization of Tensor Factorization and Applications
Generalization of Tensor Factorization and ApplicationsGeneralization of Tensor Factorization and Applications
Generalization of Tensor Factorization and ApplicationsKohei Hayashi
 
Bayesian regression models and treed Gaussian process models
Bayesian regression models and treed Gaussian process modelsBayesian regression models and treed Gaussian process models
Bayesian regression models and treed Gaussian process modelsTommaso Rigon
 
Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodSSA KPI
 
Stability Result of Iterative procedure in normed space
Stability Result of Iterative procedure in normed spaceStability Result of Iterative procedure in normed space
Stability Result of Iterative procedure in normed spaceKomal Goyal
 
Hybrid Atlas Models of Financial Equity Market
Hybrid Atlas Models of Financial Equity MarketHybrid Atlas Models of Financial Equity Market
Hybrid Atlas Models of Financial Equity Markettomoyukiichiba
 
Random Matrix Theory and Machine Learning - Part 1
Random Matrix Theory and Machine Learning - Part 1Random Matrix Theory and Machine Learning - Part 1
Random Matrix Theory and Machine Learning - Part 1Fabian Pedregosa
 
Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4Fabian Pedregosa
 

What's hot (20)

Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
 
Fatigue damage in solder joint interconnects - presentation
Fatigue damage in solder joint interconnects - presentationFatigue damage in solder joint interconnects - presentation
Fatigue damage in solder joint interconnects - presentation
 
Aurelian Isar - Decoherence And Transition From Quantum To Classical In Open ...
Aurelian Isar - Decoherence And Transition From Quantum To Classical In Open ...Aurelian Isar - Decoherence And Transition From Quantum To Classical In Open ...
Aurelian Isar - Decoherence And Transition From Quantum To Classical In Open ...
 
An Altering Distance Function in Fuzzy Metric Fixed Point Theorems
An Altering Distance Function in Fuzzy Metric Fixed Point TheoremsAn Altering Distance Function in Fuzzy Metric Fixed Point Theorems
An Altering Distance Function in Fuzzy Metric Fixed Point Theorems
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas Eberle
 
Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2Random Matrix Theory and Machine Learning - Part 2
Random Matrix Theory and Machine Learning - Part 2
 
Stochastic Approximation and Simulated Annealing
Stochastic Approximation and Simulated AnnealingStochastic Approximation and Simulated Annealing
Stochastic Approximation and Simulated Annealing
 
Generalization of Tensor Factorization and Applications
Generalization of Tensor Factorization and ApplicationsGeneralization of Tensor Factorization and Applications
Generalization of Tensor Factorization and Applications
 
03 image transform
03 image transform03 image transform
03 image transform
 
Bayesian regression models and treed Gaussian process models
Bayesian regression models and treed Gaussian process modelsBayesian regression models and treed Gaussian process models
Bayesian regression models and treed Gaussian process models
 
Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo Method
 
Md2521102111
Md2521102111Md2521102111
Md2521102111
 
Bird’s-eye view of Gaussian harmonic analysis
Bird’s-eye view of Gaussian harmonic analysisBird’s-eye view of Gaussian harmonic analysis
Bird’s-eye view of Gaussian harmonic analysis
 
Stability Result of Iterative procedure in normed space
Stability Result of Iterative procedure in normed spaceStability Result of Iterative procedure in normed space
Stability Result of Iterative procedure in normed space
 
Solved problems
Solved problemsSolved problems
Solved problems
 
Hybrid Atlas Models of Financial Equity Market
Hybrid Atlas Models of Financial Equity MarketHybrid Atlas Models of Financial Equity Market
Hybrid Atlas Models of Financial Equity Market
 
Random Matrix Theory and Machine Learning - Part 1
Random Matrix Theory and Machine Learning - Part 1Random Matrix Theory and Machine Learning - Part 1
Random Matrix Theory and Machine Learning - Part 1
 
Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4Random Matrix Theory and Machine Learning - Part 4
Random Matrix Theory and Machine Learning - Part 4
 
03 lect5randomproc
03 lect5randomproc03 lect5randomproc
03 lect5randomproc
 

Similar to Chester Nov08 Terry Lynch

MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...Chiheb Ben Hammouda
 
A Review on Image Denoising using Wavelet Transform
A Review on Image Denoising using Wavelet TransformA Review on Image Denoising using Wavelet Transform
A Review on Image Denoising using Wavelet Transformijsrd.com
 
International journal of engineering and mathematical modelling vol2 no3_2015_2
International journal of engineering and mathematical modelling vol2 no3_2015_2International journal of engineering and mathematical modelling vol2 no3_2015_2
International journal of engineering and mathematical modelling vol2 no3_2015_2IJEMM
 
Module v sp
Module v spModule v sp
Module v spVijaya79
 
Digital Signal Processing[ECEG-3171]-Ch1_L06
Digital Signal Processing[ECEG-3171]-Ch1_L06Digital Signal Processing[ECEG-3171]-Ch1_L06
Digital Signal Processing[ECEG-3171]-Ch1_L06Rediet Moges
 
A current perspectives of corrected operator splitting (os) for systems
A current perspectives of corrected operator splitting (os) for systemsA current perspectives of corrected operator splitting (os) for systems
A current perspectives of corrected operator splitting (os) for systemsAlexander Decker
 
Lecture5 Signal and Systems
Lecture5 Signal and SystemsLecture5 Signal and Systems
Lecture5 Signal and Systemsbabak danyal
 
Bachelor_Defense
Bachelor_DefenseBachelor_Defense
Bachelor_DefenseTeja Turk
 
Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Asma Ben Slimene
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Asma Ben Slimene
 
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALESNONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALESTahia ZERIZER
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorSSA KPI
 

Similar to Chester Nov08 Terry Lynch (20)

MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
MCQMC 2020 talk: Importance Sampling for a Robust and Efficient Multilevel Mo...
 
Congrès SMAI 2019
Congrès SMAI 2019Congrès SMAI 2019
Congrès SMAI 2019
 
KAUST_talk_short.pdf
KAUST_talk_short.pdfKAUST_talk_short.pdf
KAUST_talk_short.pdf
 
PhD defense talk slides
PhD  defense talk slidesPhD  defense talk slides
PhD defense talk slides
 
Continuous and Discrete
Continuous and DiscreteContinuous and Discrete
Continuous and Discrete
 
Signal Processing Homework Help
Signal Processing Homework HelpSignal Processing Homework Help
Signal Processing Homework Help
 
A Review on Image Denoising using Wavelet Transform
A Review on Image Denoising using Wavelet TransformA Review on Image Denoising using Wavelet Transform
A Review on Image Denoising using Wavelet Transform
 
International journal of engineering and mathematical modelling vol2 no3_2015_2
International journal of engineering and mathematical modelling vol2 no3_2015_2International journal of engineering and mathematical modelling vol2 no3_2015_2
International journal of engineering and mathematical modelling vol2 no3_2015_2
 
Lecture 3 sapienza 2017
Lecture 3 sapienza 2017Lecture 3 sapienza 2017
Lecture 3 sapienza 2017
 
Module v sp
Module v spModule v sp
Module v sp
 
Digital Signal Processing[ECEG-3171]-Ch1_L06
Digital Signal Processing[ECEG-3171]-Ch1_L06Digital Signal Processing[ECEG-3171]-Ch1_L06
Digital Signal Processing[ECEG-3171]-Ch1_L06
 
A current perspectives of corrected operator splitting (os) for systems
A current perspectives of corrected operator splitting (os) for systemsA current perspectives of corrected operator splitting (os) for systems
A current perspectives of corrected operator splitting (os) for systems
 
Lecture5 Signal and Systems
Lecture5 Signal and SystemsLecture5 Signal and Systems
Lecture5 Signal and Systems
 
Bachelor_Defense
Bachelor_DefenseBachelor_Defense
Bachelor_Defense
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Laplace transform
Laplace transformLaplace transform
Laplace transform
 
Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...
 
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALESNONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
NONLINEAR DIFFERENCE EQUATIONS WITH SMALL PARAMETERS OF MULTIPLE SCALES
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial Sector
 

Recently uploaded

High Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service NashikHigh Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
TEST BANK For Corporate Finance, 13th Edition By Stephen Ross, Randolph Weste...
TEST BANK For Corporate Finance, 13th Edition By Stephen Ross, Randolph Weste...TEST BANK For Corporate Finance, 13th Edition By Stephen Ross, Randolph Weste...
TEST BANK For Corporate Finance, 13th Edition By Stephen Ross, Randolph Weste...ssifa0344
 
Stock Market Brief Deck (Under Pressure).pdf
Stock Market Brief Deck (Under Pressure).pdfStock Market Brief Deck (Under Pressure).pdf
Stock Market Brief Deck (Under Pressure).pdfMichael Silva
 
Booking open Available Pune Call Girls Shivane 6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Shivane  6297143586 Call Hot Indian Gi...Booking open Available Pune Call Girls Shivane  6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Shivane 6297143586 Call Hot Indian Gi...Call Girls in Nagpur High Profile
 
Call Girls Service Nagpur Maya Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Maya Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Maya Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Maya Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
The Economic History of the U.S. Lecture 19.pdf
The Economic History of the U.S. Lecture 19.pdfThe Economic History of the U.S. Lecture 19.pdf
The Economic History of the U.S. Lecture 19.pdfGale Pooley
 
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779Delhi Call girls
 
Malad Call Girl in Services 9892124323 | ₹,4500 With Room Free Delivery
Malad Call Girl in Services  9892124323 | ₹,4500 With Room Free DeliveryMalad Call Girl in Services  9892124323 | ₹,4500 With Room Free Delivery
Malad Call Girl in Services 9892124323 | ₹,4500 With Room Free DeliveryPooja Nehwal
 
The Economic History of the U.S. Lecture 20.pdf
The Economic History of the U.S. Lecture 20.pdfThe Economic History of the U.S. Lecture 20.pdf
The Economic History of the U.S. Lecture 20.pdfGale Pooley
 
The Economic History of the U.S. Lecture 21.pdf
The Economic History of the U.S. Lecture 21.pdfThe Economic History of the U.S. Lecture 21.pdf
The Economic History of the U.S. Lecture 21.pdfGale Pooley
 
VIP Call Girls LB Nagar ( Hyderabad ) Phone 8250192130 | ₹5k To 25k With Room...
VIP Call Girls LB Nagar ( Hyderabad ) Phone 8250192130 | ₹5k To 25k With Room...VIP Call Girls LB Nagar ( Hyderabad ) Phone 8250192130 | ₹5k To 25k With Room...
VIP Call Girls LB Nagar ( Hyderabad ) Phone 8250192130 | ₹5k To 25k With Room...Suhani Kapoor
 
The Economic History of the U.S. Lecture 30.pdf
The Economic History of the U.S. Lecture 30.pdfThe Economic History of the U.S. Lecture 30.pdf
The Economic History of the U.S. Lecture 30.pdfGale Pooley
 
(ANIKA) Budhwar Peth Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANIKA) Budhwar Peth Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANIKA) Budhwar Peth Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANIKA) Budhwar Peth Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
Booking open Available Pune Call Girls Wadgaon Sheri 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Wadgaon Sheri  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Wadgaon Sheri  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Wadgaon Sheri 6297143586 Call Hot Ind...Call Girls in Nagpur High Profile
 
Call Girls Koregaon Park Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Koregaon Park Call Me 7737669865 Budget Friendly No Advance BookingCall Girls Koregaon Park Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Koregaon Park Call Me 7737669865 Budget Friendly No Advance Bookingroncy bisnoi
 
20240429 Calibre April 2024 Investor Presentation.pdf
20240429 Calibre April 2024 Investor Presentation.pdf20240429 Calibre April 2024 Investor Presentation.pdf
20240429 Calibre April 2024 Investor Presentation.pdfAdnet Communications
 
03_Emmanuel Ndiaye_Degroof Petercam.pptx
03_Emmanuel Ndiaye_Degroof Petercam.pptx03_Emmanuel Ndiaye_Degroof Petercam.pptx
03_Emmanuel Ndiaye_Degroof Petercam.pptxFinTech Belgium
 
Top Rated Pune Call Girls Viman Nagar ⟟ 6297143586 ⟟ Call Me For Genuine Sex...
Top Rated  Pune Call Girls Viman Nagar ⟟ 6297143586 ⟟ Call Me For Genuine Sex...Top Rated  Pune Call Girls Viman Nagar ⟟ 6297143586 ⟟ Call Me For Genuine Sex...
Top Rated Pune Call Girls Viman Nagar ⟟ 6297143586 ⟟ Call Me For Genuine Sex...Call Girls in Nagpur High Profile
 

Recently uploaded (20)

High Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service NashikHigh Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
High Class Call Girls Nashik Maya 7001305949 Independent Escort Service Nashik
 
(Vedika) Low Rate Call Girls in Pune Call Now 8250077686 Pune Escorts 24x7
(Vedika) Low Rate Call Girls in Pune Call Now 8250077686 Pune Escorts 24x7(Vedika) Low Rate Call Girls in Pune Call Now 8250077686 Pune Escorts 24x7
(Vedika) Low Rate Call Girls in Pune Call Now 8250077686 Pune Escorts 24x7
 
TEST BANK For Corporate Finance, 13th Edition By Stephen Ross, Randolph Weste...
TEST BANK For Corporate Finance, 13th Edition By Stephen Ross, Randolph Weste...TEST BANK For Corporate Finance, 13th Edition By Stephen Ross, Randolph Weste...
TEST BANK For Corporate Finance, 13th Edition By Stephen Ross, Randolph Weste...
 
Stock Market Brief Deck (Under Pressure).pdf
Stock Market Brief Deck (Under Pressure).pdfStock Market Brief Deck (Under Pressure).pdf
Stock Market Brief Deck (Under Pressure).pdf
 
Booking open Available Pune Call Girls Shivane 6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Shivane  6297143586 Call Hot Indian Gi...Booking open Available Pune Call Girls Shivane  6297143586 Call Hot Indian Gi...
Booking open Available Pune Call Girls Shivane 6297143586 Call Hot Indian Gi...
 
Call Girls Service Nagpur Maya Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Maya Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Maya Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Maya Call 7001035870 Meet With Nagpur Escorts
 
The Economic History of the U.S. Lecture 19.pdf
The Economic History of the U.S. Lecture 19.pdfThe Economic History of the U.S. Lecture 19.pdf
The Economic History of the U.S. Lecture 19.pdf
 
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
Best VIP Call Girls Noida Sector 18 Call Me: 8448380779
 
Malad Call Girl in Services 9892124323 | ₹,4500 With Room Free Delivery
Malad Call Girl in Services  9892124323 | ₹,4500 With Room Free DeliveryMalad Call Girl in Services  9892124323 | ₹,4500 With Room Free Delivery
Malad Call Girl in Services 9892124323 | ₹,4500 With Room Free Delivery
 
The Economic History of the U.S. Lecture 20.pdf
The Economic History of the U.S. Lecture 20.pdfThe Economic History of the U.S. Lecture 20.pdf
The Economic History of the U.S. Lecture 20.pdf
 
The Economic History of the U.S. Lecture 21.pdf
The Economic History of the U.S. Lecture 21.pdfThe Economic History of the U.S. Lecture 21.pdf
The Economic History of the U.S. Lecture 21.pdf
 
VIP Call Girls LB Nagar ( Hyderabad ) Phone 8250192130 | ₹5k To 25k With Room...
VIP Call Girls LB Nagar ( Hyderabad ) Phone 8250192130 | ₹5k To 25k With Room...VIP Call Girls LB Nagar ( Hyderabad ) Phone 8250192130 | ₹5k To 25k With Room...
VIP Call Girls LB Nagar ( Hyderabad ) Phone 8250192130 | ₹5k To 25k With Room...
 
The Economic History of the U.S. Lecture 30.pdf
The Economic History of the U.S. Lecture 30.pdfThe Economic History of the U.S. Lecture 30.pdf
The Economic History of the U.S. Lecture 30.pdf
 
Veritas Interim Report 1 January–31 March 2024
Veritas Interim Report 1 January–31 March 2024Veritas Interim Report 1 January–31 March 2024
Veritas Interim Report 1 January–31 March 2024
 
(ANIKA) Budhwar Peth Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANIKA) Budhwar Peth Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANIKA) Budhwar Peth Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANIKA) Budhwar Peth Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
Booking open Available Pune Call Girls Wadgaon Sheri 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Wadgaon Sheri  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Wadgaon Sheri  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Wadgaon Sheri 6297143586 Call Hot Ind...
 
Call Girls Koregaon Park Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Koregaon Park Call Me 7737669865 Budget Friendly No Advance BookingCall Girls Koregaon Park Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Koregaon Park Call Me 7737669865 Budget Friendly No Advance Booking
 
20240429 Calibre April 2024 Investor Presentation.pdf
20240429 Calibre April 2024 Investor Presentation.pdf20240429 Calibre April 2024 Investor Presentation.pdf
20240429 Calibre April 2024 Investor Presentation.pdf
 
03_Emmanuel Ndiaye_Degroof Petercam.pptx
03_Emmanuel Ndiaye_Degroof Petercam.pptx03_Emmanuel Ndiaye_Degroof Petercam.pptx
03_Emmanuel Ndiaye_Degroof Petercam.pptx
 
Top Rated Pune Call Girls Viman Nagar ⟟ 6297143586 ⟟ Call Me For Genuine Sex...
Top Rated  Pune Call Girls Viman Nagar ⟟ 6297143586 ⟟ Call Me For Genuine Sex...Top Rated  Pune Call Girls Viman Nagar ⟟ 6297143586 ⟟ Call Me For Genuine Sex...
Top Rated Pune Call Girls Viman Nagar ⟟ 6297143586 ⟟ Call Me For Genuine Sex...
 

Chester Nov08 Terry Lynch

  • 1. Introduction Bounded Noise Extensions Future work On the Growth of the Extreme Fluctuations of SDEs with Markovian Switching Terry Lynch (joint work with Dr. John Appleby) Dublin City University, Ireland. Leverhulme International Network University of Chester, UK. November 7th 2008 Supported by the Irish Research Council for Science, Engineering and Technology
  • 2. Introduction Bounded Noise Extensions Future work Outline 1 Introduction Motivation Regular Variation 2 Bounded Noise Main Results Outline of Proofs 3 Extensions 4 Future work
  • 3. Introduction Bounded Noise Extensions Future work Outline 1 Introduction Motivation Regular Variation 2 Bounded Noise Main Results Outline of Proofs 3 Extensions 4 Future work
  • 4. Introduction Bounded Noise Extensions Future work Recap - one year ago Theorem 1 Let X be the unique adapted continuous solution satisfying dX (t) = f (X (t), t) dt + g (X (t), t) dB(t). If there exist ρ > 0 and real numbers K1 and K2 such that ∀ (x, t) ∈ R × [0, ∞) xf (x, t) ≤ ρ, 0 < K2 ≤ g 2 (x, t) ≤ K1 then X satisfies |X (t)| lim sup √ ≤ K1 , a.s. t→∞ 2t log log t Question: Why the hell is everyone using that iterated logarithm?!
  • 5. Introduction Bounded Noise Extensions Future work Recap - one year ago Theorem 1 Let X be the unique adapted continuous solution satisfying dX (t) = f (X (t), t) dt + g (X (t), t) dB(t). If there exist ρ > 0 and real numbers K1 and K2 such that ∀ (x, t) ∈ R × [0, ∞) xf (x, t) ≤ ρ, 0 < K2 ≤ g 2 (x, t) ≤ K1 then X satisfies |X (t)| lim sup √ ≤ K1 , a.s. t→∞ 2t log log t Question: Why the hell is everyone using that iterated logarithm?!
  • 6. Introduction Bounded Noise Extensions Future work Recap - one year ago Theorem 1 Let X be the unique adapted continuous solution satisfying dX (t) = f (X (t), t) dt + g (X (t), t) dB(t). If there exist ρ > 0 and real numbers K1 and K2 such that ∀ (x, t) ∈ R × [0, ∞) xf (x, t) ≤ ρ, 0 < K2 ≤ g 2 (x, t) ≤ K1 then X satisfies |X (t)| lim sup √ ≤ K1 , a.s. t→∞ 2t log log t Question: Why the hell is everyone using that iterated logarithm?!
  • 7. Introduction Bounded Noise Extensions Future work Recap - one year ago Theorem 1 Let X be the unique adapted continuous solution satisfying dX (t) = f (X (t), t) dt + g (X (t), t) dB(t). If there exist ρ > 0 and real numbers K1 and K2 such that ∀ (x, t) ∈ R × [0, ∞) xf (x, t) ≤ ρ, 0 < K2 ≤ g 2 (x, t) ≤ K1 then X satisfies |X (t)| lim sup √ ≤ K1 , a.s. t→∞ 2t log log t Question: Why the hell is everyone using that iterated logarithm?!
  • 8. Introduction Bounded Noise Extensions Future work Introduction We consider the rate of growth of the partial maxima of the solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic differential equation (SDE) with Markovian Switching. We find upper and lower estimates on this rate of growth by finding constants α and β and an increasing function ρ s.t. X (t) 0 < α ≤ lim sup ≤ β, a.s. t→∞ ρ(t) We look at processes which have mean-reverting drift terms and which have either: (a) bounded noise intensity, or (b) unbounded noise intensity.
  • 9. Introduction Bounded Noise Extensions Future work Introduction We consider the rate of growth of the partial maxima of the solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic differential equation (SDE) with Markovian Switching. We find upper and lower estimates on this rate of growth by finding constants α and β and an increasing function ρ s.t. X (t) 0 < α ≤ lim sup ≤ β, a.s. t→∞ ρ(t) We look at processes which have mean-reverting drift terms and which have either: (a) bounded noise intensity, or (b) unbounded noise intensity.
  • 10. Introduction Bounded Noise Extensions Future work Introduction We consider the rate of growth of the partial maxima of the solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic differential equation (SDE) with Markovian Switching. We find upper and lower estimates on this rate of growth by finding constants α and β and an increasing function ρ s.t. X (t) 0 < α ≤ lim sup ≤ β, a.s. t→∞ ρ(t) We look at processes which have mean-reverting drift terms and which have either: (a) bounded noise intensity, or (b) unbounded noise intensity.
  • 11. Introduction Bounded Noise Extensions Future work Introduction We consider the rate of growth of the partial maxima of the solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic differential equation (SDE) with Markovian Switching. We find upper and lower estimates on this rate of growth by finding constants α and β and an increasing function ρ s.t. X (t) 0 < α ≤ lim sup ≤ β, a.s. t→∞ ρ(t) We look at processes which have mean-reverting drift terms and which have either: (a) bounded noise intensity, or (b) unbounded noise intensity.
  • 12. Introduction Bounded Noise Extensions Future work Introduction We consider the rate of growth of the partial maxima of the solution {X (t)}t≥0 of a nonlinear finite-dimensional stochastic differential equation (SDE) with Markovian Switching. We find upper and lower estimates on this rate of growth by finding constants α and β and an increasing function ρ s.t. X (t) 0 < α ≤ lim sup ≤ β, a.s. t→∞ ρ(t) We look at processes which have mean-reverting drift terms and which have either: (a) bounded noise intensity, or (b) unbounded noise intensity.
  • 13. Introduction Bounded Noise Extensions Future work Motivation The ability to model a self-regulating economic system which is subjected to persistent stochastic shocks is facilitated by the use of mean-reverting drift terms. The type of finite-dimensional system studied allows for the analysis of heavy tailed returns distributions in stochastic volatility market models in which many assets are traded. Equations with Markovian switching are motivated by econometric evidence which suggests that security prices often move from confident to nervous (or other) regimes.
  • 14. Introduction Bounded Noise Extensions Future work Motivation The ability to model a self-regulating economic system which is subjected to persistent stochastic shocks is facilitated by the use of mean-reverting drift terms. The type of finite-dimensional system studied allows for the analysis of heavy tailed returns distributions in stochastic volatility market models in which many assets are traded. Equations with Markovian switching are motivated by econometric evidence which suggests that security prices often move from confident to nervous (or other) regimes.
  • 15. Introduction Bounded Noise Extensions Future work Motivation The ability to model a self-regulating economic system which is subjected to persistent stochastic shocks is facilitated by the use of mean-reverting drift terms. The type of finite-dimensional system studied allows for the analysis of heavy tailed returns distributions in stochastic volatility market models in which many assets are traded. Equations with Markovian switching are motivated by econometric evidence which suggests that security prices often move from confident to nervous (or other) regimes.
  • 16. Introduction Bounded Noise Extensions Future work Motivational case study To demonstrate the potential impact of swift changes in investor sentiment, we look at a case study of Elan Corporation.
  • 17. Introduction Bounded Noise Extensions Future work Motivational case study To demonstrate the potential impact of swift changes in investor sentiment, we look at a case study of Elan Corporation.
  • 18. Introduction Bounded Noise Extensions Future work Motivational case study To demonstrate the potential impact of swift changes in investor sentiment, we look at a case study of Elan Corporation.
  • 19. Introduction Bounded Noise Extensions Future work Motivational case study To demonstrate the potential impact of swift changes in investor sentiment, we look at a case study of Elan Corporation.
  • 20. Introduction Bounded Noise Extensions Future work Motivational case study To demonstrate the potential impact of swift changes in investor sentiment, we look at a case study of Elan Corporation.
  • 21. Introduction Bounded Noise Extensions Future work Regular Variation Our analysis is aided by the use of regularly varying functions In its basic form, may be viewed as relations such as f (λx) lim = λζ ∈ (0, ∞) ∀ λ > 0 x→∞ f (x) We say that f is regularly varying at infinity with index ζ, or f ∈ RV∞ (ζ) Regularly varying functions have useful properties such as: x f ∈ RV∞ (ζ) ⇒ F (x) := f (t) dt ∈ RV∞ (ζ + 1) 1 −1 f ∈ RV∞ (ζ) ⇒ f (x) ∈ RV∞ (1/ζ).
  • 22. Introduction Bounded Noise Extensions Future work Regular Variation Our analysis is aided by the use of regularly varying functions In its basic form, may be viewed as relations such as f (λx) lim = λζ ∈ (0, ∞) ∀ λ > 0 x→∞ f (x) We say that f is regularly varying at infinity with index ζ, or f ∈ RV∞ (ζ) Regularly varying functions have useful properties such as: x f ∈ RV∞ (ζ) ⇒ F (x) := f (t) dt ∈ RV∞ (ζ + 1) 1 −1 f ∈ RV∞ (ζ) ⇒ f (x) ∈ RV∞ (1/ζ).
  • 23. Introduction Bounded Noise Extensions Future work Regular Variation Our analysis is aided by the use of regularly varying functions In its basic form, may be viewed as relations such as f (λx) lim = λζ ∈ (0, ∞) ∀ λ > 0 x→∞ f (x) We say that f is regularly varying at infinity with index ζ, or f ∈ RV∞ (ζ) Regularly varying functions have useful properties such as: x f ∈ RV∞ (ζ) ⇒ F (x) := f (t) dt ∈ RV∞ (ζ + 1) 1 −1 f ∈ RV∞ (ζ) ⇒ f (x) ∈ RV∞ (1/ζ).
  • 24. Introduction Bounded Noise Extensions Future work Regular Variation Our analysis is aided by the use of regularly varying functions In its basic form, may be viewed as relations such as f (λx) lim = λζ ∈ (0, ∞) ∀ λ > 0 x→∞ f (x) We say that f is regularly varying at infinity with index ζ, or f ∈ RV∞ (ζ) Regularly varying functions have useful properties such as: x f ∈ RV∞ (ζ) ⇒ F (x) := f (t) dt ∈ RV∞ (ζ + 1) 1 −1 f ∈ RV∞ (ζ) ⇒ f (x) ∈ RV∞ (1/ζ).
  • 25. Introduction Bounded Noise Extensions Future work Outline 1 Introduction Motivation Regular Variation 2 Bounded Noise Main Results Outline of Proofs 3 Extensions 4 Future work
  • 26. Introduction Bounded Noise Extensions Future work S.D.E with Markovian switching We study the finite-dimensional SDE with Markovian switching dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1) where Y is a Markov chain with finite irreducible state space S, f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally Lipschitz continuous functions, B is an r -dimensional Brownian motion, independent of Y . Under these conditions, there is a unique continuous and adapted process which satisfies (1).
  • 27. Introduction Bounded Noise Extensions Future work S.D.E with Markovian switching We study the finite-dimensional SDE with Markovian switching dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1) where Y is a Markov chain with finite irreducible state space S, f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally Lipschitz continuous functions, B is an r -dimensional Brownian motion, independent of Y . Under these conditions, there is a unique continuous and adapted process which satisfies (1).
  • 28. Introduction Bounded Noise Extensions Future work S.D.E with Markovian switching We study the finite-dimensional SDE with Markovian switching dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1) where Y is a Markov chain with finite irreducible state space S, f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally Lipschitz continuous functions, B is an r -dimensional Brownian motion, independent of Y . Under these conditions, there is a unique continuous and adapted process which satisfies (1).
  • 29. Introduction Bounded Noise Extensions Future work S.D.E with Markovian switching We study the finite-dimensional SDE with Markovian switching dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1) where Y is a Markov chain with finite irreducible state space S, f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally Lipschitz continuous functions, B is an r -dimensional Brownian motion, independent of Y . Under these conditions, there is a unique continuous and adapted process which satisfies (1).
  • 30. Introduction Bounded Noise Extensions Future work S.D.E with Markovian switching We study the finite-dimensional SDE with Markovian switching dX (t) = f (X (t), Y (t)) dt + g (X (t), Y (t)) dB(t) (1) where Y is a Markov chain with finite irreducible state space S, f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locally Lipschitz continuous functions, B is an r -dimensional Brownian motion, independent of Y . Under these conditions, there is a unique continuous and adapted process which satisfies (1).
  • 31. Introduction Bounded Noise Extensions Future work Bounded Noise Consider the i th component of our SDE r dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t). j=1 We suppose that the noise intensity g is bounded in the sense that there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S, 2 r d j=1 i=1 xi gij (x,y ) 2 there exists K1 > 0 s.t. inf x ∈Rd x 2 ≥ K1 . y ∈S By Cauchy-Schwarz we get the following upper and lower estimate 2 2 2 K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
  • 32. Introduction Bounded Noise Extensions Future work Bounded Noise Consider the i th component of our SDE r dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t). j=1 We suppose that the noise intensity g is bounded in the sense that there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S, 2 r d j=1 i=1 xi gij (x,y ) 2 there exists K1 > 0 s.t. inf x ∈Rd x 2 ≥ K1 . y ∈S By Cauchy-Schwarz we get the following upper and lower estimate 2 2 2 K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
  • 33. Introduction Bounded Noise Extensions Future work Bounded Noise Consider the i th component of our SDE r dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t). j=1 We suppose that the noise intensity g is bounded in the sense that there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S, 2 r d j=1 i=1 xi gij (x,y ) 2 there exists K1 > 0 s.t. inf x ∈Rd x 2 ≥ K1 . y ∈S By Cauchy-Schwarz we get the following upper and lower estimate 2 2 2 K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
  • 34. Introduction Bounded Noise Extensions Future work Bounded Noise Consider the i th component of our SDE r dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t). j=1 We suppose that the noise intensity g is bounded in the sense that there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S, 2 r d j=1 i=1 xi gij (x,y ) 2 there exists K1 > 0 s.t. inf x ∈Rd x 2 ≥ K1 . y ∈S By Cauchy-Schwarz we get the following upper and lower estimate 2 2 2 K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
  • 35. Introduction Bounded Noise Extensions Future work Bounded Noise Consider the i th component of our SDE r dXi (t) = fi (X (t), Y (t)) dt + gij (X (t), Y (t)) dBj (t). j=1 We suppose that the noise intensity g is bounded in the sense that there exists K2 > 0 s.t. g (x, y ) F ≤ K2 for all y ∈ S, 2 r d j=1 i=1 xi gij (x,y ) 2 there exists K1 > 0 s.t. inf x ∈Rd x 2 ≥ K1 . y ∈S By Cauchy-Schwarz we get the following upper and lower estimate 2 2 2 K1 ≤ g (x, y ) F ≤ K2 , for all y ∈ S.
  • 36. Introduction Bounded Noise Extensions Future work Upper bound The strength of the restoring force is characterised by a locally Lipschitz continuous function φ : [0, ∞) → (0, ∞) with xφ(x) → ∞ as x → ∞, where x, f (x, y ) lim sup sup ≤ −c2 ∈ (−∞, 0). (2) x →∞ y ∈S x φ( x ) Theorem 1 Let X be the unique adapted continuous solution satisfying (1). Suppose there exists a function φ satisfying (2) and that g satisfies the bounded noise conditions. Then X satisfies X (t) lim sup 2 ≤ 1, a.s. on Ω , t→∞ Φ−1 K2 (1+ ) log t 2c2 x where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
  • 37. Introduction Bounded Noise Extensions Future work Upper bound The strength of the restoring force is characterised by a locally Lipschitz continuous function φ : [0, ∞) → (0, ∞) with xφ(x) → ∞ as x → ∞, where x, f (x, y ) lim sup sup ≤ −c2 ∈ (−∞, 0). (2) x →∞ y ∈S x φ( x ) Theorem 1 Let X be the unique adapted continuous solution satisfying (1). Suppose there exists a function φ satisfying (2) and that g satisfies the bounded noise conditions. Then X satisfies X (t) lim sup 2 ≤ 1, a.s. on Ω , t→∞ Φ−1 K2 (1+ ) log t 2c2 x where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
  • 38. Introduction Bounded Noise Extensions Future work Upper bound The strength of the restoring force is characterised by a locally Lipschitz continuous function φ : [0, ∞) → (0, ∞) with xφ(x) → ∞ as x → ∞, where x, f (x, y ) lim sup sup ≤ −c2 ∈ (−∞, 0). (2) x →∞ y ∈S x φ( x ) Theorem 1 Let X be the unique adapted continuous solution satisfying (1). Suppose there exists a function φ satisfying (2) and that g satisfies the bounded noise conditions. Then X satisfies X (t) lim sup 2 ≤ 1, a.s. on Ω , t→∞ Φ−1 K2 (1+ ) log t 2c2 x where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
  • 39. Introduction Bounded Noise Extensions Future work Lower bound Moreover, we ensure that the degree of non-linearity in f is characterised by φ also, by means of the assumption | x, f (x, y ) | lim sup sup ≤ c1 ∈ (0, ∞). (3) x →∞ y ∈S x φ( x ) Theorem 2 Let X be the unique adapted continuous solution satisfying (1). Suppose there exists a function φ satisfying (3) and that g satisfies the bounded noise conditions. Then X satisfies X (t) lim sup 2 ≥ 1, a.s. on Ω , t→∞ Φ−1 K1 (1− ) log t 2c1 x where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
  • 40. Introduction Bounded Noise Extensions Future work Lower bound Moreover, we ensure that the degree of non-linearity in f is characterised by φ also, by means of the assumption | x, f (x, y ) | lim sup sup ≤ c1 ∈ (0, ∞). (3) x →∞ y ∈S x φ( x ) Theorem 2 Let X be the unique adapted continuous solution satisfying (1). Suppose there exists a function φ satisfying (3) and that g satisfies the bounded noise conditions. Then X satisfies X (t) lim sup 2 ≥ 1, a.s. on Ω , t→∞ Φ−1 K1 (1− ) log t 2c1 x where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
  • 41. Introduction Bounded Noise Extensions Future work Lower bound Moreover, we ensure that the degree of non-linearity in f is characterised by φ also, by means of the assumption | x, f (x, y ) | lim sup sup ≤ c1 ∈ (0, ∞). (3) x →∞ y ∈S x φ( x ) Theorem 2 Let X be the unique adapted continuous solution satisfying (1). Suppose there exists a function φ satisfying (3) and that g satisfies the bounded noise conditions. Then X satisfies X (t) lim sup 2 ≥ 1, a.s. on Ω , t→∞ Φ−1 K1 (1− ) log t 2c1 x where Φ(x) = 1 φ(u) du and Ω is an almost sure event.
  • 42. Introduction Bounded Noise Extensions Future work Growth rate of large fluctuations Taking both theorems together, in the special case where φ is a regularly varying function, we get the following: Theorem 3 Let X be the unique adapted continuous solution satisfying (1). Suppose there exists a function φ ∈ RV∞ (ζ) satisfying (2) and (3) and that g satisfies the bounded noise conditions. Then X satisfies 1 1 K12 ζ+1 X (t) K22 ζ+1 ≤ lim sup −1 ≤ a.s., 2c1 t→∞ Φ (log t) 2c2 x where Φ(x) = 1 φ(u) du ∈ RV∞ (ζ + 1).
  • 43. Introduction Bounded Noise Extensions Future work Growth rate of large fluctuations Taking both theorems together, in the special case where φ is a regularly varying function, we get the following: Theorem 3 Let X be the unique adapted continuous solution satisfying (1). Suppose there exists a function φ ∈ RV∞ (ζ) satisfying (2) and (3) and that g satisfies the bounded noise conditions. Then X satisfies 1 1 K12 ζ+1 X (t) K22 ζ+1 ≤ lim sup −1 ≤ a.s., 2c1 t→∞ Φ (log t) 2c2 x where Φ(x) = 1 φ(u) du ∈ RV∞ (ζ + 1).
  • 44. Introduction Bounded Noise Extensions Future work Growth rate of large fluctuations Taking both theorems together, in the special case where φ is a regularly varying function, we get the following: Theorem 3 Let X be the unique adapted continuous solution satisfying (1). Suppose there exists a function φ ∈ RV∞ (ζ) satisfying (2) and (3) and that g satisfies the bounded noise conditions. Then X satisfies 1 1 K12 ζ+1 X (t) K22 ζ+1 ≤ lim sup −1 ≤ a.s., 2c1 t→∞ Φ (log t) 2c2 x where Φ(x) = 1 φ(u) du ∈ RV∞ (ζ + 1).
  • 45. Introduction Bounded Noise Extensions Future work Comments and Example Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as expected, weak mean-reversion results in large fluctuations. Take a simple one-dimensional Ornstein Uhlenbeck process dX (t) = −cX (t)dt + K dB(t) where, in our notation, c1 = c2 = c and K1 = K2 = K , φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1 2 √ Φ(x) = x2 and Φ−1 (x) = 2x Then applying theorem 3 recovers the well known result 1 1 1 K12 |X (t)| K 2 2 K2 ζ+1 2 lim sup √sup X (t) ζ+1 ≤ lim = ≤ , a.s. 2c1 t→∞ t→∞ Φ−1 (log2c 2 log t t) 2c2
  • 46. Introduction Bounded Noise Extensions Future work Comments and Example Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as expected, weak mean-reversion results in large fluctuations. Take a simple one-dimensional Ornstein Uhlenbeck process dX (t) = −cX (t)dt + K dB(t) where, in our notation, c1 = c2 = c and K1 = K2 = K , φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1 2 √ Φ(x) = x2 and Φ−1 (x) = 2x Then applying theorem 3 recovers the well known result 1 1 1 K12 |X (t)| K 2 2 K2 ζ+1 2 lim sup √sup X (t) ζ+1 ≤ lim = ≤ , a.s. 2c1 t→∞ t→∞ Φ−1 (log2c 2 log t t) 2c2
  • 47. Introduction Bounded Noise Extensions Future work Comments and Example Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as expected, weak mean-reversion results in large fluctuations. Take a simple one-dimensional Ornstein Uhlenbeck process dX (t) = −cX (t)dt + K dB(t) where, in our notation, c1 = c2 = c and K1 = K2 = K , φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1 2 √ Φ(x) = x2 and Φ−1 (x) = 2x Then applying theorem 3 recovers the well known result 1 1 1 K12 |X (t)| K 2 2 K2 ζ+1 2 lim sup √sup X (t) ζ+1 ≤ lim = ≤ , a.s. 2c1 t→∞ t→∞ Φ−1 (log2c 2 log t t) 2c2
  • 48. Introduction Bounded Noise Extensions Future work Comments and Example Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as expected, weak mean-reversion results in large fluctuations. Take a simple one-dimensional Ornstein Uhlenbeck process dX (t) = −cX (t)dt + K dB(t) where, in our notation, c1 = c2 = c and K1 = K2 = K , φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1 2 √ Φ(x) = x2 and Φ−1 (x) = 2x Then applying theorem 3 recovers the well known result 1 1 1 K12 |X (t)| K 2 2 K2 ζ+1 2 lim sup √sup X (t) ζ+1 ≤ lim = ≤ , a.s. 2c1 t→∞ t→∞ Φ−1 (log2c 2 log t t) 2c2
  • 49. Introduction Bounded Noise Extensions Future work Comments and Example Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as expected, weak mean-reversion results in large fluctuations. Take a simple one-dimensional Ornstein Uhlenbeck process dX (t) = −cX (t)dt + K dB(t) where, in our notation, c1 = c2 = c and K1 = K2 = K , φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1 2 √ Φ(x) = x2 and Φ−1 (x) = 2x Then applying theorem 3 recovers the well known result 1 1 1 K12 |X (t)| K 2 2 K2 ζ+1 2 lim sup √sup X (t) ζ+1 ≤ lim = ≤ , a.s. 2c1 t→∞ t→∞ Φ−1 (log2c 2 log t t) 2c2
  • 50. Introduction Bounded Noise Extensions Future work Comments and Example Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as expected, weak mean-reversion results in large fluctuations. Take a simple one-dimensional Ornstein Uhlenbeck process dX (t) = −cX (t)dt + K dB(t) where, in our notation, c1 = c2 = c and K1 = K2 = K , φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1 2 √ Φ(x) = x2 and Φ−1 (x) = 2x Then applying theorem 3 recovers the well known result 1 1 1 K12 |X (t)| K 2 2 K2 ζ+1 2 lim sup √sup X (t) ζ+1 ≤ lim = ≤ , a.s. 2c1 t→∞ t→∞ Φ−1 (log2c 2 log t t) 2c2
  • 51. Introduction Bounded Noise Extensions Future work Comments and Example Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as expected, weak mean-reversion results in large fluctuations. Take a simple one-dimensional Ornstein Uhlenbeck process dX (t) = −cX (t)dt + K dB(t) where, in our notation, c1 = c2 = c and K1 = K2 = K , φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1 2 √ Φ(x) = x2 and Φ−1 (x) = 2x Then applying theorem 3 recovers the well known result 1 1 1 K12 |X (t)| K 2 2 K2 ζ+1 2 lim sup √sup X (t) ζ+1 ≤ lim = ≤ , a.s. 2c1 t→∞ t→∞ Φ−1 (log2c 2 log t t) 2c2
  • 52. Introduction Bounded Noise Extensions Future work Comments and Example Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as expected, weak mean-reversion results in large fluctuations. Take a simple one-dimensional Ornstein Uhlenbeck process dX (t) = −cX (t)dt + K dB(t) where, in our notation, c1 = c2 = c and K1 = K2 = K , φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1 2 √ Φ(x) = x2 and Φ−1 (x) = 2x Then applying theorem 3 recovers the well known result 1 1 1 K12 |X (t)| K 2 2 K2 ζ+1 2 lim sup √sup X (t) ζ+1 ≤ lim = ≤ , a.s. 2c1 t→∞ t→∞ Φ−1 (log2c 2 log t t) 2c2
  • 53. Introduction Bounded Noise Extensions Future work Comments and Example Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, as expected, weak mean-reversion results in large fluctuations. Take a simple one-dimensional Ornstein Uhlenbeck process dX (t) = −cX (t)dt + K dB(t) where, in our notation, c1 = c2 = c and K1 = K2 = K , φ(x) = x ∈ RV∞ (1) ⇒ ζ = 1 2 √ Φ(x) = x2 and Φ−1 (x) = 2x Then applying theorem 3 recovers the well known result 1 1 1 K12 |X (t)| K 2 2 K2 ζ+1 2 lim sup √sup X (t) ζ+1 ≤ lim = ≤ , a.s. 2c1 t→∞ t→∞ Φ−1 (log2c 2 log t t) 2c2
  • 54. Introduction Bounded Noise Extensions Future work Outline of proofs Apply time–change and Itˆ transformation to reduce o dimensionality and facilitate the use of one-dimensional theory, Use the bounding conditions on f and g to construct upper and lower comparison processes whose dynamics are not determined by the switching parameter Y , Use stochastic comparison theorem to prove that our transformed process is bounded above and below by the comparison processes for all t ≥ 0 a.s., The large deviations of the comparison processes are determined by means of a classical theorem of Motoo.
  • 55. Introduction Bounded Noise Extensions Future work Outline of proofs Apply time–change and Itˆ transformation to reduce o dimensionality and facilitate the use of one-dimensional theory, Use the bounding conditions on f and g to construct upper and lower comparison processes whose dynamics are not determined by the switching parameter Y , Use stochastic comparison theorem to prove that our transformed process is bounded above and below by the comparison processes for all t ≥ 0 a.s., The large deviations of the comparison processes are determined by means of a classical theorem of Motoo.
  • 56. Introduction Bounded Noise Extensions Future work Outline of proofs Apply time–change and Itˆ transformation to reduce o dimensionality and facilitate the use of one-dimensional theory, Use the bounding conditions on f and g to construct upper and lower comparison processes whose dynamics are not determined by the switching parameter Y , Use stochastic comparison theorem to prove that our transformed process is bounded above and below by the comparison processes for all t ≥ 0 a.s., The large deviations of the comparison processes are determined by means of a classical theorem of Motoo.
  • 57. Introduction Bounded Noise Extensions Future work Outline of proofs Apply time–change and Itˆ transformation to reduce o dimensionality and facilitate the use of one-dimensional theory, Use the bounding conditions on f and g to construct upper and lower comparison processes whose dynamics are not determined by the switching parameter Y , Use stochastic comparison theorem to prove that our transformed process is bounded above and below by the comparison processes for all t ≥ 0 a.s., The large deviations of the comparison processes are determined by means of a classical theorem of Motoo.
  • 58. Introduction Bounded Noise Extensions Future work Outline 1 Introduction Motivation Regular Variation 2 Bounded Noise Main Results Outline of Proofs 3 Extensions 4 Future work
  • 59. Introduction Bounded Noise Extensions Future work Extensions We can impose a rate of nonlinear growth of g with x as x → ∞ through an increasing scalar function γ. Then the growth rate of the deviations are of the order x Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du. Using norm equivalence in Rd we can study the size of the largest component of the system rather than the norm, to get upper and lower bounds of the form 1 max1≤j≤d |Xj (t)| 0 < √ α ≤ lim sup ≤ β ≤ +∞, a.s. d t→∞ ρ(t) We can extend the state space of the Markov chain to a countable state space.
  • 60. Introduction Bounded Noise Extensions Future work Extensions We can impose a rate of nonlinear growth of g with x as x → ∞ through an increasing scalar function γ. Then the growth rate of the deviations are of the order x Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du. Using norm equivalence in Rd we can study the size of the largest component of the system rather than the norm, to get upper and lower bounds of the form 1 max1≤j≤d |Xj (t)| 0 < √ α ≤ lim sup ≤ β ≤ +∞, a.s. d t→∞ ρ(t) We can extend the state space of the Markov chain to a countable state space.
  • 61. Introduction Bounded Noise Extensions Future work Extensions We can impose a rate of nonlinear growth of g with x as x → ∞ through an increasing scalar function γ. Then the growth rate of the deviations are of the order x Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du. Using norm equivalence in Rd we can study the size of the largest component of the system rather than the norm, to get upper and lower bounds of the form 1 max1≤j≤d |Xj (t)| 0 < √ α ≤ lim sup ≤ β ≤ +∞, a.s. d t→∞ ρ(t) We can extend the state space of the Markov chain to a countable state space.
  • 62. Introduction Bounded Noise Extensions Future work Extensions We can impose a rate of nonlinear growth of g with x as x → ∞ through an increasing scalar function γ. Then the growth rate of the deviations are of the order x Ψ−1 (log t) where Ψ(x) = 1 φ(u)/γ 2 (u) du. Using norm equivalence in Rd we can study the size of the largest component of the system rather than the norm, to get upper and lower bounds of the form 1 max1≤j≤d |Xj (t)| 0 < √ α ≤ lim sup ≤ β ≤ +∞, a.s. d t→∞ ρ(t) We can extend the state space of the Markov chain to a countable state space.
  • 63. Introduction Bounded Noise Extensions Future work Outline 1 Introduction Motivation Regular Variation 2 Bounded Noise Main Results Outline of Proofs 3 Extensions 4 Future work
  • 64. Introduction Bounded Noise Extensions Future work Future work To study finite-dimensional processes which are always positive, e.g. the Cox Ingersoll Ross (CIR) model. To investigate the growth, explosion and simulation of our developed results. This will involve stochastic numerical analysis and Monte Carlo simulation. This simulation will allow us to determine whether the qualitative features of our dynamical systems are preserved when analysed in discrete time. To investigate whether our results can be extended to SDEs with delay.
  • 65. Introduction Bounded Noise Extensions Future work Future work To study finite-dimensional processes which are always positive, e.g. the Cox Ingersoll Ross (CIR) model. To investigate the growth, explosion and simulation of our developed results. This will involve stochastic numerical analysis and Monte Carlo simulation. This simulation will allow us to determine whether the qualitative features of our dynamical systems are preserved when analysed in discrete time. To investigate whether our results can be extended to SDEs with delay.
  • 66. Introduction Bounded Noise Extensions Future work Future work To study finite-dimensional processes which are always positive, e.g. the Cox Ingersoll Ross (CIR) model. To investigate the growth, explosion and simulation of our developed results. This will involve stochastic numerical analysis and Monte Carlo simulation. This simulation will allow us to determine whether the qualitative features of our dynamical systems are preserved when analysed in discrete time. To investigate whether our results can be extended to SDEs with delay.
  • 67. Introduction Bounded Noise Extensions Future work Future work To study finite-dimensional processes which are always positive, e.g. the Cox Ingersoll Ross (CIR) model. To investigate the growth, explosion and simulation of our developed results. This will involve stochastic numerical analysis and Monte Carlo simulation. This simulation will allow us to determine whether the qualitative features of our dynamical systems are preserved when analysed in discrete time. To investigate whether our results can be extended to SDEs with delay.