SlideShare a Scribd company logo
Auto-regressive Processes

    B. Nag and J. Christophersen

             MET - 6155


        November 09, 2011
Outline of the talk




    Introduction of AR(p) Processes
    Formal Definition
    White Noise
    Deriving the First Moment
    Deriving the Second Moment
    Lag 1: AR(1)
    Lag 2: AR(2)




                   Bappaditya, Jonathan   Auto-regressive Processes
Introduction



 Dynamics of many physical processes :

                        d 2 x(t)      dx(t)
                   a2        2
                                 + a1       + a0 x(t) = z(t)               (1)
                          dt           dt
 where z(t) is some external forcing function.
 Time discretization yields

                           xt = α1 xt−1 + α2 xt−2 + zt                     (2)




                        Bappaditya, Jonathan   Auto-regressive Processes
Formal Definition



 Xt : t ∈ Z is an auto-regressive process of order p if there exist real
 constants αk , k = 0, . . . , p, with αp = 0 and a white noise process
 Zt : t ∈ Z such that
                                              p
                        Xt = α0 +                 αk Xt−k + Zt                (3)
                                         k=1

 Note : Xt is independent of the part of Zt that is in the future, but
 depends on the parts of the noise processes that are in the present and
 the past




                       Bappaditya, Jonathan       Auto-regressive Processes
White Noise

 Consider a time series :
                                  Xt = Dt + Nt                           (4)
 with Dt and Nt being the determined and stochastic (random)
 components respectively.
 If Dt is independent of Nt , then Dt is deterministic. Nt masks
 deterministic oscillations when present.
 Let us consider the case for k = 1.

                     Xt     =     α1 Xt−1 + Nt
                            =     α1 (Dt−1 + Nt−1 ) + Nt
                            =     α1 Dt−1 + α1 Nt−1 + Nt

 where, α1 Nt−1 can be regarded as the contribution from the dynamics of
 the white noise. The spectrum of a white noise process is flat and hence
 the name.


                      Bappaditya, Jonathan   Auto-regressive Processes
(a)                                             (b)
Figure: A realization of a process Xt = Dt + Nt for which the dynamical
component Dt = 0.7Xt is affected by the stochastic component Nt .
(a) Nt (b) Xt




  0 All   plots are made up of 100 member ensemble
                         Bappaditya, Jonathan    Auto-regressive Processes
First Order Moment : Mean of an AR(p)
Process

                         2
 Assumptions : µX and σX is independent of time.
 Taking expectations on both sides of the generalized eqn.( 3),
                                                    p
               ε(Xt ) =       ε(α0 ) + ε(               αk Xt−k ) + ε(Zt )
                                                k=1
                                           p
                        =     α0 +             αk ε(Xt−k )
                                       k=1
                                        p
                        =     α0 +             αk ε(Xt )
                                       k=1
                                     α0
                        =              p                                        (5)
                               1−              αk
                                     k=1




                      Bappaditya, Jonathan          Auto-regressive Processes
Second Order Moment : Variance of an AR(p)
Process
 Proposition:
                                   p
                  Var (Xt ) =          αk ρk Var (Xt ) + Var (Zt )
                                 k=1

 Proof: Let µ = ε(Xt ), then re-writting eqn. (3),
                                       p
                      Xt − µ =             αk (Xt−k − µ) + Zt              (6)
                                    k=1

 Multiplying both sides by Xt − µ and taking expectations :
      Var (Xt )   =   ε((Xt − µ)2 )
                           p
                  =   ε(       αk (Xt − µ)(Xt−k − µ)) + ε((Xt − µ)Zt )
                        k=1
                       p
                  =         αk ε((Xt − µ)(Xt−k − µ)) + ε((Xt − µ)Zt )
                      k=1

                       Bappaditya, Jonathan    Auto-regressive Processes
p
             Var (Xt )     =              αk ρk Var (Xt ) + ε((Xt − µ)Zt )      (7)
                                    k=1

where ρk is the auto-correlation function defined as
                                    ε((Xt − µ)(Xt−k − µ))
                          ρk =                                                  (8)
                                           Var (Xt )
Lemma : ε((Xt − µ)Zt ) = Var (Zt )
Proof:

                  ε((Xt − µ)Zt )             = ε(Xt Zt − µZt )
                                             = ε(Xt Zt ) − ε(µZt )              (9)
Again,
                               p
         ε(Xt Zt ) =     ε(         αk (Xt−k − µ) + Zt + µ)Zt
                              k=1
                               p
                  =      ε(         αk Xt−k Zt ) − ε(µZt ) + ε(Z2 ) + ε(µZt )
                                                                t
                              k=1

                         Bappaditya, Jonathan     Auto-regressive Processes
p
                ε(Xt Zt ) =             αk ε(Xt−k Zt ) + ε(Z2 )
                                                            t
                                  k=1
                                   p
                           =            αk ε(Xt−k Zt ) + Var (Zt )       (10)
                                  k=1

Since Xt is independent of the part of Zt that is in the future implies
Xt−k and Zt are independent. Hence
                                ε(Xt−k Zt ) = 0
Hence we get,
                             ε(Xt Zt ) = Var (Zt )                       (11)
From equation (5),
                     ε(µZt )     = µε(Zt )
                                        α0
                                 =         p  ε(Zt )
                                   1 − k=1 αk
                                        α0
                                 =         p  ×0
                                   1 − k=1 αk
                                 = 0
                     Bappaditya, Jonathan    Auto-regressive Processes
Thus
                          ε((Xt − µ)Zt ) = Var (Zt )                    (12)



and eqn. (7) reduces to
                                p
                Var (Xt ) =         αk ρk Var (Xt ) + Var (Zt )
                              k=1




                                            Var (Zt )
                          Var (Xt ) =            p                      (13)
                                           1−         αk ρk
                                                k=1




                    Bappaditya, Jonathan    Auto-regressive Processes
AR(1) Processes


 Consider the following equation:
                                       dx
                                  a1      + a0 x = z(t)                     (14)
                                       dt
 Discretizing again :
                            a1 (x1 − xt−1 ) + a0 xt = zt
                            at xt − a1 xt−1 + a0 xt = zt
                            xt (a1 + a0 ) − a1 xt−1 = zt
 Therefore we obtain :
                                   xt = α1 xt−1 + zt                        (15)
                a1                    zt
 where α1 =   a1 +a0   and zt =    a1 +a0




                         Bappaditya, Jonathan   Auto-regressive Processes
AR(1) Processes Continued
 Hence an AR(1) Process can be represented as

                               Xt = α1 Xt−1 + Zt                         (16)

 For convinience we assume, α0 = 0 and ε(Xt ) = µ = 0
 Expectation of the product of Xt with Xt−1 is

                   ε(Xt Xt−1 ) = α1 ε(X2 ) + ε(Zt Xt−1 )
                                       t−1

 Since Xt does not depend on the part of Zt that is in the future, hence

                                 ε(Zt Xt−1 ) = 0

 Also since the variance is independent of time,

                            ε(Xt Xt−1 ) = α1 ε(X2 )
                                                t                        (17)

 Hence,
                                         ε(Xt Xt−1 )
                                α1 =                                     (18)
                                          Var (Xt )

                      Bappaditya, Jonathan   Auto-regressive Processes
AR(1) Processes Continued

 Substituting for k = 1, in eqn. (8), yields

                                          ε(Xt Xt−1 )
                                 ρ1 =                                       (19)
                                           Var (Xt )

 Hence ρ1 = α1
 Using this we can write eqn. (13) for an AR(1) process as

                                                   Var(Zt )
                       Var(Xt )       =              p
                                              1−     k=1 αk ρk
                                                 2
                                                σz
                                      =            2                        (20)
                                              1 − α1

 This result shows that the variance of the random variable Xt is a linear
                                                2
 function of the variance of the white noise σZ . This also shows that the
 variance is also a nonlinear function of α1 .
 If α1 ≈ 0, then the Var (Xt ) ≈ Var (Zt ). For α1 ∈ [0, 1], we see that
 Var (Xt ) > Var (Zt ). As α1 approaches 1, the Var (Xt ) approaches ∞.

                       Bappaditya, Jonathan     Auto-regressive Processes
(a)




              (b)
Figure: AR(1) Processes with α1 = 0.3 (top) and α1 = 0.9 (bottom)


                    Bappaditya, Jonathan   Auto-regressive Processes
AR(2) Processes

                      d 2 x(t)      dx(t)
                    a2         + a1       + a0 x(t) = z(t)                  (21)
                        dt 2         dt
 where z(t) is some external forcing function.
 Time discretization yields

            a2 (xt + xt−2 − 2xt−1 ) + a1 (xt − xt−1 ) + a0 xt = z(t)
              (a0 + a1 + a2 )xt = (a1 + 2a2 )xt−1 − a2 xt−2 + zt
 Alternatively,
                            xt = α1 xt−1 + α2 xt−2 + zt                     (22)
 where
                                        a1 + 2a2
                                  α1 =
                                      a0 + a1 + a2
                                            a2
                                α2 = −
                                       a0 + a1 + a2
                                          1
                                zt =              zt
                                     a0 + a1 + a2
                         Bappaditya, Jonathan   Auto-regressive Processes
and by = −0.8 (top)
Figure: AR(2) Processes with α1 = 0.9Generated α2CamScanner from intsig.com and with
α1 = α2 = 0.3 (bottom)

                          Bappaditya, Jonathan   Auto-regressive Processes
Parameterizing AR(2) Processes



 In order for AR(2) processes to be stationary, α1 and α2 must satisfy
 three conditions:

                                 (1) α1 + α2 < 1
                                 (2) α1 − α2 < 1
                                 (3) −1 < α2 < 1

 This defines a triangular region for the (α1 , α2 )-plane.
 Note that if α2 = 0 then we observe AR(1) processes where −1 < α1 < 1
 defines the space for which α1 is stationary in an AR(1) model.




                       Bappaditya, Jonathan   Auto-regressive Processes
Parameterizing AR(2) Processes Continued




    Figure: Region of stationary points for AR(1) andbyAR(2) processes
                                               Generated CamScanner from intsig.com




                        Bappaditya, Jonathan   Auto-regressive Processes
Parameterizing AR(2) Processes Continued



 The figure above shows:
     AR(1) processes are special cases:
          α1 > 0 shows exponential decay
          α1 < 0 shows damped oscillations
          α1 > 0 for most meteorological phenomena
     The second parameter α2 :
          More complex relationship between lags
          For (0.9, −0.6), slow damped oscillation around 0
          AR(2) models can represent pseudoperiodicity
          Barometric pressure variations due to midlatitude synoptic systems
          follow pseudoperiodic behavior




                      Bappaditya, Jonathan   Auto-regressive Processes
Parameterizing AR(2) Processes Continued




      (a)                                     (b)




      (c)                                     (d)
 Figure: Four synthetic time series illustrating some properties of
 autoregressive models. (a) α1 = 0.0, α2 = 0.1, (b) α1 = 0.5, α2 = 0.1, (c)
 α1 = 0.9, α2 = −0.6, (d) α1 = 0.09, α2 = 0.11

                       Bappaditya, Jonathan    Auto-regressive Processes
References




 von Storch, H., 1999: Statistical analysis in climate research, 1st ed.
     Cambridge University, 494 pp.
 Wilks, D., 1995: Statistical methods in the atmospheric sciences, 1st ed.
     Academic Press, Inc., 467 pp.
 Scheaffer, R., 1994: Introduction to probability and its applications,
     2nd ed. Duxberry Press, 377 pp.




                       Bappaditya, Jonathan   Auto-regressive Processes
Questions




                               ??




            Bappaditya, Jonathan   Auto-regressive Processes

More Related Content

What's hot

Visual Explanation of Ridge Regression and LASSO
Visual Explanation of Ridge Regression and LASSOVisual Explanation of Ridge Regression and LASSO
Visual Explanation of Ridge Regression and LASSO
Kazuki Yoshida
 
Arima model
Arima modelArima model
Arima model
Jassika
 
Markov Chains
Markov ChainsMarkov Chains
Markov Chains
guest8901f4
 
Lesson 2 stationary_time_series
Lesson 2 stationary_time_seriesLesson 2 stationary_time_series
Lesson 2 stationary_time_series
ankit_ppt
 
Seasonal ARIMA
Seasonal ARIMASeasonal ARIMA
Seasonal ARIMA
Joud Khattab
 
Properties of estimators (blue)
Properties of estimators (blue)Properties of estimators (blue)
Properties of estimators (blue)
Kshitiz Gupta
 
Probability Density Function (PDF)
Probability Density Function (PDF)Probability Density Function (PDF)
Probability Density Function (PDF)
AakankshaR
 
Bayes Theorem
Bayes TheoremBayes Theorem
Bayes Theorem
sabareeshbabu
 
Time Series - Auto Regressive Models
Time Series - Auto Regressive ModelsTime Series - Auto Regressive Models
Time Series - Auto Regressive Models
Bhaskar T
 
Particle filter
Particle filterParticle filter
Particle filter
Mohammad Reza Jabbari
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
Manoj Bhambu
 
Introduction to modern time series analysis
Introduction to modern time series analysisIntroduction to modern time series analysis
Introduction to modern time series analysis
Springer
 
Auto Correlation Presentation
Auto Correlation PresentationAuto Correlation Presentation
Auto Correlation Presentation
Irfan Hussain
 
Mean Squared Error (MSE) of an Estimator
Mean Squared Error (MSE) of an EstimatorMean Squared Error (MSE) of an Estimator
Mean Squared Error (MSE) of an Estimator
Suruchi Somwanshi
 
The Hiring Problem
The Hiring ProblemThe Hiring Problem
The Hiring Problem
Tinou Bao
 
Lesson 5 arima
Lesson 5 arimaLesson 5 arima
Lesson 5 arima
ankit_ppt
 
5 cramer-rao lower bound
5 cramer-rao lower bound5 cramer-rao lower bound
5 cramer-rao lower bound
Solo Hermelin
 
Generalized linear model
Generalized linear modelGeneralized linear model
Generalized linear model
Rahul Rockers
 
Time series modelling arima-arch
Time series modelling  arima-archTime series modelling  arima-arch
Time series modelling arima-arch
jeevan solaskar
 
Stat 2153 Stochastic Process and Markov chain
Stat 2153 Stochastic Process and Markov chainStat 2153 Stochastic Process and Markov chain
Stat 2153 Stochastic Process and Markov chain
Khulna University
 

What's hot (20)

Visual Explanation of Ridge Regression and LASSO
Visual Explanation of Ridge Regression and LASSOVisual Explanation of Ridge Regression and LASSO
Visual Explanation of Ridge Regression and LASSO
 
Arima model
Arima modelArima model
Arima model
 
Markov Chains
Markov ChainsMarkov Chains
Markov Chains
 
Lesson 2 stationary_time_series
Lesson 2 stationary_time_seriesLesson 2 stationary_time_series
Lesson 2 stationary_time_series
 
Seasonal ARIMA
Seasonal ARIMASeasonal ARIMA
Seasonal ARIMA
 
Properties of estimators (blue)
Properties of estimators (blue)Properties of estimators (blue)
Properties of estimators (blue)
 
Probability Density Function (PDF)
Probability Density Function (PDF)Probability Density Function (PDF)
Probability Density Function (PDF)
 
Bayes Theorem
Bayes TheoremBayes Theorem
Bayes Theorem
 
Time Series - Auto Regressive Models
Time Series - Auto Regressive ModelsTime Series - Auto Regressive Models
Time Series - Auto Regressive Models
 
Particle filter
Particle filterParticle filter
Particle filter
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
 
Introduction to modern time series analysis
Introduction to modern time series analysisIntroduction to modern time series analysis
Introduction to modern time series analysis
 
Auto Correlation Presentation
Auto Correlation PresentationAuto Correlation Presentation
Auto Correlation Presentation
 
Mean Squared Error (MSE) of an Estimator
Mean Squared Error (MSE) of an EstimatorMean Squared Error (MSE) of an Estimator
Mean Squared Error (MSE) of an Estimator
 
The Hiring Problem
The Hiring ProblemThe Hiring Problem
The Hiring Problem
 
Lesson 5 arima
Lesson 5 arimaLesson 5 arima
Lesson 5 arima
 
5 cramer-rao lower bound
5 cramer-rao lower bound5 cramer-rao lower bound
5 cramer-rao lower bound
 
Generalized linear model
Generalized linear modelGeneralized linear model
Generalized linear model
 
Time series modelling arima-arch
Time series modelling  arima-archTime series modelling  arima-arch
Time series modelling arima-arch
 
Stat 2153 Stochastic Process and Markov chain
Stat 2153 Stochastic Process and Markov chainStat 2153 Stochastic Process and Markov chain
Stat 2153 Stochastic Process and Markov chain
 

Viewers also liked

AR model
AR modelAR model
AR model
Naveen Kumar
 
Psc553
Psc553Psc553
Psc553
sg.teacher
 
AR Process in Detail
AR Process in DetailAR Process in Detail
AR Process in Detail
Lyndon Peraira
 
parametric method of power spectrum Estimation
parametric method of power spectrum Estimationparametric method of power spectrum Estimation
parametric method of power spectrum Estimation
junjer
 
intro to R
intro to Rintro to R
intro to R
amar patil
 
certificate
certificatecertificate
certificate
amar patil
 
Low frequency mode estimation
Low frequency mode estimationLow frequency mode estimation
Low frequency mode estimation
pranvendra29
 
Questions from chapter 1 data communication and networking
Questions from chapter 1 data communication and networkingQuestions from chapter 1 data communication and networking
Questions from chapter 1 data communication and networking
Anuja Lad
 
Project time series ppt
Project time series pptProject time series ppt
Project time series ppt
amar patil
 
Questions from chapter 1 data communication and networking
Questions from chapter 1 data communication and networkingQuestions from chapter 1 data communication and networking
Questions from chapter 1 data communication and networking
Anuja Lad
 
Time Series Analysis with R
Time Series Analysis with RTime Series Analysis with R
Time Series Analysis with R
ARCHIT GUPTA
 
Signal modelling
Signal modellingSignal modelling
Signal modelling
Debangi_G
 
Time series analysis
Time series analysisTime series analysis
Time series analysis
Faltu Focat
 
Dif fft
Dif fftDif fft
Timeseries forecasting
Timeseries forecastingTimeseries forecasting
Timeseries forecasting
Venkata Reddy Konasani
 
Matlab ppt
Matlab pptMatlab ppt
Matlab ppt
Dhammpal Ramtake
 
Time Series
Time SeriesTime Series
Time Series
yush313
 
Time Series Analysis and Mining with R
Time Series Analysis and Mining with RTime Series Analysis and Mining with R
Time Series Analysis and Mining with R
Yanchang Zhao
 
Data Science - Part X - Time Series Forecasting
Data Science - Part X - Time Series ForecastingData Science - Part X - Time Series Forecasting
Data Science - Part X - Time Series Forecasting
Derek Kane
 

Viewers also liked (19)

AR model
AR modelAR model
AR model
 
Psc553
Psc553Psc553
Psc553
 
AR Process in Detail
AR Process in DetailAR Process in Detail
AR Process in Detail
 
parametric method of power spectrum Estimation
parametric method of power spectrum Estimationparametric method of power spectrum Estimation
parametric method of power spectrum Estimation
 
intro to R
intro to Rintro to R
intro to R
 
certificate
certificatecertificate
certificate
 
Low frequency mode estimation
Low frequency mode estimationLow frequency mode estimation
Low frequency mode estimation
 
Questions from chapter 1 data communication and networking
Questions from chapter 1 data communication and networkingQuestions from chapter 1 data communication and networking
Questions from chapter 1 data communication and networking
 
Project time series ppt
Project time series pptProject time series ppt
Project time series ppt
 
Questions from chapter 1 data communication and networking
Questions from chapter 1 data communication and networkingQuestions from chapter 1 data communication and networking
Questions from chapter 1 data communication and networking
 
Time Series Analysis with R
Time Series Analysis with RTime Series Analysis with R
Time Series Analysis with R
 
Signal modelling
Signal modellingSignal modelling
Signal modelling
 
Time series analysis
Time series analysisTime series analysis
Time series analysis
 
Dif fft
Dif fftDif fft
Dif fft
 
Timeseries forecasting
Timeseries forecastingTimeseries forecasting
Timeseries forecasting
 
Matlab ppt
Matlab pptMatlab ppt
Matlab ppt
 
Time Series
Time SeriesTime Series
Time Series
 
Time Series Analysis and Mining with R
Time Series Analysis and Mining with RTime Series Analysis and Mining with R
Time Series Analysis and Mining with R
 
Data Science - Part X - Time Series Forecasting
Data Science - Part X - Time Series ForecastingData Science - Part X - Time Series Forecasting
Data Science - Part X - Time Series Forecasting
 

Similar to Autoregression

A Note on TopicRNN
A Note on TopicRNNA Note on TopicRNN
A Note on TopicRNN
Tomonari Masada
 
Dsp U Lec10 DFT And FFT
Dsp U   Lec10  DFT And  FFTDsp U   Lec10  DFT And  FFT
Dsp U Lec10 DFT And FFT
taha25
 
Wide sense stationary process in digital communication
Wide sense stationary process in digital communicationWide sense stationary process in digital communication
Wide sense stationary process in digital communication
VitthalGavhane1
 
GradStudentSeminarSept30
GradStudentSeminarSept30GradStudentSeminarSept30
GradStudentSeminarSept30
Ryan White
 
Signals and Systems Formula Sheet
Signals and Systems Formula SheetSignals and Systems Formula Sheet
Signals and Systems Formula Sheet
Haris Hassan
 
Particle filtering
Particle filteringParticle filtering
Particle filtering
Wei Wang
 
Tele4653 l1
Tele4653 l1Tele4653 l1
Tele4653 l1
Vin Voro
 
Desktop
DesktopDesktop
Desktop
guest837d922
 
Desktop
DesktopDesktop
Desktop
guest837d922
 
R. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical ObservationsR. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical Observations
SEENET-MTP
 
Lecture 13 HMMs and the derivations for perusal.pdf
Lecture 13 HMMs and the derivations for perusal.pdfLecture 13 HMMs and the derivations for perusal.pdf
Lecture 13 HMMs and the derivations for perusal.pdf
danny197240
 
Mathematics and AI
Mathematics and AIMathematics and AI
Mathematics and AI
Marc Lelarge
 
Scattering theory analogues of several classical estimates in Fourier analysis
Scattering theory analogues of several classical estimates in Fourier analysisScattering theory analogues of several classical estimates in Fourier analysis
Scattering theory analogues of several classical estimates in Fourier analysis
VjekoslavKovac1
 
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
Deep Learning JP
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
On estimating the integrated co volatility using
On estimating the integrated co volatility usingOn estimating the integrated co volatility using
On estimating the integrated co volatility using
kkislas
 
Univariate Financial Time Series Analysis
Univariate Financial Time Series AnalysisUnivariate Financial Time Series Analysis
Univariate Financial Time Series Analysis
Anissa ATMANI
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
The Statistical and Applied Mathematical Sciences Institute
 
Research Inventy : International Journal of Engineering and Science
Research Inventy : International Journal of Engineering and ScienceResearch Inventy : International Journal of Engineering and Science
Research Inventy : International Journal of Engineering and Science
researchinventy
 
Functions, Graphs, & Curves
Functions, Graphs, & CurvesFunctions, Graphs, & Curves
Functions, Graphs, & Curves
Sebastian Vattamattam
 

Similar to Autoregression (20)

A Note on TopicRNN
A Note on TopicRNNA Note on TopicRNN
A Note on TopicRNN
 
Dsp U Lec10 DFT And FFT
Dsp U   Lec10  DFT And  FFTDsp U   Lec10  DFT And  FFT
Dsp U Lec10 DFT And FFT
 
Wide sense stationary process in digital communication
Wide sense stationary process in digital communicationWide sense stationary process in digital communication
Wide sense stationary process in digital communication
 
GradStudentSeminarSept30
GradStudentSeminarSept30GradStudentSeminarSept30
GradStudentSeminarSept30
 
Signals and Systems Formula Sheet
Signals and Systems Formula SheetSignals and Systems Formula Sheet
Signals and Systems Formula Sheet
 
Particle filtering
Particle filteringParticle filtering
Particle filtering
 
Tele4653 l1
Tele4653 l1Tele4653 l1
Tele4653 l1
 
Desktop
DesktopDesktop
Desktop
 
Desktop
DesktopDesktop
Desktop
 
R. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical ObservationsR. Jimenez - Fundamental Physics from Astronomical Observations
R. Jimenez - Fundamental Physics from Astronomical Observations
 
Lecture 13 HMMs and the derivations for perusal.pdf
Lecture 13 HMMs and the derivations for perusal.pdfLecture 13 HMMs and the derivations for perusal.pdf
Lecture 13 HMMs and the derivations for perusal.pdf
 
Mathematics and AI
Mathematics and AIMathematics and AI
Mathematics and AI
 
Scattering theory analogues of several classical estimates in Fourier analysis
Scattering theory analogues of several classical estimates in Fourier analysisScattering theory analogues of several classical estimates in Fourier analysis
Scattering theory analogues of several classical estimates in Fourier analysis
 
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
【DL輪読会】Unbiased Gradient Estimation for Marginal Log-likelihood
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
On estimating the integrated co volatility using
On estimating the integrated co volatility usingOn estimating the integrated co volatility using
On estimating the integrated co volatility using
 
Univariate Financial Time Series Analysis
Univariate Financial Time Series AnalysisUnivariate Financial Time Series Analysis
Univariate Financial Time Series Analysis
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
Research Inventy : International Journal of Engineering and Science
Research Inventy : International Journal of Engineering and ScienceResearch Inventy : International Journal of Engineering and Science
Research Inventy : International Journal of Engineering and Science
 
Functions, Graphs, & Curves
Functions, Graphs, & CurvesFunctions, Graphs, & Curves
Functions, Graphs, & Curves
 

Autoregression

  • 1. Auto-regressive Processes B. Nag and J. Christophersen MET - 6155 November 09, 2011
  • 2. Outline of the talk Introduction of AR(p) Processes Formal Definition White Noise Deriving the First Moment Deriving the Second Moment Lag 1: AR(1) Lag 2: AR(2) Bappaditya, Jonathan Auto-regressive Processes
  • 3. Introduction Dynamics of many physical processes : d 2 x(t) dx(t) a2 2 + a1 + a0 x(t) = z(t) (1) dt dt where z(t) is some external forcing function. Time discretization yields xt = α1 xt−1 + α2 xt−2 + zt (2) Bappaditya, Jonathan Auto-regressive Processes
  • 4. Formal Definition Xt : t ∈ Z is an auto-regressive process of order p if there exist real constants αk , k = 0, . . . , p, with αp = 0 and a white noise process Zt : t ∈ Z such that p Xt = α0 + αk Xt−k + Zt (3) k=1 Note : Xt is independent of the part of Zt that is in the future, but depends on the parts of the noise processes that are in the present and the past Bappaditya, Jonathan Auto-regressive Processes
  • 5. White Noise Consider a time series : Xt = Dt + Nt (4) with Dt and Nt being the determined and stochastic (random) components respectively. If Dt is independent of Nt , then Dt is deterministic. Nt masks deterministic oscillations when present. Let us consider the case for k = 1. Xt = α1 Xt−1 + Nt = α1 (Dt−1 + Nt−1 ) + Nt = α1 Dt−1 + α1 Nt−1 + Nt where, α1 Nt−1 can be regarded as the contribution from the dynamics of the white noise. The spectrum of a white noise process is flat and hence the name. Bappaditya, Jonathan Auto-regressive Processes
  • 6. (a) (b) Figure: A realization of a process Xt = Dt + Nt for which the dynamical component Dt = 0.7Xt is affected by the stochastic component Nt . (a) Nt (b) Xt 0 All plots are made up of 100 member ensemble Bappaditya, Jonathan Auto-regressive Processes
  • 7. First Order Moment : Mean of an AR(p) Process 2 Assumptions : µX and σX is independent of time. Taking expectations on both sides of the generalized eqn.( 3), p ε(Xt ) = ε(α0 ) + ε( αk Xt−k ) + ε(Zt ) k=1 p = α0 + αk ε(Xt−k ) k=1 p = α0 + αk ε(Xt ) k=1 α0 = p (5) 1− αk k=1 Bappaditya, Jonathan Auto-regressive Processes
  • 8. Second Order Moment : Variance of an AR(p) Process Proposition: p Var (Xt ) = αk ρk Var (Xt ) + Var (Zt ) k=1 Proof: Let µ = ε(Xt ), then re-writting eqn. (3), p Xt − µ = αk (Xt−k − µ) + Zt (6) k=1 Multiplying both sides by Xt − µ and taking expectations : Var (Xt ) = ε((Xt − µ)2 ) p = ε( αk (Xt − µ)(Xt−k − µ)) + ε((Xt − µ)Zt ) k=1 p = αk ε((Xt − µ)(Xt−k − µ)) + ε((Xt − µ)Zt ) k=1 Bappaditya, Jonathan Auto-regressive Processes
  • 9. p Var (Xt ) = αk ρk Var (Xt ) + ε((Xt − µ)Zt ) (7) k=1 where ρk is the auto-correlation function defined as ε((Xt − µ)(Xt−k − µ)) ρk = (8) Var (Xt ) Lemma : ε((Xt − µ)Zt ) = Var (Zt ) Proof: ε((Xt − µ)Zt ) = ε(Xt Zt − µZt ) = ε(Xt Zt ) − ε(µZt ) (9) Again, p ε(Xt Zt ) = ε( αk (Xt−k − µ) + Zt + µ)Zt k=1 p = ε( αk Xt−k Zt ) − ε(µZt ) + ε(Z2 ) + ε(µZt ) t k=1 Bappaditya, Jonathan Auto-regressive Processes
  • 10. p ε(Xt Zt ) = αk ε(Xt−k Zt ) + ε(Z2 ) t k=1 p = αk ε(Xt−k Zt ) + Var (Zt ) (10) k=1 Since Xt is independent of the part of Zt that is in the future implies Xt−k and Zt are independent. Hence ε(Xt−k Zt ) = 0 Hence we get, ε(Xt Zt ) = Var (Zt ) (11) From equation (5), ε(µZt ) = µε(Zt ) α0 = p ε(Zt ) 1 − k=1 αk α0 = p ×0 1 − k=1 αk = 0 Bappaditya, Jonathan Auto-regressive Processes
  • 11. Thus ε((Xt − µ)Zt ) = Var (Zt ) (12) and eqn. (7) reduces to p Var (Xt ) = αk ρk Var (Xt ) + Var (Zt ) k=1 Var (Zt ) Var (Xt ) = p (13) 1− αk ρk k=1 Bappaditya, Jonathan Auto-regressive Processes
  • 12. AR(1) Processes Consider the following equation: dx a1 + a0 x = z(t) (14) dt Discretizing again : a1 (x1 − xt−1 ) + a0 xt = zt at xt − a1 xt−1 + a0 xt = zt xt (a1 + a0 ) − a1 xt−1 = zt Therefore we obtain : xt = α1 xt−1 + zt (15) a1 zt where α1 = a1 +a0 and zt = a1 +a0 Bappaditya, Jonathan Auto-regressive Processes
  • 13. AR(1) Processes Continued Hence an AR(1) Process can be represented as Xt = α1 Xt−1 + Zt (16) For convinience we assume, α0 = 0 and ε(Xt ) = µ = 0 Expectation of the product of Xt with Xt−1 is ε(Xt Xt−1 ) = α1 ε(X2 ) + ε(Zt Xt−1 ) t−1 Since Xt does not depend on the part of Zt that is in the future, hence ε(Zt Xt−1 ) = 0 Also since the variance is independent of time, ε(Xt Xt−1 ) = α1 ε(X2 ) t (17) Hence, ε(Xt Xt−1 ) α1 = (18) Var (Xt ) Bappaditya, Jonathan Auto-regressive Processes
  • 14. AR(1) Processes Continued Substituting for k = 1, in eqn. (8), yields ε(Xt Xt−1 ) ρ1 = (19) Var (Xt ) Hence ρ1 = α1 Using this we can write eqn. (13) for an AR(1) process as Var(Zt ) Var(Xt ) = p 1− k=1 αk ρk 2 σz = 2 (20) 1 − α1 This result shows that the variance of the random variable Xt is a linear 2 function of the variance of the white noise σZ . This also shows that the variance is also a nonlinear function of α1 . If α1 ≈ 0, then the Var (Xt ) ≈ Var (Zt ). For α1 ∈ [0, 1], we see that Var (Xt ) > Var (Zt ). As α1 approaches 1, the Var (Xt ) approaches ∞. Bappaditya, Jonathan Auto-regressive Processes
  • 15. (a) (b) Figure: AR(1) Processes with α1 = 0.3 (top) and α1 = 0.9 (bottom) Bappaditya, Jonathan Auto-regressive Processes
  • 16. AR(2) Processes d 2 x(t) dx(t) a2 + a1 + a0 x(t) = z(t) (21) dt 2 dt where z(t) is some external forcing function. Time discretization yields a2 (xt + xt−2 − 2xt−1 ) + a1 (xt − xt−1 ) + a0 xt = z(t) (a0 + a1 + a2 )xt = (a1 + 2a2 )xt−1 − a2 xt−2 + zt Alternatively, xt = α1 xt−1 + α2 xt−2 + zt (22) where a1 + 2a2 α1 = a0 + a1 + a2 a2 α2 = − a0 + a1 + a2 1 zt = zt a0 + a1 + a2 Bappaditya, Jonathan Auto-regressive Processes
  • 17. and by = −0.8 (top) Figure: AR(2) Processes with α1 = 0.9Generated α2CamScanner from intsig.com and with α1 = α2 = 0.3 (bottom) Bappaditya, Jonathan Auto-regressive Processes
  • 18. Parameterizing AR(2) Processes In order for AR(2) processes to be stationary, α1 and α2 must satisfy three conditions: (1) α1 + α2 < 1 (2) α1 − α2 < 1 (3) −1 < α2 < 1 This defines a triangular region for the (α1 , α2 )-plane. Note that if α2 = 0 then we observe AR(1) processes where −1 < α1 < 1 defines the space for which α1 is stationary in an AR(1) model. Bappaditya, Jonathan Auto-regressive Processes
  • 19. Parameterizing AR(2) Processes Continued Figure: Region of stationary points for AR(1) andbyAR(2) processes Generated CamScanner from intsig.com Bappaditya, Jonathan Auto-regressive Processes
  • 20. Parameterizing AR(2) Processes Continued The figure above shows: AR(1) processes are special cases: α1 > 0 shows exponential decay α1 < 0 shows damped oscillations α1 > 0 for most meteorological phenomena The second parameter α2 : More complex relationship between lags For (0.9, −0.6), slow damped oscillation around 0 AR(2) models can represent pseudoperiodicity Barometric pressure variations due to midlatitude synoptic systems follow pseudoperiodic behavior Bappaditya, Jonathan Auto-regressive Processes
  • 21. Parameterizing AR(2) Processes Continued (a) (b) (c) (d) Figure: Four synthetic time series illustrating some properties of autoregressive models. (a) α1 = 0.0, α2 = 0.1, (b) α1 = 0.5, α2 = 0.1, (c) α1 = 0.9, α2 = −0.6, (d) α1 = 0.09, α2 = 0.11 Bappaditya, Jonathan Auto-regressive Processes
  • 22. References von Storch, H., 1999: Statistical analysis in climate research, 1st ed. Cambridge University, 494 pp. Wilks, D., 1995: Statistical methods in the atmospheric sciences, 1st ed. Academic Press, Inc., 467 pp. Scheaffer, R., 1994: Introduction to probability and its applications, 2nd ed. Duxberry Press, 377 pp. Bappaditya, Jonathan Auto-regressive Processes
  • 23. Questions ?? Bappaditya, Jonathan Auto-regressive Processes