SlideShare a Scribd company logo
Time Series
Ashutosh
Third-Year UG student
Mechanical Engineering Department
IIT Kharagpur
• Continuous Time Series
• Discrete Time Series
o E.g. Adjustment of a price P in response to non-zero excess
demand for a product can be modeled in continuous time as:
𝑑𝑃
𝑑𝑡
= λ
o A discrete signal or discrete-time signal is a time series
consisting of a sequence of quantities e.g. Weather data etc.
Time Series
Components of Time Series
Trend Seasonality
Cyclicity
1
General Approach
 Plot the series and examine the main features of the graph, checking in particular whether there is:
o A trend,
o A seasonal component,
o Any apparent sharp changes in behavior,
o Any outlying observations.
 Remove the trend and seasonal components to get stationary residuals.
 Choose a model to fit the residuals, making use of various sample statistics including the sample
autocorrelation function.
 Forecasting will be achieved by forecasting the residuals and then inverting the transformations
described above to arrive at forecasts of the original series {𝑋𝑡}.
1
Stationarity
• Strict Stationarity
• Weak Stationarity
{𝑋𝑡} is (weakly) stationary if
 The mean function of { 𝑋𝑡} = µ 𝑋 (t) = E(𝑋𝑡) is independent of t,
 The covariance function of {𝑋𝑡} = γ 𝑋(r, s) = Cov(𝑋𝑟, 𝑋𝑠) = E[(𝑋𝑟 − µ 𝑋 (r))( 𝑋𝑠 − µ 𝑋 (s))] for all integers r and s
and γ 𝑋 (h) := γ 𝑋 (h, 0 ) = γ 𝑋(t + h, t) = γ 𝑋 (t + h, t) is independent of t for each h.
{𝑋𝑡} is a strictly stationary time series if
• (𝑋1, … … . , 𝑋 𝑛) ′ ≜ (𝑋1+ℎ, … … . , 𝑋 𝑛+ℎ) ′, for all integers h and n ≥ 1.
1
𝑋𝑡 = 𝑚 𝑡 + 𝑠𝑡 + 𝑌𝑡, t = 1,…..,n, where E 𝑌𝑡 = 0, 𝑠𝑡+𝑑 = 𝑠𝑡, and
𝑗=1
𝑑
𝑠𝑗 = 0
Removal of Trend
• By estimation of 𝑚 𝑡 and 𝑠𝑡
• By differencing the series {𝑋𝑡 }
• Smoothing with a finite
moving average filter
• Exponential smoothing
• Smoothing by elimination
of high-frequency
components
• Polynomial fitting
lag-1 difference operator ∇
1
Removal of Trend
• By estimation of 𝑚 𝑡 and 𝑠𝑡
• Smoothing with a finite
moving average filter
• Let q be a nonnegative integer and consider the two-sided moving average: 𝑊𝑡 =
1
(2𝑞+1) 𝑗=−q
𝑞
𝑋𝑡−𝑗.
• Assuming that 𝑚 𝑡 is approximately linear over the interval [t − q, t + q] and that the average of the error terms
over this interval is close to zero.
• The moving average thus provides us with the estimates 𝑚 𝑡 =
1
(2𝑞+1) 𝑗=−q
𝑞
𝑋𝑡−𝑗 .
• For large q it will attenuate noise as 𝑚 𝑡 ≈ 0. So, to overcome this we can use the Spencer 15-point moving
average as a filter that passes polynomials of degree 3 without distortions
Linear Filter
{𝒙 𝒕} { 𝒎 𝒕 = 𝒂𝒋 𝒙 𝒕−𝒋}
1
Removal of Trend
• By estimation of 𝑚 𝑡 and 𝑠𝑡
• Exponential smoothing
• For any fixed α ∈ [0 , 1], the one-sided moving averages 𝑚 𝑡, t = 1 ,...,n , defined by the recursions:
𝑚 𝑡 = α𝑋𝑡 + ( 1 − α) 𝑚 𝑡−1 , t = 2 ,...,n, and 𝑚1 = 𝑋1
• Weighted moving average with weights decreasing exponentially (except for the last one).
1
Removal of Trend
• By differencing the series {𝑋𝑡 }
lag-1 difference operator ∇
• We define the lag-1 difference operator ∇ by:
∇𝑋𝑡 = 𝑋𝑡 − 𝑋𝑡−1= (1 − B) 𝑋𝑡, where B is the backward shift operator.
• B𝑋𝑡 = 𝑋𝑡−1.
• Powers of the operators B and ∇ are defined as 𝐵 𝑗
(𝑋𝑡) = 𝑋𝑡−𝑗 and 𝛻 𝑗
(𝑋𝑡) = ∇(𝛻 𝑗−1
(𝑋𝑡)), j ≥ 1, with 𝛻0
(𝑋𝑡) = 𝑋𝑡.
1
Removal of Seasonality
𝑋𝑡 = 𝑚 𝑡 + 𝑠𝑡 + 𝑌𝑡, t = 1,…..,n, where E 𝑌𝑡 = 0, 𝑠𝑡+𝑑 = 𝑠𝑡, and
𝑗=1
𝑑
𝑠𝑗 = 0
Estimation of Trend and then
Seasonal Components
lag- d differencing operator 𝛻𝑑
lag-d difference operator 𝛻𝑑
1
Removal of Seasonality
Estimation of Trend and then
Seasonal Components
• The trend is first estimated by applying a moving average filter specially chosen to eliminate the seasonal
component and to dampen the noise.
• If the period d is even, say d = 2q , then we use, 𝑚 𝑡 = (0.5 𝑥𝑡−𝑞 + 𝑥𝑡−𝑞+1 + ……… + 𝑥𝑡+𝑞)/d, q <t ≤ n −q.
• If the period is odd, say d = 2 q + 1, then we use the simple moving average.
• The second step is to estimate the seasonal component. For each k = 1 ,...,d ,we compute the average 𝜔 𝑘 of the
deviations {( 𝑥 𝑘+𝑗𝑑 -m 𝑘+𝑗𝑑), q < k + jd ≤ n−q}.
• Since these average deviations do not necessarily sum to zero, we estimate the seasonal component 𝑠 𝑘 as, s 𝑘 =
𝜔 𝑘 -
1
𝑑 𝑖−1
𝑑
𝜔𝑖, k = 1, ….., d, and s 𝑘 = s 𝑘−𝑑, k > d.
• The deseasonalized data is then defined to be the original series with the estimated seasonal component
removed, i.e., 𝑑 𝑡 = 𝑥𝑡 - s𝑡, t = 1, …. ,n. Finally, we re-estimate the trend from the deseasonalized data { 𝑑 𝑡} using
one of the methods already described.
1
lag- d differencing operator 𝛻𝑑
Removal of Seasonality
• Lag- d differencing operator 𝛻𝑑 defined as: 𝛻𝑑 𝑋𝑡 = 𝑋𝑡 - 𝑋𝑡−𝑑 = (1 – 𝐵 𝑑) 𝑋𝑡.
• Applying the operator 𝛻𝑑 to the model 𝑋𝑡 = 𝑚 𝑡 + 𝑠𝑡 + 𝑌𝑡, where {𝑠𝑡} has period d, we obtain 𝛻𝑑 𝑋𝑡 = 𝑚 𝑡 - 𝑚 𝑡−𝑑
+ 𝑌𝑡 - 𝑌𝑡−𝑑, which gives a decomposition of the difference 𝛻𝑑 𝑋𝑡 into a trend component (𝑚 𝑡 - 𝑚 𝑡−𝑑) and a noise
term (𝑌𝑡 - 𝑌𝑡−𝑑).
• The trend, 𝑚 𝑡 - 𝑚 𝑡−𝑑, can then be eliminated using the methods already described, in particular by applying a
power of the operator ∇.
• This doubly differenced series can in fact be well represented by a stationary time series model.
lag-d difference operator 𝛻𝑑
1
Test of Randomness1
• The Portmanteau test
• The Turning point test
• The Difference-sign test
• The rank test
• Fitting an Auto-regressive model
• Ljung and Box test
• McLeod and Li Test
Stationary Processes
• Linear Processes
o The time series {𝑋𝑡} is a linear process if it has the representation 𝑋𝑡 = 𝑗= −∞
∞
𝜓𝑗 𝑍𝑡−𝑗 or 𝑋𝑡 =
𝜓(𝐵)𝑍𝑡, where 𝜓 𝐵 = 𝑗= −∞
∞
𝜓𝑗 B 𝑗 for all t, where {𝑍𝑡} ∼ WN(0, σ 2) and {𝜓𝑗} is a sequence of
constants with 𝑗= −∞
∞
𝜓𝑗 < ∞.
o The class of linear time series models includes the class of Auto-Regressive Moving-Average (ARMA)
models
o Every second-order stationary process is either a linear process or can be transformed to a linear
process by subtracting a deterministic component
2
• MA(q) Process
o {𝑋𝑡} is a moving-average process of order q if 𝑋𝑡 = 𝑍𝑡 + θ1 𝑍𝑡−1 +.…+ θq 𝑍𝑡−q, where {𝑍𝑡] ∼ WN(0 ,σ 2)
and θ 1 ,...,θ q are constants
o Every q-correlated process is an MA(q) process.
• AR(p) Process
o {𝑋𝑡} is an Auto-Regressive process of order p if 𝑋𝑡 = φ1 X 𝑡−1 +.…+ φp X 𝑡−p + 𝑍𝑡, where {𝑍𝑡] ∼ WN(0
,σ 2) and 𝑍𝑡 is uncorrelated with 𝑋𝑠 for each s < t.
• ARMA(p, q) Process
o {𝑋𝑡} is an ARMA(p, q) process if 𝑋𝑡 - φ1 X 𝑡−1 -.…- φp X 𝑡−p = 𝑍𝑡 + θ1 𝑍𝑡−1 +.…+ θq 𝑍𝑡−q, where {𝑍𝑡] ∼
WN(0 ,σ 2
) and the polynomials (1 - φ1z -…- φp 𝑧 𝑝
) and (1 + θ1z +…+ θq 𝑧 𝑞
) have no common factors.
AR(p), MA(q) and ARMA(p, q) Processes2
ARMA(p, q) Processes
• ARMA(p, q) Process
o {𝑋𝑡} is an ARMA(p, q) process if 𝑋𝑡 - φ1 X 𝑡−1 -.…- φp X 𝑡−p = 𝑍𝑡 + θ1 𝑍𝑡−1 +.…+ θq 𝑍𝑡−q, where {𝑍𝑡] ∼ WN(0 ,σ 2
)
and the polynomials (1 - φ1z -…- φp 𝑧 𝑝
) and (1 + θ1z +…+ θq 𝑧 𝑞
) have no common factors.
o 𝑋𝑡 in above definition must be Stationary.
o A stationary solution {𝑋𝑡} of above equation exists (and is also the unique stationary solution) if and only if φ(z)
= 1 − φ1z − ··· − φpzp ≠ 0 for all |z| = 1.
o An ARMA(p, q) process {𝑋𝑡} is causal, or a causal function of {𝑍𝑡} , if there exist constants {ψ 𝑗} such that 𝑋𝑡 =
𝑗=0
∞
|𝜓𝑗|𝑍𝑡−𝑗. Causality is equivalent to the condition φ(z) = 1 − φ1z − ··· − φpzp ≠ 0 for all |z| ≤ 1.
o An ARMA(p, q) process {𝑋𝑡} is Invertible if there exist constants {𝜋𝑗} such that 𝑗=0
∞
𝜋𝑗 < ∞ and 𝑍𝑡 = 𝑗=0
∞
𝜋𝑗 𝑋𝑡−𝑗
for all t. Invertibility is equivalent to the condition θ(z) = 1 + θ1z +…+ θq 𝑧 𝑞 ≠ 0 for all |z| ≤ 1.
o We will focus our attention principally on Causal and Invertible ARMA processes.
2
ACF and PACF of ARMA(p, q) process
• PACF
o The partial autocorrelation function (PACF) of an ARMA process {𝑋𝑡} is the function α(·) defined by the
equations: α(0) = 0, and α(h) = ∅ℎℎ, h ≥ 1, where ∅ℎℎ = Γℎ
−1
𝛾ℎ, Γℎ =[𝛾(I - j)]
ℎ
𝑖, 𝑗 = 1
and 𝛾ℎ =
[𝛾(1), 𝛾(2),…, 𝛾(h)]′
o PAC For a causal AR(p) process is zero for lags greater than p.
• ACF
o If the sample ACF 𝜌(h) is significantly different from zero for 0 ≤ h ≤ q and negligible for h > q, then it
is MA(q) process
o In order to apply this criterion we need to take into account the random variation expected in the
sample autocorrelation function before we can classify ACF values as “negligible.” To resolve this
problem we can use Bartlett’s formula (Section 2.4), which implies that for a large sample of size n from
an MA( q ) process, the sample ACF values at lags greater than q are approximately normally distributed
with means 0 and variances 𝜔ℎℎ/n = (1 + 2𝜌2
(1) +…+ 2𝜌2
(q))/n
o This means that if the sample is from MA(q) process and if h > q, then 𝜌(h) should fall between the
bounds ±1.96 𝜔ℎℎ/𝑛 with probability approximately 0.95. In practice we frequently use the more
stringent values ±1.96
2
Forecasting ARMA Processes
• Innovations Algorithm
o It provides us with a recursive method for forecasting second-order zero-mean processes that are not
necessarily stationary.
o For the causal ARMA process φ(B) 𝑋𝑡 = θ(B) 𝑍𝑡, {𝑍𝑡} ∼ WN(0, 𝜎2
), it is possible to simplify the application of
the algorithm drastically.
2
3 Modelling and Forecasting with ARMA Processes
• General
o Estimation of the parameters φ = (φ𝑖 ,…, φ 𝑝), θ = (θ𝑖 ,…, θ 𝑞), and 𝜎2
when p and q are assumed to be known
o Assumption that data have been “mean-corrected” by subtraction of the sample mean, so that it is appropriate
to fit a zero-mean ARMA model to the adjusted data x1 ,..., x 𝑛 . If the model fitted to the mean-corrected data is
φ(B)X 𝑡 = θ(B)Z 𝑡, {Z 𝑡} ∼ WN(0, 𝜎2
)
o When p and q are known, good estimators of φ and θ can be found by imagining the data to be observations of a
stationary Gaussian time series and maximizing the likelihood with respect to the p + q + 1 parameters φ1 ,..., φ 𝑝,
θ1 ,..., θ 𝑞 and 𝜎2
. The estimators obtained by this procedure are known as maximum likelihood (or maximum
Gaussian likelihood) estimators
• Preliminary Estimation of parameters
o Yule-Walker Estimation: The Yule–Walker and Burg procedures apply to the fitting of pure autoregressive models.
(Although the former can be adapted to models with q > 0, its performance is less efficient than when q = 0.).
Assumption is that the ACF of {X 𝑡} coincides with the sample ACF at lags 1,…,p.
o Burg’s Algorithm: Assumption is that the PACF of {X 𝑡} coincides with the sample ACF at lags 1,…,p.
o The Innovations Algorithm:
o The Hannan-Rissanen Algorithm:
• After getting Preliminary Estimates we apply Maximum Likelihood Estimation (MLE) (or maximum Gaussian likelihood)
to estimate the parameters.
Diagnostic Checking3
• Residuals are defined by: 𝑊𝑡 = (𝑋𝑡 - 𝑋𝑡(φ, 𝜃)) / (𝑟𝑡−1(φ, 𝜃))
1/2
, t = 1,…,n.
o E(𝑋 𝑛+1 − 𝑋 𝑛+1)
2
= 𝜎2
E(𝑊𝑛+1 − 𝑊𝑛+1)
2
= 𝜎2
𝑟𝑛
• Rescaled Residuals 𝑅𝑡, t = 1 ,…, n, are obtained by dividing the residuals 𝑊𝑡, t = 1 ,…, n, by the estimate 𝜎 =
( 𝑡=1
𝑛
𝑊𝑡
2
)/𝑛 of the white noise standard deviation. Thus, 𝑅𝑡 = 𝑊𝑡/ 𝜎
• If the fitted model is appropriate, the rescaled residuals should have properties similar to those of a
WN(0,1) sequence or of an iid(0,1) sequence if we make the stronger assumption that the white noise {𝑍𝑡}
driving the ARMA process is independent white noise.
Diagnostic Checking3
The Graph of { 𝑅𝑡, t = 1 ,…, n}
• If the fitted model is appropriate, then the graph of the rescaled residuals { 𝑅𝑡, t = 1 ,…, n} should resemble
that of a white noise sequence with variance one.
Rescaled residuals after fitting the ARMA(1,1) model to some data
3 Order Selection
• p from PACF and q from ACF
• AICC Criterion
3 Order Selection
• AICC Criterion
• Corrected version of Akaike Information Criterion (AIC)
• Choose p , q , φp, and θq to minimize AICC = -2𝐥𝐧 𝐋(φ 𝐩, 𝛉 𝐪, 𝐒(φ 𝐩, 𝛉 𝐪)/𝐧) + 2(p+q+1)n/(n-p-q-2).
Thank You
Reference(s)
1) Introduction to Time Series and Forecasting, Brockwell, Peter J., Davis, Richard A.,
https://www.springer.com/us/book/9781475777505
2) Discrete time Series, https://www.wikiwand.com/en/Discrete-time_signal

More Related Content

What's hot

Laplace transformation
Laplace transformationLaplace transformation
Laplace transformation
amanullahkakar2
 
Laplace transformation
Laplace transformationLaplace transformation
Laplace transformation
Wasim Shah
 
Time series Analysis
Time series AnalysisTime series Analysis
Time series Analysis
Mahak Vijayvargiya
 
Julio Bravo's Master Graduation Project
Julio Bravo's Master Graduation ProjectJulio Bravo's Master Graduation Project
Julio Bravo's Master Graduation ProjectJulio Bravo
 
Linear transforamtion and it,s applications.(VCLA)
Linear transforamtion and it,s applications.(VCLA)Linear transforamtion and it,s applications.(VCLA)
Linear transforamtion and it,s applications.(VCLA)
DeepRaval7
 
Laplace transforms
Laplace transformsLaplace transforms
Laplace transforms
wilmersaguamamani
 
Laplace Transform and its applications
Laplace Transform and its applicationsLaplace Transform and its applications
Laplace Transform and its applications
DeepRaval7
 
Helicopter rotor dynamics
Helicopter rotor dynamicsHelicopter rotor dynamics
Helicopter rotor dynamics
Deepak Paul Tirkey
 
Unit 5: All
Unit 5: AllUnit 5: All
Unit 5: All
Hector Zenil
 
signal and system Dirac delta functions (1)
signal and system Dirac delta functions (1)signal and system Dirac delta functions (1)
signal and system Dirac delta functions (1)
iqbal ahmad
 
The feedback-control-for-distributed-systems
The feedback-control-for-distributed-systemsThe feedback-control-for-distributed-systems
The feedback-control-for-distributed-systemsCemal Ardil
 
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and SystemsDSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
Amr E. Mohamed
 
First Order Active RC Sections
First Order Active RC SectionsFirst Order Active RC Sections
First Order Active RC Sections
Hoopeer Hoopeer
 
Z transfrm ppt
Z transfrm pptZ transfrm ppt
Z transfrm ppt
SWATI MISHRA
 
The Laplace Transform of Modeling of a Spring-Mass-Damper System
The Laplace Transform of Modeling of a Spring-Mass-Damper System The Laplace Transform of Modeling of a Spring-Mass-Damper System
The Laplace Transform of Modeling of a Spring-Mass-Damper System Mahmoud Farg
 
Lecture 3 sapienza 2017
Lecture 3 sapienza 2017Lecture 3 sapienza 2017
Lecture 3 sapienza 2017
Franco Bontempi Org Didattica
 
Transfer fn mech. systm
Transfer fn mech. systmTransfer fn mech. systm
Transfer fn mech. systm
Syed Saeed
 
Heaviside's function
Heaviside's functionHeaviside's function
Heaviside's function
SeanPereira2
 
Laplace transformation
Laplace transformationLaplace transformation
Laplace transformation
Santhanam Krishnan
 

What's hot (20)

Laplace transformation
Laplace transformationLaplace transformation
Laplace transformation
 
Laplace transformation
Laplace transformationLaplace transformation
Laplace transformation
 
Time series Analysis
Time series AnalysisTime series Analysis
Time series Analysis
 
Julio Bravo's Master Graduation Project
Julio Bravo's Master Graduation ProjectJulio Bravo's Master Graduation Project
Julio Bravo's Master Graduation Project
 
Linear transforamtion and it,s applications.(VCLA)
Linear transforamtion and it,s applications.(VCLA)Linear transforamtion and it,s applications.(VCLA)
Linear transforamtion and it,s applications.(VCLA)
 
Laplace transforms
Laplace transformsLaplace transforms
Laplace transforms
 
Laplace Transform and its applications
Laplace Transform and its applicationsLaplace Transform and its applications
Laplace Transform and its applications
 
Helicopter rotor dynamics
Helicopter rotor dynamicsHelicopter rotor dynamics
Helicopter rotor dynamics
 
Unit 5: All
Unit 5: AllUnit 5: All
Unit 5: All
 
signal and system Dirac delta functions (1)
signal and system Dirac delta functions (1)signal and system Dirac delta functions (1)
signal and system Dirac delta functions (1)
 
The feedback-control-for-distributed-systems
The feedback-control-for-distributed-systemsThe feedback-control-for-distributed-systems
The feedback-control-for-distributed-systems
 
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and SystemsDSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
DSP_FOEHU - MATLAB 01 - Discrete Time Signals and Systems
 
LieGroup
LieGroupLieGroup
LieGroup
 
First Order Active RC Sections
First Order Active RC SectionsFirst Order Active RC Sections
First Order Active RC Sections
 
Z transfrm ppt
Z transfrm pptZ transfrm ppt
Z transfrm ppt
 
The Laplace Transform of Modeling of a Spring-Mass-Damper System
The Laplace Transform of Modeling of a Spring-Mass-Damper System The Laplace Transform of Modeling of a Spring-Mass-Damper System
The Laplace Transform of Modeling of a Spring-Mass-Damper System
 
Lecture 3 sapienza 2017
Lecture 3 sapienza 2017Lecture 3 sapienza 2017
Lecture 3 sapienza 2017
 
Transfer fn mech. systm
Transfer fn mech. systmTransfer fn mech. systm
Transfer fn mech. systm
 
Heaviside's function
Heaviside's functionHeaviside's function
Heaviside's function
 
Laplace transformation
Laplace transformationLaplace transformation
Laplace transformation
 

Similar to Time series Modelling Basics

Av 738- Adaptive Filtering - Background Material
Av 738- Adaptive Filtering - Background MaterialAv 738- Adaptive Filtering - Background Material
Av 738- Adaptive Filtering - Background Material
Dr. Bilal Siddiqui, C.Eng., MIMechE, FRAeS
 
Numerical Methods
Numerical MethodsNumerical Methods
Numerical Methods
Teja Ande
 
Lecture 6 of Agricultural instrumentation
Lecture 6 of Agricultural instrumentationLecture 6 of Agricultural instrumentation
Lecture 6 of Agricultural instrumentation
lauralasun1865
 
Kalman filter for Beginners
Kalman filter for BeginnersKalman filter for Beginners
Kalman filter for Beginners
winfred lu
 
SDF Hysteretic System 1 - Analytical Vaiana Rosati Model
SDF Hysteretic System 1 - Analytical Vaiana Rosati ModelSDF Hysteretic System 1 - Analytical Vaiana Rosati Model
SDF Hysteretic System 1 - Analytical Vaiana Rosati Model
University of Naples Federico II
 
Lec1 01
Lec1 01Lec1 01
Fortran chapter 2.pdf
Fortran chapter 2.pdfFortran chapter 2.pdf
Fortran chapter 2.pdf
JifarRaya
 
Intro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdfIntro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdf
JifarRaya
 
Linear regression, costs & gradient descent
Linear regression, costs & gradient descentLinear regression, costs & gradient descent
Linear regression, costs & gradient descent
Revanth Kumar
 
Estimating structured vector autoregressive models
Estimating structured vector autoregressive modelsEstimating structured vector autoregressive models
Estimating structured vector autoregressive models
Akira Tanimoto
 
Week_10.2.pdf
Week_10.2.pdfWeek_10.2.pdf
Week_10.2.pdf
Mir Shah
 
Optimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methodsOptimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methods
SantiagoGarridoBulln
 
Mathematics of nyquist plot [autosaved] [autosaved]
Mathematics of nyquist plot [autosaved] [autosaved]Mathematics of nyquist plot [autosaved] [autosaved]
Mathematics of nyquist plot [autosaved] [autosaved]
Asafak Husain
 
Kalman filter - Applications in Image processing
Kalman filter - Applications in Image processingKalman filter - Applications in Image processing
Kalman filter - Applications in Image processingRavi Teja
 
ch3.ppt
ch3.pptch3.ppt
2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filter2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filternozomuhamada
 
22 01 2014_03_23_31_eee_formula_sheet_final
22 01 2014_03_23_31_eee_formula_sheet_final22 01 2014_03_23_31_eee_formula_sheet_final
22 01 2014_03_23_31_eee_formula_sheet_final
vibhuti bansal
 
Av 738 - Adaptive Filtering - Kalman Filters
Av 738 - Adaptive Filtering - Kalman Filters Av 738 - Adaptive Filtering - Kalman Filters
Av 738 - Adaptive Filtering - Kalman Filters
Dr. Bilal Siddiqui, C.Eng., MIMechE, FRAeS
 
Digital control systems (dcs) lecture 18-19-20
Digital control systems (dcs) lecture 18-19-20Digital control systems (dcs) lecture 18-19-20
Digital control systems (dcs) lecture 18-19-20
Ali Rind
 

Similar to Time series Modelling Basics (20)

Av 738- Adaptive Filtering - Background Material
Av 738- Adaptive Filtering - Background MaterialAv 738- Adaptive Filtering - Background Material
Av 738- Adaptive Filtering - Background Material
 
Numerical Methods
Numerical MethodsNumerical Methods
Numerical Methods
 
Lecture 6 of Agricultural instrumentation
Lecture 6 of Agricultural instrumentationLecture 6 of Agricultural instrumentation
Lecture 6 of Agricultural instrumentation
 
Kalman filter for Beginners
Kalman filter for BeginnersKalman filter for Beginners
Kalman filter for Beginners
 
SDF Hysteretic System 1 - Analytical Vaiana Rosati Model
SDF Hysteretic System 1 - Analytical Vaiana Rosati ModelSDF Hysteretic System 1 - Analytical Vaiana Rosati Model
SDF Hysteretic System 1 - Analytical Vaiana Rosati Model
 
Lec1 01
Lec1 01Lec1 01
Lec1 01
 
Fortran chapter 2.pdf
Fortran chapter 2.pdfFortran chapter 2.pdf
Fortran chapter 2.pdf
 
Intro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdfIntro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdf
 
Linear regression, costs & gradient descent
Linear regression, costs & gradient descentLinear regression, costs & gradient descent
Linear regression, costs & gradient descent
 
Estimating structured vector autoregressive models
Estimating structured vector autoregressive modelsEstimating structured vector autoregressive models
Estimating structured vector autoregressive models
 
Presentation
PresentationPresentation
Presentation
 
Week_10.2.pdf
Week_10.2.pdfWeek_10.2.pdf
Week_10.2.pdf
 
Optimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methodsOptimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methods
 
Mathematics of nyquist plot [autosaved] [autosaved]
Mathematics of nyquist plot [autosaved] [autosaved]Mathematics of nyquist plot [autosaved] [autosaved]
Mathematics of nyquist plot [autosaved] [autosaved]
 
Kalman filter - Applications in Image processing
Kalman filter - Applications in Image processingKalman filter - Applications in Image processing
Kalman filter - Applications in Image processing
 
ch3.ppt
ch3.pptch3.ppt
ch3.ppt
 
2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filter2012 mdsp pr03 kalman filter
2012 mdsp pr03 kalman filter
 
22 01 2014_03_23_31_eee_formula_sheet_final
22 01 2014_03_23_31_eee_formula_sheet_final22 01 2014_03_23_31_eee_formula_sheet_final
22 01 2014_03_23_31_eee_formula_sheet_final
 
Av 738 - Adaptive Filtering - Kalman Filters
Av 738 - Adaptive Filtering - Kalman Filters Av 738 - Adaptive Filtering - Kalman Filters
Av 738 - Adaptive Filtering - Kalman Filters
 
Digital control systems (dcs) lecture 18-19-20
Digital control systems (dcs) lecture 18-19-20Digital control systems (dcs) lecture 18-19-20
Digital control systems (dcs) lecture 18-19-20
 

Recently uploaded

一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
mbawufebxi
 
Empowering Data Analytics Ecosystem.pptx
Empowering Data Analytics Ecosystem.pptxEmpowering Data Analytics Ecosystem.pptx
Empowering Data Analytics Ecosystem.pptx
benishzehra469
 
SOCRadar Germany 2024 Threat Landscape Report
SOCRadar Germany 2024 Threat Landscape ReportSOCRadar Germany 2024 Threat Landscape Report
SOCRadar Germany 2024 Threat Landscape Report
SOCRadar
 
一比一原版(UVic毕业证)维多利亚大学毕业证成绩单
一比一原版(UVic毕业证)维多利亚大学毕业证成绩单一比一原版(UVic毕业证)维多利亚大学毕业证成绩单
一比一原版(UVic毕业证)维多利亚大学毕业证成绩单
ukgaet
 
一比一原版(Deakin毕业证书)迪肯大学毕业证如何办理
一比一原版(Deakin毕业证书)迪肯大学毕业证如何办理一比一原版(Deakin毕业证书)迪肯大学毕业证如何办理
一比一原版(Deakin毕业证书)迪肯大学毕业证如何办理
oz8q3jxlp
 
Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)
TravisMalana
 
Adjusting primitives for graph : SHORT REPORT / NOTES
Adjusting primitives for graph : SHORT REPORT / NOTESAdjusting primitives for graph : SHORT REPORT / NOTES
Adjusting primitives for graph : SHORT REPORT / NOTES
Subhajit Sahu
 
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
v3tuleee
 
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
u86oixdj
 
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project PresentationPredicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Boston Institute of Analytics
 
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdfSample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Linda486226
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
ewymefz
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP
 
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
nscud
 
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
ahzuo
 
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
ahzuo
 
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
vcaxypu
 
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
axoqas
 
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
John Andrews
 
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Subhajit Sahu
 

Recently uploaded (20)

一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
 
Empowering Data Analytics Ecosystem.pptx
Empowering Data Analytics Ecosystem.pptxEmpowering Data Analytics Ecosystem.pptx
Empowering Data Analytics Ecosystem.pptx
 
SOCRadar Germany 2024 Threat Landscape Report
SOCRadar Germany 2024 Threat Landscape ReportSOCRadar Germany 2024 Threat Landscape Report
SOCRadar Germany 2024 Threat Landscape Report
 
一比一原版(UVic毕业证)维多利亚大学毕业证成绩单
一比一原版(UVic毕业证)维多利亚大学毕业证成绩单一比一原版(UVic毕业证)维多利亚大学毕业证成绩单
一比一原版(UVic毕业证)维多利亚大学毕业证成绩单
 
一比一原版(Deakin毕业证书)迪肯大学毕业证如何办理
一比一原版(Deakin毕业证书)迪肯大学毕业证如何办理一比一原版(Deakin毕业证书)迪肯大学毕业证如何办理
一比一原版(Deakin毕业证书)迪肯大学毕业证如何办理
 
Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)
 
Adjusting primitives for graph : SHORT REPORT / NOTES
Adjusting primitives for graph : SHORT REPORT / NOTESAdjusting primitives for graph : SHORT REPORT / NOTES
Adjusting primitives for graph : SHORT REPORT / NOTES
 
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
 
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
原版制作(Deakin毕业证书)迪肯大学毕业证学位证一模一样
 
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project PresentationPredicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
 
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdfSample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
 
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
 
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
 
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
 
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
 
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
 
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
 
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...
 

Time series Modelling Basics

  • 1. Time Series Ashutosh Third-Year UG student Mechanical Engineering Department IIT Kharagpur
  • 2. • Continuous Time Series • Discrete Time Series o E.g. Adjustment of a price P in response to non-zero excess demand for a product can be modeled in continuous time as: 𝑑𝑃 𝑑𝑡 = λ o A discrete signal or discrete-time signal is a time series consisting of a sequence of quantities e.g. Weather data etc. Time Series
  • 3. Components of Time Series Trend Seasonality Cyclicity 1
  • 4. General Approach  Plot the series and examine the main features of the graph, checking in particular whether there is: o A trend, o A seasonal component, o Any apparent sharp changes in behavior, o Any outlying observations.  Remove the trend and seasonal components to get stationary residuals.  Choose a model to fit the residuals, making use of various sample statistics including the sample autocorrelation function.  Forecasting will be achieved by forecasting the residuals and then inverting the transformations described above to arrive at forecasts of the original series {𝑋𝑡}. 1
  • 5. Stationarity • Strict Stationarity • Weak Stationarity {𝑋𝑡} is (weakly) stationary if  The mean function of { 𝑋𝑡} = µ 𝑋 (t) = E(𝑋𝑡) is independent of t,  The covariance function of {𝑋𝑡} = γ 𝑋(r, s) = Cov(𝑋𝑟, 𝑋𝑠) = E[(𝑋𝑟 − µ 𝑋 (r))( 𝑋𝑠 − µ 𝑋 (s))] for all integers r and s and γ 𝑋 (h) := γ 𝑋 (h, 0 ) = γ 𝑋(t + h, t) = γ 𝑋 (t + h, t) is independent of t for each h. {𝑋𝑡} is a strictly stationary time series if • (𝑋1, … … . , 𝑋 𝑛) ′ ≜ (𝑋1+ℎ, … … . , 𝑋 𝑛+ℎ) ′, for all integers h and n ≥ 1. 1
  • 6. 𝑋𝑡 = 𝑚 𝑡 + 𝑠𝑡 + 𝑌𝑡, t = 1,…..,n, where E 𝑌𝑡 = 0, 𝑠𝑡+𝑑 = 𝑠𝑡, and 𝑗=1 𝑑 𝑠𝑗 = 0 Removal of Trend • By estimation of 𝑚 𝑡 and 𝑠𝑡 • By differencing the series {𝑋𝑡 } • Smoothing with a finite moving average filter • Exponential smoothing • Smoothing by elimination of high-frequency components • Polynomial fitting lag-1 difference operator ∇ 1
  • 7. Removal of Trend • By estimation of 𝑚 𝑡 and 𝑠𝑡 • Smoothing with a finite moving average filter • Let q be a nonnegative integer and consider the two-sided moving average: 𝑊𝑡 = 1 (2𝑞+1) 𝑗=−q 𝑞 𝑋𝑡−𝑗. • Assuming that 𝑚 𝑡 is approximately linear over the interval [t − q, t + q] and that the average of the error terms over this interval is close to zero. • The moving average thus provides us with the estimates 𝑚 𝑡 = 1 (2𝑞+1) 𝑗=−q 𝑞 𝑋𝑡−𝑗 . • For large q it will attenuate noise as 𝑚 𝑡 ≈ 0. So, to overcome this we can use the Spencer 15-point moving average as a filter that passes polynomials of degree 3 without distortions Linear Filter {𝒙 𝒕} { 𝒎 𝒕 = 𝒂𝒋 𝒙 𝒕−𝒋} 1
  • 8. Removal of Trend • By estimation of 𝑚 𝑡 and 𝑠𝑡 • Exponential smoothing • For any fixed α ∈ [0 , 1], the one-sided moving averages 𝑚 𝑡, t = 1 ,...,n , defined by the recursions: 𝑚 𝑡 = α𝑋𝑡 + ( 1 − α) 𝑚 𝑡−1 , t = 2 ,...,n, and 𝑚1 = 𝑋1 • Weighted moving average with weights decreasing exponentially (except for the last one). 1
  • 9. Removal of Trend • By differencing the series {𝑋𝑡 } lag-1 difference operator ∇ • We define the lag-1 difference operator ∇ by: ∇𝑋𝑡 = 𝑋𝑡 − 𝑋𝑡−1= (1 − B) 𝑋𝑡, where B is the backward shift operator. • B𝑋𝑡 = 𝑋𝑡−1. • Powers of the operators B and ∇ are defined as 𝐵 𝑗 (𝑋𝑡) = 𝑋𝑡−𝑗 and 𝛻 𝑗 (𝑋𝑡) = ∇(𝛻 𝑗−1 (𝑋𝑡)), j ≥ 1, with 𝛻0 (𝑋𝑡) = 𝑋𝑡. 1
  • 10. Removal of Seasonality 𝑋𝑡 = 𝑚 𝑡 + 𝑠𝑡 + 𝑌𝑡, t = 1,…..,n, where E 𝑌𝑡 = 0, 𝑠𝑡+𝑑 = 𝑠𝑡, and 𝑗=1 𝑑 𝑠𝑗 = 0 Estimation of Trend and then Seasonal Components lag- d differencing operator 𝛻𝑑 lag-d difference operator 𝛻𝑑 1
  • 11. Removal of Seasonality Estimation of Trend and then Seasonal Components • The trend is first estimated by applying a moving average filter specially chosen to eliminate the seasonal component and to dampen the noise. • If the period d is even, say d = 2q , then we use, 𝑚 𝑡 = (0.5 𝑥𝑡−𝑞 + 𝑥𝑡−𝑞+1 + ……… + 𝑥𝑡+𝑞)/d, q <t ≤ n −q. • If the period is odd, say d = 2 q + 1, then we use the simple moving average. • The second step is to estimate the seasonal component. For each k = 1 ,...,d ,we compute the average 𝜔 𝑘 of the deviations {( 𝑥 𝑘+𝑗𝑑 -m 𝑘+𝑗𝑑), q < k + jd ≤ n−q}. • Since these average deviations do not necessarily sum to zero, we estimate the seasonal component 𝑠 𝑘 as, s 𝑘 = 𝜔 𝑘 - 1 𝑑 𝑖−1 𝑑 𝜔𝑖, k = 1, ….., d, and s 𝑘 = s 𝑘−𝑑, k > d. • The deseasonalized data is then defined to be the original series with the estimated seasonal component removed, i.e., 𝑑 𝑡 = 𝑥𝑡 - s𝑡, t = 1, …. ,n. Finally, we re-estimate the trend from the deseasonalized data { 𝑑 𝑡} using one of the methods already described. 1
  • 12. lag- d differencing operator 𝛻𝑑 Removal of Seasonality • Lag- d differencing operator 𝛻𝑑 defined as: 𝛻𝑑 𝑋𝑡 = 𝑋𝑡 - 𝑋𝑡−𝑑 = (1 – 𝐵 𝑑) 𝑋𝑡. • Applying the operator 𝛻𝑑 to the model 𝑋𝑡 = 𝑚 𝑡 + 𝑠𝑡 + 𝑌𝑡, where {𝑠𝑡} has period d, we obtain 𝛻𝑑 𝑋𝑡 = 𝑚 𝑡 - 𝑚 𝑡−𝑑 + 𝑌𝑡 - 𝑌𝑡−𝑑, which gives a decomposition of the difference 𝛻𝑑 𝑋𝑡 into a trend component (𝑚 𝑡 - 𝑚 𝑡−𝑑) and a noise term (𝑌𝑡 - 𝑌𝑡−𝑑). • The trend, 𝑚 𝑡 - 𝑚 𝑡−𝑑, can then be eliminated using the methods already described, in particular by applying a power of the operator ∇. • This doubly differenced series can in fact be well represented by a stationary time series model. lag-d difference operator 𝛻𝑑 1
  • 13. Test of Randomness1 • The Portmanteau test • The Turning point test • The Difference-sign test • The rank test • Fitting an Auto-regressive model • Ljung and Box test • McLeod and Li Test
  • 14. Stationary Processes • Linear Processes o The time series {𝑋𝑡} is a linear process if it has the representation 𝑋𝑡 = 𝑗= −∞ ∞ 𝜓𝑗 𝑍𝑡−𝑗 or 𝑋𝑡 = 𝜓(𝐵)𝑍𝑡, where 𝜓 𝐵 = 𝑗= −∞ ∞ 𝜓𝑗 B 𝑗 for all t, where {𝑍𝑡} ∼ WN(0, σ 2) and {𝜓𝑗} is a sequence of constants with 𝑗= −∞ ∞ 𝜓𝑗 < ∞. o The class of linear time series models includes the class of Auto-Regressive Moving-Average (ARMA) models o Every second-order stationary process is either a linear process or can be transformed to a linear process by subtracting a deterministic component 2
  • 15. • MA(q) Process o {𝑋𝑡} is a moving-average process of order q if 𝑋𝑡 = 𝑍𝑡 + θ1 𝑍𝑡−1 +.…+ θq 𝑍𝑡−q, where {𝑍𝑡] ∼ WN(0 ,σ 2) and θ 1 ,...,θ q are constants o Every q-correlated process is an MA(q) process. • AR(p) Process o {𝑋𝑡} is an Auto-Regressive process of order p if 𝑋𝑡 = φ1 X 𝑡−1 +.…+ φp X 𝑡−p + 𝑍𝑡, where {𝑍𝑡] ∼ WN(0 ,σ 2) and 𝑍𝑡 is uncorrelated with 𝑋𝑠 for each s < t. • ARMA(p, q) Process o {𝑋𝑡} is an ARMA(p, q) process if 𝑋𝑡 - φ1 X 𝑡−1 -.…- φp X 𝑡−p = 𝑍𝑡 + θ1 𝑍𝑡−1 +.…+ θq 𝑍𝑡−q, where {𝑍𝑡] ∼ WN(0 ,σ 2 ) and the polynomials (1 - φ1z -…- φp 𝑧 𝑝 ) and (1 + θ1z +…+ θq 𝑧 𝑞 ) have no common factors. AR(p), MA(q) and ARMA(p, q) Processes2
  • 16. ARMA(p, q) Processes • ARMA(p, q) Process o {𝑋𝑡} is an ARMA(p, q) process if 𝑋𝑡 - φ1 X 𝑡−1 -.…- φp X 𝑡−p = 𝑍𝑡 + θ1 𝑍𝑡−1 +.…+ θq 𝑍𝑡−q, where {𝑍𝑡] ∼ WN(0 ,σ 2 ) and the polynomials (1 - φ1z -…- φp 𝑧 𝑝 ) and (1 + θ1z +…+ θq 𝑧 𝑞 ) have no common factors. o 𝑋𝑡 in above definition must be Stationary. o A stationary solution {𝑋𝑡} of above equation exists (and is also the unique stationary solution) if and only if φ(z) = 1 − φ1z − ··· − φpzp ≠ 0 for all |z| = 1. o An ARMA(p, q) process {𝑋𝑡} is causal, or a causal function of {𝑍𝑡} , if there exist constants {ψ 𝑗} such that 𝑋𝑡 = 𝑗=0 ∞ |𝜓𝑗|𝑍𝑡−𝑗. Causality is equivalent to the condition φ(z) = 1 − φ1z − ··· − φpzp ≠ 0 for all |z| ≤ 1. o An ARMA(p, q) process {𝑋𝑡} is Invertible if there exist constants {𝜋𝑗} such that 𝑗=0 ∞ 𝜋𝑗 < ∞ and 𝑍𝑡 = 𝑗=0 ∞ 𝜋𝑗 𝑋𝑡−𝑗 for all t. Invertibility is equivalent to the condition θ(z) = 1 + θ1z +…+ θq 𝑧 𝑞 ≠ 0 for all |z| ≤ 1. o We will focus our attention principally on Causal and Invertible ARMA processes. 2
  • 17. ACF and PACF of ARMA(p, q) process • PACF o The partial autocorrelation function (PACF) of an ARMA process {𝑋𝑡} is the function α(·) defined by the equations: α(0) = 0, and α(h) = ∅ℎℎ, h ≥ 1, where ∅ℎℎ = Γℎ −1 𝛾ℎ, Γℎ =[𝛾(I - j)] ℎ 𝑖, 𝑗 = 1 and 𝛾ℎ = [𝛾(1), 𝛾(2),…, 𝛾(h)]′ o PAC For a causal AR(p) process is zero for lags greater than p. • ACF o If the sample ACF 𝜌(h) is significantly different from zero for 0 ≤ h ≤ q and negligible for h > q, then it is MA(q) process o In order to apply this criterion we need to take into account the random variation expected in the sample autocorrelation function before we can classify ACF values as “negligible.” To resolve this problem we can use Bartlett’s formula (Section 2.4), which implies that for a large sample of size n from an MA( q ) process, the sample ACF values at lags greater than q are approximately normally distributed with means 0 and variances 𝜔ℎℎ/n = (1 + 2𝜌2 (1) +…+ 2𝜌2 (q))/n o This means that if the sample is from MA(q) process and if h > q, then 𝜌(h) should fall between the bounds ±1.96 𝜔ℎℎ/𝑛 with probability approximately 0.95. In practice we frequently use the more stringent values ±1.96 2
  • 18. Forecasting ARMA Processes • Innovations Algorithm o It provides us with a recursive method for forecasting second-order zero-mean processes that are not necessarily stationary. o For the causal ARMA process φ(B) 𝑋𝑡 = θ(B) 𝑍𝑡, {𝑍𝑡} ∼ WN(0, 𝜎2 ), it is possible to simplify the application of the algorithm drastically. 2
  • 19. 3 Modelling and Forecasting with ARMA Processes • General o Estimation of the parameters φ = (φ𝑖 ,…, φ 𝑝), θ = (θ𝑖 ,…, θ 𝑞), and 𝜎2 when p and q are assumed to be known o Assumption that data have been “mean-corrected” by subtraction of the sample mean, so that it is appropriate to fit a zero-mean ARMA model to the adjusted data x1 ,..., x 𝑛 . If the model fitted to the mean-corrected data is φ(B)X 𝑡 = θ(B)Z 𝑡, {Z 𝑡} ∼ WN(0, 𝜎2 ) o When p and q are known, good estimators of φ and θ can be found by imagining the data to be observations of a stationary Gaussian time series and maximizing the likelihood with respect to the p + q + 1 parameters φ1 ,..., φ 𝑝, θ1 ,..., θ 𝑞 and 𝜎2 . The estimators obtained by this procedure are known as maximum likelihood (or maximum Gaussian likelihood) estimators • Preliminary Estimation of parameters o Yule-Walker Estimation: The Yule–Walker and Burg procedures apply to the fitting of pure autoregressive models. (Although the former can be adapted to models with q > 0, its performance is less efficient than when q = 0.). Assumption is that the ACF of {X 𝑡} coincides with the sample ACF at lags 1,…,p. o Burg’s Algorithm: Assumption is that the PACF of {X 𝑡} coincides with the sample ACF at lags 1,…,p. o The Innovations Algorithm: o The Hannan-Rissanen Algorithm: • After getting Preliminary Estimates we apply Maximum Likelihood Estimation (MLE) (or maximum Gaussian likelihood) to estimate the parameters.
  • 20. Diagnostic Checking3 • Residuals are defined by: 𝑊𝑡 = (𝑋𝑡 - 𝑋𝑡(φ, 𝜃)) / (𝑟𝑡−1(φ, 𝜃)) 1/2 , t = 1,…,n. o E(𝑋 𝑛+1 − 𝑋 𝑛+1) 2 = 𝜎2 E(𝑊𝑛+1 − 𝑊𝑛+1) 2 = 𝜎2 𝑟𝑛 • Rescaled Residuals 𝑅𝑡, t = 1 ,…, n, are obtained by dividing the residuals 𝑊𝑡, t = 1 ,…, n, by the estimate 𝜎 = ( 𝑡=1 𝑛 𝑊𝑡 2 )/𝑛 of the white noise standard deviation. Thus, 𝑅𝑡 = 𝑊𝑡/ 𝜎 • If the fitted model is appropriate, the rescaled residuals should have properties similar to those of a WN(0,1) sequence or of an iid(0,1) sequence if we make the stronger assumption that the white noise {𝑍𝑡} driving the ARMA process is independent white noise.
  • 21. Diagnostic Checking3 The Graph of { 𝑅𝑡, t = 1 ,…, n} • If the fitted model is appropriate, then the graph of the rescaled residuals { 𝑅𝑡, t = 1 ,…, n} should resemble that of a white noise sequence with variance one. Rescaled residuals after fitting the ARMA(1,1) model to some data
  • 22. 3 Order Selection • p from PACF and q from ACF • AICC Criterion
  • 23. 3 Order Selection • AICC Criterion • Corrected version of Akaike Information Criterion (AIC) • Choose p , q , φp, and θq to minimize AICC = -2𝐥𝐧 𝐋(φ 𝐩, 𝛉 𝐪, 𝐒(φ 𝐩, 𝛉 𝐪)/𝐧) + 2(p+q+1)n/(n-p-q-2).
  • 25. Reference(s) 1) Introduction to Time Series and Forecasting, Brockwell, Peter J., Davis, Richard A., https://www.springer.com/us/book/9781475777505 2) Discrete time Series, https://www.wikiwand.com/en/Discrete-time_signal