SlideShare a Scribd company logo
1 of 87
Download to read offline
TIME SERIES ANALYSIS
FOR
FORECASTING
Rupika Abeynayake
Professor in Applied Statistics
 Frequently there is a time lag between awareness of an impending
event or need and occurrence of the event
 This lead-time is the main reason for planning and forecasting. If
the lead-time is zero or very small, there is no need for planning
 If the lead-time is long, and the outcome of the final event is
conditional on identifiable factors, planning can perform an
important role
 In management and administrative situations the need for
planning is great because the lead-time for decision making
ranges from several years to a few days or hours to a few second.
Therefore, forecasting is an important aid in effective and
efficient planning
Introduction
 Forecasting is a prediction of what will occur in the future, and it
is an uncertain process
 One of the most powerful methodologies for generating forecasts
is time series analysis
 A data set containing observations on a single phenomenon
observed over multiple time periods is called time-series. In time
series data, both values and the ordering of the data points have
meaning. For many agricultural products, data are usually
collected over time
Introduction…
 Time series analysis and its applications have become
increasingly important in various fields of research, such as
business, economics, agriculture, engineering, medicine, social
sciences, politics etc.
 Realization of the fact that “ Time is Money " in business
activities, the time series analysis techniques presented here,
would be a necessary tool for applying to a wide range of
managerial decisions successfully where time and money are
directly related
Introduction
 On the time scale we are standing at a certain point called the
point of reference (Yt) and we look backward over past
observations (Yt-1, Yt-2,…,Yt-n+1) and foreword into the future
(Ft+1, Ft+2, …,Ft+m)
 Once a forecasting model has been selected, we fit the model
to the known data and obtain the fitted values. For the known
observations this allows calculation fitted errors (Yt-1- Ft-1)
 A measure of goodness-of-fit of the model and as new
observations become available we can examine forecasting
errors (Yt+1- Ft+1)
Forecasting scenario
Measuring Forecast accuracy



n
1
t
t
e
n
1
ME
Mean error :



n
1
t
t |
e
|
n
1
MAE
Mean absolute error :
Mean squared error :
Percentage error (PE) :



n
1
t
2
t |
e
|
n
1
MSE
PEt = 100*(Yt - Ft )/Yt
Mean percentage error (MPE) : 


n
1
t
t
PE
n
1
MPE
Mean absolute percentage error (MAPE) : 


n
1
t
t |
PE
|
n
1
MAPE
Main components of time series data
There are four types of components in time series analysis
Seasonal component (S)
Trend component (T)
Cyclical component (C)
Irregular component ( I )
Yt = St , Tt , Ct , I
Time Series
Yt=S,T,C, I
Seasonal
removing using
smoothing
Trend removal
using
regression
Cyclical
removing using
% ratio
I
Moving averages
 Smoothing techniques are used to reduce irregularities (random
fluctuations) in time series data
 Moving averages rank among the most popular techniques for the
preprocessing of time series. They are used to filter random "white
noise" from the data, to make the time series smoother
 There are several methods of Moving averages
 Simple Moving Averages
 Double moving averages
 Centered Moving Average
 Weighted Moving Average
Simple Moving Averages
 Moving Averages (MA) is effective and efficient approach provided
the time series is stationary in both mean and variance
 The simple moving average required an odd number of observations
to be included in each average at the middle of the data value being
averaged
 Takes a certain number of past periods and add them together; then
divide by the number of periods gives the simple moving average
 The following formula is used in finding the moving average of
order n, MA(n) for a period t+1
MAt+1 = [Yt + Yt-1 + ... +Yt-n+1] / n
Month Time
period
Observed
values
Three-month
moving
average 3 MA
Five-month
moving
average 5 MA
Jan 1 266.0 - -
Fab 2 145.9 -
-
-
-
Mar 3 183.1
Apr 4 119.3 198.333
149.433
160.900
156.033
193.533
208.267
216.367
180.067
217.400
215.100
238.900
-
May 5 180.3 -
Jun 6 168.5
Jul 7 231.8
Aug 8 224.5
Sep 9 192.8
Oct 10 122.9
Nov 11 336.5
Dec 12 185.9
Jan 13 194.3 .
Feb 14 149.5 .
Example
178.92
159.42
176.60
184.88
199.58
188.10
221.70
212.52
206.48
.
Example…
Actual
Smoothed
Forecast
Actual
Smoothed
Forecast
20
10
0
350
250
150
50
Obs
Time
MSD:
MAD:
MAPE:
Length:
Moving Average
5165.47
57.39
32.03
3
Moving Average
Example…
Actual
Smoothed
Forecast
Actual
Smoothed
Forecast
20
10
0
300
200
100
Obs
Time
MSD:
MAD:
MAPE:
Length:
Moving Average
4496.37
52.74
26.23
5
Moving Average
1Centered Moving Average
 Suppose we wish to calculate a moving average with an even
number of observations for example, to calculate a 4-term
moving average or 4 MA for, the data
 The center of the fist moving average is at 2.5 while the center
of the second moving average is at 3.5
 The average of the two moving averages is centered at 4
 Therefore, this problem can be overcome by taking an
additional 2-period moving average of the 4-period moving
average
 This centered moving average is denoted as 2 X 4 MA
2.1Example
Month Time
period
Observed
values
Four-month
moving
average 4 MA
2 X 4 MA
Jan 1 266.0 -
178.6
-
Fab 2 145.9 -
Mar 3 183.1 157.6 -
Apr 4 119.3 162.8 -
May 5 180.3 174.9 -
Jun 6 168.5 201.3
Jul 7 231.8 204.4
Aug 8 224.5 193.0
Sep 9 192.8 219.2
Oct 10 122.9 209.5
Nov 11 336.5 209.9
Dec 12 185.9 216.6
Jan 13 194.3 .
Feb 14 149.5 .
167.863
159.975
168.887
188.125
202.837
198.700
206.088
214.350
209.712
167.863
 This method is very powerful with comparing simple moving
averages
 Weighted MA(3) can be expressed as,
Weighted MA(3) = w1.Yt + w2.Yt-1 + w3.Yt-2
where w1, w2, & w3 are weights
 There are many schemes selecting appropriate weights (Kendall,
Stuart, and Ord (1983)
 Weights are any positive numbers such that,
w1 + w2 + w3 = 1
 One of the methods of calculating weights is,
w1 = 3/(1 + 2 + 3) = 3/6, w2 = 2/6, and w3 = 1/6
Weighted Moving Average
2.Exponential Smoothing Techniques
 One of the most successful forecasting methods is the exponential
smoothing (ES)
 ES is an averaging technique that uses unequal weights and assigns
exponentially decreasing weights as the observations get older
 There are several exponential smoothing techniques
Single Exponential Smoothing
Holt’s linear method
Holt-Winters’ trend and seasonality method
Single Exponential Smoothing
 The method of single exponential forecasting takes the
forecast for the previous period and adjust it using the
forecast error. [(Forecast error = (Yt – Ft)]
Ft+1 = Ft + a (Yt - Ft)
Ft+1 = a Yt + (1 - a) Ft
where:
Yt is the actual value
Ft is the forecasted value
a is the weighting factor, which ranges from 0 to 1
t is the current time period
Choosing the Best Value for Parameter a (alpha)
 In practice, the smoothing parameter is often chosen by a grid
search of the parameter space
 That is, different solutions for = 0.1 to are tried starting with
0.9, with increments of 0.1.
 Then is chosen so as to produce the smallest sums of squares (or
mean squares) for the residuals.
Month Time
period
Observed
values
Exponentially Smoothed values
a = 0.1 a = 0.5 a = 0.9
Jan 1 200.0
Feb 2 135.0 200.0 200.0 200.0
Mar 3 195.0 193.5 167.5 141.5
Apr 4 197.5 193.7 181.3 189.7
May 5 310.0 194.0 189.4 196.7
Jun 6 175.0 205.6 249.7 298.7
Jul 7 155.0 202.6 212.3 187.4
Aug 8 130.0 197.8 183.7 158.2
Sep 9 220.0 191.0 156.8 132.8
Oct 10 277.5 193.9 188.4 211.3
Nov 11 235.0 202.3 233.0 270.9
Dec 12 - 205.6 234.0 238.6
Analysis of Errors (Test period : 2 – 11)
a = 0.1 a = 0.5 a = 0.9
Mean Error 5.56 6.80 4.29
Mean Absolute Error 47.76 56.94 61.32
Mean Absolute percentage Error
(MAPE)
24.58 29.20 30.81
Mean Square Error (MSE) 3438.33 4347.24 5039.37
Theil’s U-statistics 0.81 0.92 0.98
Time Series Plots
0
50
100
150
200
250
300
350
1 2 3 4 5 6 7 8 9 10 11 12
Month
Shipments
Observed values
SES = 0.1
SES = 0.5
SES = 0.9
Holt’s linear method
 Holt (1957) extended single exponential smoothing to linear
exponential smoothing to allow forecasting of data with trends
 The forecast for Hollt’s linear exponential smoothing is found
using two smoothing constants, a and  (values between 0 & 1),
and three equations
)
)(
1
( 1
1 
 


 t
t
t
t b
L
Y
L a
a
1
1 )
1
(
)
( 
 


 t
t
t
t b
L
L
b 

m
b
L
F t
t
m
t 


Smoothing of data
Smoothing of trend
Forecast for m period a head
The initialization process : L1 = Y1 and
b1 = Y2 – Y1 or b1 = (Y4-Y1) / 3
Holt-Winters’ trend and seasonality method
 Holt’s method was extended by Winters (1960) to capture
seasonality
 The Holt-Winter’s method is based on three smoothing equations,
one for level, one for trend, and one for seasonality
)
)(
1
( 1
1 





 t
t
s
t
t
t b
L
S
Y
L a
a
1
1 )
1
(
)
( 
 


 t
t
t
t b
L
L
b 

s
t
t
t
t S
L
Y
S 


 )
1
( 

m
s
t
t
t
m
t S
m
b
L
F 

 
 )
(
Level
Trend
Seasonal
Forecast
Example…
Actual
Predicted
Forecast
Actual
Predicted
Forecast
150
100
50
0
700
600
500
400
300
200
100
Yt
Time
MSD:
MAD:
MAPE:
Delta (season):
Gamma (trend):
Alpha (level):
Smoothing Constants
356.695
12.557
5.163
0.200
0.200
0.200
Winters' Multiplicative Model for Yt
Season Sales Average Sales Seasonal
Factor
Spring 200 250 200/250 = 0.8
Summer 350 250 350/250 = 1.4
Fall 300 250 300/250 = 1.2
Winter 150 250 150/250 = 0.6
Total 1000 1000
Seasonal Factor
Ratio-to-moving-average
Season Average Sales
(1100/4)
Next Year
Forecast
FORECAST
Spring 275 275*0.8 220
Summer 275 275*1.4 385
Fall 275 275*1.2 330
Winter 275 275*0.6 165
Total 1100 1100
If Next year expected sale increment is 10%
The table below represent the Quarterly sales
Figures
Year Q1 Q2 Q3 Q4
2008 20 30 39 60
2009 40 51 62 81
2010 50 64 74 85
Time
Period
Quarter Time
index
Sales Centered
MA (4)
Sales/MA*100
2008 Q1 1 20
2008 Q2 2 30
2008 Q3 3 39
2008 Q4 4 60 39.750 150.943
2009 Q1 5 40 44.875 89.136
2009 Q2 6 51 50.375 101.241
2009 Q3 7 62 55.875 110.962
2009 Q4 8 81 59.750 135.565
2010 Q1 9 50 62.625 79.840
2010 Q2 10 64 65.750 97.338
2010 Q3 11 74 67.750 109.225
2010 Q4 12 85
Year Q1 Q2 Q3 Q4
2008 150.94
2009 89.14 101.24 110.96 135.56
2010 79.84 97.33 109.22
Mean 84.49 99.29 110.095 143.25 437.125
AF 0.915 0.915 0.915 0.915
Seasonal
Index
77.3085 90.85 100.736 131.0737 399.9693
Adjusted Factor (AF) = 400/437.125=0.915
150
100
50
0
600
500
400
300
200
100
Original Data
150
100
50
0
500
400
300
200
100
Seasonally Adjusted Data
150
100
50
0
1.4
1.3
1.2
1.1
1.0
0.9
0.8
0.7
0.6
0.5
Detrended Data
150
100
50
0
50
0
-50
-100
Seasonally Adj. and Detrended Data
Component Analysis
General overview of forecasting techniques
Classification of the widely used Forecasting
Techniques
Causal
Model
Time Series
Model
Smoothing
Techniques
Model
Regression
Analysis
Box-Jenkins
Processes
Moving averages
and Exponential
Smoothing
Techniques
Box-Jenkins ARIMA
models (1970)
Plot series
Is variance
stable
Apply
transformation
Obtain ACFs and
PACFs
Is mean
stable
Apply regular and
seasonal
differencing
Model selection
Estimate parameter values
Are residual
uncorrelated
Modify Model
Are parameters
significant
Forecasting
No
No
Yes
No
Yes
Yes
Yes
Box-Jenkins Modeling
Approach to Forecasting
 The key statistics in time series analysis is the autocorrelation
coefficient (or the correlation of the time series with itself, lagged by
1, 2, or more periods), which is given by the following formula
Autocorrelation function









 n
t
t
n
k
t
K
t
t
k
Y
Y
Y
Y
Y
Y
r
1
2
1
)
(
)
)(
(
 Then r1 indicates how successive values of Y relate to each other, r2
indicates how Y values two period apart relate to each other, and so on
 Autocorrelations at lags 1, 2, …, make up the autocorrelation function
or ACF
Partial autocorrelation function
 Partial autocorrelations are used to measure the degrees of
association between Yt and Yt-k, when the effects of other time
lags 1, 2, 3,…,k-1 are removed
 Suppose there was a significant autocorrelation between Yt and
Yt-1. Then there will also be a significant correlation between Yt-1
and Yt-2 since they are also one time unit apart
 Consequently, there will be a correlation between Yt and Yt-2
because both are related to Yt-1
 So, to measure the real correlation between Yt and Yt-2, we need to
take out the effect of the intervening value Yt-1. This is what partial
autocorrelation does.
Autocorrelation function
Y Y t-1 Yt-Ӯ Yt-1- Ӯ (Yt-Ӯ)( Yt-1- Ӯ) (Yt-Ӯ) (Yt-Ӯ)
2 -4 16
3 2 -3 -4 12 9
5 3 -1 -3 3 1
7 5 1 -1 -1 1
9 7 3 1 3 9
10 9 4 3 12 16
Total 29 52
0.557692
52
29
)
Y
(Y
)
Y
)(Y
Y
(Y
r n
1
t
2
t
n
1
k
t
K
t
t
k 











Sampling distribution of Auto Correlation
(317)
Portmanteau Tests
An alternative to this would be to examine a whole set of rk values,
say the first 10 of them (r1 to r10) all at once and then test to see
whether the set is significantly different from a zero set. Such a test
is known as a portmanteau test, and the two most common are the
Box-Pierce test and the Ljung-Box Q* statistic..
The Box-Pierce Test
Here is the Box-Pierce formula:
Checking for Error Autocorrelation
• Test: Durbin-Watson statistic: k is number
of parameters in the model subtract one
d.f.
K
and
n
for
,
)
(
2
2
1

 


i
i
i
e
e
e
d
Positive Zone of No Autocorrelation Zone of Negative
autocorrelation indecision indecision autocorrelation
|_______________|__________________|_____________|_____________|__________________|___________________|
0 d-lower d-upper 2 4-d-upper 4-d-lower
Autocorrelation is clearly evident
Ambiguous – cannot rule out autocorrelation
Autocorrelation in not evident
• Value near 2 indicates non-autocorrelation
• Value toward 0 indicates positive autocorrelation
• Value toward 4 indicates negative autocorrelation
To test for positive autocorrelation at significance α,
the test statistic d is compared to lower and upper
critical values (dL,α and dU,α):
•If d < dL,α, there is statistical evidence that the error
terms are positively autocorrelated
•If d > dU,α, there is no statistical evidence that the
error terms are positively autocorrelated
•If dL,α < d < dU,α, the test is inconclusive
Positive serial correlation is serial correlation in which
a positive error for one observation increases the
chances of a positive error for another observation
To test for negative autocorrelation at significance α, the
test statistic (4 − d) is compared to lower and upper
critical values (dL,α and dU,α):
•If (4 − d) < dL,α, there is statistical evidence that the
error terms are negatively autocorrelated.
•If (4 − d) > dU,α, there is no statistical evidence that the
error terms are negatively autocorrelated.
•If dL,α < (4 − d) < dU,α, the test is inconclusive.
There is no growth or decline in the data. The data must be
roughly horizontal along the time axis. In other words the data
fluctuate around a constant mean, independent of time and the
variance of the fluctuation remain essentially constant over time.
Stationary of the time series data
A series stationary in mean
and variance
16
15
14
13
12
11
10
9
8
7
6
5
4
3
2
1
Lag Number
1.0
0.5
0.0
-0.5
-1.0
ACF
ACF for non-stationary time
series data
Non-stationary of the time series data
A series non-stationary in mean
and variance
A series non-stationary in mean
Unit Roots
• Consider an AR(1) model:
yt = a1yt-1 + εt (eq. 1)
εt ~ N(0, ζ2)
• Rewrite equation 1 by subtracting yt-1 from
both sides:
yt – yt-1 = a1yt-1 – yt-1 + εt (eq. 2)
Δyt = δ yt-1 + εt
δ = (a1 – 1)
Unit Roots
• H0: δ = 0 (there is a unit root) not stationary
• HA: δ ≠ 0 (there is not a unit root) stationary
• If δ = 0, then we can rewrite equation 2 as
Δyt = εt
Thus first differences of a random walk time series are
stationary, because by assumption, εt is purely
random.
In general, a time series must be differenced d times to
become stationary; it is integrated of order d or I(d).
A stationary series is I(0). A random walk series is
I(1).
Tests for Unit Roots
• Dickey-Fuller test
– Estimates a regression equation
– The usual t-statistic is not valid, thus D-F
developed appropriate critical values.
– You can include a constant, trend, or both in the
test.
– If you accept the null hypothesis, you conclude
that the time series has a unit root.
– In that case, you should first difference the
series before proceeding with analysis.
Tests for Unit Roots
• Augmented Dickey-Fuller test
– We can use this version if we suspect there is autocorrelation in
the residuals.
– This model is the same as the DF test, but includes lags of the
residuals too.
• Phillips-Perron test
– Makes milder assumptions concerning the error term, allowing
for the εt to be weakly dependent and heterogenously
distributed.
• Other tests include Variance Ratio test, Modified
Rescaled Range test, & KPSS test.
• There are also unit root tests for panel data (Levin et al
2002).
Removing the non-stationary in a time series
 One way of removing non-stationarity is through the method of
differencing
 Differenced series can be expressed as
1
'


 t
t
t Y
Y
Y
'
t
Y
'
t
Y
 The differenced series will have only n-1 values since it is not possible
to calculate a difference for the first observation
 With seasonal data, which is non-stationary, it may be appropriate to
take seasonal differences. A seasonal difference is the difference
between an observation and the corresponding observation from the
previous year. So for monthly data having an annual 12-month pattern
12
'


 t
t
t Y
Y
Y
Backshift Notation
• A very useful notational device is the backward
shift operator, B which is used as follows:
1

 t
t Y
BY
• B operating on Yt , has the effect on shifting
the data back one period
2
2
)
( 

 t
t
t Y
Y
B
BY
B
1

 t
t Y
BY
2
2

 t
t Y
Y
B
3
3

 t
t Y
Y
B
d
t
t
d
Y
Y
B 

Therefore,
1
'


 t
t
t Y
Y
Y
t
t
t BY
Y
Y 

'
t
t Y
B
Y )
1
(
'


First Difference
t
t Y
B
Y )
1
(
'


2


 t
t Y
Y
t
t Y
B
Y 2


t
Y
B )
1
( 2


Second Difference Third Difference
t
Y
B )
1
( 3


t
t
t BY
Y
Y 

'
t
t Y
B
Y )
1
(
'


First Order Difference
1
'


 t
t
t Y
Y
Y
)
( '
1
'
'
'


 t
t
t Y
Y
Y
)
(
)
( 2
1
1
'
'


 


 t
t
t
t
t Y
Y
Y
Y
Y
2
1
1
'
'


 


 t
t
t
t
t Y
Y
Y
Y
Y
2
1
'
'
2 
 

 t
t
t
t Y
Y
Y
Y
t
t
t
t Y
B
BY
Y
Y 2
'
'
2 


t
t Y
B
B
Y )
2
1
( 2
'
'



t
t Y
B
Y 2
'
'
)
1
( 

Second Order Difference
Linear time series models
t
p
t
p
t
t
t
t Y
Y
Y
Y
C
Y 



 





 


 ...
3
3
2
2
1
1
AR (p)  ARIMA (p,0,0) :
t
q
t
q
t
t
t
t C
Y 







 





 


 ...
3
3
2
2
1
1
MA (q) ARIMA (0,0,q) :
q
t
q
t
t
p
t
p
t
t
t Y
Y
Y
C
Y 




 








 








 ...
... 2
2
1
1
2
2
1
1
ARMA (p,q) :
where B is Back shift notation ( BYt = Y t-1)
t
d
t
d
B
Y 
)
1
( 


ARIMA (p,d,q) :
t
t B
C
Y
B
B 

 )
1
(
)
1
)(
1
( 1
1 




ARIMA (p,1,q) :
ARIMA models for time series data
 The general model introduced by Box and Jenkins (1970) includes
autoregressive as well as moving average parameters, and explicitly
includes differencing in the formulation of the model
 In the notation introduced by Box and Jenkins, models are
summarized as ARIMA (p, d, q)
 Three types of parameters in the model are:
autoregressive parameters (p), number of differencing passes (d), and
moving average parameters (q)
 For example, a model described as (0, 1, 2) means that it contains 0
(zero) autoregressive and 2 moving average which were computed
for the series after it was differenced once
 It is needed to decide on (identify) the specific number and type of
ARIMA parameters to be estimated
 The major tools used in the identification phase are plots of the
series, correlograms of auto correlation (ACF), and partial
autocorrelation (PACF)
 A majority of empirical time series patterns can be sufficiently
approximated using one of the 5 basic models that can be
identified based on the shape of the autocorrelogram (ACF) and
partial auto correlogram (PACF)
Estimation of parameters
Estimation of parameters…
AR(1): ACF - exponential decay; PACF - spike at lag 1, no
correlation for other lags
AR(2): ACF - a sine-wave shape pattern or a set of exponential
decays; PACF - spikes at lags 1 and 2, no correlation for
other lags
MA(1): ACF - spike at lag 1, no correlation for other lags; PACF
- damps out exponentially
MA(2) : ACF - spikes at lags 1 and 2, no correlation for other lags;
PACF - a sine-wave shape pattern or a set of exponential
decays
ARMA(1.1): ACF - exponential decay starting at lag 1; PACF -
exponential decay starting at lag 1
ACF and PACF functions AR (1) models
Autocorrelation Partial autocorrelation
Autocorrelation
Partial autocorrelation
ACF and PACF for AR(1) and  > 0 model
ACF and PACF for AR(1) and  < 0 model
ACF and PACF functions MA (1) model
Autocorrelation Partial autocorrelation
Autocorrelation
Partial autocorrelation
PACF for MA(1) and  < 0 model
ACF and PACF for MA(1) and  > 0 model
Seasonal models
 In addition to the non-seasonal parameters, seasonal parameters
for a specified lag need to be estimated
 Analogous to the simple ARIMA parameters, these are:
seasonal autoregressive (Ps), seasonal differencing (Ds), and
seasonal moving average parameters (Qs) and seasonal models are
summarized as ARIMA (p, d, q) (P, D, Q)
 For example, the model (0,1,2)(0,1,1) describes a model that
includes no autoregressive parameters, 2 regular moving average
parameters and 1 seasonal moving average parameter, and these
parameters were computed for the series after it was differenced
once with lag 1, and once seasonally differenced
Seasonal models…
ARIMA(p,d,q)(P,D,Q)S:
The general recommendations concerning the selection of
parameters to be estimated (based on ACF and PACF) also apply to
seasonal models
t
S
t
S
S
B
B
Y
B
B
B
B 


 )
1
)(
1
(
)
1
)(
1
)(
1
)(
1
( 1
1
1
1 





 φ
Non-seasonal
AR (1)
Seasonal
AR (1)
Non-seasonal
difference
Seasonal
difference
Non-seasonal
MA (1)
Seasonal
MA (1)
ACF/PACF
• The seasonal part of an AR or MA model will be seen in the
seasonal lags of the PACF and ACF respectively
• For example, an ARIMA(0,0,0)(0,0,1)12 model will show: a
spike at lag 12 in the ACF but no other significant spikes
• The PACF will show exponential decay in the seasonal lags;
that is, at lags 12, 24, 36, ….
• Similarly, an ARIMA(0,0,0)(1,0,0)12 model will show:
exponential decay in the seasonal lags of the ACF a single
significant spike at lag 12 in the PACF.
Example
Fitting of a time series model for describing the average monthly Sri Lankan
spot price of black pepper 1949 – 1960 in Sri Lankan Rupees per Kg
’49 ‘50 ‘51 ‘52 ‘53 ‘54 ‘55 ‘56 ‘57 ‘58 ‘59 ‘60
Jan. 112 115 145 171 196 204 242 284 315 340 360 417
Feb 118 126 150 180 196 188 233 277 301 318 342 391
Mar 132 141 178 193 236 135 267 317 356 362 406 419
Aprl 129 135 163 181 235 227 269 313 348 348 396 461
May 121 125 172 183 229 234 270 318 355 363 420 472
Jun 135 149 178 218 243 264 315 374 422 435 472 535
July 148 170 199 230 264 302 364 413 465 491 548 622
Aug 148 170 199 242 272 293 347 405 467 505 559 606
Sep 136 158 184 209 237 259 312 355 404 404 463 508
Oct. 119 133 162 191 211 229 274 306 347 359 407 461
Nov 104 114 146 172 180 203 237 271 205 310 362 390
Dec 118 140 166 194 201 229 278 306 336 337 405 432
Time series plot for original data
1
4
3
1
4
1
1
3
9
1
3
7
1
3
5
1
3
3
1
3
1
1
2
9
1
2
7
1
2
5
1
2
3
1
2
1
1
1
9
1
1
7
1
1
5
1
1
3
1
1
1
1
0
9
1
0
7
1
0
5
1
0
3
1
0
1
9
9
9
7
9
5
9
3
9
1
8
9
8
7
8
5
8
3
8
1
7
9
7
7
7
5
7
3
7
1
6
9
6
7
6
5
6
3
6
1
5
9
5
7
5
5
5
3
5
1
4
9
4
7
4
5
4
3
4
1
3
9
3
7
3
5
3
3
3
1
2
9
2
7
2
5
2
3
2
1
1
9
1
7
1
5
1
3
1
1
9
7
5
3
1
Case Number
700
600
500
400
300
200
100
Value
yt
Time series plot for trance formed data
‘49 ‘50 ‘51 ‘52 ‘53 ‘54 ‘55 ‘56 ‘57 ‘58 ‘59 ‘60
4.72 4.74 4.98 5.14 5.28 5.32 5.49 5.65 5.75 5.83 5.89 6.03
4.77 4.84 5.01 5.19 5.28 5.24 5.45 5.62 5.71 5.76 5.83 5.97
4.88 4.95 5.18 5.26 5.46 4.91 5.59 5.76 5.87 5.89 6.01 6.04
4.86 4.91 5.09 5.20 5.46 5.42 5.59 5.75 5.85 5.85 5.98 6.13
4.80 4.83 5.15 5.21 5.43 5.46 5.60 5.76 5.87 5.89 6.04 6.16
4.91 5.00 5.18 5.38 5.49 5.58 5.75 5.92 6.05 6.08 6.16 6.28
5.00 5.14 5.29 5.44 5.58 5.71 5.90 6.02 6.14 6.20 6.31 6.43
5.00 5.14 5.29 5.49 5.61 5.68 5.85 6.00 6.15 6.22 6.33 6.41
4.91 5.06 5.21 5.34 5.47 5.56 5.74 5.87 6.00 6.00 6.14 6.23
4.78 4.89 5.09 5.25 5.35 5.43 5.61 5.72 5.85 5.88 6.01 6.13
4.64 4.74 4.98 5.15 5.19 5.31 5.47 5.60 5.32 5.74 5.89 5.97
4.77 4.94 5.11 5.27 5.30 5.43 5.63 5.72 5.82 5.82 6.00 6.07
Time series plot for trance formed data
1
4
3
1
4
1
1
3
9
1
3
7
1
3
5
1
3
3
1
3
1
1
2
9
1
2
7
1
2
5
1
2
3
1
2
1
1
1
9
1
1
7
1
1
5
1
1
3
1
1
1
1
0
9
1
0
7
1
0
5
1
0
3
1
0
1
9
9
9
7
9
5
9
3
9
1
8
9
8
7
8
5
8
3
8
1
7
9
7
7
7
5
7
3
7
1
6
9
6
7
6
5
6
3
6
1
5
9
5
7
5
5
5
3
5
1
4
9
4
7
4
5
4
3
4
1
3
9
3
7
3
5
3
3
3
1
2
9
2
7
2
5
2
3
2
1
1
9
1
7
1
5
1
3
1
1
9
7
5
3
1
Case Number
6.50
6.00
5.50
5.00
4.50
Value
Log_yt
1
4
3
1
4
1
1
3
9
1
3
7
1
3
5
1
3
3
1
3
1
1
2
9
1
2
7
1
2
5
1
2
3
1
2
1
1
1
9
1
1
7
1
1
5
1
1
3
1
1
1
1
0
9
1
0
7
1
0
5
1
0
3
1
0
1
9
9
9
7
9
5
9
3
9
1
8
9
8
7
8
5
8
3
8
1
7
9
7
7
7
5
7
3
7
1
6
9
6
7
6
5
6
3
6
1
5
9
5
7
5
5
5
3
5
1
4
9
4
7
4
5
4
3
4
1
3
9
3
7
3
5
3
3
3
1
2
9
2
7
2
5
2
3
2
1
1
9
1
7
1
5
1
3
Case Number
150
100
50
0
-50
-100
-150
Value
SDIFF(Yt,1,12)
1
4
4
1
4
2
1
4
0
1
3
8
1
3
6
1
3
4
1
3
2
1
3
0
1
2
8
1
2
6
1
2
4
1
2
2
1
2
0
1
1
8
1
1
6
1
1
4
1
1
2
1
1
0
1
0
8
1
0
6
1
0
4
1
0
2
1
0
0
9
8
9
6
9
4
9
2
9
0
8
8
8
6
8
4
8
2
8
0
7
8
7
6
7
4
7
2
7
0
6
8
6
6
6
4
6
2
6
0
5
8
5
6
5
4
5
2
5
0
4
8
4
6
4
4
4
2
4
0
3
8
3
6
3
4
3
2
3
0
2
8
2
6
2
4
2
2
2
0
1
8
1
6
1
4
Case Number
100
50
0
-50
-100
-150
Value
DIFF(Yt_D1,1)
Graph after a seasonal
difference (D=1)
Graph after first difference of first
seasonal differenced data (d=1)
Time series plots after differencing
ACF and PACF plots
3
6
3
5
3
4
3
3
3
2
3
1
3
0
2
9
2
8
2
7
2
6
2
5
2
4
2
3
2
2
2
1
2
0
1
9
1
8
1
7
1
6
1
5
1
4
1
3
1
2
1
1
1
0
9
8
7
6
5
4
3
2
1
Lag Number
1.0
0.5
0.0
-0.5
-1.0
ACF
Lower Confidence
Limit
Upper Confidence Limit
Coefficient
DIFF(Yt_D1,1)
3
6
3
5
3
4
3
3
3
2
3
1
3
0
2
9
2
8
2
7
2
6
2
5
2
4
2
3
2
2
2
1
2
0
1
9
1
8
1
7
1
6
1
5
1
4
1
3
1
2
1
1
1
0
9
8
7
6
5
4
3
2
1
Lag Number
1.0
0.5
0.0
-0.5
-1.0
Partial
ACF
Lower Confidence
Limit
Upper Confidence Limit
Coefficient
DIFF(Yt_D1,1)
ACF
PACF
Comparison of ARIMA models with AIC
Akaike’s Information Criteria (AIC) = -2 log L + 2m
Where, L = Likelihood
m = p+q+P+Q
σ2 = variance of residual
Comparison of ARIMA models with AIC
S. No. Model AIC
1 ARIMA (1,1,1) (0,1,1)12 1172.362
2 ARIMA (0,1,1) (0,1,1)12 1169.650
3 ARIMA (0,1,2) (0,1,1)12 1171.052
4 ARIMA (0,1,1) (0,1,2)12 1172.295
5 ARIMA (0,1,1) (1,1,1)12 1172.631
6 ARIMA (0,1,3) (0,1,1)12 1172.205
7 ARIMA (1,1,1) (1,1,1)12 1171.065
8 ARIMA (0,1,1) (1,1,0)12 1171.286
9 ARIMA (1,1,1) (1,1,0)12 1170.986
10 ARIMA (1,1,0) (0,1,1)12 1185.438
Akaike’s Information Criteria (AIC) = -2 log L + 2m
Conclusions
 The Moving average methods and single exponential smoothing
emphasizes the short-range perspective, on the condition that
there is no trend no seasonality
 Holt’s linear exponential smoothing captures information about
recent trend
 Hotl’s method was extended by Winters (1960) to capture
seasonality
 The class of ARIMA models are useful for both stationary and
non-stationary time series
 There are numerical indicators for assessing the accuracy of the
forecasting technique, the most widely use approach is use of
several indicators to assess their accuracy
Box, G. E. P. and Jenkins G. M. (1970). Time series analysis: forecasting and control, San
Francisco: Holden day.
Box, G. E. P. and Pierce D. A. (1970). Distribution of the residual autocorrelation in
autoregressive-integrated moving-average time series models, Journal American
Statistical Association, 65, 1509-1526.
Gardner, E. S. (1985). Exponential smoothing: the state of the art, Journal of Forecasting, 4, 1-
28.
Holt, C.C. (1957). Forecasting seasonal and trends by exponentially weighed moving average,
Office of Navel Research, Research memorandum No. 52.
Makridakis, S. and Hibon, M. (1979). Accuracy of forecasting an empirical investigation.
Journal of Royal Statistical Society A, 142, 97-145.
Makridakis, S., Steven, C. W. and Rob J. Hyndman (1998). Forecasting Methods and
Applications, 3rd edition, John Wiley & sons, New Yark.
Winters, P. R.(1960). Forecasting sales by exponentially weighted moving averages, seasonal and
trends by exponentially weighed moving average, Management Science, 6, 324-342.
Yar, M and C. Chatfield (1990), Prediction intervals for the Holt-Winters forecasting procedure,
International Journal of Forecasting 6, 127-137.
References
ARCH Model
• At any point in a series, the error terms will
have a characteristic size or variance.
• In particular ARCH models assume the
variance of the current error term or innovation
to be a function of the actual sizes of the
previous time periods' error terms: often the
variance is related to the squares of the previous
innovations.
ARCH(q) model Specification
0
100
200
300
400
500
600
700
1
9
4
9
1
9
5
0
1
9
5
1
1
9
5
2
1
9
5
3
1
9
5
4
1
9
5
5
1
9
5
6
1
9
5
7
1
9
5
8
1
9
5
9
1
9
6
0

More Related Content

Similar to time series.ppt [Autosaved].pdf

Forecasting demand planning
Forecasting demand planningForecasting demand planning
Forecasting demand planningManonmaniA3
 
2b. Decomposition.pptx
2b. Decomposition.pptx2b. Decomposition.pptx
2b. Decomposition.pptxgigol12808
 
Business forecasting decomposition & exponential smoothing - bhawani nandan...
Business forecasting   decomposition & exponential smoothing - bhawani nandan...Business forecasting   decomposition & exponential smoothing - bhawani nandan...
Business forecasting decomposition & exponential smoothing - bhawani nandan...Bhawani N Prasad
 
Trend and Seasonal Components
Trend and Seasonal ComponentsTrend and Seasonal Components
Trend and Seasonal ComponentsAnnaRevin
 
Time series mnr
Time series mnrTime series mnr
Time series mnrNH Rao
 
Forecasting of demand (management)
Forecasting of demand (management)Forecasting of demand (management)
Forecasting of demand (management)Manthan Chavda
 
Trend and seasonal components/Abshor Marantika/Cindy Aprilia Anggaresta
Trend and seasonal components/Abshor Marantika/Cindy Aprilia AnggarestaTrend and seasonal components/Abshor Marantika/Cindy Aprilia Anggaresta
Trend and seasonal components/Abshor Marantika/Cindy Aprilia AnggarestaCindyAprilia15
 
Wavelet Multi-resolution Analysis of High Frequency FX Rates
Wavelet Multi-resolution Analysis of High Frequency FX RatesWavelet Multi-resolution Analysis of High Frequency FX Rates
Wavelet Multi-resolution Analysis of High Frequency FX RatesaiQUANT
 
Operations management forecasting
Operations management   forecastingOperations management   forecasting
Operations management forecastingTwinkle Constantino
 
Test of Random Walk Hypothesis: Before & After the 2007-09 Crisis
Test of Random Walk Hypothesis: Before & After the 2007-09 CrisisTest of Random Walk Hypothesis: Before & After the 2007-09 Crisis
Test of Random Walk Hypothesis: Before & After the 2007-09 CrisisGabriel Koh
 
Quantitative forecasting
Quantitative forecastingQuantitative forecasting
Quantitative forecastingRavi Loriya
 
Demand forecasting methods 1 gp
Demand forecasting methods 1 gpDemand forecasting methods 1 gp
Demand forecasting methods 1 gpPUTTU GURU PRASAD
 
Forecasting and methods of forecasting
Forecasting and methods of forecastingForecasting and methods of forecasting
Forecasting and methods of forecastingMilind Pelagade
 

Similar to time series.ppt [Autosaved].pdf (20)

Chap003 Forecasting
Chap003    ForecastingChap003    Forecasting
Chap003 Forecasting
 
Forecasting demand planning
Forecasting demand planningForecasting demand planning
Forecasting demand planning
 
2b. Decomposition.pptx
2b. Decomposition.pptx2b. Decomposition.pptx
2b. Decomposition.pptx
 
Business forecasting decomposition & exponential smoothing - bhawani nandan...
Business forecasting   decomposition & exponential smoothing - bhawani nandan...Business forecasting   decomposition & exponential smoothing - bhawani nandan...
Business forecasting decomposition & exponential smoothing - bhawani nandan...
 
Trend and Seasonal Components
Trend and Seasonal ComponentsTrend and Seasonal Components
Trend and Seasonal Components
 
forecasting
forecastingforecasting
forecasting
 
Time series mnr
Time series mnrTime series mnr
Time series mnr
 
Forecasting of demand (management)
Forecasting of demand (management)Forecasting of demand (management)
Forecasting of demand (management)
 
Trend and seasonal components/Abshor Marantika/Cindy Aprilia Anggaresta
Trend and seasonal components/Abshor Marantika/Cindy Aprilia AnggarestaTrend and seasonal components/Abshor Marantika/Cindy Aprilia Anggaresta
Trend and seasonal components/Abshor Marantika/Cindy Aprilia Anggaresta
 
forecasting methods
forecasting methodsforecasting methods
forecasting methods
 
Wavelet Multi-resolution Analysis of High Frequency FX Rates
Wavelet Multi-resolution Analysis of High Frequency FX RatesWavelet Multi-resolution Analysis of High Frequency FX Rates
Wavelet Multi-resolution Analysis of High Frequency FX Rates
 
Operations management forecasting
Operations management   forecastingOperations management   forecasting
Operations management forecasting
 
Test of Random Walk Hypothesis: Before & After the 2007-09 Crisis
Test of Random Walk Hypothesis: Before & After the 2007-09 CrisisTest of Random Walk Hypothesis: Before & After the 2007-09 Crisis
Test of Random Walk Hypothesis: Before & After the 2007-09 Crisis
 
Quantitative forecasting
Quantitative forecastingQuantitative forecasting
Quantitative forecasting
 
BS6_Measurement of Trend.pptx
BS6_Measurement of Trend.pptxBS6_Measurement of Trend.pptx
BS6_Measurement of Trend.pptx
 
Demand forecasting methods 1 gp
Demand forecasting methods 1 gpDemand forecasting methods 1 gp
Demand forecasting methods 1 gp
 
Master_Thesis_Harihara_Subramanyam_Sreenivasan
Master_Thesis_Harihara_Subramanyam_SreenivasanMaster_Thesis_Harihara_Subramanyam_Sreenivasan
Master_Thesis_Harihara_Subramanyam_Sreenivasan
 
Forecasting
ForecastingForecasting
Forecasting
 
Forecasting
ForecastingForecasting
Forecasting
 
Forecasting and methods of forecasting
Forecasting and methods of forecastingForecasting and methods of forecasting
Forecasting and methods of forecasting
 

Recently uploaded

result management system report for college project
result management system report for college projectresult management system report for college project
result management system report for college projectTonystark477637
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordAsst.prof M.Gokilavani
 
Glass Ceramics: Processing and Properties
Glass Ceramics: Processing and PropertiesGlass Ceramics: Processing and Properties
Glass Ceramics: Processing and PropertiesPrabhanshu Chaturvedi
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations120cr0395
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINESIVASHANKAR N
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...Call Girls in Nagpur High Profile
 
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfKamal Acharya
 
MANUFACTURING PROCESS-II UNIT-1 THEORY OF METAL CUTTING
MANUFACTURING PROCESS-II UNIT-1 THEORY OF METAL CUTTINGMANUFACTURING PROCESS-II UNIT-1 THEORY OF METAL CUTTING
MANUFACTURING PROCESS-II UNIT-1 THEORY OF METAL CUTTINGSIVASHANKAR N
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Dr.Costas Sachpazis
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlysanyuktamishra911
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Christo Ananth
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...ranjana rawat
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdfKamal Acharya
 
UNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and workingUNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and workingrknatarajan
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 

Recently uploaded (20)

DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINEDJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
 
result management system report for college project
result management system report for college projectresult management system report for college project
result management system report for college project
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
 
Glass Ceramics: Processing and Properties
Glass Ceramics: Processing and PropertiesGlass Ceramics: Processing and Properties
Glass Ceramics: Processing and Properties
 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
 
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
 
MANUFACTURING PROCESS-II UNIT-1 THEORY OF METAL CUTTING
MANUFACTURING PROCESS-II UNIT-1 THEORY OF METAL CUTTINGMANUFACTURING PROCESS-II UNIT-1 THEORY OF METAL CUTTING
MANUFACTURING PROCESS-II UNIT-1 THEORY OF METAL CUTTING
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghly
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdf
 
UNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and workingUNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and working
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 

time series.ppt [Autosaved].pdf

  • 1. TIME SERIES ANALYSIS FOR FORECASTING Rupika Abeynayake Professor in Applied Statistics
  • 2.  Frequently there is a time lag between awareness of an impending event or need and occurrence of the event  This lead-time is the main reason for planning and forecasting. If the lead-time is zero or very small, there is no need for planning  If the lead-time is long, and the outcome of the final event is conditional on identifiable factors, planning can perform an important role  In management and administrative situations the need for planning is great because the lead-time for decision making ranges from several years to a few days or hours to a few second. Therefore, forecasting is an important aid in effective and efficient planning Introduction
  • 3.  Forecasting is a prediction of what will occur in the future, and it is an uncertain process  One of the most powerful methodologies for generating forecasts is time series analysis  A data set containing observations on a single phenomenon observed over multiple time periods is called time-series. In time series data, both values and the ordering of the data points have meaning. For many agricultural products, data are usually collected over time Introduction…
  • 4.  Time series analysis and its applications have become increasingly important in various fields of research, such as business, economics, agriculture, engineering, medicine, social sciences, politics etc.  Realization of the fact that “ Time is Money " in business activities, the time series analysis techniques presented here, would be a necessary tool for applying to a wide range of managerial decisions successfully where time and money are directly related Introduction
  • 5.  On the time scale we are standing at a certain point called the point of reference (Yt) and we look backward over past observations (Yt-1, Yt-2,…,Yt-n+1) and foreword into the future (Ft+1, Ft+2, …,Ft+m)  Once a forecasting model has been selected, we fit the model to the known data and obtain the fitted values. For the known observations this allows calculation fitted errors (Yt-1- Ft-1)  A measure of goodness-of-fit of the model and as new observations become available we can examine forecasting errors (Yt+1- Ft+1) Forecasting scenario
  • 6. Measuring Forecast accuracy    n 1 t t e n 1 ME Mean error :    n 1 t t | e | n 1 MAE Mean absolute error : Mean squared error : Percentage error (PE) :    n 1 t 2 t | e | n 1 MSE PEt = 100*(Yt - Ft )/Yt Mean percentage error (MPE) :    n 1 t t PE n 1 MPE Mean absolute percentage error (MAPE) :    n 1 t t | PE | n 1 MAPE
  • 7. Main components of time series data There are four types of components in time series analysis Seasonal component (S) Trend component (T) Cyclical component (C) Irregular component ( I ) Yt = St , Tt , Ct , I Time Series Yt=S,T,C, I Seasonal removing using smoothing Trend removal using regression Cyclical removing using % ratio I
  • 8. Moving averages  Smoothing techniques are used to reduce irregularities (random fluctuations) in time series data  Moving averages rank among the most popular techniques for the preprocessing of time series. They are used to filter random "white noise" from the data, to make the time series smoother  There are several methods of Moving averages  Simple Moving Averages  Double moving averages  Centered Moving Average  Weighted Moving Average
  • 9. Simple Moving Averages  Moving Averages (MA) is effective and efficient approach provided the time series is stationary in both mean and variance  The simple moving average required an odd number of observations to be included in each average at the middle of the data value being averaged  Takes a certain number of past periods and add them together; then divide by the number of periods gives the simple moving average  The following formula is used in finding the moving average of order n, MA(n) for a period t+1 MAt+1 = [Yt + Yt-1 + ... +Yt-n+1] / n
  • 10. Month Time period Observed values Three-month moving average 3 MA Five-month moving average 5 MA Jan 1 266.0 - - Fab 2 145.9 - - - - Mar 3 183.1 Apr 4 119.3 198.333 149.433 160.900 156.033 193.533 208.267 216.367 180.067 217.400 215.100 238.900 - May 5 180.3 - Jun 6 168.5 Jul 7 231.8 Aug 8 224.5 Sep 9 192.8 Oct 10 122.9 Nov 11 336.5 Dec 12 185.9 Jan 13 194.3 . Feb 14 149.5 . Example 178.92 159.42 176.60 184.88 199.58 188.10 221.70 212.52 206.48 .
  • 13. 1Centered Moving Average  Suppose we wish to calculate a moving average with an even number of observations for example, to calculate a 4-term moving average or 4 MA for, the data  The center of the fist moving average is at 2.5 while the center of the second moving average is at 3.5  The average of the two moving averages is centered at 4  Therefore, this problem can be overcome by taking an additional 2-period moving average of the 4-period moving average  This centered moving average is denoted as 2 X 4 MA
  • 14. 2.1Example Month Time period Observed values Four-month moving average 4 MA 2 X 4 MA Jan 1 266.0 - 178.6 - Fab 2 145.9 - Mar 3 183.1 157.6 - Apr 4 119.3 162.8 - May 5 180.3 174.9 - Jun 6 168.5 201.3 Jul 7 231.8 204.4 Aug 8 224.5 193.0 Sep 9 192.8 219.2 Oct 10 122.9 209.5 Nov 11 336.5 209.9 Dec 12 185.9 216.6 Jan 13 194.3 . Feb 14 149.5 . 167.863 159.975 168.887 188.125 202.837 198.700 206.088 214.350 209.712 167.863
  • 15.  This method is very powerful with comparing simple moving averages  Weighted MA(3) can be expressed as, Weighted MA(3) = w1.Yt + w2.Yt-1 + w3.Yt-2 where w1, w2, & w3 are weights  There are many schemes selecting appropriate weights (Kendall, Stuart, and Ord (1983)  Weights are any positive numbers such that, w1 + w2 + w3 = 1  One of the methods of calculating weights is, w1 = 3/(1 + 2 + 3) = 3/6, w2 = 2/6, and w3 = 1/6 Weighted Moving Average
  • 16. 2.Exponential Smoothing Techniques  One of the most successful forecasting methods is the exponential smoothing (ES)  ES is an averaging technique that uses unequal weights and assigns exponentially decreasing weights as the observations get older  There are several exponential smoothing techniques Single Exponential Smoothing Holt’s linear method Holt-Winters’ trend and seasonality method
  • 17. Single Exponential Smoothing  The method of single exponential forecasting takes the forecast for the previous period and adjust it using the forecast error. [(Forecast error = (Yt – Ft)] Ft+1 = Ft + a (Yt - Ft) Ft+1 = a Yt + (1 - a) Ft where: Yt is the actual value Ft is the forecasted value a is the weighting factor, which ranges from 0 to 1 t is the current time period
  • 18. Choosing the Best Value for Parameter a (alpha)  In practice, the smoothing parameter is often chosen by a grid search of the parameter space  That is, different solutions for = 0.1 to are tried starting with 0.9, with increments of 0.1.  Then is chosen so as to produce the smallest sums of squares (or mean squares) for the residuals.
  • 19. Month Time period Observed values Exponentially Smoothed values a = 0.1 a = 0.5 a = 0.9 Jan 1 200.0 Feb 2 135.0 200.0 200.0 200.0 Mar 3 195.0 193.5 167.5 141.5 Apr 4 197.5 193.7 181.3 189.7 May 5 310.0 194.0 189.4 196.7 Jun 6 175.0 205.6 249.7 298.7 Jul 7 155.0 202.6 212.3 187.4 Aug 8 130.0 197.8 183.7 158.2 Sep 9 220.0 191.0 156.8 132.8 Oct 10 277.5 193.9 188.4 211.3 Nov 11 235.0 202.3 233.0 270.9 Dec 12 - 205.6 234.0 238.6
  • 20. Analysis of Errors (Test period : 2 – 11) a = 0.1 a = 0.5 a = 0.9 Mean Error 5.56 6.80 4.29 Mean Absolute Error 47.76 56.94 61.32 Mean Absolute percentage Error (MAPE) 24.58 29.20 30.81 Mean Square Error (MSE) 3438.33 4347.24 5039.37 Theil’s U-statistics 0.81 0.92 0.98
  • 21. Time Series Plots 0 50 100 150 200 250 300 350 1 2 3 4 5 6 7 8 9 10 11 12 Month Shipments Observed values SES = 0.1 SES = 0.5 SES = 0.9
  • 22. Holt’s linear method  Holt (1957) extended single exponential smoothing to linear exponential smoothing to allow forecasting of data with trends  The forecast for Hollt’s linear exponential smoothing is found using two smoothing constants, a and  (values between 0 & 1), and three equations ) )( 1 ( 1 1       t t t t b L Y L a a 1 1 ) 1 ( ) (       t t t t b L L b   m b L F t t m t    Smoothing of data Smoothing of trend Forecast for m period a head The initialization process : L1 = Y1 and b1 = Y2 – Y1 or b1 = (Y4-Y1) / 3
  • 23. Holt-Winters’ trend and seasonality method  Holt’s method was extended by Winters (1960) to capture seasonality  The Holt-Winter’s method is based on three smoothing equations, one for level, one for trend, and one for seasonality ) )( 1 ( 1 1        t t s t t t b L S Y L a a 1 1 ) 1 ( ) (       t t t t b L L b   s t t t t S L Y S     ) 1 (   m s t t t m t S m b L F      ) ( Level Trend Seasonal Forecast
  • 25. Season Sales Average Sales Seasonal Factor Spring 200 250 200/250 = 0.8 Summer 350 250 350/250 = 1.4 Fall 300 250 300/250 = 1.2 Winter 150 250 150/250 = 0.6 Total 1000 1000 Seasonal Factor Ratio-to-moving-average
  • 26. Season Average Sales (1100/4) Next Year Forecast FORECAST Spring 275 275*0.8 220 Summer 275 275*1.4 385 Fall 275 275*1.2 330 Winter 275 275*0.6 165 Total 1100 1100 If Next year expected sale increment is 10%
  • 27. The table below represent the Quarterly sales Figures Year Q1 Q2 Q3 Q4 2008 20 30 39 60 2009 40 51 62 81 2010 50 64 74 85
  • 28. Time Period Quarter Time index Sales Centered MA (4) Sales/MA*100 2008 Q1 1 20 2008 Q2 2 30 2008 Q3 3 39 2008 Q4 4 60 39.750 150.943 2009 Q1 5 40 44.875 89.136 2009 Q2 6 51 50.375 101.241 2009 Q3 7 62 55.875 110.962 2009 Q4 8 81 59.750 135.565 2010 Q1 9 50 62.625 79.840 2010 Q2 10 64 65.750 97.338 2010 Q3 11 74 67.750 109.225 2010 Q4 12 85
  • 29. Year Q1 Q2 Q3 Q4 2008 150.94 2009 89.14 101.24 110.96 135.56 2010 79.84 97.33 109.22 Mean 84.49 99.29 110.095 143.25 437.125 AF 0.915 0.915 0.915 0.915 Seasonal Index 77.3085 90.85 100.736 131.0737 399.9693 Adjusted Factor (AF) = 400/437.125=0.915
  • 30.
  • 31. 150 100 50 0 600 500 400 300 200 100 Original Data 150 100 50 0 500 400 300 200 100 Seasonally Adjusted Data 150 100 50 0 1.4 1.3 1.2 1.1 1.0 0.9 0.8 0.7 0.6 0.5 Detrended Data 150 100 50 0 50 0 -50 -100 Seasonally Adj. and Detrended Data Component Analysis
  • 32. General overview of forecasting techniques Classification of the widely used Forecasting Techniques Causal Model Time Series Model Smoothing Techniques Model Regression Analysis Box-Jenkins Processes Moving averages and Exponential Smoothing Techniques
  • 33. Box-Jenkins ARIMA models (1970) Plot series Is variance stable Apply transformation Obtain ACFs and PACFs Is mean stable Apply regular and seasonal differencing Model selection Estimate parameter values Are residual uncorrelated Modify Model Are parameters significant Forecasting No No Yes No Yes Yes Yes Box-Jenkins Modeling Approach to Forecasting
  • 34.  The key statistics in time series analysis is the autocorrelation coefficient (or the correlation of the time series with itself, lagged by 1, 2, or more periods), which is given by the following formula Autocorrelation function           n t t n k t K t t k Y Y Y Y Y Y r 1 2 1 ) ( ) )( (  Then r1 indicates how successive values of Y relate to each other, r2 indicates how Y values two period apart relate to each other, and so on  Autocorrelations at lags 1, 2, …, make up the autocorrelation function or ACF
  • 35. Partial autocorrelation function  Partial autocorrelations are used to measure the degrees of association between Yt and Yt-k, when the effects of other time lags 1, 2, 3,…,k-1 are removed  Suppose there was a significant autocorrelation between Yt and Yt-1. Then there will also be a significant correlation between Yt-1 and Yt-2 since they are also one time unit apart  Consequently, there will be a correlation between Yt and Yt-2 because both are related to Yt-1  So, to measure the real correlation between Yt and Yt-2, we need to take out the effect of the intervening value Yt-1. This is what partial autocorrelation does.
  • 36. Autocorrelation function Y Y t-1 Yt-Ӯ Yt-1- Ӯ (Yt-Ӯ)( Yt-1- Ӯ) (Yt-Ӯ) (Yt-Ӯ) 2 -4 16 3 2 -3 -4 12 9 5 3 -1 -3 3 1 7 5 1 -1 -1 1 9 7 3 1 3 9 10 9 4 3 12 16 Total 29 52 0.557692 52 29 ) Y (Y ) Y )(Y Y (Y r n 1 t 2 t n 1 k t K t t k            
  • 37. Sampling distribution of Auto Correlation (317) Portmanteau Tests An alternative to this would be to examine a whole set of rk values, say the first 10 of them (r1 to r10) all at once and then test to see whether the set is significantly different from a zero set. Such a test is known as a portmanteau test, and the two most common are the Box-Pierce test and the Ljung-Box Q* statistic.. The Box-Pierce Test Here is the Box-Pierce formula:
  • 38. Checking for Error Autocorrelation • Test: Durbin-Watson statistic: k is number of parameters in the model subtract one d.f. K and n for , ) ( 2 2 1      i i i e e e d Positive Zone of No Autocorrelation Zone of Negative autocorrelation indecision indecision autocorrelation |_______________|__________________|_____________|_____________|__________________|___________________| 0 d-lower d-upper 2 4-d-upper 4-d-lower Autocorrelation is clearly evident Ambiguous – cannot rule out autocorrelation Autocorrelation in not evident
  • 39. • Value near 2 indicates non-autocorrelation • Value toward 0 indicates positive autocorrelation • Value toward 4 indicates negative autocorrelation
  • 40. To test for positive autocorrelation at significance α, the test statistic d is compared to lower and upper critical values (dL,α and dU,α): •If d < dL,α, there is statistical evidence that the error terms are positively autocorrelated •If d > dU,α, there is no statistical evidence that the error terms are positively autocorrelated •If dL,α < d < dU,α, the test is inconclusive Positive serial correlation is serial correlation in which a positive error for one observation increases the chances of a positive error for another observation
  • 41. To test for negative autocorrelation at significance α, the test statistic (4 − d) is compared to lower and upper critical values (dL,α and dU,α): •If (4 − d) < dL,α, there is statistical evidence that the error terms are negatively autocorrelated. •If (4 − d) > dU,α, there is no statistical evidence that the error terms are negatively autocorrelated. •If dL,α < (4 − d) < dU,α, the test is inconclusive.
  • 42.
  • 43. There is no growth or decline in the data. The data must be roughly horizontal along the time axis. In other words the data fluctuate around a constant mean, independent of time and the variance of the fluctuation remain essentially constant over time. Stationary of the time series data A series stationary in mean and variance 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 Lag Number 1.0 0.5 0.0 -0.5 -1.0 ACF ACF for non-stationary time series data
  • 44. Non-stationary of the time series data A series non-stationary in mean and variance A series non-stationary in mean
  • 45. Unit Roots • Consider an AR(1) model: yt = a1yt-1 + εt (eq. 1) εt ~ N(0, ζ2) • Rewrite equation 1 by subtracting yt-1 from both sides: yt – yt-1 = a1yt-1 – yt-1 + εt (eq. 2) Δyt = δ yt-1 + εt δ = (a1 – 1)
  • 46. Unit Roots • H0: δ = 0 (there is a unit root) not stationary • HA: δ ≠ 0 (there is not a unit root) stationary • If δ = 0, then we can rewrite equation 2 as Δyt = εt Thus first differences of a random walk time series are stationary, because by assumption, εt is purely random. In general, a time series must be differenced d times to become stationary; it is integrated of order d or I(d). A stationary series is I(0). A random walk series is I(1).
  • 47. Tests for Unit Roots • Dickey-Fuller test – Estimates a regression equation – The usual t-statistic is not valid, thus D-F developed appropriate critical values. – You can include a constant, trend, or both in the test. – If you accept the null hypothesis, you conclude that the time series has a unit root. – In that case, you should first difference the series before proceeding with analysis.
  • 48. Tests for Unit Roots • Augmented Dickey-Fuller test – We can use this version if we suspect there is autocorrelation in the residuals. – This model is the same as the DF test, but includes lags of the residuals too. • Phillips-Perron test – Makes milder assumptions concerning the error term, allowing for the εt to be weakly dependent and heterogenously distributed. • Other tests include Variance Ratio test, Modified Rescaled Range test, & KPSS test. • There are also unit root tests for panel data (Levin et al 2002).
  • 49. Removing the non-stationary in a time series  One way of removing non-stationarity is through the method of differencing  Differenced series can be expressed as 1 '    t t t Y Y Y ' t Y ' t Y  The differenced series will have only n-1 values since it is not possible to calculate a difference for the first observation  With seasonal data, which is non-stationary, it may be appropriate to take seasonal differences. A seasonal difference is the difference between an observation and the corresponding observation from the previous year. So for monthly data having an annual 12-month pattern 12 '    t t t Y Y Y
  • 50. Backshift Notation • A very useful notational device is the backward shift operator, B which is used as follows: 1   t t Y BY • B operating on Yt , has the effect on shifting the data back one period
  • 51. 2 2 ) (    t t t Y Y B BY B 1   t t Y BY 2 2   t t Y Y B 3 3   t t Y Y B d t t d Y Y B   Therefore,
  • 52. 1 '    t t t Y Y Y t t t BY Y Y   ' t t Y B Y ) 1 ( '   First Difference t t Y B Y ) 1 ( '  
  • 53. 2    t t Y Y t t Y B Y 2   t Y B ) 1 ( 2   Second Difference Third Difference t Y B ) 1 ( 3  
  • 54. t t t BY Y Y   ' t t Y B Y ) 1 ( '   First Order Difference 1 '    t t t Y Y Y
  • 55. ) ( ' 1 ' ' '    t t t Y Y Y ) ( ) ( 2 1 1 ' '        t t t t t Y Y Y Y Y 2 1 1 ' '        t t t t t Y Y Y Y Y 2 1 ' ' 2      t t t t Y Y Y Y t t t t Y B BY Y Y 2 ' ' 2    t t Y B B Y ) 2 1 ( 2 ' '    t t Y B Y 2 ' ' ) 1 (   Second Order Difference
  • 56. Linear time series models t p t p t t t t Y Y Y Y C Y                 ... 3 3 2 2 1 1 AR (p)  ARIMA (p,0,0) : t q t q t t t t C Y                     ... 3 3 2 2 1 1 MA (q) ARIMA (0,0,q) : q t q t t p t p t t t Y Y Y C Y                           ... ... 2 2 1 1 2 2 1 1 ARMA (p,q) : where B is Back shift notation ( BYt = Y t-1) t d t d B Y  ) 1 (    ARIMA (p,d,q) : t t B C Y B B    ) 1 ( ) 1 )( 1 ( 1 1      ARIMA (p,1,q) :
  • 57. ARIMA models for time series data  The general model introduced by Box and Jenkins (1970) includes autoregressive as well as moving average parameters, and explicitly includes differencing in the formulation of the model  In the notation introduced by Box and Jenkins, models are summarized as ARIMA (p, d, q)  Three types of parameters in the model are: autoregressive parameters (p), number of differencing passes (d), and moving average parameters (q)  For example, a model described as (0, 1, 2) means that it contains 0 (zero) autoregressive and 2 moving average which were computed for the series after it was differenced once
  • 58.  It is needed to decide on (identify) the specific number and type of ARIMA parameters to be estimated  The major tools used in the identification phase are plots of the series, correlograms of auto correlation (ACF), and partial autocorrelation (PACF)  A majority of empirical time series patterns can be sufficiently approximated using one of the 5 basic models that can be identified based on the shape of the autocorrelogram (ACF) and partial auto correlogram (PACF) Estimation of parameters
  • 59. Estimation of parameters… AR(1): ACF - exponential decay; PACF - spike at lag 1, no correlation for other lags AR(2): ACF - a sine-wave shape pattern or a set of exponential decays; PACF - spikes at lags 1 and 2, no correlation for other lags MA(1): ACF - spike at lag 1, no correlation for other lags; PACF - damps out exponentially MA(2) : ACF - spikes at lags 1 and 2, no correlation for other lags; PACF - a sine-wave shape pattern or a set of exponential decays ARMA(1.1): ACF - exponential decay starting at lag 1; PACF - exponential decay starting at lag 1
  • 60. ACF and PACF functions AR (1) models Autocorrelation Partial autocorrelation Autocorrelation Partial autocorrelation ACF and PACF for AR(1) and  > 0 model ACF and PACF for AR(1) and  < 0 model
  • 61. ACF and PACF functions MA (1) model Autocorrelation Partial autocorrelation Autocorrelation Partial autocorrelation PACF for MA(1) and  < 0 model ACF and PACF for MA(1) and  > 0 model
  • 62. Seasonal models  In addition to the non-seasonal parameters, seasonal parameters for a specified lag need to be estimated  Analogous to the simple ARIMA parameters, these are: seasonal autoregressive (Ps), seasonal differencing (Ds), and seasonal moving average parameters (Qs) and seasonal models are summarized as ARIMA (p, d, q) (P, D, Q)  For example, the model (0,1,2)(0,1,1) describes a model that includes no autoregressive parameters, 2 regular moving average parameters and 1 seasonal moving average parameter, and these parameters were computed for the series after it was differenced once with lag 1, and once seasonally differenced
  • 63. Seasonal models… ARIMA(p,d,q)(P,D,Q)S: The general recommendations concerning the selection of parameters to be estimated (based on ACF and PACF) also apply to seasonal models t S t S S B B Y B B B B     ) 1 )( 1 ( ) 1 )( 1 )( 1 )( 1 ( 1 1 1 1        φ Non-seasonal AR (1) Seasonal AR (1) Non-seasonal difference Seasonal difference Non-seasonal MA (1) Seasonal MA (1)
  • 64. ACF/PACF • The seasonal part of an AR or MA model will be seen in the seasonal lags of the PACF and ACF respectively • For example, an ARIMA(0,0,0)(0,0,1)12 model will show: a spike at lag 12 in the ACF but no other significant spikes • The PACF will show exponential decay in the seasonal lags; that is, at lags 12, 24, 36, …. • Similarly, an ARIMA(0,0,0)(1,0,0)12 model will show: exponential decay in the seasonal lags of the ACF a single significant spike at lag 12 in the PACF.
  • 65.
  • 66.
  • 67.
  • 68.
  • 69.
  • 70.
  • 71.
  • 72. Example Fitting of a time series model for describing the average monthly Sri Lankan spot price of black pepper 1949 – 1960 in Sri Lankan Rupees per Kg ’49 ‘50 ‘51 ‘52 ‘53 ‘54 ‘55 ‘56 ‘57 ‘58 ‘59 ‘60 Jan. 112 115 145 171 196 204 242 284 315 340 360 417 Feb 118 126 150 180 196 188 233 277 301 318 342 391 Mar 132 141 178 193 236 135 267 317 356 362 406 419 Aprl 129 135 163 181 235 227 269 313 348 348 396 461 May 121 125 172 183 229 234 270 318 355 363 420 472 Jun 135 149 178 218 243 264 315 374 422 435 472 535 July 148 170 199 230 264 302 364 413 465 491 548 622 Aug 148 170 199 242 272 293 347 405 467 505 559 606 Sep 136 158 184 209 237 259 312 355 404 404 463 508 Oct. 119 133 162 191 211 229 274 306 347 359 407 461 Nov 104 114 146 172 180 203 237 271 205 310 362 390 Dec 118 140 166 194 201 229 278 306 336 337 405 432
  • 73. Time series plot for original data 1 4 3 1 4 1 1 3 9 1 3 7 1 3 5 1 3 3 1 3 1 1 2 9 1 2 7 1 2 5 1 2 3 1 2 1 1 1 9 1 1 7 1 1 5 1 1 3 1 1 1 1 0 9 1 0 7 1 0 5 1 0 3 1 0 1 9 9 9 7 9 5 9 3 9 1 8 9 8 7 8 5 8 3 8 1 7 9 7 7 7 5 7 3 7 1 6 9 6 7 6 5 6 3 6 1 5 9 5 7 5 5 5 3 5 1 4 9 4 7 4 5 4 3 4 1 3 9 3 7 3 5 3 3 3 1 2 9 2 7 2 5 2 3 2 1 1 9 1 7 1 5 1 3 1 1 9 7 5 3 1 Case Number 700 600 500 400 300 200 100 Value yt
  • 74. Time series plot for trance formed data ‘49 ‘50 ‘51 ‘52 ‘53 ‘54 ‘55 ‘56 ‘57 ‘58 ‘59 ‘60 4.72 4.74 4.98 5.14 5.28 5.32 5.49 5.65 5.75 5.83 5.89 6.03 4.77 4.84 5.01 5.19 5.28 5.24 5.45 5.62 5.71 5.76 5.83 5.97 4.88 4.95 5.18 5.26 5.46 4.91 5.59 5.76 5.87 5.89 6.01 6.04 4.86 4.91 5.09 5.20 5.46 5.42 5.59 5.75 5.85 5.85 5.98 6.13 4.80 4.83 5.15 5.21 5.43 5.46 5.60 5.76 5.87 5.89 6.04 6.16 4.91 5.00 5.18 5.38 5.49 5.58 5.75 5.92 6.05 6.08 6.16 6.28 5.00 5.14 5.29 5.44 5.58 5.71 5.90 6.02 6.14 6.20 6.31 6.43 5.00 5.14 5.29 5.49 5.61 5.68 5.85 6.00 6.15 6.22 6.33 6.41 4.91 5.06 5.21 5.34 5.47 5.56 5.74 5.87 6.00 6.00 6.14 6.23 4.78 4.89 5.09 5.25 5.35 5.43 5.61 5.72 5.85 5.88 6.01 6.13 4.64 4.74 4.98 5.15 5.19 5.31 5.47 5.60 5.32 5.74 5.89 5.97 4.77 4.94 5.11 5.27 5.30 5.43 5.63 5.72 5.82 5.82 6.00 6.07
  • 75. Time series plot for trance formed data 1 4 3 1 4 1 1 3 9 1 3 7 1 3 5 1 3 3 1 3 1 1 2 9 1 2 7 1 2 5 1 2 3 1 2 1 1 1 9 1 1 7 1 1 5 1 1 3 1 1 1 1 0 9 1 0 7 1 0 5 1 0 3 1 0 1 9 9 9 7 9 5 9 3 9 1 8 9 8 7 8 5 8 3 8 1 7 9 7 7 7 5 7 3 7 1 6 9 6 7 6 5 6 3 6 1 5 9 5 7 5 5 5 3 5 1 4 9 4 7 4 5 4 3 4 1 3 9 3 7 3 5 3 3 3 1 2 9 2 7 2 5 2 3 2 1 1 9 1 7 1 5 1 3 1 1 9 7 5 3 1 Case Number 6.50 6.00 5.50 5.00 4.50 Value Log_yt
  • 76. 1 4 3 1 4 1 1 3 9 1 3 7 1 3 5 1 3 3 1 3 1 1 2 9 1 2 7 1 2 5 1 2 3 1 2 1 1 1 9 1 1 7 1 1 5 1 1 3 1 1 1 1 0 9 1 0 7 1 0 5 1 0 3 1 0 1 9 9 9 7 9 5 9 3 9 1 8 9 8 7 8 5 8 3 8 1 7 9 7 7 7 5 7 3 7 1 6 9 6 7 6 5 6 3 6 1 5 9 5 7 5 5 5 3 5 1 4 9 4 7 4 5 4 3 4 1 3 9 3 7 3 5 3 3 3 1 2 9 2 7 2 5 2 3 2 1 1 9 1 7 1 5 1 3 Case Number 150 100 50 0 -50 -100 -150 Value SDIFF(Yt,1,12) 1 4 4 1 4 2 1 4 0 1 3 8 1 3 6 1 3 4 1 3 2 1 3 0 1 2 8 1 2 6 1 2 4 1 2 2 1 2 0 1 1 8 1 1 6 1 1 4 1 1 2 1 1 0 1 0 8 1 0 6 1 0 4 1 0 2 1 0 0 9 8 9 6 9 4 9 2 9 0 8 8 8 6 8 4 8 2 8 0 7 8 7 6 7 4 7 2 7 0 6 8 6 6 6 4 6 2 6 0 5 8 5 6 5 4 5 2 5 0 4 8 4 6 4 4 4 2 4 0 3 8 3 6 3 4 3 2 3 0 2 8 2 6 2 4 2 2 2 0 1 8 1 6 1 4 Case Number 100 50 0 -50 -100 -150 Value DIFF(Yt_D1,1) Graph after a seasonal difference (D=1) Graph after first difference of first seasonal differenced data (d=1) Time series plots after differencing
  • 77. ACF and PACF plots 3 6 3 5 3 4 3 3 3 2 3 1 3 0 2 9 2 8 2 7 2 6 2 5 2 4 2 3 2 2 2 1 2 0 1 9 1 8 1 7 1 6 1 5 1 4 1 3 1 2 1 1 1 0 9 8 7 6 5 4 3 2 1 Lag Number 1.0 0.5 0.0 -0.5 -1.0 ACF Lower Confidence Limit Upper Confidence Limit Coefficient DIFF(Yt_D1,1) 3 6 3 5 3 4 3 3 3 2 3 1 3 0 2 9 2 8 2 7 2 6 2 5 2 4 2 3 2 2 2 1 2 0 1 9 1 8 1 7 1 6 1 5 1 4 1 3 1 2 1 1 1 0 9 8 7 6 5 4 3 2 1 Lag Number 1.0 0.5 0.0 -0.5 -1.0 Partial ACF Lower Confidence Limit Upper Confidence Limit Coefficient DIFF(Yt_D1,1) ACF PACF
  • 78. Comparison of ARIMA models with AIC Akaike’s Information Criteria (AIC) = -2 log L + 2m Where, L = Likelihood m = p+q+P+Q σ2 = variance of residual
  • 79. Comparison of ARIMA models with AIC S. No. Model AIC 1 ARIMA (1,1,1) (0,1,1)12 1172.362 2 ARIMA (0,1,1) (0,1,1)12 1169.650 3 ARIMA (0,1,2) (0,1,1)12 1171.052 4 ARIMA (0,1,1) (0,1,2)12 1172.295 5 ARIMA (0,1,1) (1,1,1)12 1172.631 6 ARIMA (0,1,3) (0,1,1)12 1172.205 7 ARIMA (1,1,1) (1,1,1)12 1171.065 8 ARIMA (0,1,1) (1,1,0)12 1171.286 9 ARIMA (1,1,1) (1,1,0)12 1170.986 10 ARIMA (1,1,0) (0,1,1)12 1185.438 Akaike’s Information Criteria (AIC) = -2 log L + 2m
  • 80. Conclusions  The Moving average methods and single exponential smoothing emphasizes the short-range perspective, on the condition that there is no trend no seasonality  Holt’s linear exponential smoothing captures information about recent trend  Hotl’s method was extended by Winters (1960) to capture seasonality  The class of ARIMA models are useful for both stationary and non-stationary time series  There are numerical indicators for assessing the accuracy of the forecasting technique, the most widely use approach is use of several indicators to assess their accuracy
  • 81. Box, G. E. P. and Jenkins G. M. (1970). Time series analysis: forecasting and control, San Francisco: Holden day. Box, G. E. P. and Pierce D. A. (1970). Distribution of the residual autocorrelation in autoregressive-integrated moving-average time series models, Journal American Statistical Association, 65, 1509-1526. Gardner, E. S. (1985). Exponential smoothing: the state of the art, Journal of Forecasting, 4, 1- 28. Holt, C.C. (1957). Forecasting seasonal and trends by exponentially weighed moving average, Office of Navel Research, Research memorandum No. 52. Makridakis, S. and Hibon, M. (1979). Accuracy of forecasting an empirical investigation. Journal of Royal Statistical Society A, 142, 97-145. Makridakis, S., Steven, C. W. and Rob J. Hyndman (1998). Forecasting Methods and Applications, 3rd edition, John Wiley & sons, New Yark. Winters, P. R.(1960). Forecasting sales by exponentially weighted moving averages, seasonal and trends by exponentially weighed moving average, Management Science, 6, 324-342. Yar, M and C. Chatfield (1990), Prediction intervals for the Holt-Winters forecasting procedure, International Journal of Forecasting 6, 127-137. References
  • 82. ARCH Model • At any point in a series, the error terms will have a characteristic size or variance. • In particular ARCH models assume the variance of the current error term or innovation to be a function of the actual sizes of the previous time periods' error terms: often the variance is related to the squares of the previous innovations.
  • 84.
  • 85.
  • 86.