This document summarizes a session on financial econometric models. It discusses time series analysis principles including auto regressive processes, moving average processes, and ARMA models. The session aims to remind participants of key concepts from the previous session and conclude by reviewing time series analysis steps of identifying trends and seasonality, fitting appropriate models, and forecasting.
2. ESGF 5IFM Q1 2012
Summary of the session (Est. 3h)
• Reminder of Last Session
• Time Series Analysis Principles
• Auto Regressive Process
vinzjeannin@hotmail.com
• Moving Average Process
• ARMA
• Conclusion
2
3. Be logic!
Reminder of Last Session
vinzjeannin@hotmail.com ESGF 5IFM Q1 2012
3
11. ESGF 4IFM Q1 2012
Differentiate series to obtain stationary series
Time series analysis and forecast simpler with stationary series
vinzjeannin@hotmail.com
Different models involved with stationary or heteroscedasticity
11
12. Properties of stationary series
Same distribution of the following
ESGF 4IFM Q1 2012
(������1 , ������2 , ������3 , … , ������������ )
(������2 , ������3 , ������4 , … , ������������+1 )
Distribution not time dependent
vinzjeannin@hotmail.com
Rare occurrence
Stationarity accepted if
������(������������ ) = ������ Constant in the time
12
������������������(������������ , ������������−������ ) Depends only on n
13. Acceptable Shortcut
ESGF 4IFM Q1 2012
A series is stationary if the mean and the variance are stable
Which one is more likely to be stationary?
vinzjeannin@hotmail.com
13
14. About the residuals…
White noise!
ESGF 4IFM Q1 2012
Normality test
vinzjeannin@hotmail.com
Have an idea with
Skewness
Kurtosis
Proper tests: KS, Durbin Watson, Portmanteau,…
14
15. Auto Regressive Process
Main principle
ESGF 4IFM Q1 2012
There is a correlation between current data and previous data
������������ = ������ + ������1 ������������−1 + ������2 ������������−2 + ⋯ + ������������ ������������−������ + ������������
vinzjeannin@hotmail.com
������������ Parameters of the model
������������ White noise
AR(n)
If the parameters are identified, the prediction will be easy 15
16. Let’s upload some data
DATA<-read.csv(file="C:/Users/vin/Desktop/Series1.csv",header=T)
plot(DATA$Val, type="l")
ESGF 4IFM Q1 2012
vinzjeannin@hotmail.com
16
17. Is this a white noise?
hist(DATA$Val, breaks=20)
ESGF 4IFM Q1 2012
vinzjeannin@hotmail.com
17
18. Probably not…
Portmanteau test
ESGF 4IFM Q1 2012
Test the autocorrelation of a series
If there is autocorrelation, data aren’t independently distributed
vinzjeannin@hotmail.com
Let’s use Ljung–Box statistics
H0: Data are independently distributed
H1: Data aren’t independently distributed
������
������2 ������ With α confidence interval rejection
������ = ������(������ + 2) following a Chi Square distribution
������ − ������
������=1
18
������������ Autocorrelation at the lag k ������ > Χ 21−������,ℎ
19. > Box.test(DATA$Val)
ESGF 4IFM Q1 2012
Box-Pierce test
data: DATA$Val
X-squared = 188.3263, df = 1, p-value < 2.2e-16
vinzjeannin@hotmail.com
H0 is rejected, the data aren’t independently distributed
19
20. Let’s try a regression and analyse residuals
TReg<-lm(DATA$Val~DATA$t)
plot(DATA$Val, type="l")
ESGF 4IFM Q1 2012
abline(TReg, col="blue")
vinzjeannin@hotmail.com
20
22. Box-Pierce test
ESGF 4IFM Q1 2012
data: eps
X-squared = 187.6299, df = 1, p-value < 2.2e-16
Residuals aren’t a white noise
vinzjeannin@hotmail.com
Regression rejected
Not a surprise, did the series look stationary?
22
What next then?
24. Does the differentiation create a stationary series?
plot(diff(DATA$Val), type="l")
ESGF 4IFM Q1 2012
vinzjeannin@hotmail.com
24
25. ACF & PACF
par(mfrow=c(2,1))
acf(diff(DATA$Val),20)
ESGF 4IFM Q1 2012
pacf(diff(DATA$Val),20)
vinzjeannin@hotmail.com
ACF decreasing
PACF cancelling after order 1
25
26. Decreasing ACF
ESGF 4IFM Q1 2012
PACF cancel after order 1
vinzjeannin@hotmail.com
Typically an Autoregressive Process
AR(1) ?
26
27. Let’s try to fit an AR(1) model
Modl<-ar(diff(DATA$Val),order.max=20)
plot(Modl$aic)
ESGF 4IFM Q1 2012
vinzjeannin@hotmail.com
27
The likelihood for the order 1 is significant
28. > ar(diff(DATA$Val),order.max=20)
Call:
ar(x = diff(DATA$Val), order.max = 20)
Coefficients:
1 2 3
ESGF 4IFM Q1 2012
0.5925 -0.1669 0.1385
Order selected 3 sigma^2 estimated as 0.8514
vinzjeannin@hotmail.com
We have our coefficient and standard deviation
> ARDif<-diff(DATA$Val)
> ARDif[1]
[1] 0.3757723
We know the first term of our series
������������ = 0.3757723 + 0.5925. ������������−1 + ������������
28
Here is our model
29. Need to test the residuals
ESGF 4IFM Q1 2012
Box.test(Modl$resid)
Box-Pierce test
vinzjeannin@hotmail.com
data: Modl$resid
X-squared = 7e-04, df = 1, p-value = 0.9789
H0 accepted, residuals are independently distributed (white noise)
The differentiated series is a AR(1)
29
41. > Box.test(Modl$resid)
Box-Pierce test
ESGF 4IFM Q1 2012
data: Modl$resid
X-squared = 0.0023, df = 1, p-value = 0.9619
vinzjeannin@hotmail.com
Model accepted
The more factors the harder the prediction is
The more factors there are the more stationary need to be
the series for a good prediction
41
42. Moving Average Process
Main principle
ESGF 4IFM Q1 2012
Stationary series with auto correlation of errors
������������ = ������ + ������������ + ������1 ������������−1 + ������2 ������������−2 + ⋯ + ������������ ������������−������
vinzjeannin@hotmail.com
������������ Parameters of the model
������������ White noise
MA(n)
More difficult to estimate than a AR(n) 42
52. ARMA
Main principle
ESGF 4IFM Q1 2012
The series is a function of past values plus current and past values of the noise
vinzjeannin@hotmail.com
ARMA(p,q)
Combines AR(p) & MA(q)
52
60. Identification can be difficult
ESGF 4IFM Q1 2012
Easiest model is AR
Imagine when the series is not stationary…
vinzjeannin@hotmail.com
Step by step approach, exploration, tries,…
Sometimes you find a satisfying model
60
Sometimes you don’t!
61. Conclusion
AR
MA
ARMA
Times series
vinzjeannin@hotmail.com ESGF 5IFM Q1 2012
61