SlideShare a Scribd company logo
Project report
On
Study and Forecasting of Financial Time Series Data
By
AMAR SUBHASH PATIL
AMIT BALKRISHNA DOIFODE
HEENAKAUSHAR INAYATBHAI VHORA
MANISHA JAYANTILAL KANANI
PRAMOD BALKRISHNA GHADAGE
SACHIN KRISHNA RASANKAR
TRUPTI RAMESHBHAI RATHOD
1
Introduction
 Stock Market
When people talk about the Stock Market, it's no always immediately clear what
they're referring to. Is the Stock Market a place? Or is it something different? To
many people it is an abstract idea. They buy stocks in "the stock market" without ever
leaving the comfort of their computer terminal. But the stock market is indeed a
physical place with buildings and addresses, a place you can go visit.
2
 Current Stock Market
The current "stock market" is comprised of 300,000 computers situated on pro
trader's desks. These computers are networked together using sophisticated protocols.
This level of information sharing makes pricing an almost exact science.
These 300,000 computers are further linked to another 26 million computers
worldwide. These computers are located in banks, small businesses, and large
corporations. These computers comprise the banking networks which make
computerized transactions possible.
Finally, these computers are connected to another 300 million+ computers
which connect and disconnect from the financial markets daily. In New York City
alone, these transactions amount to over $2.2 trillion dollars daily
3
 Bombay Stock Exchange (BSE)
The Bombay Stock Exchange is known as the oldest exchange in Asia. It traces
its history to the 1850s, when stockbrokers would gather under banyan trees in front
of Mumbai's Town Hall. The location of these meetings changed many times, as the
number of brokers constantly increased. The group eventually moved to Dalal Street
in 1874 and in 1875 became an official organization known as 'The Native Share &
Stock Brokers Association'. In 1956, the BSE became the first stock exchange to be
recognized by the Indian Government under the Securities Contracts Regulation
Act.It is the 11th
largest Stock Exchange in the world.
4
 Two main goals of the time series analysis
There are two main goals of time series analysis:
(a) identifying the nature of the phenomenon represented by the
sequence of observations
(b) forecasting (predicting future values of the time series variable).
Both of these goals require that the pattern of observed time-
series data is identified
5
 Identifying Patterns in Time Series Data
 Systematic pattern and random noise
 Two general aspects of time series patterns
1. Trend Analysis
I. Smoothing
II. Fitting a function
2. Analysis of Seasonality
 White Noise
 Autocorrelation Correlogram (ACF)
 Partial Autocorrelation Correlogram (PACF)
 Removing serial dependency
6
 TIME SERIES MODELS
The task facing the modern time-series econometrician is to develop
reasonably simple models capable of forecasting, interpreting, and testing
hypotheses concerning economic data. . The challenge has grown over time
the original use of time-series analysis was primarily as an aid to forecasting.
As such, a methodology was developed to decompose a series into a trend, a
seasonal, a cyclical, and an irregular component. Uncovering the dynamic
path of a series improves forecast accuracy. . Using the time-series methods,
it is possible to decompose this series into the trend, seasonal, and irregular
components.
7
 Autoregressive Model
As stationary process is somewhat parsimonious with parameter. But it
is not sufficiently parsimonious similar to the general non-stationary
process. The problem is that there are infinite number of parameters.
What we need is the class of stationary time series model with only finite
parameters preferably small number of parameters. That’s why the simplest
autoregressive (AR) models are used.
8
 AR (1) Process
Most time series consist of elements that are serially dependent in the
sense that one can estimate a coefficient or a set of coefficients that describe
consecutive elements of the series from specific, time-lagged (previous) elements.
This can be summarized in the equation:
Note that an auto regressive process will only be stable if the parameters are
within a certain range for example if there is only one autoregressive parameter then
it must fall within the interval of -1<a<1. Otherwise, that is, the series would not be
stationary. If there is more than one autoregressive perimeter similar restriction on the
perimeter values can be defined.
titi
p
i
t xyaay  

1
0
9
 Moving Average(1) Process
Independent from the autoregressive process, each element in the seriese can also be
affected by the past error that can’t be accounted for by the auto regressive component that is :
yt= a0 + 
q
i
i
0

There is “ duality “ between the moving average process and the autoregressive process,
that is, the moving average equation above can be rewritten into an autoregressive form.
However , analogous to the stationarity condition described above, this can only be done if the
moving average parameter follow certain condition, that is, if the model is invertible. Otherwise
the series will not be stationary.
10
 ARMA model
An autoregressive model of order p is conventionally classified as
AR(p). A moving average model with q terms is classified as MA(q). A
combination model containing p autoregressive terms and q moving average
terms is classified as ARMA (p, q).
It is possible to combine a moving average process with a linear
difference equation to obtain an autoregressive moving average model.
Consider the p-th order difference equation:
titi
p
i
t xyaay  

1
0 ……………… (6)
Now let { xt } be the MA (q) process so from white noise process
11
0
q
t i
i
X 

 
so that we can write,
yt= a0 + 
p
i
ia
1
yt-i + 
q
i
i
0

We follow the convention of normalizing units so that β0 is always equal to unity. If the
characteristic roots of above equation are all in the unit circle, {yt} is called an autoregressive
moving-average (ARMA) model for yt. The autoregressive part of the model is difference
equation is given by the homogeneous portion of and the moving average part is the xt
sequence.
If the homogeneous part of difference equation contains p lags and the model for xt contains q
lags, the model is called an ARMA(p,q) model, i.e., the model with the p AR parameters, q MA
parameters and the variance of the error term
12
 ARIMA model
If the object series is differenced d times to achieve stationarity, the
model is classified as ARIMA (p, d, q), where the symbol "I" signifies
"integrated." An ARIMA(p,0,q) is the same as an ARMA(p, q) model; likewise,
an ARIMA(p,0,0) is the same as an AR(p) model, and an ARIMA(0,0,q) is the
same as an MA(q) model
13
Data analysis steps
 First of all we have to check whether the given time series is stationary or not,
if not then we try to make it stationary by taking difference. Most of the
financial time series become stationary by taking first order differentiation.
 Identify the parameters p and q of AR( p), MA(q) from the correlogram ACF
and PACF. From ACF we can identify the parameter q and from PACF we
identify the parameter p.
 We now fit the ARIMA (p, d, q) model and estimate the parameters of the
model. If the parameters are not significant then we choose another
combination of p and q and fit another model and identify the appropriate p
and q by trial and error.
14
 Check the Model Selection Criteria for identifying the best model.
 Find predicted value for in-sample period and save the residuals.
 Check the normality assumption for the model residuals.
 If the residual is not normally distributed than detect the influential points
which affect the normality assumption. And try to make it normal.
 Generate the normal deviate from mean and variance which obtain in above
step .
 Find the future predicted value from the fitted model replacing the error terms
by generated normal deviate.
 Interpret the result.
15
Analysis of MRF (MONTHLY) data
-0.4
-0.3
-0.2
-0.1
0
0.1
0.2
0.3
0.4
0.5
1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81 85 89 93 97 101105109113117121125129
return
return
SEQUENCE CHART
16
Autocorrelation Function (ACF)
17
Interpretation (ACF):-
Here, we have to examine that the consecutive lags re normally
dependent or not, i.e. the first element is closely related to the second, and the
second to the third which determine the serial dependency. In the above figure
alternating positive and negative decaying, so we can predict that there is an MA
model effects are there in the model. In the above figure, we make the assumption
lag q of the ARMA(p,q) model. It was found that they were stationary and it has
single large spike which is significant. So, we take q=1
18
Partial Autocorrelation Function (PACF)
19
Interpretation (PACF):-
In the above figure, we make the assumption lag p of the ARMA (p,
q) model. It was found that they were stationary and it has single large spike
which is significant. So, we take p=1. From the above correlogram we get
p=1 and q=1. We fit ARIMA (1, 0, 1) model . We have choosen
ARIMA(1,0,1) in which auto regressive perameter( AR) i.e. p is 1
differencing parameter d=0 as our data on return is stationary and moving
average perameter MA i.e. q=1. These perameter s are choosen from ACF
and PACF graph.
20
Time Series Modeler
Model Fit
Fit Statistic Mean
Stationary R-squared 0.061
R-squared 0.061
RMSE 0.16
MAPE 176.6
MaxAPE 2477.7
MAE 0.10
MaxAE 1.11
Normalized BIC -3.53
Model Description
ARIMA(1,0,1)Model_1returnModel ID
Model Type
21
ARIMA Model Parameters
Estimate SE T Sig.
return-
Model_1
return No
Transformation
Constant 0.03219 0.002311 13.9264 2.08E-27
AR Lag 1 0.858689 0.082952 10.35163 1.28E-18
MA Lag 1 0.998545 0.803012 1.243499 0.215957
Interpretation:-
The above table gives interpretation of ARIMA(1,0,1) from the table it is clear that the estimate
of constant is 0.03219, estimate of AR is 0.858689 and estimate of MA is 0.998545. Standared
error of constant and coefficient of AR lag1 is very small which is near to zero.
22
Therefore these two parameters are statistically significant. It is also see from p-value criteria
i.e. if p-value is greater than 0.05 which means that we can’t reject the null hypotheses that
estimates are statistically insignificant. It is clear from the above table that the estimates of
constant and coefficient of AR lag1 are statistically significant. And the estimate of MA is
statistically insignificant. The Computed model is as follows.
𝑅𝑡= 𝛽0 + 𝛽1 𝑅𝑡−1 + 𝜀𝑡
𝑅𝑡=0.03219 +0.858689∗ 𝑅𝑡−1 + 𝜀𝑡
and the predicted value can be find from
𝑅𝑡= 𝛽0 + 𝛽1 𝑅𝑡−1
23
Graph for return and predicted values
Interpretation:-
Here we can see from the graph that predicted value obtained
from the model is near to the original value so we can say that
our predictive model is good for the in-sample-data.
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1
8
15
22
29
36
43
50
57
64
71
78
85
92
99
106
113
120
127
return
predicted
24
K-S test for checking normality assumption
In the K-S test our null hypotheses is : The test distribution is
normal. And from the p-value we do not reject our null
hypotheses. So, we can say that our model residual is normally distributed.
One-Sam ple Kolm ogorov-Sm irnov Test
131
-.0037
.12701
.059
.059
-.057
.672
.757
N
Mean
Std. Deviation
Normal Parameters a,b
Absolute
Positive
Negative
Most Extreme
Differences
Kolmogorov-Smirnov Z
Asymp. Sig. (2-tailed)
Noise
residual from
VAR00003-
Model_1
Test distribution is Normal.a.
Calculated from data.b.
25
MRF Forecast plot:-
From the above chart display the forecast value of return, It is easy to see from the graph that
the value of the return are in increasing trend and it may continue increasing with time.
26
Programe for forecasting:-
b0=0.032
b1=0.858
et=0.998
rt<-vector("numeric")
i=0
for(i in i:12)
{
rt[i]=0.048721335
e<-rnorm(1,0.0265,0.131)
rt[i+1]=b0+(b1)*(rt[i])+(et)*(e)
rt[i]=rt[i+1]
i<-i+1
}
print(rt)
27
Forecasted values
[1] 0.459984384 0.003063379 0.266967056 0.071628528 0.215974207 0.115193849
[7] 0.027578631 0.015844679 0.122600279 0.199623276 0.316388738 0.141274116
[13] 0.141274116
28
29

More Related Content

What's hot

Lecture 14 cusum and ewma
Lecture 14 cusum and ewmaLecture 14 cusum and ewma
Lecture 14 cusum and ewma
Ingrid McKenzie
 
Mba 532 2011_part_3_time_series_analysis
Mba 532 2011_part_3_time_series_analysisMba 532 2011_part_3_time_series_analysis
Mba 532 2011_part_3_time_series_analysis
Chandra Kodituwakku
 
Autoregression
AutoregressionAutoregression
Autoregression
jchristo06
 

What's hot (20)

Seasonal ARIMA
Seasonal ARIMASeasonal ARIMA
Seasonal ARIMA
 
ARIMA
ARIMA ARIMA
ARIMA
 
Time series analysis
Time series analysisTime series analysis
Time series analysis
 
Holt winters exponential.pptx
Holt winters exponential.pptxHolt winters exponential.pptx
Holt winters exponential.pptx
 
Timeseries forecasting
Timeseries forecastingTimeseries forecasting
Timeseries forecasting
 
Time series analysis in Stata
Time series analysis in StataTime series analysis in Stata
Time series analysis in Stata
 
Time series
Time seriesTime series
Time series
 
Econometrics Project
Econometrics ProjectEconometrics Project
Econometrics Project
 
ARIMA Models - [Lab 3]
ARIMA Models - [Lab 3]ARIMA Models - [Lab 3]
ARIMA Models - [Lab 3]
 
Time Series - Auto Regressive Models
Time Series - Auto Regressive ModelsTime Series - Auto Regressive Models
Time Series - Auto Regressive Models
 
Trend and Seasonal Components
Trend and Seasonal ComponentsTrend and Seasonal Components
Trend and Seasonal Components
 
Lecture 14 cusum and ewma
Lecture 14 cusum and ewmaLecture 14 cusum and ewma
Lecture 14 cusum and ewma
 
Describing Data: Numerical Measures
Describing Data: Numerical MeasuresDescribing Data: Numerical Measures
Describing Data: Numerical Measures
 
Time series forecasting
Time series forecastingTime series forecasting
Time series forecasting
 
Mba 532 2011_part_3_time_series_analysis
Mba 532 2011_part_3_time_series_analysisMba 532 2011_part_3_time_series_analysis
Mba 532 2011_part_3_time_series_analysis
 
Arima model
Arima modelArima model
Arima model
 
Time series analysis
Time series analysis Time series analysis
Time series analysis
 
Time Series - 1
Time Series - 1Time Series - 1
Time Series - 1
 
Arima Forecasting - Presentation by Sera Cresta, Nora Alosaimi and Puneet Mahana
Arima Forecasting - Presentation by Sera Cresta, Nora Alosaimi and Puneet MahanaArima Forecasting - Presentation by Sera Cresta, Nora Alosaimi and Puneet Mahana
Arima Forecasting - Presentation by Sera Cresta, Nora Alosaimi and Puneet Mahana
 
Autoregression
AutoregressionAutoregression
Autoregression
 

Viewers also liked

Time Series Project
Time Series ProjectTime Series Project
Time Series Project
Jason Eber
 
Time Series Project
Time Series Project Time Series Project
Time Series Project
Sean Cahill
 
R language Project report
R language Project reportR language Project report
R language Project report
Tianyue Wang
 
Econometrics lecture 1st
Econometrics lecture 1stEconometrics lecture 1st
Econometrics lecture 1st
Ishaq Ahmad
 

Viewers also liked (15)

Time Series Project
Time Series ProjectTime Series Project
Time Series Project
 
AR model
AR modelAR model
AR model
 
certificate
certificatecertificate
certificate
 
intro to R
intro to Rintro to R
intro to R
 
Math subject specific tools presentation
Math subject specific tools presentationMath subject specific tools presentation
Math subject specific tools presentation
 
Time Series Project
Time Series Project Time Series Project
Time Series Project
 
Seasonal modeling in time series with R
Seasonal modeling in time series with RSeasonal modeling in time series with R
Seasonal modeling in time series with R
 
Statistics Project Report
Statistics Project ReportStatistics Project Report
Statistics Project Report
 
Mutiple linear regression project
Mutiple linear regression projectMutiple linear regression project
Mutiple linear regression project
 
SSA slides
SSA slidesSSA slides
SSA slides
 
Using Singular Spectrum Analysis to Model Electricity Prices
Using Singular Spectrum Analysis  to Model Electricity PricesUsing Singular Spectrum Analysis  to Model Electricity Prices
Using Singular Spectrum Analysis to Model Electricity Prices
 
R language Project report
R language Project reportR language Project report
R language Project report
 
Presentation on spss
Presentation on spssPresentation on spss
Presentation on spss
 
Econometrics lecture 1st
Econometrics lecture 1stEconometrics lecture 1st
Econometrics lecture 1st
 
Statistics Project
Statistics ProjectStatistics Project
Statistics Project
 

Similar to Project time series ppt

ARIMA - Statistical Analysis for Data Science
ARIMA - Statistical Analysis for Data ScienceARIMA - Statistical Analysis for Data Science
ARIMA - Statistical Analysis for Data Science
sridhark868747
 

Similar to Project time series ppt (20)

Time series modelling arima-arch
Time series modelling  arima-archTime series modelling  arima-arch
Time series modelling arima-arch
 
Investigation of Parameter Behaviors in Stationarity of Autoregressive and Mo...
Investigation of Parameter Behaviors in Stationarity of Autoregressive and Mo...Investigation of Parameter Behaviors in Stationarity of Autoregressive and Mo...
Investigation of Parameter Behaviors in Stationarity of Autoregressive and Mo...
 
04_AJMS_288_20.pdf
04_AJMS_288_20.pdf04_AJMS_288_20.pdf
04_AJMS_288_20.pdf
 
Different Models Used In Time Series - InsideAIML
Different Models Used In Time Series - InsideAIMLDifferent Models Used In Time Series - InsideAIML
Different Models Used In Time Series - InsideAIML
 
Time Series Analysis with R
Time Series Analysis with RTime Series Analysis with R
Time Series Analysis with R
 
ETSATPWAATFU
ETSATPWAATFUETSATPWAATFU
ETSATPWAATFU
 
arimamodel-170204090012.pdf
arimamodel-170204090012.pdfarimamodel-170204090012.pdf
arimamodel-170204090012.pdf
 
Social_Distancing_DIS_Time_Series
Social_Distancing_DIS_Time_SeriesSocial_Distancing_DIS_Time_Series
Social_Distancing_DIS_Time_Series
 
Lesson 4 ar-ma
Lesson 4 ar-maLesson 4 ar-ma
Lesson 4 ar-ma
 
Machiwal, D. y Jha, MK (2012). Modelado estocástico de series de tiempo. En A...
Machiwal, D. y Jha, MK (2012). Modelado estocástico de series de tiempo. En A...Machiwal, D. y Jha, MK (2012). Modelado estocástico de series de tiempo. En A...
Machiwal, D. y Jha, MK (2012). Modelado estocástico de series de tiempo. En A...
 
On Modeling Murder Crimes in Nigeria
On Modeling Murder Crimes in NigeriaOn Modeling Murder Crimes in Nigeria
On Modeling Murder Crimes in Nigeria
 
Enhance interval width of crime forecasting with ARIMA model-fuzzy alpha cut
Enhance interval width of crime forecasting with ARIMA model-fuzzy alpha cutEnhance interval width of crime forecasting with ARIMA model-fuzzy alpha cut
Enhance interval width of crime forecasting with ARIMA model-fuzzy alpha cut
 
An Improved AC-BM Algorithm for Monitoring Watch List
An Improved AC-BM Algorithm for Monitoring Watch ListAn Improved AC-BM Algorithm for Monitoring Watch List
An Improved AC-BM Algorithm for Monitoring Watch List
 
Arima
ArimaArima
Arima
 
Verification of confliction and unreachability in rule based expert systems w...
Verification of confliction and unreachability in rule based expert systems w...Verification of confliction and unreachability in rule based expert systems w...
Verification of confliction and unreachability in rule based expert systems w...
 
Pyro Primer
Pyro PrimerPyro Primer
Pyro Primer
 
What is ARIMA Forecasting and How Can it Be Used for Enterprise Analysis?
What is ARIMA Forecasting and How Can it Be Used for Enterprise Analysis?What is ARIMA Forecasting and How Can it Be Used for Enterprise Analysis?
What is ARIMA Forecasting and How Can it Be Used for Enterprise Analysis?
 
ARIMA - Statistical Analysis for Data Science
ARIMA - Statistical Analysis for Data ScienceARIMA - Statistical Analysis for Data Science
ARIMA - Statistical Analysis for Data Science
 
CFM Challenge - Course Project
CFM Challenge - Course ProjectCFM Challenge - Course Project
CFM Challenge - Course Project
 
timeseries cheat sheet with example code for R
timeseries cheat sheet with example code for Rtimeseries cheat sheet with example code for R
timeseries cheat sheet with example code for R
 

Project time series ppt

  • 1. Project report On Study and Forecasting of Financial Time Series Data By AMAR SUBHASH PATIL AMIT BALKRISHNA DOIFODE HEENAKAUSHAR INAYATBHAI VHORA MANISHA JAYANTILAL KANANI PRAMOD BALKRISHNA GHADAGE SACHIN KRISHNA RASANKAR TRUPTI RAMESHBHAI RATHOD 1
  • 2. Introduction  Stock Market When people talk about the Stock Market, it's no always immediately clear what they're referring to. Is the Stock Market a place? Or is it something different? To many people it is an abstract idea. They buy stocks in "the stock market" without ever leaving the comfort of their computer terminal. But the stock market is indeed a physical place with buildings and addresses, a place you can go visit. 2
  • 3.  Current Stock Market The current "stock market" is comprised of 300,000 computers situated on pro trader's desks. These computers are networked together using sophisticated protocols. This level of information sharing makes pricing an almost exact science. These 300,000 computers are further linked to another 26 million computers worldwide. These computers are located in banks, small businesses, and large corporations. These computers comprise the banking networks which make computerized transactions possible. Finally, these computers are connected to another 300 million+ computers which connect and disconnect from the financial markets daily. In New York City alone, these transactions amount to over $2.2 trillion dollars daily 3
  • 4.  Bombay Stock Exchange (BSE) The Bombay Stock Exchange is known as the oldest exchange in Asia. It traces its history to the 1850s, when stockbrokers would gather under banyan trees in front of Mumbai's Town Hall. The location of these meetings changed many times, as the number of brokers constantly increased. The group eventually moved to Dalal Street in 1874 and in 1875 became an official organization known as 'The Native Share & Stock Brokers Association'. In 1956, the BSE became the first stock exchange to be recognized by the Indian Government under the Securities Contracts Regulation Act.It is the 11th largest Stock Exchange in the world. 4
  • 5.  Two main goals of the time series analysis There are two main goals of time series analysis: (a) identifying the nature of the phenomenon represented by the sequence of observations (b) forecasting (predicting future values of the time series variable). Both of these goals require that the pattern of observed time- series data is identified 5
  • 6.  Identifying Patterns in Time Series Data  Systematic pattern and random noise  Two general aspects of time series patterns 1. Trend Analysis I. Smoothing II. Fitting a function 2. Analysis of Seasonality  White Noise  Autocorrelation Correlogram (ACF)  Partial Autocorrelation Correlogram (PACF)  Removing serial dependency 6
  • 7.  TIME SERIES MODELS The task facing the modern time-series econometrician is to develop reasonably simple models capable of forecasting, interpreting, and testing hypotheses concerning economic data. . The challenge has grown over time the original use of time-series analysis was primarily as an aid to forecasting. As such, a methodology was developed to decompose a series into a trend, a seasonal, a cyclical, and an irregular component. Uncovering the dynamic path of a series improves forecast accuracy. . Using the time-series methods, it is possible to decompose this series into the trend, seasonal, and irregular components. 7
  • 8.  Autoregressive Model As stationary process is somewhat parsimonious with parameter. But it is not sufficiently parsimonious similar to the general non-stationary process. The problem is that there are infinite number of parameters. What we need is the class of stationary time series model with only finite parameters preferably small number of parameters. That’s why the simplest autoregressive (AR) models are used. 8
  • 9.  AR (1) Process Most time series consist of elements that are serially dependent in the sense that one can estimate a coefficient or a set of coefficients that describe consecutive elements of the series from specific, time-lagged (previous) elements. This can be summarized in the equation: Note that an auto regressive process will only be stable if the parameters are within a certain range for example if there is only one autoregressive parameter then it must fall within the interval of -1<a<1. Otherwise, that is, the series would not be stationary. If there is more than one autoregressive perimeter similar restriction on the perimeter values can be defined. titi p i t xyaay    1 0 9
  • 10.  Moving Average(1) Process Independent from the autoregressive process, each element in the seriese can also be affected by the past error that can’t be accounted for by the auto regressive component that is : yt= a0 +  q i i 0  There is “ duality “ between the moving average process and the autoregressive process, that is, the moving average equation above can be rewritten into an autoregressive form. However , analogous to the stationarity condition described above, this can only be done if the moving average parameter follow certain condition, that is, if the model is invertible. Otherwise the series will not be stationary. 10
  • 11.  ARMA model An autoregressive model of order p is conventionally classified as AR(p). A moving average model with q terms is classified as MA(q). A combination model containing p autoregressive terms and q moving average terms is classified as ARMA (p, q). It is possible to combine a moving average process with a linear difference equation to obtain an autoregressive moving average model. Consider the p-th order difference equation: titi p i t xyaay    1 0 ……………… (6) Now let { xt } be the MA (q) process so from white noise process 11
  • 12. 0 q t i i X     so that we can write, yt= a0 +  p i ia 1 yt-i +  q i i 0  We follow the convention of normalizing units so that β0 is always equal to unity. If the characteristic roots of above equation are all in the unit circle, {yt} is called an autoregressive moving-average (ARMA) model for yt. The autoregressive part of the model is difference equation is given by the homogeneous portion of and the moving average part is the xt sequence. If the homogeneous part of difference equation contains p lags and the model for xt contains q lags, the model is called an ARMA(p,q) model, i.e., the model with the p AR parameters, q MA parameters and the variance of the error term 12
  • 13.  ARIMA model If the object series is differenced d times to achieve stationarity, the model is classified as ARIMA (p, d, q), where the symbol "I" signifies "integrated." An ARIMA(p,0,q) is the same as an ARMA(p, q) model; likewise, an ARIMA(p,0,0) is the same as an AR(p) model, and an ARIMA(0,0,q) is the same as an MA(q) model 13
  • 14. Data analysis steps  First of all we have to check whether the given time series is stationary or not, if not then we try to make it stationary by taking difference. Most of the financial time series become stationary by taking first order differentiation.  Identify the parameters p and q of AR( p), MA(q) from the correlogram ACF and PACF. From ACF we can identify the parameter q and from PACF we identify the parameter p.  We now fit the ARIMA (p, d, q) model and estimate the parameters of the model. If the parameters are not significant then we choose another combination of p and q and fit another model and identify the appropriate p and q by trial and error. 14
  • 15.  Check the Model Selection Criteria for identifying the best model.  Find predicted value for in-sample period and save the residuals.  Check the normality assumption for the model residuals.  If the residual is not normally distributed than detect the influential points which affect the normality assumption. And try to make it normal.  Generate the normal deviate from mean and variance which obtain in above step .  Find the future predicted value from the fitted model replacing the error terms by generated normal deviate.  Interpret the result. 15
  • 16. Analysis of MRF (MONTHLY) data -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81 85 89 93 97 101105109113117121125129 return return SEQUENCE CHART 16
  • 18. Interpretation (ACF):- Here, we have to examine that the consecutive lags re normally dependent or not, i.e. the first element is closely related to the second, and the second to the third which determine the serial dependency. In the above figure alternating positive and negative decaying, so we can predict that there is an MA model effects are there in the model. In the above figure, we make the assumption lag q of the ARMA(p,q) model. It was found that they were stationary and it has single large spike which is significant. So, we take q=1 18
  • 20. Interpretation (PACF):- In the above figure, we make the assumption lag p of the ARMA (p, q) model. It was found that they were stationary and it has single large spike which is significant. So, we take p=1. From the above correlogram we get p=1 and q=1. We fit ARIMA (1, 0, 1) model . We have choosen ARIMA(1,0,1) in which auto regressive perameter( AR) i.e. p is 1 differencing parameter d=0 as our data on return is stationary and moving average perameter MA i.e. q=1. These perameter s are choosen from ACF and PACF graph. 20
  • 21. Time Series Modeler Model Fit Fit Statistic Mean Stationary R-squared 0.061 R-squared 0.061 RMSE 0.16 MAPE 176.6 MaxAPE 2477.7 MAE 0.10 MaxAE 1.11 Normalized BIC -3.53 Model Description ARIMA(1,0,1)Model_1returnModel ID Model Type 21
  • 22. ARIMA Model Parameters Estimate SE T Sig. return- Model_1 return No Transformation Constant 0.03219 0.002311 13.9264 2.08E-27 AR Lag 1 0.858689 0.082952 10.35163 1.28E-18 MA Lag 1 0.998545 0.803012 1.243499 0.215957 Interpretation:- The above table gives interpretation of ARIMA(1,0,1) from the table it is clear that the estimate of constant is 0.03219, estimate of AR is 0.858689 and estimate of MA is 0.998545. Standared error of constant and coefficient of AR lag1 is very small which is near to zero. 22
  • 23. Therefore these two parameters are statistically significant. It is also see from p-value criteria i.e. if p-value is greater than 0.05 which means that we can’t reject the null hypotheses that estimates are statistically insignificant. It is clear from the above table that the estimates of constant and coefficient of AR lag1 are statistically significant. And the estimate of MA is statistically insignificant. The Computed model is as follows. 𝑅𝑡= 𝛽0 + 𝛽1 𝑅𝑡−1 + 𝜀𝑡 𝑅𝑡=0.03219 +0.858689∗ 𝑅𝑡−1 + 𝜀𝑡 and the predicted value can be find from 𝑅𝑡= 𝛽0 + 𝛽1 𝑅𝑡−1 23
  • 24. Graph for return and predicted values Interpretation:- Here we can see from the graph that predicted value obtained from the model is near to the original value so we can say that our predictive model is good for the in-sample-data. -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1 8 15 22 29 36 43 50 57 64 71 78 85 92 99 106 113 120 127 return predicted 24
  • 25. K-S test for checking normality assumption In the K-S test our null hypotheses is : The test distribution is normal. And from the p-value we do not reject our null hypotheses. So, we can say that our model residual is normally distributed. One-Sam ple Kolm ogorov-Sm irnov Test 131 -.0037 .12701 .059 .059 -.057 .672 .757 N Mean Std. Deviation Normal Parameters a,b Absolute Positive Negative Most Extreme Differences Kolmogorov-Smirnov Z Asymp. Sig. (2-tailed) Noise residual from VAR00003- Model_1 Test distribution is Normal.a. Calculated from data.b. 25
  • 26. MRF Forecast plot:- From the above chart display the forecast value of return, It is easy to see from the graph that the value of the return are in increasing trend and it may continue increasing with time. 26
  • 27. Programe for forecasting:- b0=0.032 b1=0.858 et=0.998 rt<-vector("numeric") i=0 for(i in i:12) { rt[i]=0.048721335 e<-rnorm(1,0.0265,0.131) rt[i+1]=b0+(b1)*(rt[i])+(et)*(e) rt[i]=rt[i+1] i<-i+1 } print(rt) 27
  • 28. Forecasted values [1] 0.459984384 0.003063379 0.266967056 0.071628528 0.215974207 0.115193849 [7] 0.027578631 0.015844679 0.122600279 0.199623276 0.316388738 0.141274116 [13] 0.141274116 28
  • 29. 29