The International Journal of Soft Computing, Mathematics and Control (IJSCMC) is a Quarterly peer-reviewed and refereed open access journal that publishes articles which contribute new results in all areas of Soft Computing, Pure, Applied and Numerical Mathematics and Control. The focus of this new journal is on all theoretical and numerical methods on soft computing, mathematics and control theory with applications in science and industry. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on latest topics of soft computing, pure, applied and numerical mathematics and control engineering, and establishing new collaborations in these areas.
Authors are solicited to contribute to this journal by submitting articles that illustrate new algorithms, theorems, modeling results, research results, projects, surveying works and industrial experiences that describe significant advances in Soft Computing, Mathematics and Control Engineering
1) This paper proposes an adaptive quasi-maximum likelihood estimation approach for GARCH models when the distribution of volatility data is unspecified or heavy-tailed.
2) The approach works by using a scale parameter ηf to identify the discrepancy between the wrongly specified innovation density and the true innovation density.
3) Simulation studies and an application show that the adaptive approach gains better efficiency compared to other methods, especially when the innovation error is heavy-tailed.
Nonlinear Extension of Asymmetric Garch Model within Neural Network Framework csandit
The importance of volatility for all market partici
pants has led to the development and
application of various econometric models. The most
popular models in modelling volatility are
GARCH type models because they can account excess k
urtosis and asymmetric effects of
financial time series. Since standard GARCH(1,1) mo
del usually indicate high persistence in the
conditional variance, the empirical researches turn
ed to GJR-GARCH model and reveal its
superiority in fitting the asymmetric heteroscedast
icity in the data. In order to capture both
asymmetry and nonlinearity in data, the goal of thi
s paper is to develop a parsimonious NN
model as an extension to GJR-GARCH model and to det
ermine if GJR-GARCH-NN outperforms
the GJR-GARCH model.
This document discusses methods for clustering time series data in a way that allows the cluster structure to change over time. It begins by introducing the problem and defining relevant terms. It then provides spectral clustering as a preliminary benchmark approach before exploring an alternative method using triangular potentials within a graphical model framework. The document presents the proposed method and provides illustrative examples and discussion of extensions.
The document summarizes a study on modeling risk aggregation and sensitivity analysis for economic capital at banks. It finds that different risk aggregation methodologies, such as historical bootstrap, normal approximation, and copula models, produce significantly different economic capital estimates ranging from 10% to 60% differences. The empirical copula approach tends to be the most conservative while normal approximation is the least conservative. The results indicate banks should take a conservative approach to quantify integrated risk and consider the impact of methodology choice and parameter uncertainty on economic capital estimates.
This project looks at the abilities of GARCH family models to forecast stock market volatility. FTSE 100 stock market returns are covered over the 10 years period in attempt to contribute to wide range of studies made on GARCH models.
The dissertation received 93% and was highly appreciated at the University of Portsmouth.
This document introduces stochastic processes that can help model risk factors in risk management. It discusses processes for modeling fat tails and mean reversion, including geometric Brownian motion, GARCH models, jump diffusion models, variance gamma processes, Vasicek models, CIR models, and combinations of mean reversion and fat tails. The document provides an overview of these processes and guidance on selecting an appropriate initial model based on analyzing a time series for stationarity, autoregressive properties, and fat tails. The goal is to select a simple practical model to begin quantitative risk analysis.
9. the efficiency of volatility financial model withikhwanecdc
This document summarizes a study that investigates the effectiveness of volatility financial models with the presence of additive outliers via Monte Carlo simulation. The study simulates data using an ARMA(1,0)-GARCH(1,2) model with different sample sizes of 500, 1000, and 1400, both with and without 10% additive outliers added. The effectiveness of the models is evaluated based on error metrics and information criteria. The results indicate that the effectiveness of the ARMA-GARCH model diminishes as sample size increases in the presence of additive outliers.
1) This paper proposes an adaptive quasi-maximum likelihood estimation approach for GARCH models when the distribution of volatility data is unspecified or heavy-tailed.
2) The approach works by using a scale parameter ηf to identify the discrepancy between the wrongly specified innovation density and the true innovation density.
3) Simulation studies and an application show that the adaptive approach gains better efficiency compared to other methods, especially when the innovation error is heavy-tailed.
Nonlinear Extension of Asymmetric Garch Model within Neural Network Framework csandit
The importance of volatility for all market partici
pants has led to the development and
application of various econometric models. The most
popular models in modelling volatility are
GARCH type models because they can account excess k
urtosis and asymmetric effects of
financial time series. Since standard GARCH(1,1) mo
del usually indicate high persistence in the
conditional variance, the empirical researches turn
ed to GJR-GARCH model and reveal its
superiority in fitting the asymmetric heteroscedast
icity in the data. In order to capture both
asymmetry and nonlinearity in data, the goal of thi
s paper is to develop a parsimonious NN
model as an extension to GJR-GARCH model and to det
ermine if GJR-GARCH-NN outperforms
the GJR-GARCH model.
This document discusses methods for clustering time series data in a way that allows the cluster structure to change over time. It begins by introducing the problem and defining relevant terms. It then provides spectral clustering as a preliminary benchmark approach before exploring an alternative method using triangular potentials within a graphical model framework. The document presents the proposed method and provides illustrative examples and discussion of extensions.
The document summarizes a study on modeling risk aggregation and sensitivity analysis for economic capital at banks. It finds that different risk aggregation methodologies, such as historical bootstrap, normal approximation, and copula models, produce significantly different economic capital estimates ranging from 10% to 60% differences. The empirical copula approach tends to be the most conservative while normal approximation is the least conservative. The results indicate banks should take a conservative approach to quantify integrated risk and consider the impact of methodology choice and parameter uncertainty on economic capital estimates.
This project looks at the abilities of GARCH family models to forecast stock market volatility. FTSE 100 stock market returns are covered over the 10 years period in attempt to contribute to wide range of studies made on GARCH models.
The dissertation received 93% and was highly appreciated at the University of Portsmouth.
This document introduces stochastic processes that can help model risk factors in risk management. It discusses processes for modeling fat tails and mean reversion, including geometric Brownian motion, GARCH models, jump diffusion models, variance gamma processes, Vasicek models, CIR models, and combinations of mean reversion and fat tails. The document provides an overview of these processes and guidance on selecting an appropriate initial model based on analyzing a time series for stationarity, autoregressive properties, and fat tails. The goal is to select a simple practical model to begin quantitative risk analysis.
9. the efficiency of volatility financial model withikhwanecdc
This document summarizes a study that investigates the effectiveness of volatility financial models with the presence of additive outliers via Monte Carlo simulation. The study simulates data using an ARMA(1,0)-GARCH(1,2) model with different sample sizes of 500, 1000, and 1400, both with and without 10% additive outliers added. The effectiveness of the models is evaluated based on error metrics and information criteria. The results indicate that the effectiveness of the ARMA-GARCH model diminishes as sample size increases in the presence of additive outliers.
MODELING THE AUTOREGRESSIVE CAPITAL ASSET PRICING MODEL FOR TOP 10 SELECTED...IAEME Publication
Systematic risk is the uncertainty inherent to the entire market or entire market segment and Unsystematic risk is the type of uncertainty that comes with the company or industry we invest. It can be reduced through diversification. The study generalized for selecting of non -linear capital asset pricing model for top securities in BSE and made an attempt to identify the marketable and non-marketable risk of investors of top companies. The analysis was conducted at different stages. They are Vector auto regression of systematic and unsystematic risk.
A Fuzzy Arithmetic Approach for Perishable Items in Discounted Entropic Order...Waqas Tariq
This paper uses fuzzy arithmetic approach to the system cost for perishable items with instant deterioration for the discounted entropic order quantity model. Traditional crisp system cost observes that some costs may belong to the uncertain factors. It is necessary to extend the system cost to treat also the vague costs. We introduce a new concept which we call entropy and show that the total payoff satisfies the optimization property. We show how special case of this problem reduce to perfect results, and how post deteriorated discounted entropic order quantity model is a generalization of optimization. It has been imperative to demonstrate this model by analysis, which reveals important characteristics of discounted structure. Further numerical experiments are conducted to evaluate the relative performance between the fuzzy and crisp cases in EnOQ and EOQ separately.
Single period inventory model with stochastic demand and partial backloggingIAEME Publication
This document summarizes an article from the International Journal of Management that discusses inventory models with stochastic demand and partial backlogging. It presents three key points:
1) It develops an approximate closed-form solution for the expected total cost of a single period inventory model where demand is stochastic and follows the SCBZ property.
2) It examines the model through three cases: the first considers holding cost without salvage value; the second varies only the holding cost satisfying the SCBZ property; the third incorporates partial backlogging with time-dependent and time-independent shortage costs.
3) Optimal solutions are derived for each case and numerical examples are provided to validate the models.
This document discusses two methods for calculating Value-at-Risk (VaR): 1) Assuming a normal distribution of portfolio returns and using a GARCH model to estimate conditional volatility, and 2) A nonparametric bootstrap method. The normal distribution assumption is appropriate only during calm periods but will underestimate risk during turbulent times. The bootstrap method does not rely on distributional assumptions and better accounts for uncertainty in conditional variance dynamics to provide more accurate VaR estimates. An empirical exercise applies the two methods to the CAC40 index to demonstrate how the normal distribution method fails VaR tests during turbulence while the bootstrap method passes.
This document discusses estimating stochastic relative risk aversion from interest rates. It first introduces a model for deriving relative risk aversion from interest rates using a time inhomogeneous single factor short rate model. It then details the estimation methodology used, which calibrates the model to US LIBOR data to estimate a time series for the market price of risk and ex-ante bond Sharpe ratio. This allows deducing a stochastic process for relative risk aversion under a power utility function. Estimated mean relative risk aversion is 49.89. The document then introduces modifying a Real Business Cycle model to allow time-varying relative risk aversion, finding it better matches empirical consumption volatility than a baseline model.
This document summarizes a study that empirically models the monthly Treasury bill rates in Ghana from 1998 to 2012. Specifically, it models the rates of the 91-day and 182-day Treasury bills using ARIMA models. For the 91-day bills, the ARIMA 3,1,1 model provided the best fit with a log likelihood value of -328.58. For the 182-day bills, the ARIMA 1,1,0 model fit best with a log likelihood value of -356.50. Residual tests on both models showed the residuals were free from heteroscedasticity and serial correlation. The study aims to determine appropriate time series models for predicting and forecasting future Treasury bill rates in
This document provides a critical review of the 1996 paper "The Conditional CAPM and the Cross-Section of Expected Returns" by Jagannathan and Wang. The review summarizes the key findings of the original paper, which showed that conditional CAPM can explain the cross-sectional variation in stock returns better than static CAPM. However, the review also notes some limitations in the assumptions around time-varying betas and use of R-squared. Overall, it evaluates the original paper as influential but also discusses subsequent research that built on its findings or identified weaknesses.
Undergraduate Project written by EBERE on ANALYSIS OF VARIATION IN GSKEbere Uzowuru
This document discusses analysis of variance (ANOVA) and its use in quality control and manufacturing processes. It provides background on the development of ANOVA, beginning in the 1930s and growing rapidly after World War II. It then discusses key concepts in ANOVA like partitioning variance, comparing multiple group means, and hypothesis testing. The rest of the document discusses specific tools used with ANOVA like control charts, histograms, Pareto charts, fishbone diagrams and their uses in identifying sources of variation and improving processes.
USE OF PLS COMPONENTS TO IMPROVE CLASSIFICATION ON BUSINESS DECISION MAKINGIJDKP
This paper presents a methodology that eliminates multicollinearity of the predictors variables in
supervised classification by transforming the predictor variables into orthogonal components obtained
from the application of Partial Least Squares (PLS) Logistic Regression. The PLS logistic regression was
developed by Bastien, Esposito-Vinzi, and Tenenhaus [1]. We apply the techniques of supervised
classification on data, based on the original variables and data based on the PLS components. The error
rates are calculated and the results compared. The implementation of the methodology of classification is
rests upon the development of computer programs written in the R language to make possible the
calculation of PLS components and error rates of classification. The impact of this research will be
disseminated, based on evidence that the methodology of Partial Least Squares Logistic Regression, is
fundamental when working in a supervised classification with data of many predictors variables.
This document discusses the concept of quality management. It provides an overview of quality management and defines supply chain quality management. It also lists several quality management tools including check sheets, control charts, Pareto charts, scatter plots, Ishikawa diagrams, and histograms. Finally, it lists several related topics to quality management such as quality management systems, courses, standards, and strategies.
The document describes the scenario simulation method for quantitative risk analysis. It discusses principal component analysis (PCA) to reduce variables and identify key factors that influence yield curve movements. The methodology involves using PCA to represent changes in key rates as a linear combination of independent principal factors. These factors are then discretized into a finite number of scenarios to simulate changes in key rates and portfolio values over time, enabling faster risk analysis compared to Monte Carlo simulation. Several examples are provided to illustrate applying this scenario simulation approach to analyze risk for single-currency and multi-currency fixed income portfolios.
Testing and extending the capital asset pricing modelGabriel Koh
This paper attempts to prove whether the conventional Capital Asset Pricing Model (CAPM) holds with respect to a set of asset returns. Starting with the Fama-Macbeth cross-sectional regression, we prove through the significance of pricing errors that the CAPM does not hold. Hence, we expand the original CAPM by including risk factors and factor-mimicking portfolios built on firm-specific characteristics and test for their significance in the model. Ultimately, by adding significant factors, we find that the model helps to better explain asset returns, but does still not entirely capture pricing errors.
This document describes stress testing a portfolio of four assets using hypothetical variance-covariance matrix stress testing. It investigates two methods - one proposed by Finger that modifies return vectors, and one by Numpacharoen and Bunwong that directly adjusts the correlation matrix. The document applies these methods to a small four asset portfolio using real index return data to demonstrate how the methods work and their strengths and weaknesses. It calculates initial portfolio risk measures and then selects scenarios to stress correlations and re-calculate portfolio risks to assess the impact of difficult market conditions.
Predicting U.S. business cycles: an analysis based on credit spreads and mark...Gabriel Koh
Our paper aims to empirically test the significance of the credit spreads and excess returns of the market portfolio in predicting the U.S. business cycles. We adopt the probit model to estimate the partial effects of the variables using data from the Federal Reserve Economic Data – St. Louis Fed (FRED) and the National Bureau of Economic Research (NBER) from 1993:12 to 2014:08. Results show that the contemporaneous regression model is not significant while the predictive regression model is significant. Our tests show that only the credit spread variable lagged by one period is significant and that the lagged variables of the excess returns of the market portfolio is also significant. Therefore, we can conclude that credit spreads and excess returns of the market portfolio can predict U.S. business cycles to a certain extent.
This document presents a final year project report on using quasi-Monte Carlo methods for market risk management. It first outlines two existing variance reduction methods - importance sampling and stratified importance sampling - and how they are applied to estimate tail loss probabilities in a stock portfolio model. The report then introduces quasi-Monte Carlo and proposes combining it with stratified importance sampling to achieve further variance reduction. Numerical results show that this combined method does not significantly improve variance reduction compared to existing methods.
COMPARING NET PROFIT FORECASTS OF INDIAN BANKS USING OLS AND GARCH 1,1 FRAMEWORKSCHOLEDGE R&D CENTER
In the present paper the Bi-variate Ordinary Least Square (OLS) and Generalized autoregressive conditional heteroskedasicity (GARCH 1, 1) model are applied to gather the fitted Net –Profit series of Two nationalized banks viz, State Bank of India SBI (being a leader) and ING Vysya bank (not a leader) in the Indian Banking sector. It is evident that OLS is non-parameterized method while QMLE or QML is a parameterized technique of coefficients estimation. The robustness must therefore need to see with respect to the data in consideration. The whole approach is to measure how both the models provide Earning forecasts and to analyze the behavior of regression coefficients. Also, the second objective could be to see how “Leader” bank earnings estimation process differs from the non-leader bank in the Indian banking setup. The results are clearly explaining differences in two banks in terms of their coefficient values, residual state and R-squared values.
This document discusses the evolution of research on the Efficient Market Hypothesis (EMH) in finance. It begins by outlining the three forms of market efficiency put forth in EMH. It then describes how Mandelbrot and others challenged EMH by finding long-term dependence and non-linear relationships in asset price movements, contrary to EMH assumptions of randomness. The document outlines Mandelbrot's rescaled range statistical technique and discusses how later researchers used non-linearity tests to further analyze non-random patterns in markets. It questions the validity of conventional linear tests used to support EMH and argues considering non-linearity is crucial to understanding market efficiency.
This document discusses validating risk models using intraday value-at-risk (VaR) and expected shortfall (ES) approaches with the Multiplicative Component GARCH (MC-GARCH) model. The study assesses different distributional assumptions for innovations in the MC-GARCH model and evaluates their effects on modeling and forecasting performance. Backtesting procedures are used to validate the models' predictive power for VaR and ES. Results show non-normal distributions best fit the intraday data and forecast ES, while an asymmetric distribution best forecasts VaR.
NONLINEAR EXTENSION OF ASYMMETRIC GARCH MODEL WITHIN NEURAL NETWORK FRAMEWORKcscpconf
The importance of volatility for all market participants has led to the development and
application of various econometric models. The most popular models in modelling volatility are
GARCH type models because they can account excess kurtosis and asymmetric effects of
financial time series. Since standard GARCH(1,1) model usually indicate high persistence in the
conditional variance, the empirical researches turned to GJR-GARCH model and reveal its
superiority in fitting the asymmetric heteroscedasticity in the data. In order to capture both
asymmetry and nonlinearity in data, the goal of this paper is to develop a parsimonious NN
model as an extension to GJR-GARCH model and to determine if GJR-GARCH-NN outperforms
the GJR-GARCH model.
The Use of ARCH and GARCH Models for Estimating and Forecasting Volatility-ru...Ismet Kale
This document discusses volatility modeling using ARCH and GARCH models. It first provides background on ARCH and GARCH models, noting they were developed to model characteristics of financial time series data like volatility clustering and fat tails. It then describes the specific ARCH and GARCH models that will be used in the study, including the ARCH, GARCH, EGARCH, GJR, APARCH, IGARCH, FIGARCH and FIAPARCH models. The document aims to apply these models to daily stock index data from the IMKB 100 to analyze and forecast volatility, and better understand risk in the Turkish market.
Can we use Mixture Models to Predict Market Bottoms? by Brian Christopher - 2...QuantInsti
Session Details:
This session explains Mixture Models and explores its application to predict an asset’s return distribution and identify outlier returns that are likely to mean revert.
The objective of this session is to explain and illustrated the use of Mixture Models with a sample strategy in Python.
Who should attend?
- Traders/quants/analysts interested in algorithmic trading research
- Python/software/strategy developers
- Algorithmic/Systematic traders
- Portfolio Managers and consultants
- Students and academicians
Guest Speaker
Mr. Brian Christopher
Quantitative researcher, Python developer, CFA charterholder, and founder of Blackarbs LLC, a quantitative research firm.
Six years ago he learned to code using Python for the purpose of creating algorithmic trading strategies. Four years ago he decided to self publish his research with a focus on practical, reproducible application.
Now he continues his open research initiatives for a growing community of traders, researchers, developers, engineers, architects and practitioners across various industries.
He attained a BSc in Economics from Northeastern University in Boston, MA and received the Chartered Financial Analyst (CFA) designation in 2016.
Access the webinar recording here: https://www.youtube.com/watch?v=o5BFAQK_Acw
Know more about EPAT™ by QuantInsti™ at http://www.quantinsti.com/epat/
Garch Models in Value-At-Risk Estimation for REITIJERDJOURNAL
Abstract:- In this study we investigate volatility forecasting of REIT, from January 03, 2007 to November 18, 2016, using four GARCH models (GARCH, EGARCH, GARCH-GJR and APARCH). We examine the performance of these GARCH-type models respectively and backtesting procedures are also conducted to analyze the model adequacy. The empirical results display that when we take estimation of volatility in REIT into account, the EGARCH model, the GARCH-GJR model, and the APARCH model are adequate. Among all these models, GARCH-GJR model especially outperforms others.
MODELING THE AUTOREGRESSIVE CAPITAL ASSET PRICING MODEL FOR TOP 10 SELECTED...IAEME Publication
Systematic risk is the uncertainty inherent to the entire market or entire market segment and Unsystematic risk is the type of uncertainty that comes with the company or industry we invest. It can be reduced through diversification. The study generalized for selecting of non -linear capital asset pricing model for top securities in BSE and made an attempt to identify the marketable and non-marketable risk of investors of top companies. The analysis was conducted at different stages. They are Vector auto regression of systematic and unsystematic risk.
A Fuzzy Arithmetic Approach for Perishable Items in Discounted Entropic Order...Waqas Tariq
This paper uses fuzzy arithmetic approach to the system cost for perishable items with instant deterioration for the discounted entropic order quantity model. Traditional crisp system cost observes that some costs may belong to the uncertain factors. It is necessary to extend the system cost to treat also the vague costs. We introduce a new concept which we call entropy and show that the total payoff satisfies the optimization property. We show how special case of this problem reduce to perfect results, and how post deteriorated discounted entropic order quantity model is a generalization of optimization. It has been imperative to demonstrate this model by analysis, which reveals important characteristics of discounted structure. Further numerical experiments are conducted to evaluate the relative performance between the fuzzy and crisp cases in EnOQ and EOQ separately.
Single period inventory model with stochastic demand and partial backloggingIAEME Publication
This document summarizes an article from the International Journal of Management that discusses inventory models with stochastic demand and partial backlogging. It presents three key points:
1) It develops an approximate closed-form solution for the expected total cost of a single period inventory model where demand is stochastic and follows the SCBZ property.
2) It examines the model through three cases: the first considers holding cost without salvage value; the second varies only the holding cost satisfying the SCBZ property; the third incorporates partial backlogging with time-dependent and time-independent shortage costs.
3) Optimal solutions are derived for each case and numerical examples are provided to validate the models.
This document discusses two methods for calculating Value-at-Risk (VaR): 1) Assuming a normal distribution of portfolio returns and using a GARCH model to estimate conditional volatility, and 2) A nonparametric bootstrap method. The normal distribution assumption is appropriate only during calm periods but will underestimate risk during turbulent times. The bootstrap method does not rely on distributional assumptions and better accounts for uncertainty in conditional variance dynamics to provide more accurate VaR estimates. An empirical exercise applies the two methods to the CAC40 index to demonstrate how the normal distribution method fails VaR tests during turbulence while the bootstrap method passes.
This document discusses estimating stochastic relative risk aversion from interest rates. It first introduces a model for deriving relative risk aversion from interest rates using a time inhomogeneous single factor short rate model. It then details the estimation methodology used, which calibrates the model to US LIBOR data to estimate a time series for the market price of risk and ex-ante bond Sharpe ratio. This allows deducing a stochastic process for relative risk aversion under a power utility function. Estimated mean relative risk aversion is 49.89. The document then introduces modifying a Real Business Cycle model to allow time-varying relative risk aversion, finding it better matches empirical consumption volatility than a baseline model.
This document summarizes a study that empirically models the monthly Treasury bill rates in Ghana from 1998 to 2012. Specifically, it models the rates of the 91-day and 182-day Treasury bills using ARIMA models. For the 91-day bills, the ARIMA 3,1,1 model provided the best fit with a log likelihood value of -328.58. For the 182-day bills, the ARIMA 1,1,0 model fit best with a log likelihood value of -356.50. Residual tests on both models showed the residuals were free from heteroscedasticity and serial correlation. The study aims to determine appropriate time series models for predicting and forecasting future Treasury bill rates in
This document provides a critical review of the 1996 paper "The Conditional CAPM and the Cross-Section of Expected Returns" by Jagannathan and Wang. The review summarizes the key findings of the original paper, which showed that conditional CAPM can explain the cross-sectional variation in stock returns better than static CAPM. However, the review also notes some limitations in the assumptions around time-varying betas and use of R-squared. Overall, it evaluates the original paper as influential but also discusses subsequent research that built on its findings or identified weaknesses.
Undergraduate Project written by EBERE on ANALYSIS OF VARIATION IN GSKEbere Uzowuru
This document discusses analysis of variance (ANOVA) and its use in quality control and manufacturing processes. It provides background on the development of ANOVA, beginning in the 1930s and growing rapidly after World War II. It then discusses key concepts in ANOVA like partitioning variance, comparing multiple group means, and hypothesis testing. The rest of the document discusses specific tools used with ANOVA like control charts, histograms, Pareto charts, fishbone diagrams and their uses in identifying sources of variation and improving processes.
USE OF PLS COMPONENTS TO IMPROVE CLASSIFICATION ON BUSINESS DECISION MAKINGIJDKP
This paper presents a methodology that eliminates multicollinearity of the predictors variables in
supervised classification by transforming the predictor variables into orthogonal components obtained
from the application of Partial Least Squares (PLS) Logistic Regression. The PLS logistic regression was
developed by Bastien, Esposito-Vinzi, and Tenenhaus [1]. We apply the techniques of supervised
classification on data, based on the original variables and data based on the PLS components. The error
rates are calculated and the results compared. The implementation of the methodology of classification is
rests upon the development of computer programs written in the R language to make possible the
calculation of PLS components and error rates of classification. The impact of this research will be
disseminated, based on evidence that the methodology of Partial Least Squares Logistic Regression, is
fundamental when working in a supervised classification with data of many predictors variables.
This document discusses the concept of quality management. It provides an overview of quality management and defines supply chain quality management. It also lists several quality management tools including check sheets, control charts, Pareto charts, scatter plots, Ishikawa diagrams, and histograms. Finally, it lists several related topics to quality management such as quality management systems, courses, standards, and strategies.
The document describes the scenario simulation method for quantitative risk analysis. It discusses principal component analysis (PCA) to reduce variables and identify key factors that influence yield curve movements. The methodology involves using PCA to represent changes in key rates as a linear combination of independent principal factors. These factors are then discretized into a finite number of scenarios to simulate changes in key rates and portfolio values over time, enabling faster risk analysis compared to Monte Carlo simulation. Several examples are provided to illustrate applying this scenario simulation approach to analyze risk for single-currency and multi-currency fixed income portfolios.
Testing and extending the capital asset pricing modelGabriel Koh
This paper attempts to prove whether the conventional Capital Asset Pricing Model (CAPM) holds with respect to a set of asset returns. Starting with the Fama-Macbeth cross-sectional regression, we prove through the significance of pricing errors that the CAPM does not hold. Hence, we expand the original CAPM by including risk factors and factor-mimicking portfolios built on firm-specific characteristics and test for their significance in the model. Ultimately, by adding significant factors, we find that the model helps to better explain asset returns, but does still not entirely capture pricing errors.
This document describes stress testing a portfolio of four assets using hypothetical variance-covariance matrix stress testing. It investigates two methods - one proposed by Finger that modifies return vectors, and one by Numpacharoen and Bunwong that directly adjusts the correlation matrix. The document applies these methods to a small four asset portfolio using real index return data to demonstrate how the methods work and their strengths and weaknesses. It calculates initial portfolio risk measures and then selects scenarios to stress correlations and re-calculate portfolio risks to assess the impact of difficult market conditions.
Predicting U.S. business cycles: an analysis based on credit spreads and mark...Gabriel Koh
Our paper aims to empirically test the significance of the credit spreads and excess returns of the market portfolio in predicting the U.S. business cycles. We adopt the probit model to estimate the partial effects of the variables using data from the Federal Reserve Economic Data – St. Louis Fed (FRED) and the National Bureau of Economic Research (NBER) from 1993:12 to 2014:08. Results show that the contemporaneous regression model is not significant while the predictive regression model is significant. Our tests show that only the credit spread variable lagged by one period is significant and that the lagged variables of the excess returns of the market portfolio is also significant. Therefore, we can conclude that credit spreads and excess returns of the market portfolio can predict U.S. business cycles to a certain extent.
This document presents a final year project report on using quasi-Monte Carlo methods for market risk management. It first outlines two existing variance reduction methods - importance sampling and stratified importance sampling - and how they are applied to estimate tail loss probabilities in a stock portfolio model. The report then introduces quasi-Monte Carlo and proposes combining it with stratified importance sampling to achieve further variance reduction. Numerical results show that this combined method does not significantly improve variance reduction compared to existing methods.
COMPARING NET PROFIT FORECASTS OF INDIAN BANKS USING OLS AND GARCH 1,1 FRAMEWORKSCHOLEDGE R&D CENTER
In the present paper the Bi-variate Ordinary Least Square (OLS) and Generalized autoregressive conditional heteroskedasicity (GARCH 1, 1) model are applied to gather the fitted Net –Profit series of Two nationalized banks viz, State Bank of India SBI (being a leader) and ING Vysya bank (not a leader) in the Indian Banking sector. It is evident that OLS is non-parameterized method while QMLE or QML is a parameterized technique of coefficients estimation. The robustness must therefore need to see with respect to the data in consideration. The whole approach is to measure how both the models provide Earning forecasts and to analyze the behavior of regression coefficients. Also, the second objective could be to see how “Leader” bank earnings estimation process differs from the non-leader bank in the Indian banking setup. The results are clearly explaining differences in two banks in terms of their coefficient values, residual state and R-squared values.
This document discusses the evolution of research on the Efficient Market Hypothesis (EMH) in finance. It begins by outlining the three forms of market efficiency put forth in EMH. It then describes how Mandelbrot and others challenged EMH by finding long-term dependence and non-linear relationships in asset price movements, contrary to EMH assumptions of randomness. The document outlines Mandelbrot's rescaled range statistical technique and discusses how later researchers used non-linearity tests to further analyze non-random patterns in markets. It questions the validity of conventional linear tests used to support EMH and argues considering non-linearity is crucial to understanding market efficiency.
This document discusses validating risk models using intraday value-at-risk (VaR) and expected shortfall (ES) approaches with the Multiplicative Component GARCH (MC-GARCH) model. The study assesses different distributional assumptions for innovations in the MC-GARCH model and evaluates their effects on modeling and forecasting performance. Backtesting procedures are used to validate the models' predictive power for VaR and ES. Results show non-normal distributions best fit the intraday data and forecast ES, while an asymmetric distribution best forecasts VaR.
NONLINEAR EXTENSION OF ASYMMETRIC GARCH MODEL WITHIN NEURAL NETWORK FRAMEWORKcscpconf
The importance of volatility for all market participants has led to the development and
application of various econometric models. The most popular models in modelling volatility are
GARCH type models because they can account excess kurtosis and asymmetric effects of
financial time series. Since standard GARCH(1,1) model usually indicate high persistence in the
conditional variance, the empirical researches turned to GJR-GARCH model and reveal its
superiority in fitting the asymmetric heteroscedasticity in the data. In order to capture both
asymmetry and nonlinearity in data, the goal of this paper is to develop a parsimonious NN
model as an extension to GJR-GARCH model and to determine if GJR-GARCH-NN outperforms
the GJR-GARCH model.
The Use of ARCH and GARCH Models for Estimating and Forecasting Volatility-ru...Ismet Kale
This document discusses volatility modeling using ARCH and GARCH models. It first provides background on ARCH and GARCH models, noting they were developed to model characteristics of financial time series data like volatility clustering and fat tails. It then describes the specific ARCH and GARCH models that will be used in the study, including the ARCH, GARCH, EGARCH, GJR, APARCH, IGARCH, FIGARCH and FIAPARCH models. The document aims to apply these models to daily stock index data from the IMKB 100 to analyze and forecast volatility, and better understand risk in the Turkish market.
Can we use Mixture Models to Predict Market Bottoms? by Brian Christopher - 2...QuantInsti
Session Details:
This session explains Mixture Models and explores its application to predict an asset’s return distribution and identify outlier returns that are likely to mean revert.
The objective of this session is to explain and illustrated the use of Mixture Models with a sample strategy in Python.
Who should attend?
- Traders/quants/analysts interested in algorithmic trading research
- Python/software/strategy developers
- Algorithmic/Systematic traders
- Portfolio Managers and consultants
- Students and academicians
Guest Speaker
Mr. Brian Christopher
Quantitative researcher, Python developer, CFA charterholder, and founder of Blackarbs LLC, a quantitative research firm.
Six years ago he learned to code using Python for the purpose of creating algorithmic trading strategies. Four years ago he decided to self publish his research with a focus on practical, reproducible application.
Now he continues his open research initiatives for a growing community of traders, researchers, developers, engineers, architects and practitioners across various industries.
He attained a BSc in Economics from Northeastern University in Boston, MA and received the Chartered Financial Analyst (CFA) designation in 2016.
Access the webinar recording here: https://www.youtube.com/watch?v=o5BFAQK_Acw
Know more about EPAT™ by QuantInsti™ at http://www.quantinsti.com/epat/
Garch Models in Value-At-Risk Estimation for REITIJERDJOURNAL
Abstract:- In this study we investigate volatility forecasting of REIT, from January 03, 2007 to November 18, 2016, using four GARCH models (GARCH, EGARCH, GARCH-GJR and APARCH). We examine the performance of these GARCH-type models respectively and backtesting procedures are also conducted to analyze the model adequacy. The empirical results display that when we take estimation of volatility in REIT into account, the EGARCH model, the GARCH-GJR model, and the APARCH model are adequate. Among all these models, GARCH-GJR model especially outperforms others.
This document proposes a stock market forecasting system that uses both a Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) model and a decision tree algorithm. The GARCH model is used to predict stock prices and their volatility over time. A decision tree algorithm is then applied to optimize the GARCH model by reducing errors and false predictions. The decision tree assigns weights to parameters like earnings per share, sales revenue, and trading volume to classify the quality of the input data. This combined GARCH and decision tree approach aims to more accurately forecast stock market movements and prices.
What is wrong with the quantitative standards for market riskAlexander Decker
This document evaluates the quantitative standards laid out in the Basel Accords for implementing internal market risk models. It finds that some standards may not accurately reflect research findings. For example, the standards do not specify a VaR method despite evidence that volatility is time-varying and returns are fat-tailed. Additionally, requiring a minimum historical period runs contrary to evidence of clustered volatility. Several standards effectively smooth the market risk charge over time in ways that make it unresponsive. Overall, the document argues that some quantitative standards could be improved by better aligning with available research findings.
This document provides instructions for modeling stock return volatility using daily stock price data from Hong Kong, Japan, and Singapore markets from 1990 to 2005. It outlines steps to estimate Threshold GARCH and GARCH-in-mean models to examine the volatility and asymmetry of returns in the Singapore market. Specifically, it describes how to: 1) Estimate a TGARCH model to analyze asymmetry in volatility; 2) Estimate a GARCH-in-mean model to investigate the return-risk relationship; and 3) Estimate a TGARCH-in-mean model and compare the results.
Investment portfolio optimization with garch modelsEvans Tee
Since the introduction of the Markowitz mean-variance optimization model, several extensions have been made to improve optimality. This study examines the application of two models - the ARMA-GARCH model and the ARMA- DCC GARCH model - for the Mean-VaR optimization of funds managed by HFC Investment Limited. Weekly prices of the above mentioned funds from 2009 to 2012 were examined. The funds analyzed were the Equity Trust Fund, the Future Plan Fund and the Unit Trust Fund. The returns of the funds are modelled with the Autoregressive Moving Average (ARMA) whiles volatility was modelled with the univariate Generalized Autoregressive Conditional Heteroskedasti city (GARCH) as well as the multivariate Dynamic Conditional Correlation GARCH (DCC GARCH). This was based on the assumption of non-constant mean and volatility of fund returns. In this study the risk of a portfolio is measured using the value-at-risk. A single constrained Mean-VaR optimization problem was obtained based on the assumption that investors’ preference is solely based on risk and return. The optimization process was performed using the Lagrange Multiplier approach and the solution was obtained by the Kuhn-Tucker theorems. Conclusions which were drawn based on the results pointed to the fact that a more efficient portfolio is obtained when the value-at-risk (VaR) is modelled with a multivariate GARCH.
To analyze the factors affecting the price volatility of stocks, microeconomic and macroeco-nomic elements must be considered. This paper selects elements that are appropriate with the daily data of stock prices to build the GARCH family models. External variables such as global oil prices, consumer price index, short interest rates and the exchange rate between the United States Dollar and the Euro are examined. The GARCH models are developed in order to analyze and forecast the stock price of the companies in the DAX 30, which is Germany’s most important stock exchange barometer. The volatility of the residual of the mean function is the important key point in the GARCH approach. This financial application can be extend-ed to analyze other specific shares or stock indexes in any stock market in the world. There-fore, it is necessary to understand the operating procedures of their pricing for risk manage-ment, profitability strategies, cost minimization and, in addition, to construct the optimal port-folio depending on investor’s preferences.
Multifactorial Heath-Jarrow-Morton model using principal component analysisIJECEIAES
In this study, we propose an implementation of the multifactor Heath-Jarrow- Morton (HJM) interest rate model using an approach that integrates principal component analysis (PCA) and Monte Carlo simulation (MCS) techniques. By integrating PCA and MCS with the multifactor HJM model, we successfully capture the principal factors driving the evolution of short-term interest rates in the US market. Additionally, we provide a framework for deriving spot interest rates through parameter calibration and forward rate estimation. For this, we use daily data from the US yield curve from June 2017 to December 2019. The integration of PCA, MCS with multifactor HJM model in this study represents a robust and precise approach to characterizing interest rate dynamics and compared to previous approaches, this method provided greater accuracy and improved understanding of the factors influencing US Treasury Yield interest rates.
This document provides an update to a previous study on the performance of passive and active collar strategies applied to the Powershares QQQ ETF (QQQ). The update extends the analysis period through September 2010. It finds that during market declines like the tech bubble and credit crisis, collar strategies provided downside protection and strong returns compared to a long position in QQQ. However, collars underperformed during strong market climbs. The document also analyzes applying collar strategies to a small cap mutual fund and finds similar beneficial results. It concludes that active collars, which dynamically adjust based on momentum, volatility, and macroeconomic signals, tended to outperform passive collars both in-sample and out-of-sample.
Review of Quantitative Finance and Accounting, 13 (1999) 171±.docxronak56
Review of Quantitative Finance and Accounting, 13 (1999): 171±188
# 1999 Kluwer Academic Publishers, Boston. Manufactured in The Netherlands.
Random Walks and Market Ef®ciency Tests: Evidence
from Emerging Equity Markets
DAVID KAREMERA
South Carolina State University, Orangeburg, SC 29117
KALU OJAH
Saint Louis University, St. Louis, MO 63108
JOHN A. COLE
Benedict College, Columbia, SC 29204
Abstract. We use the multiple variance-ratio test of Chow and Denning (1993) to examine the stochastic
properties of local currency- and US dollar-based equity returns in 15 emerging capital markets. The technique is
based on the Studentized Maximum Modulus distribution and provides a multiple statistical comparison of
variance-ratios, with control of the joint-test's size. We ®nd that the random walk model is consistent with the
dynamics of returns in most of the emerging markets analyzed, which contrasts many random walk test results
documented with the use of single variance-ratio techniques. Further, a runs test suggests that most of the
emerging markets are weak-form ef®cient. Overall, our results suggest that investors are unlikely to make
systematic nonzero pro®t by using past information in many of the examined markets, thus, investors should
predicate their investment strategies on the assumption of random walks. Additionally, our results suggest
exchange rate matters in returns' dynamics determination for some of the emerging equity markets we analyzed.
Key words: random walk, stock prices, multiple variance-ratio test, emerging capital markets, weak-form
ef®ciency
JEL Classi®cation: G15, G14
Introduction
The random walk properties of security prices have an important bearing on the
determination of security return dynamics and on associated potential trading strategies, as
is amply suggested by Poterba and Summers (1988, pp. 53±54), Lo and MacKinlay (1989),
and Eckbo and Liu (1993). Random walks, which are a special case of unit root processes,
help identify the kinds of shocks that drive stock prices. If a given equity price series is, for
instance, a random walk, the generating process is dominated by permanent components
and hence has no mean-reversion tendency.
1
A shock to the series from an initial
equilibrium will lead to increasing deviations from its long-run equilibrium. Moreover, the
random walk properties of stock returns are considered an outcome of the ef®cient market
hypothesis (i.e., stock prices exhibit unpredictable behavior, given available information).
Accordingly, Liu and Maddala (1992) demonstrate how the presence or absence of random
walks in security returns is crucial to both the formulation of rational expectation models
and the testing of market ef®ciency hypothesis.
Several studies, (e.g., Hakkio (1986), Summer (1986), Fama and French (1988), and
Poterba and Summers (1988)) demonstrate that standard random walk hypothesis
(RWH)
2
tests (e.g., unit root tests) lack power and are ...
Measuring the volatility in ghana’s gross domestic product (gdp) rate using t...Alexander Decker
This document summarizes a study that analyzed volatility in Ghana's GDP growth rate using GARCH models. The study found that GDP volatility exhibited characteristics like clustering and leverage effects. A GARCH(1,1) model provided a reasonably good fit to quarterly GDP data. Volatility and leverage effects were found to have significantly increased. The best fitting models for GDP volatility were ARIMA(1,1,1)(0,0,1)12 and ARIMA(1,1,2)(0,0,1)12 models.
Econometric Modeling in relation to Foreign Exchange RiskMatthew Adj
This document discusses different econometric models for estimating the optimal hedge ratio when hedging foreign exchange risk, and whether the model specification matters. It argues that the optimal hedge ratio and hedging effectiveness are more dependent on the correlation between the unhedged position and hedging instrument, rather than the model used. Four models - levels, first difference, non-linear, and error correction model - are used to estimate hedge ratios for hedging Swiss franc, British pound, and Hong Kong dollar exposures with money markets and cross currencies from 2001-2009. Empirical evidence shows that when correlation is strong with money markets, all four models exhibit similar hedge ratios and nearly 99% hedging effectiveness. However, with
Statistical Arbitrage
Pairs Trading, Long-Short Strategy
Cyrille BEN LEMRID

1 Pairs Trading Model 5
1.1 Generaldiscussion ................................ 5 1.2 Cointegration ................................... 6 1.3 Spreaddynamics ................................. 7
2 State of the art and model overview 9
2.1 StochasticDependenciesinFinancialTimeSeries . . . . . . . . . . . . . . . 9 2.2 Cointegration-basedtradingstrategies ..................... 10 2.3 FormulationasaStochasticControlProblem. . . . . . . . . . . . . . . . . . 13 2.4 Fundamentalanalysis............................... 16
3 Strategies Analysis 19
3.1 Roadmapforstrategydesign .......................... 19 3.2 Identificationofpotentialpairs ......................... 19 3.3 Testingcointegration ............................... 20 3.4 Riskcontrolandfeasibility............................ 20
4 Results
22
2
Contents

Introduction
This report presents my research work carried out at Credit Suisse from May to September 2012. This study has been pursued in collaboration with the Global Arbitrage Strategies team.
Quantitative analysis strategy developers use sophisticated statistical and optimization techniques to discover and construct new algorithms. These algorithms take advantage of the short term deviation from the ”fair” securities’ prices. Pairs trading is one such quantitative strategy - it is a process of identifying securities that generally move together but are currently ”drifting away”.
Pairs trading is a common strategy among many hedge funds and banks. However, there is not a significant amount of academic literature devoted to it due to its proprietary nature. For a review of some of the existing academic models, see [6], [8], [11] .
Our focus for this analysis is the study of two quantitative approaches to the problem of pairs trading, the first one uses the properties of co-integrated financial time series as a basis for trading strategy, in the second one we model the log-relationship between a pair of stock prices as an Ornstein-Uhlenbeck process and use this to formulate a portfolio optimization based stochastic control problem.
This study was performed to show that under certain assumptions the two approaches are equivalent.
Practitioners most often use a fundamentally driven approach, analyzing the performance of stocks around a market event and implement strategies using back-tested trading levels.
We also study an example of a fundamentally driven strategy, using market reaction to a stock being dropped or added to the MSCI World Standard, as a signal for a pair trading strategy on those stocks once their inclusion/exclusion has been made effective.
This report is organized as follows. Section 1 provides some background on pairs trading strategy. The theoretical results are described in Section 2. Section 3
This document discusses using extreme value theory (EVT) to model policyholder behavior in extreme market conditions using variable annuity lapse data. EVT allows predicting behavior in the extremes based on nonextreme data. The paper applies EVT by fitting bivariate distributions to lapse and market indicator data above a large threshold. This provides insights into policyholder behavior in extreme markets without direct observations. The goal is a dynamic lapse formula capturing different characteristics than traditional methods.
Superior performance by combining Rsik Parity with Momentum?Wilhelm Fritsche
This document examines different strategies for global asset allocation between equities, bonds, commodities and real estate. It finds that applying trend following rules substantially improves risk-adjusted performance compared to traditional buy-and-hold portfolios. It also finds trend following to be superior to risk parity approaches. Combining momentum strategies with trend following further improves returns while reducing volatility and drawdowns. A flexible approach that allocates capital based on volatility-weighted momentum rankings of 95 markets produces attractive, consistent risk-adjusted returns.
This document discusses using institutional ownership data to predict future stock returns. It analyzes institutional ownership data from the US between 2004-2014 using machine learning algorithms. A support vector machine was able to classify stocks into 3 bins of future 4-quarter returns with 37.3% accuracy, significantly better than chance. The study aims to identify which institutional ownership features best predict returns and whether their combination improves predictions over individual features.
This document presents a novel approach for combining individual realized volatility measures to form new estimators of asset price variability. It analyzes 30 different realized measures estimated from high frequency IBM stock price data from 1996-2007. It finds that a simple equally-weighted average of the realized measures is not outperformed by any individual measure and that combining measures provides benefits by incorporating information from different estimators. Optimal linear and multiplicative combination estimators are estimated and none of the individual measures are found to encompass all the information in other measures, further supporting the use of combination estimators.
Similar to Volatility forecasting a_performance_mea (20)
Liberarsi dai framework con i Web Component.pptxMassimo Artizzu
In Italian
Presentazione sulle feature e l'utilizzo dei Web Component nell sviluppo di pagine e applicazioni web. Racconto delle ragioni storiche dell'avvento dei Web Component. Evidenziazione dei vantaggi e delle sfide poste, indicazione delle best practices, con particolare accento sulla possibilità di usare web component per facilitare la migrazione delle proprie applicazioni verso nuovi stack tecnologici.
Consistent toolbox talks are critical for maintaining workplace safety, as they provide regular opportunities to address specific hazards and reinforce safe practices.
These brief, focused sessions ensure that safety is a continual conversation rather than a one-time event, which helps keep safety protocols fresh in employees' minds. Studies have shown that shorter, more frequent training sessions are more effective for retention and behavior change compared to longer, infrequent sessions.
Engaging workers regularly, toolbox talks promote a culture of safety, empower employees to voice concerns, and ultimately reduce the likelihood of accidents and injuries on site.
The traditional method of conducting safety talks with paper documents and lengthy meetings is not only time-consuming but also less effective. Manual tracking of attendance and compliance is prone to errors and inconsistencies, leading to gaps in safety communication and potential non-compliance with OSHA regulations. Switching to a digital solution like Safelyio offers significant advantages.
Safelyio automates the delivery and documentation of safety talks, ensuring consistency and accessibility. The microlearning approach breaks down complex safety protocols into manageable, bite-sized pieces, making it easier for employees to absorb and retain information.
This method minimizes disruptions to work schedules, eliminates the hassle of paperwork, and ensures that all safety communications are tracked and recorded accurately. Ultimately, using a digital platform like Safelyio enhances engagement, compliance, and overall safety performance on site. https://safelyio.com/
E-Invoicing Implementation: A Step-by-Step Guide for Saudi Arabian CompaniesQuickdice ERP
Explore the seamless transition to e-invoicing with this comprehensive guide tailored for Saudi Arabian businesses. Navigate the process effortlessly with step-by-step instructions designed to streamline implementation and enhance efficiency.
8 Best Automated Android App Testing Tool and Framework in 2024.pdfkalichargn70th171
Regarding mobile operating systems, two major players dominate our thoughts: Android and iPhone. With Android leading the market, software development companies are focused on delivering apps compatible with this OS. Ensuring an app's functionality across various Android devices, OS versions, and hardware specifications is critical, making Android app testing essential.
UI5con 2024 - Keynote: Latest News about UI5 and it’s EcosystemPeter Muessig
Learn about the latest innovations in and around OpenUI5/SAPUI5: UI5 Tooling, UI5 linter, UI5 Web Components, Web Components Integration, UI5 2.x, UI5 GenAI.
Recording:
https://www.youtube.com/live/MSdGLG2zLy8?si=INxBHTqkwHhxV5Ta&t=0
Top Benefits of Using Salesforce Healthcare CRM for Patient Management.pdfVALiNTRY360
Salesforce Healthcare CRM, implemented by VALiNTRY360, revolutionizes patient management by enhancing patient engagement, streamlining administrative processes, and improving care coordination. Its advanced analytics, robust security, and seamless integration with telehealth services ensure that healthcare providers can deliver personalized, efficient, and secure patient care. By automating routine tasks and providing actionable insights, Salesforce Healthcare CRM enables healthcare providers to focus on delivering high-quality care, leading to better patient outcomes and higher satisfaction. VALiNTRY360's expertise ensures a tailored solution that meets the unique needs of any healthcare practice, from small clinics to large hospital systems.
For more info visit us https://valintry360.com/solutions/health-life-sciences
Measures in SQL (SIGMOD 2024, Santiago, Chile)Julian Hyde
SQL has attained widespread adoption, but Business Intelligence tools still use their own higher level languages based upon a multidimensional paradigm. Composable calculations are what is missing from SQL, and we propose a new kind of column, called a measure, that attaches a calculation to a table. Like regular tables, tables with measures are composable and closed when used in queries.
SQL-with-measures has the power, conciseness and reusability of multidimensional languages but retains SQL semantics. Measure invocations can be expanded in place to simple, clear SQL.
To define the evaluation semantics for measures, we introduce context-sensitive expressions (a way to evaluate multidimensional expressions that is consistent with existing SQL semantics), a concept called evaluation context, and several operations for setting and modifying the evaluation context.
A talk at SIGMOD, June 9–15, 2024, Santiago, Chile
Authors: Julian Hyde (Google) and John Fremlin (Google)
https://doi.org/10.1145/3626246.3653374
WWDC 2024 Keynote Review: For CocoaCoders AustinPatrick Weigel
Overview of WWDC 2024 Keynote Address.
Covers: Apple Intelligence, iOS18, macOS Sequoia, iPadOS, watchOS, visionOS, and Apple TV+.
Understandable dialogue on Apple TV+
On-device app controlling AI.
Access to ChatGPT with a guest appearance by Chief Data Thief Sam Altman!
App Locking! iPhone Mirroring! And a Calculator!!
14 th Edition of International conference on computer visionShulagnaSarkar2
About the event
14th Edition of International conference on computer vision
Computer conferences organized by ScienceFather group. ScienceFather takes the privilege to invite speakers participants students delegates and exhibitors from across the globe to its International Conference on computer conferences to be held in the Various Beautiful cites of the world. computer conferences are a discussion of common Inventions-related issues and additionally trade information share proof thoughts and insight into advanced developments in the science inventions service system. New technology may create many materials and devices with a vast range of applications such as in Science medicine electronics biomaterials energy production and consumer products.
Nomination are Open!! Don't Miss it
Visit: computer.scifat.com
Award Nomination: https://x-i.me/ishnom
Conference Submission: https://x-i.me/anicon
For Enquiry: Computer@scifat.com
Project Management: The Role of Project Dashboards.pdfKarya Keeper
Project management is a crucial aspect of any organization, ensuring that projects are completed efficiently and effectively. One of the key tools used in project management is the project dashboard, which provides a comprehensive view of project progress and performance. In this article, we will explore the role of project dashboards in project management, highlighting their key features and benefits.
The Key to Digital Success_ A Comprehensive Guide to Continuous Testing Integ...kalichargn70th171
In today's business landscape, digital integration is ubiquitous, demanding swift innovation as a necessity rather than a luxury. In a fiercely competitive market with heightened customer expectations, the timely launch of flawless digital products is crucial for both acquisition and retention—any delay risks ceding market share to competitors.
J-Spring 2024 - Going serverless with Quarkus, GraalVM native images and AWS ...
Volatility forecasting a_performance_mea
1. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
1
VOLATILITY FORECASTING - A
PERFORMANCE MEASURE OF GARCH
TECHNIQUES WITH DIFFERENT
DISTRIBUTION MODELS
Hemanth Kumar P.1
and Basavaraj Patil S.2
1
Computer Science Engineering, VTURRC, Belagavi
hemanth00kumar@gmail.com
2
Computer Science Engineering, VTURRC, Belagavi
dr.sharan9@gmal.com
Abstract
Volatility Forecasting is an interesting challenging topic in current financial instruments as it is directly
associated with profits. There are many risks and rewards directly associated with volatility. Hence
forecasting volatility becomes most dispensable topic in finance. The GARCH distributions play an important
role in the risk measurement and option pricing. The min motive of this paper is to measure the performance
of GARCH techniques for forecasting volatility by using different distribution model. We have used 9
variations in distribution models that are used to forecast the volatility of a stock entity. The different GARCH
distribution models observed in this paper are Std, Norm, SNorm, GED, SSTD, SGED, NIG, GHYP and JSU.
Volatility is forecasted for 10 days in advance and values are compared with the actual values to find out the
best distribution model for volatility forecast. From the results obtain it has been observed that GARCH with
GED distribution models has outperformed all models.
Keywords
Volatility, Forecasts, GARCH, Distribution models, Stock market
1. INTRODUCTION
Volatility plays a key role in finance it is responsible for option pricing and risk management.
Volatility is directly associated with risks and returns, higher the volatility the more financial
market is unstable. It may result in both High profits or huge loses if volatility is changing at higher
rate. Volatility directly or indirectly controls asset return series, equity prices and foreign exchange
rates. If the pattern of volatility clusters is studied for longer duration we observe that, once if
volatility reaches its highest point then it will continue for a longer duration. These are readily
recognized by Generalized Autoregressive Conditional Heteroscedasticity (GARCH) model
introduced by Bollerslev [1986]. The volatility models identify and track the volatility clusters that
are reaching either higher peaks or lower peaks by modeling the volatility clusters. In every period,
the arrival of cluster is demonstrated as another advancement term with fluctuation scaled up by the
data of profits and volatilities in the past periods. While considering the volatility dynamic with
standout lagged period, the GARCH (1, 1) model has turned into a workhorse in both scholarly and
practice because of its effortlessness and instinctive understanding.
DOI: 10.14810/ijscmc.2016.5301
2. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
2
While applying GARCH models in monetary danger administration, the conveyance of GARCH
developments assumes a critical part. From the meaning of GARCH model, it is clear that the
restrictive circulation of future returns has the same shape as the appropriation of the advancements.
Subsequently, an unseemly model on the appropriation of advancements might prompt either
underestimation or overestimation of future dangers. Furthermore, diverse appropriations of
GARCH advancements might likewise prompt distinctive choice estimating results. This paper
looks at a current analysis structure on the dispersion of GARCH advancements, what's more,
exhibits its downside when applying to money related time arrangement. Further, we add to an
option technique, especially towards applications to money related time arrangement.
The recent work carried in the field of finance using Garch techniques are discussed in this section.
Francesco Audrino [2016][1] discuss about Volatility Forecasting on SP 500 data set considering
Downside Risk, Jumps and Leverage Effect. The paper forecast the leverage effect separated into
continuous and discontinuous effects, and past volatility are separated into good and bad leverages.
Momtchil Dojarliev [2014][2] researched on the volatility and value risk evaluation for MSCI North
American Index, the paper compares techniques such as Naïve, GARCH, AGARCH and BEKK
model in forecasting volatility. Out of all the techniques Naïve has the highest failure rate and
BEKK model has highest successful rate. Karunanithy Banumathy [2015][3] in their research work
modeling in Stock Market volatility using GARCH, Akaike Information Criterion (AIC) and
Schwarz Information Criterion (SIC), the study proves that GARCH and TGARCH estimations are
found to be most appropriate model to capture the symmetric and asymmetric volatility
respectively.
Amadeus Wennström [2014][4] research on volatility forecasting and their performance of 6
generally used forecasting models; the simple moving average, the exponentially weighted moving
average, the ARCH model, the GARCH model, the EGARCH model and the GJR-GARCH model.
The dataset used in this report are three different Nordic equity indices, OMXS30, OMXH25 and
OMXC20. The result of this research work suggests that EGARCH has better MSE (Mean Square
Error) rates compared to other techniques. Yiannis Dendramis [2012] [5] measure performance of
option parametric instability models, as EGARCH or GARCH models, can be extensively enhanced
in the event that they are joined with skewed conveyances of return innovations. The execution of
these models is observed to be like that of the EVT (compelling esteem hypothesis) methodology
and it is superior to that of their expansions taking into account Markov administration exchanging
effects with or without EGARCH effects. The paper …recommends that the execution of the last
approach can be additionally significantly enhanced on the off chance that it depends on …altered
residuals got through instability models which take into account skewed appropriations of return
developments.
BACK GROUND
Amid the most recent couple of decades have seen a huge number of various recommendations for
how to show the second momentum, are referred as Volatility. Among the models that have
demonstrated the best are the auto-regressive heteroskedasticity (Arch) group of models presented
by Engle (1982) and the models of stochastic change (SV) spearheaded by Taylor (1986). During
the last couple of years ARFIMA sort demonstrating of high-recurrence squared returns has
demonstrated exceptionally productive [6]. Forecasting the unpredictability of profits is key for
some ranges of finance, it is understood that financial return arrangement show numerous non-
ordinary qualities that cannot be caught by the standard GARCH model with a typical blunder
dissemination. In any case, which GARCH model and which error appropriation to utilize is still
open to address, particularly where the model that best fits the in-test information may not give the
3. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
3
best out-of-test instability gauging capacity which we use as the foundation for the determination of
the most effective model from among the choices. In this study, six mimicked examines in GARCH
(p,q) with six distinctive mistake circulations are completed[7].The Generalized Autoregressive
Conditional Heteroskedasticity (GARCH) model, intended to display instability bunching, displays
overwhelming tailedness paying little heed to the appropriation of on its development term. While
applying the model to money related time arrangement, the conveyance of advancements plays an
imperative part for risk estimation and option valuing [8].
Financial related returns arrangements are essentially described by having a zero mean, showing
high kurtosis and little, if any, connection. The squares of these profits frequently present high
connection and perseverance, which makes ARCH-sort models suitable for evaluating the
restrictive unpredictability of such procedures; see Engle (1982) for the original work, Bollerslev et
al (1994) for a review on instability models and Engle and Patton (2001) for a few expansions. The
ARCH parameters are typically assessed utilizing most extreme probability (ML) techniques that
are ideal when the information is drawn from a Gaussian circulation [9].
This paper looks at the anticipating execution of four GARCH (1, 1) models (GARCH, EGARCH,
GJR and APARCH) utilized with three dispersions (Normal, Student-t and Skewed Student-t). They
investigate and look at changed conceivable wellsprings of conjectures upgrades: asymmetry in the
contingent difference, fat-followed conveyances and skewed appropriations. Two noteworthy
European stock records (FTSE 100 and DAX 30) are considered utilizing day by day information
over a 15-years time span. Our outcomes propose that enhancements of the general estimation are
accomplished when topsy-turvy GARCH are utilized and when fat-followed densities are checked
in the contingent change. Also, it is found that GJR and APARCH give preferred figures over
symmetric GARCH. At long last expanded execution of the estimates is not obviously watched
when utilizing non-ordinary conveyances [10].
This paper breaks down the system, results and exactness of GARCH (1, 1) models used with three
appropriations (Normal, Student-t and Skewed Student-t). They examine and contrast different
conveyances with get high determining precision through rolling out improvements in asymmetry
restrictive change, skew and fat followed circulations. Two vital European stock records (FTSE 100
and DAX 30) are examined using each day data over a 15-years time span. Our results suggest that
improvements of the general estimation are expert when hilter kilter GARCH are used and when
fat-took after densities are considered in the prohibitive change. Likewise, it is found that GJR and
APARCH give favored guesses over symmetric GARCH. Finally extended execution of the gages
is not clearly watched while using non-common dispersals [11].
This paper thinks about 330 ARCH-sort models as far as their capacity to portray the restrictive
difference [12]. The models are looked at out-of-test utilizing DM–$ swapping scale information
and IBM return information, where the last depends on another information set of acknowledged
change. We discover no confirmation that a GARCH (1, 1) is beated by more refined models in our
investigation of trade rates, while the GARCH (1, 1) is obviously second rate compared to models
that can oblige an influence impact in our examination of IBM returns. The models are contrasted
and the test for unrivaled prescient capacity (SPA) and the rude awakening for information
snooping (RC). Our observational results demonstrate that the RC needs energy to a degree that
makes it not able to recognize "great" and "awful" models in their investigation [12].
2. VOLATILITY
In finance, Volatility is defines level of variety of an exchanging trade prices after some time as
measured by the standard deviation of profits. Historic volatility is gotten from time arrangement of
past business sector costs. A suggested unpredictability is gotten from the business sector cost of a
business sector exchanged subsidiary (specifically a choice). The image σ is utilized for
4. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
4
unpredictability, and compares to standard deviation, which ought not be mistaken for the
comparatively named difference, which is rather the square, σ2
.The three principle purposes of
estimating Volatility are for knowing risk, allocation of assets, make profit with financial trading. A
huge piece of volatility forecasting is measuring the potential future misfortunes of an arrangement
of advantages, and keeping in mind the end goal to gauge these potential misfortunes, gauges must
be made of future volatilities and relationships. In asset management, the Markowitz methodology
of minimizing risk for a given level of expected returns has turned into a standard methodology, and
obviously an assessment of the fluctuation covariance network is required to measure volatility.
Maybe the most difficult use of forecasting volatility is to utilize risk factors for building up a risk
oriented return model.
3. METHODOLOGY
The methodology can be split into 4 steps such as Data Acquisition, Data Preprocessing, Estimation
of Volatility, Forecasting using GARCH Techniques and Result Comparison. Data acquisition is the
first step, the stock market closing. The detailed explanation of steps is explained below.
Data Acquisition – this paper uses 10 years of stock market data for volatility forecasting. We have
selected the SP 500 index as the input dataset. SP 500 end of the day stock data is downloaded from
Yahoo Finance. This paper uses 10 years of historical data ranging from 5th
December 2005 to 4th
December 2015. 10 years of data set resulted in 2514 samples of data set. One row of data is
generated per day except on Saturday and Sunday as their will be no transactions on weekends.
Fig.1 Closing Prices of SP 500 Index for a period of 10 years
3.1 Data Preprocessing
The downloaded data consists of 6 columns such as Date, Open, High, Low, Close, Volume and
Adjusted Closing prices. The data when downloaded is in the recent date first order, the data is
arranged to contain recent date at the last to predict the volatility values for net 10 dates. The data is
checked for missing values or NA values, such data will be either replaced with mean or median or
deleted.
4.2 Estimating Volatility
The volatility is estimated from open, high, low and close values of stock data; generally volatility
is calculated as the standard deviation or returns of stock data. The volatility is calculated by
considering every 10 days as interval of stock data. Close Method is the most commonly used
5. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
5
volatility calculation technique. This method works on closing prices of the stock data. The plot of
estimated volatility for SP 500 using Close price is as shown in Fig.1
VolatilityClose= cc = ∑ ……………………………………………….…….. (1)
Forecasting of Volatility - The volatility estimated from close technique is forecasted using Garch
technique. In this paper we apply Garch with different distribution models in order to forecast
accurately. The volatility is forecasted for 10 days in advance.
4.3 Result Comparison
The results of Garch technique with different models are compared with error measuring parameter
MSE. The distribution model with lowest MSE value is considered as the most accurate distribution
model compared to other models. The main aim of this research is to find out a distribution model
with lowest error.
Figure 1. Volatility of SP 500 Index for a period of 10 years
4. GARCH TECHNIQUES
Generally GARCH is referred as Generalized Autoregressive Conditional Heteroskedasticity model,
intended for volatility is clustering and displays heavy-tailedness depending upon the selection of
innovation term. While applying Garch model to forecast volatility, the distribution of innovation
terms plays a critical part for risk estimation and option pricing. GARCH models have been created
to clarify the unpredictability grouping. In the GARCH model, the development (or remaining)
conveyances are thought to be a standard typical dispersion, regardless of the way that this
presumption is frequently dismisses experimentally. Consequently, GARCH models with non-
ordinary advancement dissemination have been produced. In this research Garch techniques on
6. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
6
applying different innovation models to conclude about a better forecasting model. Financial
models with long tailed distributions and volatility clustering have been acquainted to overcome
issues with the authenticity of traditional Garch models. These traditional models of financial time
series lack the explanation of homoskedasticity, skewness, substantial tails, and instability grouping
of empirical asset returns.
In GARCH models, the probability density function is written in terms of the scale and location
parameters, standardized to have mean zero and variance equal to one.
αt = (µt, σt, ω) ……………………………………………………..………………………………….…….(2)
Where the conditional mean is given by
µt = µ(θ, xt) = E(yt|xt)……………………………………………………………………………....(3)
and the conditional variance is,
σ2
t = σ2
(θ, xt) = E((yt − µt)2
|xt) ……………………………………………………………………(4)
with ω = ω(θ, xt) denotes the others parameters of the distribution, perhaps a shape and skew
parameter. The conditional mean and variance are used to scale the innovations,
zt(θ) = yt − µ(θ, xt)σ/(θ, xt) ………………………………………………………..……………..(5)
Having conditional density which may be written as,
g(z|ω) = d/dzP(zt<z|ω) ……………………………………………………………………………..(6)
5.1 Student distribution
The GARCH-Student model was initially utilized portrayed as a part of Bollerslev (1987) as a
distinct option for the Normal appropriation for fitting the institutionalized developments. It is
depicted totally by a shape parameter ν, yet for institutionalization we continue by utilizing its 3
parameter representation as takes after
f(x) =
√
……………………………………………………………………………(7)
Where α, β, and ν are the area, scale and shape parameters separately and Γ is the Gamma capacity.
Like the GED dispersion depicted later, this is a unimodal and symmetric dispersion where the area
parameter α is the mean (and mode) of the dissemination while the change is
Var(x) = ………………………………………………………………………………………………..(8)
For the purposes of standardization we require that:
Var(x) = = 1 …………………………………………………………………………………..………...(9)
7. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
7
That implies
…………………………………………………………………………………………………….(10)
Substituting β into f(x) we obtain the standardized Student's distribution
f ) = f(z) =
√
(1+ ) …………………………………………………………….....(11)
5.2 Normal distribution
The Normal Distribution is a circular appropriation portrayed totally by it initial two minutes, the
mean and change. Formally, the arbitrary variable x is said to be typically disseminated with mean
µ and change σ2 with thickness given by
f(x) = ……………………………………………………………………………………………...(12)
Taking after a mean filtration or brightening process, the residuals ε, institutionalized by σ yield the
standard typical thickness given by
f( ) = f(z) = ( ) ………………………………………………………………………….……….(13)
To get the restrictive probability of the GARCH process at every point in time, the contingent
standard deviation σt from the GARCH movement progress, goes about as a scaling component on
the thickness, so that
LLt(zt; t) = f(zt) ……………………………………………………………………………………..…..(14)
Which outlines the significance of the scaling property. At last, the ordinary conveyance
has zero skewness and zero overabundance kurtosis.
5.3 Skew Normal distribution
In probability theory and statistics, the skew normal distribution is a continuous probability
distribution that generalizes the normal distribution to allow for non-zero skewness.
Let (x) denote the standard normal probability density function
(x) = ……………………………………………………………………………………………..(15)
With the cumulative distribution function given by
(x) = ∫ = ……………………………………………………………………….(16)
A stochastic procedure that supports the conveyance was depicted by Andel, Netuka and Zvara
(1984).[1] Both the dispersion and its stochastic procedure underpinnings were results of the
symmetry contention created in Chan and Tong (1986), which applies to multivariate cases past
8. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
8
ordinariness, e.g. skew multivariate t dissemination and others. The dispersion is a specific instance
of a general class of circulations with likelihood thickness elements of the structure f(x)=2 φ(x)
Φ(x) where φ() is any PDF symmetric around zero and Φ() is any CDF whose PDF is symmetric
around zero
f(x)=2 φ(x) Φ(x) .............................................................................................................................................(17)
5.4 Generalized Error distribution
The Generalized Error Distribution (GED) is a 3 parameter distribution belonging to the exponential
family with conditional density given by,
f(x) = ……………………………………………………………………………………....(18)
With, and speaking to the area, scale and shape parameters. Since the conveyance is symmetric and
unimodal the area parameter is likewise the mode, middle and mean of the conveyance. By
symmetry, every odd minute past the mean are zero. The fluctuation what's more, kurtosis are given
by
Var(x) = …………………………………………………………………………………….(19)
Ku(x) = …………………………………………………………………………………….(20)
As abatements the thickness gets atter and atter while in the farthest point as! 1, the
dispersion tends towards the uniform. Uncommon cases are the Normal when = 2, the
Laplace when = 1. Institutionalization is straightforward and includes rescaling the
thickness to have unit standard deviation
Var(x) = = 1…………………………………………………………………………………. (20)
That implies √ …………………………………..…………………………………………. (21)
5.5 Skewed Distributions
Fernandez and Steel (1998) proposed introducing skewness into unimodal and symmetric
distributions by introducing inverse scale factors in the positive and negative real half lines. Given a
skew parameter the density of a random variable z can be represented as:
f (z|ξ) = [f(ξz) H(-z) + f( ………………………………………………………….….….(22)
Where ξ, H (.) is Heaviside function. The absolute moments, requirements for deriving central
moments are as follows
Mr= 2 ∫ …………………………………………………………………………...…….……. (23)
The mean and variance are defined as
9. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
9
E (z) = M1 (ξ - ) ……………………………………………………………………………...…………. (24)
Var(z) = (M2 – M1
2
)( 1
2
– M2 …………………………………………………………….(25)
5.6 Skew Student distribution
The Normal, Student and GED distributions have skew variants which have been standardized to
zero mean, unit variance by making use of the moment conditions given above.
5.7 Normal Inverse Gaussian distribution
The normal-inverse Gaussian distribution (NIG) is a continuous probability distribution that is
defined as the normal variance-mean mixture where the mixing density is the inverse Gaussian
distribution. The NIG distribution was noted by Blaesild in 1977 as a subclass of the generalized
hyperbolic distribution discovered by Ole Barndorff-Nielsen, in the next year Barndorff-Nielsen
published the NIG in another paper. It was introduced in the mathematical finance literature in
1997.
The Inverse Gaussian distribution are controlled by the location, and
, their relation is refereed as
………………………………………………………………………………...…………….(26)
The probability density function is given by
(
√
√
) …………………………………………….............……………………….(27)
Where Ki denotes Bessel function of third kind
5.8 Generalized Hyperbolic distribution
The General Hyperbolic distribution was popularized by Aas and Ha (2006) because of its
uniqueness in the GH family in having g one tail with polynomial and one with exponential
behavior. This distribution is a limiting case of the GH when | and where v is the
shape parameter of the Student distribution. The domain of variation of the parameters is R and
v> 0, but for the variance to be infinite v> 4, while for the existence of skewness and kurtosis, v > 6
and v> 8 respectively. The density of the random variable x is then given by:
f(x) =
( )
√ ( )
√
…………………………….…………….……………..(28)
5.9 Johnson’s reparametrized SU distribution
The reparametrized Johnson SU distribution, discussed in Rigby and Stasinopoulos (2005), is a four
parameter distribution denoted by JSU ( ), with mean and standard deviation for all values
of the skew and shape parameters and respectively.
10. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
10
The probability density function is given by
√ ( )
……………………………………………………….……………..(29)
6. Results
This research mainly focused on exploring Garch techniques with different distribution models.
Garch techniques were tested with 9 different distribution models on the same data set to forecast
10 days in advance. The future 10 forecasted values are tabulated in table and compared with rest of
the other models. The Actual volatility values are calculated from the original stock data by
considering 10 days more of SP 500 closing prices than used for forecasting with Garch techniques.
The actual values of volatility and forecasted volatility with different Garch distribution models are
tabulated in the table. The accuracy of the forecasting techniques is measured using Mean Square
Error (MSE) parameter. The MSE of the forecasting models are tabulated in the table. The Garch
distribution model with lowest MSE value for all the 10 forecasted value is considered as accurate
forecast model. From the table it is evident that Garch technique with GED distribution is having
lowest MSE values. Hence Garch technique with GED model can be considered as accurate
technique compared to other models.
Table 1. Actual and Forecasted volatility values for 10 days
Day
Forecast
Actual
Volatility
Std Norm SNorm GED SSTD SGED NIG GHYP JSU
T+1 0.1146 0.1140 0.1163 0.1165 0.1129 0.1142 0.1131 0.1136 0.1137 0.1137
T+2 0.1143 0.1149 0.1181 0.1185 0.1131 0.1152 0.1135 0.1143 0.1145 0.1144
T+3 0.1023 0.1157 0.1198 0.1204 0.1132 0.1162 0.1138 0.1149 0.1152 0.1151
T+4 0.1061 0.1165 0.1214 0.1221 0.1134 0.1172 0.1142 0.1155 0.1159 0.1158
T+5 0.1063 0.1173 0.1229 0.1239 0.1136 0.1182 0.1146 0.1161 0.1166 0.1165
T+6 0.0897 0.1181 0.1244 0.1255 0.1137 0.1191 0.1149 0.1167 0.1173 0.1172
T+7 0.0841 0.1189 0.1257 0.1270 0.1139 0.1201 0.1153 0.1173 0.1179 0.1178
T+8 0.0793 0.1196 0.1270 0.1285 0.1141 0.1210 0.1157 0.1179 0.1186 0.1185
T+9 0.0611 0.1204 0.1283 0.1299 0.1142 0.1218 0.1160 0.1185 0.1193 0.1191
T+10 0.0649 0.1211 0.1295 0.1312 0.1144 0.1227 0.1164 0.1190 0.1199 0.1197
11. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
11
Table 2.Mean square error of forecasted values
Error
rates
Std Norm SNorm GED SSTD SGED NIG GHYP JSU
T+1 0.00000 0.00000 0.00000 0.00000 0.00000 0.00000 0.00000 0.00000 0.00000
T+2 0.00000 0.00001 0.00002 0.00000 0.00000 0.00000 0.00000 0.00000 0.00000
T+3 0.00018 0.00031 0.00033 0.00012 0.00020 0.00013 0.00016 0.00017 0.00017
T+4 0.00011 0.00023 0.00026 0.00005 0.00012 0.00007 0.00009 0.00010 0.00009
T+5 0.00012 0.00028 0.00031 0.00005 0.00014 0.00007 0.00010 0.00011 0.00010
T+6 0.00081 0.00120 0.00128 0.00058 0.00087 0.00064 0.00073 0.00076 0.00076
T+7 0.00121 0.00173 0.00184 0.00089 0.00129 0.00097 0.00110 0.00114 0.00114
T+8 0.00163 0.00228 0.00242 0.00121 0.00173 0.00132 0.00149 0.00154 0.00153
T+9 0.00351 0.00452 0.00474 0.00282 0.00369 0.00302 0.00329 0.00338 0.00337
T+10 0.00316 0.00418 0.00440 0.00245 0.00334 0.00265 0.00293 0.00303 0.00301
6.1 Results Comparison
There has been a tremendous research in GARCH models in volatility forecasting, all these
forecasting researches are based on usage of different GARCH techniques such as EGARCH,
TGARCH and GARCH with maximum likelihood estimation. According to Ghulam Ali [2013]
[16] results on Garch model suggests that with wider tail distribution, the TGARCH model is
reasonable for explaining the data. GARCH with GED distributions have comparative advantage
over GARCH with normal distribution. The research paper of Yan Goa [2012] [17] compares
various GARCH techniques and the observation according to that paper are as follows: GED-
GARCH model is better than t-GARCH, and t-GARCH is better than N-GARCH. Yiannis
Dendramis [5] the GARCH model performance in stocks suggests that the skewed t-student and
GED distributions constitute excellent tools in modeling distribution features of asset returns.
According to Abu Hassan [18] [2009] suggest among GARCH with norm and s-norm and t-norm
distributions t norms outperforms the other distribution models.
The results in this paper also suggests that the GARCH model with GED distributions have minimal
mean square errors and it outperforms other GARCH distribution models.
7. Summary
The aim of this research work is to forecast volatility with high accuracy using different
distributions of Garch techniques. This paper uses SP 500 indices stock market end of the day data
for a period of 10 years for volatility forecasting. The volatility was calculated using standard
deviation of returns over period of time. The volatility was given as input for Garch techniques with
different distribution parameter. The research work uses 9 Garch distribution models that are used
to forecast the volatility. The different GARCH distribution models used in this paper are Std,
Norm, SNorm, GED, SSTD, SGED, NIG, GHYP and JSU. Future values of Volatility are
forecasted for 10 days in advance and values are compared with the actual values to find out the
best distribution model. Based on the results obtained it has been observed that GARCH with GED
distribution models predicts volatility with least error compared to other models. The future work of
this research can be the application of Hybrid distribution models to forecast volatility. The Hybrid
distribution models may involve combining two or more techniques to improve the results.
12. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
12
Table. 1 contains the actual volatility values for the next 10 days recoded after the values are
obtained. The Table.2 contains the MSE rate of the forecasting techniques and the actual values, the
result suggest GARCH with GED distribution models predicts volatility more accuracy compared to
other techniques. This technique is highlighted in bold as shown in the tables.
REFERENCES
[1] Francesco Audrino and Yujia Hu, “Volatility Forecasting: Downside Risk, Jumps and Leverage
Effect” , Econometrics Journal, Vol 4, Issue 8, 23 February, 2016.
[2] Momtchil Dojarliev and Wolfgang Polasek, “ Volatility Forecasts and Value at Risk Evaluation for
the MSCI North American Index”, CMUP, 2014.
[3] Karunanithy Banumathy and Ramachandran Azhagaiah, “Modelling Stock Market Volatility:
Evidence from India”, Managing Global Transitions, Vol 13, Issue 1, pp. 27–42, 2015.
[4] Amadeus Wennström, “Volatility Forecasting Performance: Evaluation of GARCH type volatility
models on Nordic equity indices”, June 11 2014.
[5] Yiannis Dendramis, Giles E Spunginy and Elias Tzavalis, “Forecasting VaR models under di¤erent
volatility processes and distributions of return innovations”, November 2012.
[6] ChaiwatKosapattarapim, Yan-Xia LinandMichael McCrae, “Evaluating the volatility forecasting
performance of best fitting GARCH models in emerging Asian stock markets” Centre for Statistical
& Survey Methodology,Faculty of Engineering and Information Sciences, 2011.
[7] Anders Wilhelmsson, “Garch Forecasting Performance under Different Distribution Assumptions”,
Journal of Forecasting, Vol 25, pp. 561–578, 2006.
[8] PengfeiSun and Chen Zhou, “Diagnosing the Distribution of GARCH Innovations”, Online Journal,
07 May 2013.
[9] Fernando Perez-Cruz, Julio A Afonso-Rodriguez and Javier Giner, “Estimating GARCH models
using support vector machines”,Institute of Physics Publishing, Quantitative Finance Volume 3
(2003) 1–10.
[10] Jean-Philippe Peters, “Estimating and forecasting volatility of stock indices using asymmetric
GARCH models and (Skewed) Student-t densities”, Universit´e de Li`ege, Boulevard du Rectorat,7,
B-4000 Li`ege, Belgium.
[11] TimoTeräsvirta,“ An Introduction to Univariate GARCH Models”, Economics and Finance, No. 646,
December 7, 2006
[12] Peter R. Hansen and AsgerLunde, “A Forecast Comparison Of Volatility Models: Does Anything
Beat A Garch(1,1)?”, Journal Of Applied Econometrics, Vol 20, pp 873-889, 2005.
[13] Robert Engle, “GARCH 101: The Use of ARCH/GARCH Models in Applied Econometrics”, journal
of Economic Perspectives, Vol 15, No 4, pp, 157, 2001.
[14] Luc Bauwens,ASebastien Laurent and Jeroen V. K. Romboutsa, “Multivariate Garch Models: A
Survey” Journal Of Applied Econometrics, Vol 21, pp. 79–109, 2006.
13. International Journal of Soft Computing, Mathematics and Control (IJSCMC), Vol. 5, No. 2/3, August 2016
13
[15] AricLaBarr, “Volatility Estimation through ARCH/GARCH Modeling” Institute for Advanced
Analytics North Carolina State University, 2014.
[11] David Ardiaa, LennartHoogerheideb, “GARCH Models for Daily Stock Returns: Impact of
Estimation Frequency on Value-at-Risk and Expected Shortfall Forecasts”, Tinbergen Institute
Discussion Paper, Vol 3, pp. 47, 2013.
[12] Jin-ChuanDuan, “The Garch Option Pricing Model”, Mathematical Finance, Vol 5, No 1, pp. 13-32,
1992.
[13] Da Huang, Hansheng Wang And Qiwei Yao, “Estimating GARCH models: when to use
what?”,Econometrics Journal, Vol 11, pp. 27–38, 2008.
[14] Ivo Jánský and Milan Rippel, “Value at Risk forecasting with the ARMA-GARCH family of models
in times of increased volatility”, Institute of Economic Studies, Faculty of Social Sciences Charles
University in Prague, 2007.
[15] Christian Schittenkopf, Georg Dorffner and Engelbert J. Dockner, “Forecasting Time-dependent
Conditional Densities: A Semi-nonparametric Neural Network Approach, Journal of Forecasting,
Vol 8, pp. 244-263, 1996.
[16] Ghulam Ali, “EGARCH, GJR-GARCH, TGARCH, AVGARCH, NGARCH, IGARCH and
APARCH Models for Pathogens at Marine Recreational Sites”, Journal of Statistical and
Econometric Methods, vol. 2, no.3, 2013, 57-73.
[17] Yan Gao, Chengjun Zhang and Liyan Zhang, “Comparison of GARCH Models based on Different
Distributions, Journal Of Computers, Vol. 7, NO. 8, August 2012, pp 1967-73.
[18] Abu Hassan Shaari Mohd Nor, Ahmad Shamiri & Zaidi Isa, “Comparing the Accuracy of Density
Forecasts from Competing GARCH Models”, Sains Malaysiana 38(1)(2009): 109–118.