This document summarizes a study that empirically models the monthly Treasury bill rates in Ghana from 1998 to 2012. Specifically, it models the rates of the 91-day and 182-day Treasury bills using ARIMA models. For the 91-day bills, the ARIMA 3,1,1 model provided the best fit with a log likelihood value of -328.58. For the 182-day bills, the ARIMA 1,1,0 model fit best with a log likelihood value of -356.50. Residual tests on both models showed the residuals were free from heteroscedasticity and serial correlation. The study aims to determine appropriate time series models for predicting and forecasting future Treasury bill rates in
This document discusses the evolution of research on the Efficient Market Hypothesis (EMH) in finance. It begins by outlining the three forms of market efficiency put forth in EMH. It then describes how Mandelbrot and others challenged EMH by finding long-term dependence and non-linear relationships in asset price movements, contrary to EMH assumptions of randomness. The document outlines Mandelbrot's rescaled range statistical technique and discusses how later researchers used non-linearity tests to further analyze non-random patterns in markets. It questions the validity of conventional linear tests used to support EMH and argues considering non-linearity is crucial to understanding market efficiency.
Improving Returns from the Markowitz Model using GA- AnEmpirical Validation o...idescitation
Portfolio optimization is the task of allocating the investors capital among
different assets in such a way that the returns are maximized while at the same time, the
risk is minimized. The traditional model followed for portfolio optimization is the
Markowitz model [1], [2],[3]. Markowitz model, considering the ideal case of linear
constraints, can be solved using quadratic programming, however, in real-life scenario, the
presence of nonlinear constraints such as limits on the number of assets in the portfolio, the
constraints on budgetary allocation to each asset class, transaction costs and limits to the
maximum weightage that can be assigned to each asset in the portfolio etc., this problem
becomes increasingly computationally difficult to solve, ie NP-hard. Hence, soft computing
based approaches seem best suited for solving such a problem. An attempt has been made in
this study to use soft computing technique (specifically, Genetic Algorithms), to overcome
this issue. In this study, Genetic Algorithm (GA) has been used to optimize the parameters
of the Markowitz model such that overall portfolio returns are maximized with the standard
deviation of the returns being minimized at the same time. The proposed system is validated
by testing its ability to generate optimal stock portfolios with high returns and low standard
deviations with the assets drawn from the stocks traded on the Bombay Stock Exchange
(BSE). Results show that the proposed system is able to generate much better portfolios
when compared to the traditional Markowitz model.
VOLATILITY FORECASTING - A PERFORMANCE MEASURE OF GARCH TECHNIQUES WITH DIFFE...ijscmcj
Volatility Forecasting is an interesting challenging topic in current financial instruments as it is directly associated with profits. There are many risks and rewards directly associated with volatility. Hence forecasting volatility becomes most dispensable topic in finance. The GARCH distributions play an important role in the risk measurement and option pricing. The min motive of this paper is to measure the performance of GARCH techniques for forecasting volatility by using different distribution model. We have used 9 variations in distribution models that are used to forecast the volatility of a stock entity. The different GARCH distribution models observed in this paper are Std, Norm, SNorm, GED, SSTD, SGED, NIG, GHYP and JSU. Volatility is forecasted for 10 days in advance and values are compared with the actual values to find out the best distribution model for volatility forecast. From the results obtain it has been observed that GARCH with GED distribution models has outperformed all models.
Volatility Forecasting - A Performance Measure of Garch Techniques With Diffe...ijscmcj
Volatility Forecasting is an interesting challengingtopicin current financial instruments as it is directly associated with profits. There are many risks and rewards directly associated with volatility. Hence forecasting volatility becomes most dispensable topic in finance. The GARCH distributionsplay an import ant role in the risk measurement a nd option pricing. T heminmotiveof this paper is tomeasure the performance of GARCH techniques for forecasting volatility by using different distribution model. We have used 9 variations in distribution models that are used to forecast t he volatility of a stock entity. Thedifferent GARCH
distribution models observed in this paper are Std, Norm, SNorm,GED, SSTD, SGED, NIG, GHYP and JSU.Volatility is forecasted for 10 days in dvance andvalues are compared with the actual values to find out the best distribution model for volatility forecast. From the results obtain it has been observed that GARCH withGED distribution models has outperformed all models
This binary classification model uses logistic regression to predict customer credit risk and maximize profit. The dataset was merged and cleaned. The dependent variable was transformed into a binary variable indicating good or bad credit risk. Some variables had coded non-numeric values that were replaced. The model was developed using variable transformations and selections to create a profitable clustering of customers.
This document discusses methods for clustering time series data in a way that allows the cluster structure to change over time. It begins by introducing the problem and defining relevant terms. It then provides spectral clustering as a preliminary benchmark approach before exploring an alternative method using triangular potentials within a graphical model framework. The document presents the proposed method and provides illustrative examples and discussion of extensions.
The document discusses supervised versus unsupervised discretization methods for transforming variables in cluster analysis models. It finds that unsupervised, or SAS-defined, transformations generally result in more profitable models compared to supervised, or user-defined, transformations. However, the most profitable transformations can be complex and difficult to explain. There is a tradeoff between profitability and interpretability, known as the "cost of simplicity." The document analyzes different variable transformations applied to a credit risk prediction model to determine which balance of profit and explanation is most appropriate.
The document presents a model for estimating exposure at default (EAD) for contingent credit lines (CCLs) at the portfolio level. It models each CCL as a portfolio of put options, with the exercise of each put following a Poisson process. The model convolutes the usage distributions of individual obligors, sub-segments, and segments to estimate the portfolio-level EAD distribution. The authors test the model using data from Moody's and find near-Gaussian results. They discuss future work to refine the model and make it more practical for banks to estimate regulatory capital requirements.
This document discusses the evolution of research on the Efficient Market Hypothesis (EMH) in finance. It begins by outlining the three forms of market efficiency put forth in EMH. It then describes how Mandelbrot and others challenged EMH by finding long-term dependence and non-linear relationships in asset price movements, contrary to EMH assumptions of randomness. The document outlines Mandelbrot's rescaled range statistical technique and discusses how later researchers used non-linearity tests to further analyze non-random patterns in markets. It questions the validity of conventional linear tests used to support EMH and argues considering non-linearity is crucial to understanding market efficiency.
Improving Returns from the Markowitz Model using GA- AnEmpirical Validation o...idescitation
Portfolio optimization is the task of allocating the investors capital among
different assets in such a way that the returns are maximized while at the same time, the
risk is minimized. The traditional model followed for portfolio optimization is the
Markowitz model [1], [2],[3]. Markowitz model, considering the ideal case of linear
constraints, can be solved using quadratic programming, however, in real-life scenario, the
presence of nonlinear constraints such as limits on the number of assets in the portfolio, the
constraints on budgetary allocation to each asset class, transaction costs and limits to the
maximum weightage that can be assigned to each asset in the portfolio etc., this problem
becomes increasingly computationally difficult to solve, ie NP-hard. Hence, soft computing
based approaches seem best suited for solving such a problem. An attempt has been made in
this study to use soft computing technique (specifically, Genetic Algorithms), to overcome
this issue. In this study, Genetic Algorithm (GA) has been used to optimize the parameters
of the Markowitz model such that overall portfolio returns are maximized with the standard
deviation of the returns being minimized at the same time. The proposed system is validated
by testing its ability to generate optimal stock portfolios with high returns and low standard
deviations with the assets drawn from the stocks traded on the Bombay Stock Exchange
(BSE). Results show that the proposed system is able to generate much better portfolios
when compared to the traditional Markowitz model.
VOLATILITY FORECASTING - A PERFORMANCE MEASURE OF GARCH TECHNIQUES WITH DIFFE...ijscmcj
Volatility Forecasting is an interesting challenging topic in current financial instruments as it is directly associated with profits. There are many risks and rewards directly associated with volatility. Hence forecasting volatility becomes most dispensable topic in finance. The GARCH distributions play an important role in the risk measurement and option pricing. The min motive of this paper is to measure the performance of GARCH techniques for forecasting volatility by using different distribution model. We have used 9 variations in distribution models that are used to forecast the volatility of a stock entity. The different GARCH distribution models observed in this paper are Std, Norm, SNorm, GED, SSTD, SGED, NIG, GHYP and JSU. Volatility is forecasted for 10 days in advance and values are compared with the actual values to find out the best distribution model for volatility forecast. From the results obtain it has been observed that GARCH with GED distribution models has outperformed all models.
Volatility Forecasting - A Performance Measure of Garch Techniques With Diffe...ijscmcj
Volatility Forecasting is an interesting challengingtopicin current financial instruments as it is directly associated with profits. There are many risks and rewards directly associated with volatility. Hence forecasting volatility becomes most dispensable topic in finance. The GARCH distributionsplay an import ant role in the risk measurement a nd option pricing. T heminmotiveof this paper is tomeasure the performance of GARCH techniques for forecasting volatility by using different distribution model. We have used 9 variations in distribution models that are used to forecast t he volatility of a stock entity. Thedifferent GARCH
distribution models observed in this paper are Std, Norm, SNorm,GED, SSTD, SGED, NIG, GHYP and JSU.Volatility is forecasted for 10 days in dvance andvalues are compared with the actual values to find out the best distribution model for volatility forecast. From the results obtain it has been observed that GARCH withGED distribution models has outperformed all models
This binary classification model uses logistic regression to predict customer credit risk and maximize profit. The dataset was merged and cleaned. The dependent variable was transformed into a binary variable indicating good or bad credit risk. Some variables had coded non-numeric values that were replaced. The model was developed using variable transformations and selections to create a profitable clustering of customers.
This document discusses methods for clustering time series data in a way that allows the cluster structure to change over time. It begins by introducing the problem and defining relevant terms. It then provides spectral clustering as a preliminary benchmark approach before exploring an alternative method using triangular potentials within a graphical model framework. The document presents the proposed method and provides illustrative examples and discussion of extensions.
The document discusses supervised versus unsupervised discretization methods for transforming variables in cluster analysis models. It finds that unsupervised, or SAS-defined, transformations generally result in more profitable models compared to supervised, or user-defined, transformations. However, the most profitable transformations can be complex and difficult to explain. There is a tradeoff between profitability and interpretability, known as the "cost of simplicity." The document analyzes different variable transformations applied to a credit risk prediction model to determine which balance of profit and explanation is most appropriate.
The document presents a model for estimating exposure at default (EAD) for contingent credit lines (CCLs) at the portfolio level. It models each CCL as a portfolio of put options, with the exercise of each put following a Poisson process. The model convolutes the usage distributions of individual obligors, sub-segments, and segments to estimate the portfolio-level EAD distribution. The authors test the model using data from Moody's and find near-Gaussian results. They discuss future work to refine the model and make it more practical for banks to estimate regulatory capital requirements.
This document discusses portfolio optimization and different algorithms used to solve portfolio optimization problems. It begins by formulating the unconstrained and constrained portfolio optimization problems. For the unconstrained problem, it uses quadratic programming to generate the efficient frontier. For the constrained problem, it uses mixed integer quadratic programming and heuristic algorithms like genetic algorithm, tabu search and simulated annealing. It compares the results of these different algorithms and concludes some perform better than others in terms of accuracy and time complexity for portfolio optimization problems with constraints.
Bid and Ask Prices Tailored to Traders' Risk Aversion and Gain Propension: a ...Waqas Tariq
Risky asset bid and ask prices “tailored” to the risk-aversion and the gain-propension of the traders are set up. They are calculated through the principle of the Extended Gini premium, a standard method used in non-life insurance. Explicit formulae for the most common stochastic distributions of risky returns, are calculated. Sufficient and necessary conditions for successful trading are also discussed.
The document summarizes several improved algorithms that aim to address the drawbacks of the Apriori algorithm for association rule mining. It discusses six different approaches: 1) An intersection and record filter approach that counts candidate support only in transactions of sufficient length and uses set intersection; 2) An approach using set size and frequency to prune insignificant candidates; 3) An approach that reduces the candidate set and memory usage by only searching frequent itemsets once to delete candidates; 4) A partitioning approach that divides the database; 5) An approach using vertical data format to reduce database scans; and 6) A distributed approach to parallelize the algorithm across machines.
This document summarizes a research paper that proposes an improved genetic algorithm called IGACRA to solve the winner determination problem in combinatorial reverse auctions more efficiently than the authors' previously proposed GACRA algorithm. IGACRA uses only one repair function to repair infeasible chromosomes, whereas GACRA used two repair functions. The authors conducted experiments comparing IGACRA and GACRA, finding that IGACRA achieved winner determination with less processing time and lower procurement costs.
Garch Models in Value-At-Risk Estimation for REITIJERDJOURNAL
Abstract:- In this study we investigate volatility forecasting of REIT, from January 03, 2007 to November 18, 2016, using four GARCH models (GARCH, EGARCH, GARCH-GJR and APARCH). We examine the performance of these GARCH-type models respectively and backtesting procedures are also conducted to analyze the model adequacy. The empirical results display that when we take estimation of volatility in REIT into account, the EGARCH model, the GARCH-GJR model, and the APARCH model are adequate. Among all these models, GARCH-GJR model especially outperforms others.
This document introduces stochastic processes that can help model risk factors in risk management. It discusses processes for modeling fat tails and mean reversion, including geometric Brownian motion, GARCH models, jump diffusion models, variance gamma processes, Vasicek models, CIR models, and combinations of mean reversion and fat tails. The document provides an overview of these processes and guidance on selecting an appropriate initial model based on analyzing a time series for stationarity, autoregressive properties, and fat tails. The goal is to select a simple practical model to begin quantitative risk analysis.
A Deterministic Inventory Model for Perishable Items with Price Sensitive Qua...IJMERJOURNAL
ABSTRACT: An inventory system is considered with trended demand which is assumed to be a function of price and time dependent quadratic demand. For minimizing the total cost of the inventory system, it is assumed that the deterioration rate is constant and the supplier offers the retailer a credit period to settle the account of the procurement units. To solve the model it is further assumed that shortages are not allowed. Salvage value is also considered and observed its effect on the total cost. A numerical example is given to test the strength of the model. Critical parameters are identified by studying the sensitivity of the system.
This document summarizes key concepts from the book "Active Portfolio Management" by Richard C. Grinold and Ronald N. Kahn.
It introduces the foundations of active portfolio management including risk, expected returns, benchmarks, value added, and the information ratio. The information ratio measures the expected level of annual residual return per unit of annual residual risk and defines the opportunities available to the active manager. Higher information ratios indicate greater potential for adding value through active management.
It also discusses concepts like consensus expected returns as defined by the CAPM model, decomposing returns into market, residual and exceptional components, and managing total risk versus focusing on active and residual risk relative to a benchmark. The goal of active management is
This document provides a critical review of the 1996 paper "The Conditional CAPM and the Cross-Section of Expected Returns" by Jagannathan and Wang. The review summarizes the key findings of the original paper, which showed that conditional CAPM can explain the cross-sectional variation in stock returns better than static CAPM. However, the review also notes some limitations in the assumptions around time-varying betas and use of R-squared. Overall, it evaluates the original paper as influential but also discusses subsequent research that built on its findings or identified weaknesses.
This document discusses forecasting covariance matrices using the Dynamic Conditional Correlation (DCC) GARCH model. It begins with an overview of univariate GARCH models and the GARCH(1,1) specification. It then introduces the DCC model, which models the conditional covariance matrix indirectly through the conditional correlation matrix. The document evaluates how forecasts from the DCC model perform compared to a covariance matrix based only on historical data. It presents an empirical application comparing the two approaches using different datasets. The conclusion discusses how the DCC model tends to outperform the historical covariance matrix in the short-run but the reverse is true in the long-run.
This document provides an overview of statistical arbitrage (SA) strategies and their application to 130/30 products using synthetic equity index swaps. It begins by discussing the efficient market hypothesis and how SA circumvents some of its limitations. It then describes various SA trading strategies, including pairs trading, stochastic spread approaches, cointegration approaches, high frequency strategies, and behavioral strategies. The document applies some of these SA strategies to 130/30 products using synthetic index swaps and finds they can improve the risk-return profile of active equity management.
Fuzzy Inventory Model for Constantly Deteriorating Items with Power Demand an...iosrjce
IOSR Journal of Mathematics(IOSR-JM) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of mathemetics and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in mathematics. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
The Use of ARCH and GARCH Models for Estimating and Forecasting Volatility-ru...Ismet Kale
This document discusses volatility modeling using ARCH and GARCH models. It first provides background on ARCH and GARCH models, noting they were developed to model characteristics of financial time series data like volatility clustering and fat tails. It then describes the specific ARCH and GARCH models that will be used in the study, including the ARCH, GARCH, EGARCH, GJR, APARCH, IGARCH, FIGARCH and FIAPARCH models. The document aims to apply these models to daily stock index data from the IMKB 100 to analyze and forecast volatility, and better understand risk in the Turkish market.
Statistical arbitrage strategies attempt to profit from short-term price discrepancies between similar securities. Common statistical arbitrage strategies used by hedge funds include pairs trading, which involves buying an underperforming stock in a pair and short selling the overperforming stock, and multi-factor models that select stocks based on correlations to identified market factors. Other strategies include mean reversion trading, which bets that stock prices will revert to their average value, and cointegration, which tracks indexes and uses optimized portfolios to generate returns from spreads between enhanced and basic indexes.
IRJET- Financial Analysis using Data MiningIRJET Journal
The document discusses using three machine learning algorithms - K-nearest neighbors (KNN), rule-based classification, and deep learning - to predict if the NASDAQ stock market will increase in a given month. KNN achieved an accuracy of 71.28%, rule-based classification achieved 74.49% accuracy, and deep learning achieved the highest accuracy of 76.03%. Therefore, the document concludes deep learning is best suited for this stock market prediction task.
1) This paper proposes an adaptive quasi-maximum likelihood estimation approach for GARCH models when the distribution of volatility data is unspecified or heavy-tailed.
2) The approach works by using a scale parameter ηf to identify the discrepancy between the wrongly specified innovation density and the true innovation density.
3) Simulation studies and an application show that the adaptive approach gains better efficiency compared to other methods, especially when the innovation error is heavy-tailed.
This document discusses a case study that analyzed over 6,400 rules for trading the S&P 500 using data mining techniques. It describes how data mining bias can lead to overstating a rule's expected future performance. The case study used statistical inference tests like White's reality check and Masters' Monte-Carlo permutation method to minimize this bias. It details the various rule types analyzed, including trends, extremes/transitions, and divergence. Input data series included raw time series, indicators, and other preprocessed data. The goal was to identify rules with genuine predictive power and evaluate their statistical and practical significance.
This document summarizes a case study analyzing rules for mining data from the S&P 500 stock market index. It discusses potential biases in backtesting rules to select superior performers and statistical methods to minimize these biases. Specific topics covered include data mining biases, techniques to avoid data snooping bias by splitting samples, defining the case study statistically, transforming data series into market positions with rules, constructing technical analysis indicators from price and volume data, and categories of rules examined including trends, extremes/transitions, and divergence.
A predictive model for monthly currency in circulation in ghanaAlexander Decker
This document presents a predictive model for monthly currency in circulation in Ghana. The researchers used seasonal autoregressive integrated moving average (SARIMA) modeling to analyze secondary data on Ghana's monthly currency in circulation from 2000 to 2011. They found that an SARIMA (0,1,1)(0,1,1)12 model provided the best fit for the data based on having the lowest AIC, AICc, and BIC values. Diagnostic tests confirmed the model adequately represented the data and was free of autocorrelation and heteroscedasticity. Therefore, the researchers proposed this SARIMA model for predicting currency in circulation in Ghana in the future.
A Fuzzy Arithmetic Approach for Perishable Items in Discounted Entropic Order...Waqas Tariq
This paper uses fuzzy arithmetic approach to the system cost for perishable items with instant deterioration for the discounted entropic order quantity model. Traditional crisp system cost observes that some costs may belong to the uncertain factors. It is necessary to extend the system cost to treat also the vague costs. We introduce a new concept which we call entropy and show that the total payoff satisfies the optimization property. We show how special case of this problem reduce to perfect results, and how post deteriorated discounted entropic order quantity model is a generalization of optimization. It has been imperative to demonstrate this model by analysis, which reveals important characteristics of discounted structure. Further numerical experiments are conducted to evaluate the relative performance between the fuzzy and crisp cases in EnOQ and EOQ separately.
This document proposes a stock market forecasting system that uses both a Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) model and a decision tree algorithm. The GARCH model is used to predict stock prices and their volatility over time. A decision tree algorithm is then applied to optimize the GARCH model by reducing errors and false predictions. The decision tree assigns weights to parameters like earnings per share, sales revenue, and trading volume to classify the quality of the input data. This combined GARCH and decision tree approach aims to more accurately forecast stock market movements and prices.
Measuring the volatility in ghana’s gross domestic product (gdp) rate using t...Alexander Decker
This document summarizes a study that analyzed volatility in Ghana's GDP growth rate using GARCH models. The study found that GDP volatility exhibited characteristics like clustering and leverage effects. A GARCH(1,1) model provided a reasonably good fit to quarterly GDP data. Volatility and leverage effects were found to have significantly increased. The best fitting models for GDP volatility were ARIMA(1,1,1)(0,0,1)12 and ARIMA(1,1,2)(0,0,1)12 models.
This document discusses portfolio optimization and different algorithms used to solve portfolio optimization problems. It begins by formulating the unconstrained and constrained portfolio optimization problems. For the unconstrained problem, it uses quadratic programming to generate the efficient frontier. For the constrained problem, it uses mixed integer quadratic programming and heuristic algorithms like genetic algorithm, tabu search and simulated annealing. It compares the results of these different algorithms and concludes some perform better than others in terms of accuracy and time complexity for portfolio optimization problems with constraints.
Bid and Ask Prices Tailored to Traders' Risk Aversion and Gain Propension: a ...Waqas Tariq
Risky asset bid and ask prices “tailored” to the risk-aversion and the gain-propension of the traders are set up. They are calculated through the principle of the Extended Gini premium, a standard method used in non-life insurance. Explicit formulae for the most common stochastic distributions of risky returns, are calculated. Sufficient and necessary conditions for successful trading are also discussed.
The document summarizes several improved algorithms that aim to address the drawbacks of the Apriori algorithm for association rule mining. It discusses six different approaches: 1) An intersection and record filter approach that counts candidate support only in transactions of sufficient length and uses set intersection; 2) An approach using set size and frequency to prune insignificant candidates; 3) An approach that reduces the candidate set and memory usage by only searching frequent itemsets once to delete candidates; 4) A partitioning approach that divides the database; 5) An approach using vertical data format to reduce database scans; and 6) A distributed approach to parallelize the algorithm across machines.
This document summarizes a research paper that proposes an improved genetic algorithm called IGACRA to solve the winner determination problem in combinatorial reverse auctions more efficiently than the authors' previously proposed GACRA algorithm. IGACRA uses only one repair function to repair infeasible chromosomes, whereas GACRA used two repair functions. The authors conducted experiments comparing IGACRA and GACRA, finding that IGACRA achieved winner determination with less processing time and lower procurement costs.
Garch Models in Value-At-Risk Estimation for REITIJERDJOURNAL
Abstract:- In this study we investigate volatility forecasting of REIT, from January 03, 2007 to November 18, 2016, using four GARCH models (GARCH, EGARCH, GARCH-GJR and APARCH). We examine the performance of these GARCH-type models respectively and backtesting procedures are also conducted to analyze the model adequacy. The empirical results display that when we take estimation of volatility in REIT into account, the EGARCH model, the GARCH-GJR model, and the APARCH model are adequate. Among all these models, GARCH-GJR model especially outperforms others.
This document introduces stochastic processes that can help model risk factors in risk management. It discusses processes for modeling fat tails and mean reversion, including geometric Brownian motion, GARCH models, jump diffusion models, variance gamma processes, Vasicek models, CIR models, and combinations of mean reversion and fat tails. The document provides an overview of these processes and guidance on selecting an appropriate initial model based on analyzing a time series for stationarity, autoregressive properties, and fat tails. The goal is to select a simple practical model to begin quantitative risk analysis.
A Deterministic Inventory Model for Perishable Items with Price Sensitive Qua...IJMERJOURNAL
ABSTRACT: An inventory system is considered with trended demand which is assumed to be a function of price and time dependent quadratic demand. For minimizing the total cost of the inventory system, it is assumed that the deterioration rate is constant and the supplier offers the retailer a credit period to settle the account of the procurement units. To solve the model it is further assumed that shortages are not allowed. Salvage value is also considered and observed its effect on the total cost. A numerical example is given to test the strength of the model. Critical parameters are identified by studying the sensitivity of the system.
This document summarizes key concepts from the book "Active Portfolio Management" by Richard C. Grinold and Ronald N. Kahn.
It introduces the foundations of active portfolio management including risk, expected returns, benchmarks, value added, and the information ratio. The information ratio measures the expected level of annual residual return per unit of annual residual risk and defines the opportunities available to the active manager. Higher information ratios indicate greater potential for adding value through active management.
It also discusses concepts like consensus expected returns as defined by the CAPM model, decomposing returns into market, residual and exceptional components, and managing total risk versus focusing on active and residual risk relative to a benchmark. The goal of active management is
This document provides a critical review of the 1996 paper "The Conditional CAPM and the Cross-Section of Expected Returns" by Jagannathan and Wang. The review summarizes the key findings of the original paper, which showed that conditional CAPM can explain the cross-sectional variation in stock returns better than static CAPM. However, the review also notes some limitations in the assumptions around time-varying betas and use of R-squared. Overall, it evaluates the original paper as influential but also discusses subsequent research that built on its findings or identified weaknesses.
This document discusses forecasting covariance matrices using the Dynamic Conditional Correlation (DCC) GARCH model. It begins with an overview of univariate GARCH models and the GARCH(1,1) specification. It then introduces the DCC model, which models the conditional covariance matrix indirectly through the conditional correlation matrix. The document evaluates how forecasts from the DCC model perform compared to a covariance matrix based only on historical data. It presents an empirical application comparing the two approaches using different datasets. The conclusion discusses how the DCC model tends to outperform the historical covariance matrix in the short-run but the reverse is true in the long-run.
This document provides an overview of statistical arbitrage (SA) strategies and their application to 130/30 products using synthetic equity index swaps. It begins by discussing the efficient market hypothesis and how SA circumvents some of its limitations. It then describes various SA trading strategies, including pairs trading, stochastic spread approaches, cointegration approaches, high frequency strategies, and behavioral strategies. The document applies some of these SA strategies to 130/30 products using synthetic index swaps and finds they can improve the risk-return profile of active equity management.
Fuzzy Inventory Model for Constantly Deteriorating Items with Power Demand an...iosrjce
IOSR Journal of Mathematics(IOSR-JM) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of mathemetics and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in mathematics. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
The Use of ARCH and GARCH Models for Estimating and Forecasting Volatility-ru...Ismet Kale
This document discusses volatility modeling using ARCH and GARCH models. It first provides background on ARCH and GARCH models, noting they were developed to model characteristics of financial time series data like volatility clustering and fat tails. It then describes the specific ARCH and GARCH models that will be used in the study, including the ARCH, GARCH, EGARCH, GJR, APARCH, IGARCH, FIGARCH and FIAPARCH models. The document aims to apply these models to daily stock index data from the IMKB 100 to analyze and forecast volatility, and better understand risk in the Turkish market.
Statistical arbitrage strategies attempt to profit from short-term price discrepancies between similar securities. Common statistical arbitrage strategies used by hedge funds include pairs trading, which involves buying an underperforming stock in a pair and short selling the overperforming stock, and multi-factor models that select stocks based on correlations to identified market factors. Other strategies include mean reversion trading, which bets that stock prices will revert to their average value, and cointegration, which tracks indexes and uses optimized portfolios to generate returns from spreads between enhanced and basic indexes.
IRJET- Financial Analysis using Data MiningIRJET Journal
The document discusses using three machine learning algorithms - K-nearest neighbors (KNN), rule-based classification, and deep learning - to predict if the NASDAQ stock market will increase in a given month. KNN achieved an accuracy of 71.28%, rule-based classification achieved 74.49% accuracy, and deep learning achieved the highest accuracy of 76.03%. Therefore, the document concludes deep learning is best suited for this stock market prediction task.
1) This paper proposes an adaptive quasi-maximum likelihood estimation approach for GARCH models when the distribution of volatility data is unspecified or heavy-tailed.
2) The approach works by using a scale parameter ηf to identify the discrepancy between the wrongly specified innovation density and the true innovation density.
3) Simulation studies and an application show that the adaptive approach gains better efficiency compared to other methods, especially when the innovation error is heavy-tailed.
This document discusses a case study that analyzed over 6,400 rules for trading the S&P 500 using data mining techniques. It describes how data mining bias can lead to overstating a rule's expected future performance. The case study used statistical inference tests like White's reality check and Masters' Monte-Carlo permutation method to minimize this bias. It details the various rule types analyzed, including trends, extremes/transitions, and divergence. Input data series included raw time series, indicators, and other preprocessed data. The goal was to identify rules with genuine predictive power and evaluate their statistical and practical significance.
This document summarizes a case study analyzing rules for mining data from the S&P 500 stock market index. It discusses potential biases in backtesting rules to select superior performers and statistical methods to minimize these biases. Specific topics covered include data mining biases, techniques to avoid data snooping bias by splitting samples, defining the case study statistically, transforming data series into market positions with rules, constructing technical analysis indicators from price and volume data, and categories of rules examined including trends, extremes/transitions, and divergence.
A predictive model for monthly currency in circulation in ghanaAlexander Decker
This document presents a predictive model for monthly currency in circulation in Ghana. The researchers used seasonal autoregressive integrated moving average (SARIMA) modeling to analyze secondary data on Ghana's monthly currency in circulation from 2000 to 2011. They found that an SARIMA (0,1,1)(0,1,1)12 model provided the best fit for the data based on having the lowest AIC, AICc, and BIC values. Diagnostic tests confirmed the model adequately represented the data and was free of autocorrelation and heteroscedasticity. Therefore, the researchers proposed this SARIMA model for predicting currency in circulation in Ghana in the future.
A Fuzzy Arithmetic Approach for Perishable Items in Discounted Entropic Order...Waqas Tariq
This paper uses fuzzy arithmetic approach to the system cost for perishable items with instant deterioration for the discounted entropic order quantity model. Traditional crisp system cost observes that some costs may belong to the uncertain factors. It is necessary to extend the system cost to treat also the vague costs. We introduce a new concept which we call entropy and show that the total payoff satisfies the optimization property. We show how special case of this problem reduce to perfect results, and how post deteriorated discounted entropic order quantity model is a generalization of optimization. It has been imperative to demonstrate this model by analysis, which reveals important characteristics of discounted structure. Further numerical experiments are conducted to evaluate the relative performance between the fuzzy and crisp cases in EnOQ and EOQ separately.
This document proposes a stock market forecasting system that uses both a Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) model and a decision tree algorithm. The GARCH model is used to predict stock prices and their volatility over time. A decision tree algorithm is then applied to optimize the GARCH model by reducing errors and false predictions. The decision tree assigns weights to parameters like earnings per share, sales revenue, and trading volume to classify the quality of the input data. This combined GARCH and decision tree approach aims to more accurately forecast stock market movements and prices.
Measuring the volatility in ghana’s gross domestic product (gdp) rate using t...Alexander Decker
This document summarizes a study that analyzed volatility in Ghana's GDP growth rate using GARCH models. The study found that GDP volatility exhibited characteristics like clustering and leverage effects. A GARCH(1,1) model provided a reasonably good fit to quarterly GDP data. Volatility and leverage effects were found to have significantly increased. The best fitting models for GDP volatility were ARIMA(1,1,1)(0,0,1)12 and ARIMA(1,1,2)(0,0,1)12 models.
Statistical Arbitrage
Pairs Trading, Long-Short Strategy
Cyrille BEN LEMRID

1 Pairs Trading Model 5
1.1 Generaldiscussion ................................ 5 1.2 Cointegration ................................... 6 1.3 Spreaddynamics ................................. 7
2 State of the art and model overview 9
2.1 StochasticDependenciesinFinancialTimeSeries . . . . . . . . . . . . . . . 9 2.2 Cointegration-basedtradingstrategies ..................... 10 2.3 FormulationasaStochasticControlProblem. . . . . . . . . . . . . . . . . . 13 2.4 Fundamentalanalysis............................... 16
3 Strategies Analysis 19
3.1 Roadmapforstrategydesign .......................... 19 3.2 Identificationofpotentialpairs ......................... 19 3.3 Testingcointegration ............................... 20 3.4 Riskcontrolandfeasibility............................ 20
4 Results
22
2
Contents

Introduction
This report presents my research work carried out at Credit Suisse from May to September 2012. This study has been pursued in collaboration with the Global Arbitrage Strategies team.
Quantitative analysis strategy developers use sophisticated statistical and optimization techniques to discover and construct new algorithms. These algorithms take advantage of the short term deviation from the ”fair” securities’ prices. Pairs trading is one such quantitative strategy - it is a process of identifying securities that generally move together but are currently ”drifting away”.
Pairs trading is a common strategy among many hedge funds and banks. However, there is not a significant amount of academic literature devoted to it due to its proprietary nature. For a review of some of the existing academic models, see [6], [8], [11] .
Our focus for this analysis is the study of two quantitative approaches to the problem of pairs trading, the first one uses the properties of co-integrated financial time series as a basis for trading strategy, in the second one we model the log-relationship between a pair of stock prices as an Ornstein-Uhlenbeck process and use this to formulate a portfolio optimization based stochastic control problem.
This study was performed to show that under certain assumptions the two approaches are equivalent.
Practitioners most often use a fundamentally driven approach, analyzing the performance of stocks around a market event and implement strategies using back-tested trading levels.
We also study an example of a fundamentally driven strategy, using market reaction to a stock being dropped or added to the MSCI World Standard, as a signal for a pair trading strategy on those stocks once their inclusion/exclusion has been made effective.
This report is organized as follows. Section 1 provides some background on pairs trading strategy. The theoretical results are described in Section 2. Section 3
Prediction & analysis of volatility patterns v1.0Anirban Dey
The document summarizes a capstone project analyzing volatility patterns of stock prices. It discusses:
1) The team conducting the project and their industry collaboration with Agrud, a FinTech company.
2) The dataset used, which contains OHLC data from 2012-2017 for Apple, Amazon, Google, and American Airlines.
3) Tools and techniques used in the analysis, including ARIMA, GARCH models, and Excel, R, and SAS software.
4) Key findings that GARCH more accurately predicted volatility over the next month compared to ARIMA.
Time series data are observations collected over time on one or more variables. Time series data can be used to analyze problems involving changes over time, such as stock prices, GDP, and exchange rates. Time series data must be stationary, meaning that its statistical properties like mean and variance do not change over time, to avoid spurious regressions. Non-stationary time series can be transformed to become stationary through differencing, removing trends, or taking logs. Common time series models like ARIMA rely on stationary data.
PERFORMANCE ANALYSIS OF HYBRID FORECASTING MODEL IN STOCK MARKET FORECASTINGIJMIT JOURNAL
This document describes a study that analyzed the performance of a hybrid forecasting model for stock markets. The hybrid model uses measures of concordance like Kendall's Tau to identify patterns in past stock market data that resemble present patterns. Genetic programming is then used to match past trends to present trends and estimate future trends. The model was tested on S&P 500 and NASDAQ index data and found to more accurately forecast prices and outperform an ARIMA model based on lower error metrics like MAPE and RMSE. The hybrid model also achieved better results than another previously proposed model when applied to Apple, IBM, and Dell stock data.
An Enhance PSO Based Approach for Solving Economical Ordered Quantity (EOQ) P...IJMER
The Meta-heuristic approaches can provide a sufficiently good solution to an
optimization problem, especially with incomplete or imperfect information and with lower
computational complexity especially for numerical solutions. This paper presents an enhanced PSO
(Particle Swarm Optimization) technique for solving the same problem. Through the PSO performs well
but it may require some more iteration to converge or sometimes many repetitions for the complex
problems. To overcome these problems an enhanced PSO is presented which utilizes the PSO with
double chaotic maps to perform irregular velocity updates which forces the particles to search greater
space for best global solution. Finally the comparison between both algorithms is performed for the
EOQ problem considering deteriorating items, shortages and partially backlogging. The simulation
results shows that the proposed enhanced PSO converges quickly and found much closer solution then
PSO.
Single period inventory model with stochastic demand and partial backloggingIAEME Publication
This document summarizes an article from the International Journal of Management that discusses inventory models with stochastic demand and partial backlogging. It presents three key points:
1) It develops an approximate closed-form solution for the expected total cost of a single period inventory model where demand is stochastic and follows the SCBZ property.
2) It examines the model through three cases: the first considers holding cost without salvage value; the second varies only the holding cost satisfying the SCBZ property; the third incorporates partial backlogging with time-dependent and time-independent shortage costs.
3) Optimal solutions are derived for each case and numerical examples are provided to validate the models.
Single period inventory model with stochastic demand and partial backloggingIAEME Publication
This document summarizes an article from the International Journal of Management that discusses inventory models with stochastic demand and partial backlogging. It presents three key points:
1) It develops an approximate closed-form solution for the expected total cost of a single period inventory model where demand is stochastic and follows the SCBZ property.
2) It examines the model through three cases: the first considers holding cost without salvage value; the second varies only the holding cost satisfying the SCBZ property; the third incorporates partial backlogging with time-dependent and time-independent shortage costs.
3) Optimal solutions are derived for each case and numerical examples are provided to validate the models.
This document discusses stochastic volatility analysis and time series models for financial data. It introduces GARCH (Generalized Autoregressive Conditional Heteroskedasticity) models, which allow for time-varying volatility in time series analysis. The document outlines properties of financial time series data like volatility clustering and fat tails. It then discusses earlier linear models like ARMA as well as nonlinear ARCH models before focusing on the GARCH family of models for capturing characteristics of financial data like asymmetric volatility responses. Parameter estimation and forecasting with GARCH models is also summarized.
The International Journal of Soft Computing, Mathematics and Control (IJSCMC) is a Quarterly peer-reviewed and refereed open access journal that publishes articles which contribute new results in all areas of Soft Computing, Pure, Applied and Numerical Mathematics and Control. The focus of this new journal is on all theoretical and numerical methods on soft computing, mathematics and control theory with applications in science and industry. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on latest topics of soft computing, pure, applied and numerical mathematics and control engineering, and establishing new collaborations in these areas.
Authors are solicited to contribute to this journal by submitting articles that illustrate new algorithms, theorems, modeling results, research results, projects, surveying works and industrial experiences that describe significant advances in Soft Computing, Mathematics and Control Engineering
VOLATILITY FORECASTING - A PERFORMANCE MEASURE OF GARCH TECHNIQUES WITH DIFFE...ijscmcj
Volatility Forecasting is an interesting challenging topic in current financial instruments as it is directly
associated with profits. There are many risks and rewards directly associated with volatility. Hence
forecasting volatility becomes most dispensable topic in finance. The GARCH distributions play an important
role in the risk measurement and option pricing. The min motive of this paper is to measure the performance
of GARCH techniques for forecasting volatility by using different distribution model. We have used 9
variations in distribution models that are used to forecast the volatility of a stock entity. The different GARCH
distribution models observed in this paper are Std, Norm, SNorm, GED, SSTD, SGED, NIG, GHYP and JSU.
Volatility is forecasted for 10 days in advance and values are compared with the actual values to find out the
best distribution model for volatility forecast. From the results obtain it has been observed that GARCH with
GED distribution models has outperformed all models.
This document discusses quality assurance project management. It provides resources on quality assurance project management forms, tools, and strategies. It also lists quality management KPIs, job descriptions, and interview questions. The document discusses reasons for project failures such as unclear requirements and lack of issue escalation. It then describes quality management tools including check sheets, control charts, Pareto charts, scatter plots, Ishikawa diagrams, and histograms. Finally, it lists additional quality assurance topics such as quality management systems and standards.
Histogram, Pareto Diagram, Ishikawa Diagram, and Control ChartNicola Ergo
The document provides information on various quality control tools including histograms, Pareto diagrams, Ishikawa diagrams, and control charts. Histograms show the distribution of numerical data by frequency. Pareto diagrams highlight the most important factors by showing variables in descending order. Ishikawa diagrams show causes of a problem in a branching diagram format. Control charts graph process data over time to determine if a process is stable or unpredictable through the use of control limits.
The document provides an overview of time series analysis. It discusses key concepts like components of a time series, stationarity, autocorrelation functions, and various forecasting models including AR, MA, ARMA, and ARIMA. It also covers exponential smoothing and how to decompose, validate, and test the accuracy of forecasting models. Examples are given of different time series patterns and how to make non-stationary data stationary.
This document provides an overview of time series analysis and the Box-Jenkins methodology. Time series analysis attempts to model observations over time and identify patterns. The goals are to identify the structure of the time series and forecast future values. The Box-Jenkins methodology involves conditioning the data, selecting a model, estimating parameters, and assessing the model. Autocorrelation (ACF) and partial autocorrelation (PACF) plots are used to identify autoregressive (AR), moving average (MA), and autoregressive integrated moving average (ARIMA) models.
Similar to Modelling the rate of treasury bills in ghana (20)
Abnormalities of hormones and inflammatory cytokines in women affected with p...Alexander Decker
Women with polycystic ovary syndrome (PCOS) have elevated levels of hormones like luteinizing hormone and testosterone, as well as higher levels of insulin and insulin resistance compared to healthy women. They also have increased levels of inflammatory markers like C-reactive protein, interleukin-6, and leptin. This study found these abnormalities in the hormones and inflammatory cytokines of women with PCOS ages 23-40, indicating that hormone imbalances associated with insulin resistance and elevated inflammatory markers may worsen infertility in women with PCOS.
A usability evaluation framework for b2 c e commerce websitesAlexander Decker
This document presents a framework for evaluating the usability of B2C e-commerce websites. It involves user testing methods like usability testing and interviews to identify usability problems in areas like navigation, design, purchasing processes, and customer service. The framework specifies goals for the evaluation, determines which website aspects to evaluate, and identifies target users. It then describes collecting data through user testing and analyzing the results to identify usability problems and suggest improvements.
A universal model for managing the marketing executives in nigerian banksAlexander Decker
This document discusses a study that aimed to synthesize motivation theories into a universal model for managing marketing executives in Nigerian banks. The study was guided by Maslow and McGregor's theories. A sample of 303 marketing executives was used. The results showed that managers will be most effective at motivating marketing executives if they consider individual needs and create challenging but attainable goals. The emerged model suggests managers should provide job satisfaction by tailoring assignments to abilities and monitoring performance with feedback. This addresses confusion faced by Nigerian bank managers in determining effective motivation strategies.
A unique common fixed point theorems in generalized dAlexander Decker
This document presents definitions and properties related to generalized D*-metric spaces and establishes some common fixed point theorems for contractive type mappings in these spaces. It begins by introducing D*-metric spaces and generalized D*-metric spaces, defines concepts like convergence and Cauchy sequences. It presents lemmas showing the uniqueness of limits in these spaces and the equivalence of different definitions of convergence. The goal of the paper is then stated as obtaining a unique common fixed point theorem for generalized D*-metric spaces.
A trends of salmonella and antibiotic resistanceAlexander Decker
This document provides a review of trends in Salmonella and antibiotic resistance. It begins with an introduction to Salmonella as a facultative anaerobe that causes nontyphoidal salmonellosis. The emergence of antimicrobial-resistant Salmonella is then discussed. The document proceeds to cover the historical perspective and classification of Salmonella, definitions of antimicrobials and antibiotic resistance, and mechanisms of antibiotic resistance in Salmonella including modification or destruction of antimicrobial agents, efflux pumps, modification of antibiotic targets, and decreased membrane permeability. Specific resistance mechanisms are discussed for several classes of antimicrobials.
A transformational generative approach towards understanding al-istifhamAlexander Decker
This document discusses a transformational-generative approach to understanding Al-Istifham, which refers to interrogative sentences in Arabic. It begins with an introduction to the origin and development of Arabic grammar. The paper then explains the theoretical framework of transformational-generative grammar that is used. Basic linguistic concepts and terms related to Arabic grammar are defined. The document analyzes how interrogative sentences in Arabic can be derived and transformed via tools from transformational-generative grammar, categorizing Al-Istifham into linguistic and literary questions.
A time series analysis of the determinants of savings in namibiaAlexander Decker
This document summarizes a study on the determinants of savings in Namibia from 1991 to 2012. It reviews previous literature on savings determinants in developing countries. The study uses time series analysis including unit root tests, cointegration, and error correction models to analyze the relationship between savings and variables like income, inflation, population growth, deposit rates, and financial deepening in Namibia. The results found inflation and income have a positive impact on savings, while population growth negatively impacts savings. Deposit rates and financial deepening were found to have no significant impact. The study reinforces previous work and emphasizes the importance of improving income levels to achieve higher savings rates in Namibia.
A therapy for physical and mental fitness of school childrenAlexander Decker
This document summarizes a study on the importance of exercise in maintaining physical and mental fitness for school children. It discusses how physical and mental fitness are developed through participation in regular physical exercises and cannot be achieved solely through classroom learning. The document outlines different types and components of fitness and argues that developing fitness should be a key objective of education systems. It recommends that schools ensure pupils engage in graded physical activities and exercises to support their overall development.
A theory of efficiency for managing the marketing executives in nigerian banksAlexander Decker
This document summarizes a study examining efficiency in managing marketing executives in Nigerian banks. The study was examined through the lenses of Kaizen theory (continuous improvement) and efficiency theory. A survey of 303 marketing executives from Nigerian banks found that management plays a key role in identifying and implementing efficiency improvements. The document recommends adopting a "3H grand strategy" to improve the heads, hearts, and hands of management and marketing executives by enhancing their knowledge, attitudes, and tools.
This document discusses evaluating the link budget for effective 900MHz GSM communication. It describes the basic parameters needed for a high-level link budget calculation, including transmitter power, antenna gains, path loss, and propagation models. Common propagation models for 900MHz that are described include Okumura model for urban areas and Hata model for urban, suburban, and open areas. Rain attenuation is also incorporated using the updated ITU model to improve communication during rainfall.
A synthetic review of contraceptive supplies in punjabAlexander Decker
This document discusses contraceptive use in Punjab, Pakistan. It begins by providing background on the benefits of family planning and contraceptive use for maternal and child health. It then analyzes contraceptive commodity data from Punjab, finding that use is still low despite efforts to improve access. The document concludes by emphasizing the need for strategies to bridge gaps and meet the unmet need for effective and affordable contraceptive methods and supplies in Punjab in order to improve health outcomes.
A synthesis of taylor’s and fayol’s management approaches for managing market...Alexander Decker
1) The document discusses synthesizing Taylor's scientific management approach and Fayol's process management approach to identify an effective way to manage marketing executives in Nigerian banks.
2) It reviews Taylor's emphasis on efficiency and breaking tasks into small parts, and Fayol's focus on developing general management principles.
3) The study administered a survey to 303 marketing executives in Nigerian banks to test if combining elements of Taylor and Fayol's approaches would help manage their performance through clear roles, accountability, and motivation. Statistical analysis supported combining the two approaches.
A survey paper on sequence pattern mining with incrementalAlexander Decker
This document summarizes four algorithms for sequential pattern mining: GSP, ISM, FreeSpan, and PrefixSpan. GSP is an Apriori-based algorithm that incorporates time constraints. ISM extends SPADE to incrementally update patterns after database changes. FreeSpan uses frequent items to recursively project databases and grow subsequences. PrefixSpan also uses projection but claims to not require candidate generation. It recursively projects databases based on short prefix patterns. The document concludes by stating the goal was to find an efficient scheme for extracting sequential patterns from transactional datasets.
A survey on live virtual machine migrations and its techniquesAlexander Decker
This document summarizes several techniques for live virtual machine migration in cloud computing. It discusses works that have proposed affinity-aware migration models to improve resource utilization, energy efficient migration approaches using storage migration and live VM migration, and a dynamic consolidation technique using migration control to avoid unnecessary migrations. The document also summarizes works that have designed methods to minimize migration downtime and network traffic, proposed a resource reservation framework for efficient migration of multiple VMs, and addressed real-time issues in live migration. Finally, it provides a table summarizing the techniques, tools used, and potential future work or gaps identified for each discussed work.
A survey on data mining and analysis in hadoop and mongo dbAlexander Decker
This document discusses data mining of big data using Hadoop and MongoDB. It provides an overview of Hadoop and MongoDB and their uses in big data analysis. Specifically, it proposes using Hadoop for distributed processing and MongoDB for data storage and input. The document reviews several related works that discuss big data analysis using these tools, as well as their capabilities for scalable data storage and mining. It aims to improve computational time and fault tolerance for big data analysis by mining data stored in Hadoop using MongoDB and MapReduce.
1. The document discusses several challenges for integrating media with cloud computing including media content convergence, scalability and expandability, finding appropriate applications, and reliability.
2. Media content convergence challenges include dealing with the heterogeneity of media types, services, networks, devices, and quality of service requirements as well as integrating technologies used by media providers and consumers.
3. Scalability and expandability challenges involve adapting to the increasing volume of media content and being able to support new media formats and outlets over time.
This document surveys trust architectures that leverage provenance in wireless sensor networks. It begins with background on provenance, which refers to the documented history or derivation of data. Provenance can be used to assess trust by providing metadata about how data was processed. The document then discusses challenges for using provenance to establish trust in wireless sensor networks, which have constraints on energy and computation. Finally, it provides background on trust, which is the subjective probability that a node will behave dependably. Trust architectures need to be lightweight to account for the constraints of wireless sensor networks.
This document discusses private equity investments in Kenya. It provides background on private equity and discusses trends in various regions. The objectives of the study discussed are to establish the extent of private equity adoption in Kenya, identify common forms of private equity utilized, and determine typical exit strategies. Private equity can involve venture capital, leveraged buyouts, or mezzanine financing. Exits allow recycling of capital into new opportunities. The document provides context on private equity globally and in developing markets like Africa to frame the goals of the study.
This document discusses a study that analyzes the financial health of the Indian logistics industry from 2005-2012 using Altman's Z-score model. The study finds that the average Z-score for selected logistics firms was in the healthy to very healthy range during the study period. The average Z-score increased from 2006 to 2010 when the Indian economy was hit by the global recession, indicating the overall performance of the Indian logistics industry was good. The document reviews previous literature on measuring financial performance and distress using ratios and Z-scores, and outlines the objectives and methodology used in the current study.
Fabular Frames and the Four Ratio ProblemMajid Iqbal
Digital, interactive art showing the struggle of a society in providing for its present population while also saving planetary resources for future generations. Spread across several frames, the art is actually the rendering of real and speculative data. The stereographic projections change shape in response to prompts and provocations. Visitors interact with the model through speculative statements about how to increase savings across communities, regions, ecosystems and environments. Their fabulations combined with random noise, i.e. factors beyond control, have a dramatic effect on the societal transition. Things get better. Things get worse. The aim is to give visitors a new grasp and feel of the ongoing struggles in democracies around the world.
Stunning art in the small multiples format brings out the spatiotemporal nature of societal transitions, against backdrop issues such as energy, housing, waste, farmland and forest. In each frame we see hopeful and frightful interplays between spending and saving. Problems emerge when one of the two parts of the existential anaglyph rapidly shrinks like Arctic ice, as factors cross thresholds. Ecological wealth and intergenerational equity areFour at stake. Not enough spending could mean economic stress, social unrest and political conflict. Not enough saving and there will be climate breakdown and ‘bankruptcy’. So where does speculative design start and the gambling and betting end? Behind each fabular frame is a four ratio problem. Each ratio reflects the level of sacrifice and self-restraint a society is willing to accept, against promises of prosperity and freedom. Some values seem to stabilise a frame while others cause collapse. Get the ratios right and we can have it all. Get them wrong and things get more desperate.
An accounting information system (AIS) refers to tools and systems designed for the collection and display of accounting information so accountants and executives can make informed decisions.
Optimizing Net Interest Margin (NIM) in the Financial Sector (With Examples).pdfshruti1menon2
NIM is calculated as the difference between interest income earned and interest expenses paid, divided by interest-earning assets.
Importance: NIM serves as a critical measure of a financial institution's profitability and operational efficiency. It reflects how effectively the institution is utilizing its interest-earning assets to generate income while managing interest costs.
Every business, big or small, deals with outgoing payments. Whether it’s to suppliers for inventory, to employees for salaries, or to vendors for services rendered, keeping track of these expenses is crucial. This is where payment vouchers come in – the unsung heroes of the accounting world.
[4:55 p.m.] Bryan Oates
OJPs are becoming a critical resource for policy-makers and researchers who study the labour market. LMIC continues to work with Vicinity Jobs’ data on OJPs, which can be explored in our Canadian Job Trends Dashboard. Valuable insights have been gained through our analysis of OJP data, including LMIC research lead
Suzanne Spiteri’s recent report on improving the quality and accessibility of job postings to reduce employment barriers for neurodivergent people.
Decoding job postings: Improving accessibility for neurodivergent job seekers
Improving the quality and accessibility of job postings is one way to reduce employment barriers for neurodivergent people.
The Impact of Generative AI and 4th Industrial RevolutionPaolo Maresca
This infographic explores the transformative power of Generative AI, a key driver of the 4th Industrial Revolution. Discover how Generative AI is revolutionizing industries, accelerating innovation, and shaping the future of work.
OJP data from firms like Vicinity Jobs have emerged as a complement to traditional sources of labour demand data, such as the Job Vacancy and Wages Survey (JVWS). Ibrahim Abuallail, PhD Candidate, University of Ottawa, presented research relating to bias in OJPs and a proposed approach to effectively adjust OJP data to complement existing official data (such as from the JVWS) and improve the measurement of labour demand.
STREETONOMICS: Exploring the Uncharted Territories of Informal Markets throug...sameer shah
Delve into the world of STREETONOMICS, where a team of 7 enthusiasts embarks on a journey to understand unorganized markets. By engaging with a coffee street vendor and crafting questionnaires, this project uncovers valuable insights into consumer behavior and market dynamics in informal settings."
Discover the Future of Dogecoin with Our Comprehensive Guidance36 Crypto
Learn in-depth about Dogecoin's trajectory and stay informed with 36crypto's essential and up-to-date information about the crypto space.
Our presentation delves into Dogecoin's potential future, exploring whether it's destined to skyrocket to the moon or face a downward spiral. In addition, it highlights invaluable insights. Don't miss out on this opportunity to enhance your crypto understanding!
https://36crypto.com/the-future-of-dogecoin-how-high-can-this-cryptocurrency-reach/
Abhay Bhutada, the Managing Director of Poonawalla Fincorp Limited, is an accomplished leader with over 15 years of experience in commercial and retail lending. A Qualified Chartered Accountant, he has been pivotal in leveraging technology to enhance financial services. Starting his career at Bank of India, he later founded TAB Capital Limited and co-founded Poonawalla Finance Private Limited, emphasizing digital lending. Under his leadership, Poonawalla Fincorp achieved a 'AAA' credit rating, integrating acquisitions and emphasizing corporate governance. Actively involved in industry forums and CSR initiatives, Abhay has been recognized with awards like "Young Entrepreneur of India 2017" and "40 under 40 Most Influential Leader for 2020-21." Personally, he values mindfulness, enjoys gardening, yoga, and sees every day as an opportunity for growth and improvement.
1. Mathematical Theory and Modeling
ISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)
Vol.3, No.11, 2013
www.iiste.org
Modelling the Rate of Treasury Bills in Ghana
Ida Anuwoje Logubayom*, Suleman Nasiru, Albert Luguterah
Department of Statistics, University for Development Studies, P. O. Box 24 Navrongo, Ghana
*:
Corresponding Author’s Email: idalogubayom@yahoo.com
Abstract
Treasury bills rate is a preeminent default-risk free rate asset in Ghana’s money market whose existence can
affect the purchasing power of other assets in the security market. Bank of Ghana sells its Bills to mop up excess
liquidity and buys Bank of Ghana Bills to inject liquidity into the system. This paper empirically models the
monthly Treasury bill rate of two short term Treasury bills (91 day and 182 day) from the year 1998 to 2012
from the BoG using ARIMA models. From the results, it was realized that ARIMA 3, 1, 1 model is appropriate
for modelling the 91-day Treasury bill rate with a log likelihood value of -328.58, and least AIC value of 667.17,
AICc value of 667.52 and BIC value of 683.05. Also, ARIMA 1, 1, 0 best models the 182-day Treasury bill
rates with a log likelihood value of -356.50, and AIC value of 717.00, AICc value of 717.06 and least BIC value
of 723.35. An ARCH-LM test and Ljung-Box test on the residuals of the models revealed that the residuals are
free from heteroscedasticity and serial correlation respectively.
Keywords: Treasury bills, Ghana, Asset, Empirical, Short term.
1. Introduction
The acceptance of financial risk is inherent to the business of banking and insurance roles as financial
intermediaries. To meet the demands of customers and communities and to execute business strategies, financial
institutions make loans, purchase securities and deposit with different maturities and interest rates. A treasury
bill which is one of the common securities being purchased in many societies is a default-risk free short-term
bonds that matures within one year or less from their time of issuance. Treasury bills are sold with maturities of
four weeks (1month), 13 weeks (3 months-91-day), 26 weeks (6 months-182-day), and 52 weeks (12 months360-days) weeks, which are more commonly referred to as the one-, three-, six-, and 12-month T-bills,
respectively. Like a zero-coupon bond, Treasury bills are sold at a discount to par. “Par” is the value at which all
T-bills mature. Treasury Bills are issued to finance government deficits and Bank of Ghana sells Bank of Ghana
Bills to mop up excess liquidity and buys Bank of Ghana Bills to inject liquidity in the system (Brilliant, 2011).
Treasury Bills have so gained a high appeal among the population as securities with high returns and virtually no
default risk.
Jacoby, et al., (2000) provided theoretical arguments to show how treasury bills impacts stock market prices.
Jones (2001) showed that stock prices predicted expected returns in the time-series. As more data has become
available, recent work has shifted focus on studying time-series properties of risk in equity markets as well as in
Treasury bills. Huang, et al., (2001) related risk to return volatility, while Brandt (2002) studied the relationship
between liquidity, order flow and the yield curve.
Aboagye et al., (2008) in studying the performance of stocks in Ghana, using an investment of the same amount
in treasury bills and shares over a period of 1991 and 2001, found out that investors in stock exchange traded
shares earned on average 54% per annum, whereas treasury bill investors Earned 36.3%. The changing rate
relationship across the spectrum of maturities is analyzed by some researchers by a yield curve risk: A yield
curve is the graph of required interest rates for various maturity times.
Ghanaians are so impressed with the observable high rate of returns on treasury bills that many believe treasury
bills offer the chance to earn higher returns than can be earned on other financial securities. Due to the
impressive nature of treasury bills in the financial market, many researchers on the Ghanaian economy focus
much on comparative study on the performance of treasury bills and other stock investments but none in this
country has concentrated on modeling Treasury bills. This study therefore is focused on modeling the 91-day and
182-day Treasury bills, to determine an appropriate times series model for predicting these Treasury bills and
forecast future outcomes. The Modelling of Treasury bills in useful to investors in their choice of Treasury bill to
invest in. It will also be useful in the financial markets and the security agencies in the control of the different
degrees of liquidity in their securities.
2. Materials and Methods of Analysis
We obtained monthly data on the 91-day and 182-day Treasury bills rate from the Bank of Ghana (BoG) from
December, 1998 to October, 2012. The rates were modeled using Autoregressive Integrated Moving Average
(ARIMA) models. The ARIMA , , is a modified form of the Autoregressive Moving Average Model
(ARMA ,
model where the times series variable is non-stationary. An ARMA ,
model is the
combination of an Autoregressive process and a Moving average process into a compact form in order to reduce
102
2. Mathematical Theory and Modeling
ISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)
Vol.3, No.11, 2013
www.iiste.org
the number of parameters. For an ARMA , model, is the order of the Autoregressive process and is the
order of the Moving average process. An ARMA is used only if a time series variable is weakly stationary. If the
times variable is non-stationary (that is has a unit root), the ARMA , model is extended to an ARIMA
, ,
model where is the order of integration of the series (number of times the series is differenced to
make it stationary).
The general form of the ARMA , model is:
=
+
+
−
Ө
whiles an ARIMA , ,
model is represented by the backward shift operator as:
1−
−
− …..−
! 1 − " = #1 − Ө − Ө
− ….− Ө
!
"
"
"
is the non-seasonal difference filters. 1 −
= ∆
where 1 −
is a white noise series and p and q are positive integers. is determined from the
PACF plot whiles is determined from the ACF plot.
The modelling of an ARIMA , , model as outlined by Box–Jenkins consist of Model identification,
Parameter Estimation and Diagnostic of selected model.
Model Identification: The order of the Autoregressive component ( , the Moving Average component ( and
the order of integration
were obtained by a model identification process. Generally, for an ARIMA , ,
model,
is identified from the Autocorrelation function (ACF) plot and
is taken from the Partial
Autocorrelation function (PACF) plot. For any ARIMA , , process, the theoretical PACF has non-zeros
partial autocorrelations at lag 1, 2, … … . . , , but zero partial autocorrelation at all lags, whiles the theoretical
ACF has non-zeros partial autocorrelations at lag1, 2, … … . . , , but zero partial autocorrelation at all lags. In this
study, the non-zero lags were taken as the and for the model estimation as expected. Before the orders were
identified the following test were carried out on the data.
Ljung Box test: We performed the Ljung Box test to test jointly whether or not several autocorrelations &' of
the time series variable measured were zero. The Ljung Box statistic is given by:
,
&'
( ) =* *+2
*−+
'
( ) is approximately a chi-square distribution with ) degrees of freedom. We reject -. and conclude there is
serial correlation when the p − value < 0.05.
Augmented Dickey Fuller (ADF) Unit Root Test: We used the Augmented Dickey Fuller (ADF) Test to
determine whether the times series has a unit root (non-stationary) or is weakly stationary. This test is based on
the assumption that the series follows a random walk with model;
=
+ 7
and hypothesis:
- :
= 1 Non-stationary)
- :
< 1 Stationary
Where is the characteristic root of an AR polynomial and 7 is a white noise series.
An Autocorrelation plot of the series shows no serial correlation and randomness if all sample autocorrelations
fall within the two standard error limits.
Estimation of Parameters: We selected the best model among all candidate models by the Akaike Information
criterion (AIC), Akaike Information Corrected criterion (AICc), Normalized Bayesian Information Criterion
(BIC) and the Log-likelihood values. The best model is the model with the maximum Log-likelihood value and
least AIC, AICc and BIC value.
Model Diagnostics: The selected model was checked to determine whether or not it appropriately represented
the data set. The diagnostic check on the residuals of the fitted model to check whether they are white noise
series was done: These include an ACF plot of the residuals, a Ljung Box test and an ARCH-LM test on the
residuals of the best model to determine whether they are random and their variance, homoscedastic (constant) or
Heteroscedastic respectively.
3. Results and Discussion
Modelling of 91-Day Treasury Bill
The time series plot of the 91-day Treasury bill rate in figure 1 shows that the series does not fluctuate about a
fixed point and thus gives an indication of non-stationary in the series. This is also seen from the ACF plot of the
series which shows a slow decay and also from the PACF plot which has a very significant spike at lag 1. The
Augmented Dickey-fuller test further confirms this assertion: The test is insignificant at the 0.05 significance
level showing that the series has a unit root and hence not stationary. The series was therefore first differenced
103
3. Mathematical Theory and Modeling
ISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)
Vol.3, No.11, 2013
www.iiste.org
and tested for stationary with the Augmented Dickey-fuller test: The first difference was enough to make the
series stationary as shown by the test.
Table 2 shows the different models fitted to the series, ARIMA 3, 1, 1 appears to be the best model as it has the
least AIC, AICc, BIC values and the maximum Log-likelihood. The estimates of the parameters of the model,
shown in table 3, indicates that AR 1 , AR 2 and MA 1 models are significant at the 0.05 significance level
whiles AR 3 is significant at 0.10 level of significance. Our diagnostic checking of the ARIMA 3, 1, 1 model
revealed that the model was adequate for the series. The ARCH-LM test showed that there was no ARCH effect;
hence the residuals have a constant variance. The Ljung-Box p–values (> 0.05) showed that there is no serial
correlation in the residuals of the model. The ACF plot of the residuals also shows that the residuals are white
noise series.
Modelling of 182-Day Treasury Bill
A plot of the 182-day Treasury bill rate gave an indication that the series was not weakly stationary as shown in
figure 6. This was also realized from the ACF and PACF plots in figure 7; the ACF plot of the series had a slow
decay and the PACF plot showed a very significant spike at lag 1. The Augmented Dickey-fuller test which was
also insignificant at the 0.05 level of significance further confirms the non-stationarity of the series. The series
was therefore stationarized after the first differencing; the Augmented Dickey-fuller test of the differenced series
was significant indicating that the series was stationary.
From table 7, ARIMA 1, 1, 0 model was selected as the best model among the different ARIMA ( , 1,
models fitted for the 182-day series since it has the smallest BIC value. Even though the AIC and AICc value of
the ARIMA 1, 1, 0 model was larger than other models fitted, the BIC criterion was used for selecting the best
model because the BIC criterion is a consistent estimator and tends to select models with less parameters as
compared to AIC criterion. The parameter estimate of ARIMA 1, 1, 0 shown in table 8 indicates that AR 1 is
significant at the 0.05 significance level. Our diagnostics of the ARIMA 1, 1, 0 model showed that the model
best fit the series. It was realized that there was no ARCH effect on the residuals of the selected model due to an
insignificant ARCH-LM test statistic hence the residuals are homoscedastic. Also the Ljung-Box statistic was
not significant thus gives an indication of no serial correlation among the residuals of the selected model at 0.05
level of significance. The ACF plot of the residuals further showed that the residuals were white noise series.
These tests revealed that ARIMA 1, 1, 0 model was adequate in representing the 182-day Treasury bill rate.
4. Conclusion
This study used time series to model the treasury bills in Ghana using data from the Bank of Ghana (BoG) from
the year 1988 to 2012. The modeling of the treasury bills was done mainly by ARIMA model. The Study
revealed that, the 91-day Treasury bill rate is best modeled with ARIMA 3, 1, 1 whiles the 182-day Treasury
bill rates is best modeled by ARIMA 1, 1, 0 . The diagnostics of these two models showed that these models
adequately fits the two series hence are adequate for the forecasting of Treasury bill rate in Ghana.
References
Aboagye, Q. Q., Akoena, S.K., Antwi-Asare, T.O. and Gockel F.A. (2008): Explaining Interest Rate Spreads in
Ghana. African Development Review, 20( 3):378 - 399.
Annual Reports, (2008): Securities and Exchange Commission, Ghana
Brandt, M., and Kavajecz, K. (2002). Price discovery in the U.S. Treasury Market: The impact of order flow and
liquidity on the yield curve, working paper, University of Pennsylvania, Philadelphia, PA.
Box, G.E. and Jenkins, G.M. (1976): Time Series Analysis Forecasting and Control
Doko, B.D. (2012): Comparative Analysis of Performance of Equities (stock) and Treasury bills in Ghana.
Kwame Nkrumah University of Science and Technology.
Huang, R., Cai, J. and Wang, X., (2001): Inventory risk-sharing and public information-based trading in the
Treasury Note interdealer broker market, working paper, University of Notre Dame, Notre Dame, IN.
Jacoby, G., Fowler, D., and Gottesman, A. (2000): The capital asset pricing model and the liquidity effect, A
theoretical approach, Journal of Financial Markets, 3: 69-81.
Jones, C., (2001): A century of stock market liquidity and trading costs, working paper, Columbia University,
New York, NY.
Ljung, G.M and Box, G.E, (1978): On the measure of lack of fit in time series models. Biometrika 65(2): 297303.
104
4. Mathematical Theory and Modeling
ISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)
Vol.3, No.11, 2013
www.iiste.org
APPENDIX
Tables and Figures of 91-day Treasury bill
Figure1: Time series plot of 91-day Treasury bill rate
Figure 2: ACF AND PACF of undifferenced monthly 91-day T-bill rate
Table 1: Augmented Dickey-Fuller and Ljung-Box test Statistic
Differencing Order
ADF statistic
p-value
0
-1.5037
0.1355
1
-3.6914
0.01
Ljung-Box statistic
1288.621
51.8194
Figure 3: ACF and PACF of first difference series
105
p-value
0.000
0.000
5. Mathematical Theory and Modeling
ISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)
Vol.3, No.11, 2013
Table 2: Different ARIMA
, 1,
www.iiste.org
models fitted
model type
AIC
AICc
BIC
Log likelihood
ARIMA(1,1,1)
674.81
674.95
684.34
-334.41
ARIMA(1,1,2)
673.60
673.83
686.30
-332.8
ARIMA(1,1,3)
668.30
668.65
684.18
-329.15
ARIMA(1,1,4)
670.01
670.50
689.06
-329.00
ARIMA(3,1,1)*
667.17*
667.52*
683.05*
-328.58*
ARIMA(3,1,2)
668.44
668.94
687.50
-328.22
ARIMA(3,1,3)
669.78
670.44
692.01
-327.89
ARIMA(3,1,4)
*: Model selected
671.22
672.08
696.63
-327.61
Table 3: Estimates of ARIMA (3, 1, 1) model
Type
coefficient
standard error
t-statistic
p-value
AR(1)
-0.3817
0.0905
-4.06
0.000
AR(2)
0.2794
0.0831
3.280
0.001
AR(3)
0.1278
0.0784
1.640
0.103
MA(1)
Model Diagnostics
0.8828
0.0555
4.730
0.000
Table 4: Diagnostic test statistic
Test
statistic
p-value
ARCH LM
8.3945
0.7536
Ljung Box
5.696
0.9306
Figure 4: Diagnostic plot of residuals of ARIMA (3,1,1) model
106
6. Mathematical Theory and Modeling
ISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)
Vol.3, No.11, 2013
www.iiste.org
Tables and Figures of 182-day Treasury bill
Figure6: Time series plot of 182-day Treasury bill rate
Figure 7: ACF AND PACF of undifferenced monthly 182-day T-bill rate
Table 6: Augmented Dickey-Fuller and Ljung-Box test Statistic
Differencing Order
ADF statistic
p-value
Ljung-Box statistic p-value
0
-1.6229
0.09883
170.2105
0.000
1
-6.8901
0.01
25.3658
0.000
Figure 8: ACF AND PACF of differenced monthly 182-day T-bill rate
107
7. Mathematical Theory and Modeling
ISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)
Vol.3, No.11, 2013
Table 7: Different ARIMA
model type
, 1,
www.iiste.org
models fitted for 182-day T-bill
AIC
AICc
BIC
Log likelihood
ARIMA(1,1,0)*
717.00*
717.06*
723.35*
-356.50*
ARIMA(1,1,1)
717.15
717.29
726.68
-355.58
ARIMA(1,1,2)
718.36
718.59
731.06
-355.18
ARIMA(1,1,3)
717.14
717.49
733.02
-353.57
ARIMA(1,1,4)
718.35
718.84
737.4
-353.17
ARIMA(3,1,0)
715
716.17
728.65
-353.97
ARIMA(3,1,1)
717.93
718.28
733.81
-353.96
ARIMA(3,1,2)
*: Model selected
715.64
716.14
734.7
-351.82
Table 8: Estimates of ARIMA (1, 1, 0) model
type
coefficient
standard error
t-statistic
p-value
AR(1)
0.3763
0.0693
5.42
0.000
Model Diagnostics
Table 9: Diagnostic test statistic
Test
statistic
p-value
ARCH LM
2.2595
0.536
Ljung Box
10.2595
0.9989
Figure 9: Diagnostic plot of residuals of ARIMA (1, 1, 0) model
108
8. This academic article was published by The International Institute for Science,
Technology and Education (IISTE). The IISTE is a pioneer in the Open Access
Publishing service based in the U.S. and Europe. The aim of the institute is
Accelerating Global Knowledge Sharing.
More information about the publisher can be found in the IISTE’s homepage:
http://www.iiste.org
CALL FOR JOURNAL PAPERS
The IISTE is currently hosting more than 30 peer-reviewed academic journals and
collaborating with academic institutions around the world. There’s no deadline for
submission. Prospective authors of IISTE journals can find the submission
instruction on the following page: http://www.iiste.org/journals/
The IISTE
editorial team promises to the review and publish all the qualified submissions in a
fast manner. All the journals articles are available online to the readers all over the
world without financial, legal, or technical barriers other than those inseparable from
gaining access to the internet itself. Printed version of the journals is also available
upon request of readers and authors.
MORE RESOURCES
Book publication information: http://www.iiste.org/book/
Recent conferences: http://www.iiste.org/conference/
IISTE Knowledge Sharing Partners
EBSCO, Index Copernicus, Ulrich's Periodicals Directory, JournalTOCS, PKP Open
Archives Harvester, Bielefeld Academic Search Engine, Elektronische
Zeitschriftenbibliothek EZB, Open J-Gate, OCLC WorldCat, Universe Digtial
Library , NewJour, Google Scholar