The term asymmetric volatility arises from observation that we observe higher volatilities (higher risk) during the market downturn than in the market upturns. The most common mentioned factor that contributes to such risk behavior is increased market leverage that was produced by a negative shock; however, there are also other factors, such as perceived risk/reward balance in different stages of market behavior.
The document discusses analyzing multivariate time series of five energy futures (crude oil, ethanol, gasoline, heating oil, natural gas) using vector autoregressive (VAR) and vector error correction (VEC) models. It finds the futures are cointegrated using Johansen and Engle-Granger tests, indicating they share a common stochastic trend. A VAR(1) model is estimated and found stable. The VEC model captures the error correction behavior as futures return to their long-run equilibrium. Forecasts are generated and limitations of the Engle-Granger approach discussed.
5th International Disaster and Risk Conference IDRC 2014 Integrative Risk Management - The role of science, technology & practice 24-28 August 2014 in Davos, Switzerland
The document discusses static hedging of binary options using a portfolio of vanilla options. Specifically, it examines hedging a binary call option with a strike of 100 using a short call with a strike of 90 and a long call with a strike of 110. The analysis considers uncertain volatility, inhomogeneous maturity between the options, and incorporating bid-ask spreads to maximize the value of the binary option for both long and short positions. Finite difference methods are used to numerically evaluate the option prices under different volatility assumptions and jump conditions.
This document provides instructions for modeling stock return volatility using daily stock price data from Hong Kong, Japan, and Singapore markets from 1990 to 2005. It outlines steps to estimate Threshold GARCH and GARCH-in-mean models to examine the volatility and asymmetry of returns in the Singapore market. Specifically, it describes how to: 1) Estimate a TGARCH model to analyze asymmetry in volatility; 2) Estimate a GARCH-in-mean model to investigate the return-risk relationship; and 3) Estimate a TGARCH-in-mean model and compare the results.
Impact of Derivative Trading on the volatility of the underlying assetPrasanna Ramamurthy
The aim of the study is to analyze the impact of additional information generated due to introduction of individual stock derivative instrument on the volatility of returns on the underlying stock
1) The document discusses modeling volatility in financial time series using autoregressive conditional heteroscedasticity (ARCH) and generalized autoregressive conditional heteroscedasticity (GARCH) models. These models account for time-varying volatility or variance in the data.
2) As an example, an ARCH(1) model is fitted to monthly changes in the US-UK exchange rate from 1971-2007 which shows evidence of volatility clustering.
3) Similarly, fitting an ARCH(1) model to monthly percentage changes in the NYSE stock index from 1966-2002 also demonstrates volatility clustering in financial returns.
This study examines extreme co-movements in stock prices. Daily prices for the first 100 stocks were analyzed to calculate log returns and identify extreme jumps. A GARCH model was used to extract conditional volatility. Pseudo-observations were generated by dividing returns by volatility. A generalized extreme value distribution was fitted to exceedances above a threshold to determine tail properties. Fréchet scales were calculated and ranked. The number of joint exceedances above percentiles over time lags were counted to estimate conditional probabilities of extreme co-movements in stock decreases.
Express measurement of market volatility using ergodicity conceptJack Sarkissian
Don't we want to base our trading decisions on current market conditions? Then why should we rely on time averages only because they are simple to comprehend? We can get current market volatility a lot faster by applying the ERGODICITY concept to financial markets. Ensemble averaging allows to measure market volatility quickly, based on only two points in time and is as relevant to volatility measurement as the traditional measures.
The document discusses analyzing multivariate time series of five energy futures (crude oil, ethanol, gasoline, heating oil, natural gas) using vector autoregressive (VAR) and vector error correction (VEC) models. It finds the futures are cointegrated using Johansen and Engle-Granger tests, indicating they share a common stochastic trend. A VAR(1) model is estimated and found stable. The VEC model captures the error correction behavior as futures return to their long-run equilibrium. Forecasts are generated and limitations of the Engle-Granger approach discussed.
5th International Disaster and Risk Conference IDRC 2014 Integrative Risk Management - The role of science, technology & practice 24-28 August 2014 in Davos, Switzerland
The document discusses static hedging of binary options using a portfolio of vanilla options. Specifically, it examines hedging a binary call option with a strike of 100 using a short call with a strike of 90 and a long call with a strike of 110. The analysis considers uncertain volatility, inhomogeneous maturity between the options, and incorporating bid-ask spreads to maximize the value of the binary option for both long and short positions. Finite difference methods are used to numerically evaluate the option prices under different volatility assumptions and jump conditions.
This document provides instructions for modeling stock return volatility using daily stock price data from Hong Kong, Japan, and Singapore markets from 1990 to 2005. It outlines steps to estimate Threshold GARCH and GARCH-in-mean models to examine the volatility and asymmetry of returns in the Singapore market. Specifically, it describes how to: 1) Estimate a TGARCH model to analyze asymmetry in volatility; 2) Estimate a GARCH-in-mean model to investigate the return-risk relationship; and 3) Estimate a TGARCH-in-mean model and compare the results.
Impact of Derivative Trading on the volatility of the underlying assetPrasanna Ramamurthy
The aim of the study is to analyze the impact of additional information generated due to introduction of individual stock derivative instrument on the volatility of returns on the underlying stock
1) The document discusses modeling volatility in financial time series using autoregressive conditional heteroscedasticity (ARCH) and generalized autoregressive conditional heteroscedasticity (GARCH) models. These models account for time-varying volatility or variance in the data.
2) As an example, an ARCH(1) model is fitted to monthly changes in the US-UK exchange rate from 1971-2007 which shows evidence of volatility clustering.
3) Similarly, fitting an ARCH(1) model to monthly percentage changes in the NYSE stock index from 1966-2002 also demonstrates volatility clustering in financial returns.
This study examines extreme co-movements in stock prices. Daily prices for the first 100 stocks were analyzed to calculate log returns and identify extreme jumps. A GARCH model was used to extract conditional volatility. Pseudo-observations were generated by dividing returns by volatility. A generalized extreme value distribution was fitted to exceedances above a threshold to determine tail properties. Fréchet scales were calculated and ranked. The number of joint exceedances above percentiles over time lags were counted to estimate conditional probabilities of extreme co-movements in stock decreases.
Express measurement of market volatility using ergodicity conceptJack Sarkissian
Don't we want to base our trading decisions on current market conditions? Then why should we rely on time averages only because they are simple to comprehend? We can get current market volatility a lot faster by applying the ERGODICITY concept to financial markets. Ensemble averaging allows to measure market volatility quickly, based on only two points in time and is as relevant to volatility measurement as the traditional measures.
- The document analyzes forecasting volatility for the MSCI Emerging Markets Index using a Stochastic Volatility model solved with Kalman Filtering. It derives the Stochastic Differential Equations for the model and puts them into State Space form solved with a Kalman Filter.
- Descriptive statistics on the daily returns of the MSCI Emerging Markets Index ETF from 2011-2016 show a mean close to 0, standard deviation of 0.01428, negative skewness, and kurtosis close to a normal distribution. The model will be evaluated against a GARCH model.
This document summarizes the results of an econometrics analysis examining the relationship between macroeconomic variables in the US and Italy. It tests for unit roots and cointegration, estimates vector autoregression models in levels and first differences, and analyzes impulse response functions and variance decompositions. The key findings are: 1) some variables are stationary while others have unit roots; 2) there are two cointegrating relationships; 3) monetary shocks have a significant positive effect on GDP for several quarters in the levels model; 4) variance decompositions show monetary shocks do not explain significant portions of GDP variance.
This document summarizes a study that models crude oil prices using a Lévy process. The study finds that a MA(8) model best fits the time series properties of oil price returns. However, there is also evidence of GARCH effects. Therefore, the best overall model is a GARCH(1,1) with errors modeled by a Johnson SU distribution. This hybrid Lévy-GARCH process captures the temporal, spectral and distributional properties of the crude oil price data set.
1. The document discusses time series forecasting using Holt-Winters exponential smoothing methods. It focuses on analyzing seasonal time series data.
2. There are two Holt-Winters models - the multiplicative seasonal model and the additive seasonal model. The models account for trend, seasonal variations, and error in time series data.
3. Exponential smoothing assigns decreasing weights to older observations to generate forecasts. There are methods for single, double, and triple exponential smoothing to handle different patterns in time series like trend and seasonality.
1) The document discusses calibrating the Libor Forward Market Model (LFM) to Australian dollar market data using the approach of Pedersen.
2) Pedersen employs a non-parametric approach using a piecewise constant volatility grid to calibrate the LFM deterministically to swaption and cap prices. He formulates a cost function balancing fit to market prices and volatility surface smoothness.
3) Caplet and swaption prices can be approximated in closed form under the LFM, allowing calibration by minimizing differences between model and market prices of these instruments.
Garch Models in Value-At-Risk Estimation for REITIJERDJOURNAL
Abstract:- In this study we investigate volatility forecasting of REIT, from January 03, 2007 to November 18, 2016, using four GARCH models (GARCH, EGARCH, GARCH-GJR and APARCH). We examine the performance of these GARCH-type models respectively and backtesting procedures are also conducted to analyze the model adequacy. The empirical results display that when we take estimation of volatility in REIT into account, the EGARCH model, the GARCH-GJR model, and the APARCH model are adequate. Among all these models, GARCH-GJR model especially outperforms others.
This document discusses volatility modeling using GARCH models. It provides an overview of the basic GARCH specification and the steps involved in GARCH modeling, including descriptive statistics, testing for ARCH effects, GARCH specification, estimation, evaluation, and inferences. Specific GARCH models discussed include GARCH(1,1), TARCH, and EGARCH. The goal of GARCH modeling is to characterize volatility for applications such as risk analysis and portfolio selection.
"Correlated Volatility Shocks" by Dr. Xiao Qiao, Researcher at SummerHaven In...Quantopian
Commonality in idiosyncratic volatility cannot be completely explained by time-varying volatility. After removing the effects of time-varying volatility, idiosyncratic volatility innovations are still positively correlated. This result suggests correlated volatility shocks contribute to the comovement in idiosyncratic volatility.
Motivated by this fact, we propose the Dynamic Factor Correlation (DFC) model, which fits the data well and captures the cross-sectional correlations in idiosyncratic volatility innovations. We decompose the common factor in idiosyncratic volatility (CIV) of Herskovic et al. (2016) into the volatility innovation factor (VIN) and time-varying volatility factor (TVV). Whereas VIN is associated with strong variation in average returns, TVV is only weakly priced in the cross section
A strategy that takes a long position in the portfolio with the lowest VIN and TVV betas, and a short position in the portfolio with the highest VIN and TVV betas earns average returns of 8.0% per year.
The Vasicek model is one of the earliest stochastic models for modeling the term structure of interest rates. It represents the movement of interest rates as a function of market risk, time, and the equilibrium value the rate tends to revert to. This document discusses parameter estimation techniques for the Vasicek one-factor model using least squares regression and maximum likelihood estimation on historical interest rate data. It also covers simulating the term structure and pricing zero-coupon bonds under the Vasicek model. The two-factor Vasicek model is introduced as an extension of the one-factor model.
This document presents a sequential approach to loss forecasting that produces a minimum variance estimate. It begins by estimating loss rates for individual historical policy periods. It then weights these rates sequentially based on their variances to incorporate all historical information, with more weight given to periods with lower variance. This approach is applied both to incurred and paid losses to develop two forecasts, which are then combined based on their covariance. The document also explores modeling claims using a Gamma distribution and applying Bayesian updating to forecast losses.
This document presents a time series model for the exchange rate between the Euro (EUR) and the Egyptian Pound (EGP) using a GARCH model. The author analyzes the time series data of the exchange rate for 2008 and finds that it exhibits volatility clustering where large changes tend to follow large changes. An ARCH or GARCH model is needed to capture the changing conditional variances over time. The author estimates several GARCH models and selects the GARCH(1,2) model based on statistical significance of coefficients and AIC values. Diagnostic tests show that the GARCH(1,2) model adequately captures the heteroskedasticity in the data. The fitted model is then used to predict future exchange rates
This document provides an overview and comparison of two models for forecasting and trading volatility: the Markov-Switching GARCH (MS-GARCH) model and the Markov-Switching Multifractal (MSM) model. The key findings are: 1) MSM outperforms MS-GARCH for out-of-sample forecasts at horizons of 10-50 days but performs similarly at 1-day horizons; 2) MS-GARCH generates inaccurate forecasts in volatile and low volatility periods while MSM better captures volatility characteristics; 3) MS-GARCH yields higher trading profits than MSM for intra-day and monthly variance swap trading but this may be due to mispricing of implied volatilities.
This document discusses valuing and hedging the extrinsic value of a natural gas storage facility using a basket-of-options approach. It presents a formula for calculating the intrinsic value of the storage by maximizing the spread between purchase and sale prices of gas over time. The storage value includes both the intrinsic value and an extrinsic value based on future opportunities. It models the storage as a portfolio of options on spreads between monthly gas prices. Delta hedging with these options provides a lower bound for the storage value and a way to monetize the extrinsic value. The methodology is tested on a six-month period using daily gas price data.
This document introduces Value at Risk (VaR) through defining it, describing different VaR methods, and providing examples of VaR calculations. VaR measures the worst expected loss over a given time period at a given confidence level. The document focuses on the parametric VaR method, which assumes returns are normally distributed, and provides examples of calculating VaR for single-asset and multi-asset portfolios using the normal distribution. It also briefly discusses VaR for derivative portfolios using delta approximation.
This document provides an overview of a research project analyzing the relationship between economic growth and environmental quality in the United States over time. The author explores this relationship using time series models with GDP as the dependent variable and factors like carbon dioxide emissions, oil consumption, coal consumption, and municipal solid waste as independent variables. Different model specifications are estimated and tested, including linear, threshold autoregressive, and vector autoregressive models. The author aims to determine how economic growth has impacted various environmental indicators in the US and vice versa.
Affine Term Structure Model with Stochastic Market Price of RiskSwati Mital
- The document proposes a new affine term structure model that combines principal components analysis with a stochastic market price of risk.
- Principal components provide useful information about yield curves and only three components explain over 95% of yield variation.
- Previous models linked risk premium deterministically to return-predicting factors like slope, but this could result in unrealistic risk premium levels.
- The new model introduces an additional state variable to capture the stochastic market price of risk and break the deterministic link between risk premium and return-predicting factors.
This document provides an overview of macroeconomic aggregates and fiscal accounts and analysis. It defines the main economic sectors and aggregates such as GDP, GNP, consumption, investment, etc. It also discusses approaches to determining GDP, nominal vs real GDP, and inflation. The document then covers fiscal accounting methods and concepts such as the budget deficit, government saving-investment gap, and methods of financing the deficit such as borrowing from the central bank, other banks, or non-bank sources.
Contents
• What are Energy Markets?
• Oil Markets – Oil Supply – Oil Demand – Oil Prices and Other Oil Products
• Natural Gas Markets
• Electricity Markets
• Coal Markets
• Renewable Energy Markets
• Economics and Energy Markets
- The document analyzes forecasting volatility for the MSCI Emerging Markets Index using a Stochastic Volatility model solved with Kalman Filtering. It derives the Stochastic Differential Equations for the model and puts them into State Space form solved with a Kalman Filter.
- Descriptive statistics on the daily returns of the MSCI Emerging Markets Index ETF from 2011-2016 show a mean close to 0, standard deviation of 0.01428, negative skewness, and kurtosis close to a normal distribution. The model will be evaluated against a GARCH model.
This document summarizes the results of an econometrics analysis examining the relationship between macroeconomic variables in the US and Italy. It tests for unit roots and cointegration, estimates vector autoregression models in levels and first differences, and analyzes impulse response functions and variance decompositions. The key findings are: 1) some variables are stationary while others have unit roots; 2) there are two cointegrating relationships; 3) monetary shocks have a significant positive effect on GDP for several quarters in the levels model; 4) variance decompositions show monetary shocks do not explain significant portions of GDP variance.
This document summarizes a study that models crude oil prices using a Lévy process. The study finds that a MA(8) model best fits the time series properties of oil price returns. However, there is also evidence of GARCH effects. Therefore, the best overall model is a GARCH(1,1) with errors modeled by a Johnson SU distribution. This hybrid Lévy-GARCH process captures the temporal, spectral and distributional properties of the crude oil price data set.
1. The document discusses time series forecasting using Holt-Winters exponential smoothing methods. It focuses on analyzing seasonal time series data.
2. There are two Holt-Winters models - the multiplicative seasonal model and the additive seasonal model. The models account for trend, seasonal variations, and error in time series data.
3. Exponential smoothing assigns decreasing weights to older observations to generate forecasts. There are methods for single, double, and triple exponential smoothing to handle different patterns in time series like trend and seasonality.
1) The document discusses calibrating the Libor Forward Market Model (LFM) to Australian dollar market data using the approach of Pedersen.
2) Pedersen employs a non-parametric approach using a piecewise constant volatility grid to calibrate the LFM deterministically to swaption and cap prices. He formulates a cost function balancing fit to market prices and volatility surface smoothness.
3) Caplet and swaption prices can be approximated in closed form under the LFM, allowing calibration by minimizing differences between model and market prices of these instruments.
Garch Models in Value-At-Risk Estimation for REITIJERDJOURNAL
Abstract:- In this study we investigate volatility forecasting of REIT, from January 03, 2007 to November 18, 2016, using four GARCH models (GARCH, EGARCH, GARCH-GJR and APARCH). We examine the performance of these GARCH-type models respectively and backtesting procedures are also conducted to analyze the model adequacy. The empirical results display that when we take estimation of volatility in REIT into account, the EGARCH model, the GARCH-GJR model, and the APARCH model are adequate. Among all these models, GARCH-GJR model especially outperforms others.
This document discusses volatility modeling using GARCH models. It provides an overview of the basic GARCH specification and the steps involved in GARCH modeling, including descriptive statistics, testing for ARCH effects, GARCH specification, estimation, evaluation, and inferences. Specific GARCH models discussed include GARCH(1,1), TARCH, and EGARCH. The goal of GARCH modeling is to characterize volatility for applications such as risk analysis and portfolio selection.
"Correlated Volatility Shocks" by Dr. Xiao Qiao, Researcher at SummerHaven In...Quantopian
Commonality in idiosyncratic volatility cannot be completely explained by time-varying volatility. After removing the effects of time-varying volatility, idiosyncratic volatility innovations are still positively correlated. This result suggests correlated volatility shocks contribute to the comovement in idiosyncratic volatility.
Motivated by this fact, we propose the Dynamic Factor Correlation (DFC) model, which fits the data well and captures the cross-sectional correlations in idiosyncratic volatility innovations. We decompose the common factor in idiosyncratic volatility (CIV) of Herskovic et al. (2016) into the volatility innovation factor (VIN) and time-varying volatility factor (TVV). Whereas VIN is associated with strong variation in average returns, TVV is only weakly priced in the cross section
A strategy that takes a long position in the portfolio with the lowest VIN and TVV betas, and a short position in the portfolio with the highest VIN and TVV betas earns average returns of 8.0% per year.
The Vasicek model is one of the earliest stochastic models for modeling the term structure of interest rates. It represents the movement of interest rates as a function of market risk, time, and the equilibrium value the rate tends to revert to. This document discusses parameter estimation techniques for the Vasicek one-factor model using least squares regression and maximum likelihood estimation on historical interest rate data. It also covers simulating the term structure and pricing zero-coupon bonds under the Vasicek model. The two-factor Vasicek model is introduced as an extension of the one-factor model.
This document presents a sequential approach to loss forecasting that produces a minimum variance estimate. It begins by estimating loss rates for individual historical policy periods. It then weights these rates sequentially based on their variances to incorporate all historical information, with more weight given to periods with lower variance. This approach is applied both to incurred and paid losses to develop two forecasts, which are then combined based on their covariance. The document also explores modeling claims using a Gamma distribution and applying Bayesian updating to forecast losses.
This document presents a time series model for the exchange rate between the Euro (EUR) and the Egyptian Pound (EGP) using a GARCH model. The author analyzes the time series data of the exchange rate for 2008 and finds that it exhibits volatility clustering where large changes tend to follow large changes. An ARCH or GARCH model is needed to capture the changing conditional variances over time. The author estimates several GARCH models and selects the GARCH(1,2) model based on statistical significance of coefficients and AIC values. Diagnostic tests show that the GARCH(1,2) model adequately captures the heteroskedasticity in the data. The fitted model is then used to predict future exchange rates
This document provides an overview and comparison of two models for forecasting and trading volatility: the Markov-Switching GARCH (MS-GARCH) model and the Markov-Switching Multifractal (MSM) model. The key findings are: 1) MSM outperforms MS-GARCH for out-of-sample forecasts at horizons of 10-50 days but performs similarly at 1-day horizons; 2) MS-GARCH generates inaccurate forecasts in volatile and low volatility periods while MSM better captures volatility characteristics; 3) MS-GARCH yields higher trading profits than MSM for intra-day and monthly variance swap trading but this may be due to mispricing of implied volatilities.
This document discusses valuing and hedging the extrinsic value of a natural gas storage facility using a basket-of-options approach. It presents a formula for calculating the intrinsic value of the storage by maximizing the spread between purchase and sale prices of gas over time. The storage value includes both the intrinsic value and an extrinsic value based on future opportunities. It models the storage as a portfolio of options on spreads between monthly gas prices. Delta hedging with these options provides a lower bound for the storage value and a way to monetize the extrinsic value. The methodology is tested on a six-month period using daily gas price data.
This document introduces Value at Risk (VaR) through defining it, describing different VaR methods, and providing examples of VaR calculations. VaR measures the worst expected loss over a given time period at a given confidence level. The document focuses on the parametric VaR method, which assumes returns are normally distributed, and provides examples of calculating VaR for single-asset and multi-asset portfolios using the normal distribution. It also briefly discusses VaR for derivative portfolios using delta approximation.
This document provides an overview of a research project analyzing the relationship between economic growth and environmental quality in the United States over time. The author explores this relationship using time series models with GDP as the dependent variable and factors like carbon dioxide emissions, oil consumption, coal consumption, and municipal solid waste as independent variables. Different model specifications are estimated and tested, including linear, threshold autoregressive, and vector autoregressive models. The author aims to determine how economic growth has impacted various environmental indicators in the US and vice versa.
Affine Term Structure Model with Stochastic Market Price of RiskSwati Mital
- The document proposes a new affine term structure model that combines principal components analysis with a stochastic market price of risk.
- Principal components provide useful information about yield curves and only three components explain over 95% of yield variation.
- Previous models linked risk premium deterministically to return-predicting factors like slope, but this could result in unrealistic risk premium levels.
- The new model introduces an additional state variable to capture the stochastic market price of risk and break the deterministic link between risk premium and return-predicting factors.
This document provides an overview of macroeconomic aggregates and fiscal accounts and analysis. It defines the main economic sectors and aggregates such as GDP, GNP, consumption, investment, etc. It also discusses approaches to determining GDP, nominal vs real GDP, and inflation. The document then covers fiscal accounting methods and concepts such as the budget deficit, government saving-investment gap, and methods of financing the deficit such as borrowing from the central bank, other banks, or non-bank sources.
Contents
• What are Energy Markets?
• Oil Markets – Oil Supply – Oil Demand – Oil Prices and Other Oil Products
• Natural Gas Markets
• Electricity Markets
• Coal Markets
• Renewable Energy Markets
• Economics and Energy Markets
This report identifies an outstanding issue in my organization: lack of proper risk management department. As a newly appointed Risk Manager I prepared an active solution plan, which I present below. I first identify the problem in my organization, and then present a solution and steps toward its implementation. Furthermore, I discuss management’s involvement in the process. Finally, I discuss the expected results.
Iceland has a BBB- credit rating from Moody's, S&P, and Fitch. While it has a highly skilled workforce and strong institutions, it also has high debt levels, capital controls, and economic dependence on a few commodity exports. Recent macroeconomic performance has improved with GDP growth resuming and inflation declining, although public and external debt remain elevated. While the outlook is positive if Europe avoids deeper crisis, delays in investment or uncertainty could slow Iceland's recovery. The document recommends that Alcoa proceed with its planned project, given Iceland's favorable economic outlook.
Managers use a short-term horizon to maximize their utility function. Short-term profitability of banking institutions is one of the most important determinants of bonus packages and managers are therefore motivated to produce highest possible returns on equity by lowering equity buffers to the lowest possible level. Framing effects approach shows that managers engage into risk seeking behavior in order to avoid sure loss (thus, to guarantee that they receive higher bonus), although risk adverse behavior is a preferred choice. Lessons learned from the financial crisis are the importance of introducing behavioral finance concepts into a daily banking activities, increase information transparency, and try to find alternative measures of managers’ efficiency – measures that would stimulate setting up long-term value functions.
The document discusses operational risk within the financial services industry and the Bank's operational risk management framework. It describes how the Bank identifies and manages operational risks through processes like risk control self-assessments and collecting operational loss event data. It provides an influence diagram showing factors that could lead to human errors in the Bank's risk management department. It also briefly compares sources of operational risk across different industries like transportation, focusing on factors like reputational, financial, and legal losses.
In the paper we test the new Phillips curve for Central and Eastern European EU accession countries for the period from 1990 to 2002 and use it to compare the efficiency of the traditional Phillips curve. More specifically, we want to see whether real marginal cost, which includes labor productivity and real wage components, can account for inflation dynamics in the observed sample. Surprisingly, when observing all eight selected countries, the relation between real marginal cost and inflation is opposite than expected. On the other hand, inflation in Baltic States and Slovenia seems to be influenced by real marginal cost. The elasticity coefficient of real wages on inflation for Slovenia shows that inflation was quite responsive to movement in wages during the total period, however, inflation became quite inelastic with respect to wages after 2000. Thus, economic policies that were introduced in Slovenia after 2000 were quite efficient in wage regulation, although the real effect will be observed in a more advanced period.
In this paper we try to estimate effects of financial deepness and capital account liberalization on economic growth, investment and the total factor productivity (TFP) in Slovenia from 1993 to the second quarter of 2001. We find out that the only positive effect of capital account liberalization was increased credits to private sector. On the other hand, financial depth has a positive and significant effect on economic growth and investment, but not on the TFP growth. Moreover, it is not likely that also capital account liberalization positively affects above specified choice variables. Namely, financial deepening is achieved through development of adequate institutions and sustainable macroeconomic policies. Once financial system is set in the country, capital account liberalization takes place.
One of the biggest drawbacks in the subprime crisis was a wrong fit of risk measurements and tools to the firm’s portfolio allocation strategies.1 Crouhy (2009) and Stulz (2009) among others point out what went wrong in the risk management practices during the current and other recent financial crisis:
(a) Inadequate use of risk metrics. Daily VaR (Value at Risk) is widely used in financial institutions to assess the trading activities risk. However, VaR measures the minimum worst loss expected (at 99% or 95% confidence level, depending on the distribution used) and not the expected worst loss (Stulz, 2009). Furthermore, VaR does not tell us anything about distribution of the losses BEYOND the minimum worst loss and even worse, it is not sure whether VaR can capture low probability catastrophic events.
The fund invests in insurance-linked bonds referred to as cat bonds. These are high-yield debt instruments with the purpose of raising money in the catastrophe events, usually natural disasters. Cat bonds are issued by insurance and reinsurance companies. Their main attraction for issuers is that in case of a catastrophe event, the issuers’ obligation to pay interest and principal is either deferred or forgiven.
Few empirical studies have looked specifically at the contribution of financial sector development to transition economies' growth, although developed financial markets have been generally assumed to be crucial to supporting growth performance. An empirical exercise that relates GDP growth to a range of variables finds some support for the proposition that financial sector development—in particular the role of foreign-owned banks—had a significant positive impact on transition economies' growth during the past decade.
The results of elections held in eastern Europe this year do not indicate any clear shifts in regional trends or direction. Bulgaria and Romania are set to join the EU on January 1st 2007 (at most there could have been a one-year delay until January 2008), but this will be under the strictest conditions ever applied to new members. Acrimonious negotiations on the final status of Kosovo appear to be grinding towards an impasse and possible crisis. It is difficult to predict the effects on developments in the wider region—not only in restive and resentful Serbia, but also in Bosnia and Hercegovina (BiH), Macedonia and possibly further afield. This is occurring when the EU's most effective instrument for influencing developments in the region—the offer of EU membership—has been seriously weakened by the anti-enlargement mood sweeping western Europe.
Systemic Risk Safeguards for Central Clearing CounterpartiesHELIOSPADILLAMAYER
During the financial crisis, the advantages of exchange-traded and centrally cleared derivatives became visible and an increased use of central counterparties (CCPs) was advocated amid their market safety. CCPs were seen as mitigation agents of counterparty, liquidity and operational risk, entities that are able to address information asymmetries, reduce trading complexity and increase operational efficiency and transparency.
The CCP is designed to reduce and assist with managing credit risk, also known as counterparty risk, in the derivative clearing process through a series of financial safeguards or layers of protection (also referred to as the CCP risk waterfall), where each safeguard handles a particular set of risks the CCP faces during its normal clearing activity or when it faces a default event amid a failure of one or several clearing members. Safeguard measures are constructed in such a way that they prevent a negative spillover effect to other members and to the financial markets.
The aim of the paper is to show that the risk waterfall processes used by CCPs can withstand an extended period of stressed market conditions. We design a theoretical framework in order to simulate a CCP risk waterfall and create a hypothetical CCP to empirically test its ability to perform under crisis conditions.
Our empirical study includes a baseline with a 30-day period and scenario tests, which assume that CCP clearing members suffer capital shortfall due to the systemic risk. Results show that while in the baseline scenario all participating clearing members meet capital requirements, several members become undercapitalized in stress tests and are therefore excluded from trading. Furthermore, in situations in which the defaulter’s guaranty fund was not sufficient to cover a shortfall, the CCP first-loss capital and other financial resources are quickly used up and the default management process moves to a further step. This requires mutualization of the loss by non-defaulting clearing members. Another important observation is that CCPs need timely information about the history of a member’s trading behavior cleared by the CCP, their positions in international markets as well as their overall financial health in order to correctly handle vulnerabilities that arise from their undercapitalization.
Despite credit market turbulence and slowing activity in many major advanced economies, oil prices have been reaching record highs in recent months. Besides oil-specific factors, such as geopolitical risks and speculations, the current price boom is driven by demand and supply forces that reinforce each other amid supportive financial conditions. This paper aims to a link macroeconomic variables together with oil prices in order to provide complement decision tools used by commercial and investment banks when optimizing their investment portfolios. For that reason, we apply financial programming model with incorporated oil price variable. We show that oil prices affect private consumption, gross domestic product, inflation, and imports. On the other hand, we also investigate effects of macroeconomic variables on oil market equilibrium. A decrease in oil supply as well as depreciation of the US$ lead to higher oil prices, which in turn decrease private consumption and output, but as well stimulate inflationary pressures. Empirical test is performed on the basis of quarterly US data from 2001 to 2007. Although financial programming models are subject to limitations and empirical implications are difficult to apply, some general relations between selected macroeconomic variables and oil price can be determined.
Optimizing Net Interest Margin (NIM) in the Financial Sector (With Examples).pdfshruti1menon2
NIM is calculated as the difference between interest income earned and interest expenses paid, divided by interest-earning assets.
Importance: NIM serves as a critical measure of a financial institution's profitability and operational efficiency. It reflects how effectively the institution is utilizing its interest-earning assets to generate income while managing interest costs.
Discover the Future of Dogecoin with Our Comprehensive Guidance36 Crypto
Learn in-depth about Dogecoin's trajectory and stay informed with 36crypto's essential and up-to-date information about the crypto space.
Our presentation delves into Dogecoin's potential future, exploring whether it's destined to skyrocket to the moon or face a downward spiral. In addition, it highlights invaluable insights. Don't miss out on this opportunity to enhance your crypto understanding!
https://36crypto.com/the-future-of-dogecoin-how-high-can-this-cryptocurrency-reach/
Vicinity Jobs’ data includes more than three million 2023 OJPs and thousands of skills. Most skills appear in less than 0.02% of job postings, so most postings rely on a small subset of commonly used terms, like teamwork.
Laura Adkins-Hackett, Economist, LMIC, and Sukriti Trehan, Data Scientist, LMIC, presented their research exploring trends in the skills listed in OJPs to develop a deeper understanding of in-demand skills. This research project uses pointwise mutual information and other methods to extract more information about common skills from the relationships between skills, occupations and regions.
2. Elemental Economics - Mineral demand.pdfNeal Brewster
After this second you should be able to: Explain the main determinants of demand for any mineral product, and their relative importance; recognise and explain how demand for any product is likely to change with economic activity; recognise and explain the roles of technology and relative prices in influencing demand; be able to explain the differences between the rates of growth of demand for different products.
The Universal Account Number (UAN) by EPFO centralizes multiple PF accounts, simplifying management for Indian employees. It streamlines PF transfers, withdrawals, and KYC updates, providing transparency and reducing employer dependency. Despite challenges like digital literacy and internet access, UAN is vital for financial empowerment and efficient provident fund management in today's digital age.
A toxic combination of 15 years of low growth, and four decades of high inequality, has left Britain poorer and falling behind its peers. Productivity growth is weak and public investment is low, while wages today are no higher than they were before the financial crisis. Britain needs a new economic strategy to lift itself out of stagnation.
Scotland is in many ways a microcosm of this challenge. It has become a hub for creative industries, is home to several world-class universities and a thriving community of businesses – strengths that need to be harness and leveraged. But it also has high levels of deprivation, with homelessness reaching a record high and nearly half a million people living in very deep poverty last year. Scotland won’t be truly thriving unless it finds ways to ensure that all its inhabitants benefit from growth and investment. This is the central challenge facing policy makers both in Holyrood and Westminster.
What should a new national economic strategy for Scotland include? What would the pursuit of stronger economic growth mean for local, national and UK-wide policy makers? How will economic change affect the jobs we do, the places we live and the businesses we work for? And what are the prospects for cities like Glasgow, and nations like Scotland, in rising to these challenges?
"Does Foreign Direct Investment Negatively Affect Preservation of Culture in the Global South? Case Studies in Thailand and Cambodia."
Do elements of globalization, such as Foreign Direct Investment (FDI), negatively affect the ability of countries in the Global South to preserve their culture? This research aims to answer this question by employing a cross-sectional comparative case study analysis utilizing methods of difference. Thailand and Cambodia are compared as they are in the same region and have a similar culture. The metric of difference between Thailand and Cambodia is their ability to preserve their culture. This ability is operationalized by their respective attitudes towards FDI; Thailand imposes stringent regulations and limitations on FDI while Cambodia does not hesitate to accept most FDI and imposes fewer limitations. The evidence from this study suggests that FDI from globally influential countries with high gross domestic products (GDPs) (e.g. China, U.S.) challenges the ability of countries with lower GDPs (e.g. Cambodia) to protect their culture. Furthermore, the ability, or lack thereof, of the receiving countries to protect their culture is amplified by the existence and implementation of restrictive FDI policies imposed by their governments.
My study abroad in Bali, Indonesia, inspired this research topic as I noticed how globalization is changing the culture of its people. I learned their language and way of life which helped me understand the beauty and importance of cultural preservation. I believe we could all benefit from learning new perspectives as they could help us ideate solutions to contemporary issues and empathize with others.
Understanding how timely GST payments influence a lender's decision to approve loans, this topic explores the correlation between GST compliance and creditworthiness. It highlights how consistent GST payments can enhance a business's financial credibility, potentially leading to higher chances of loan approval.
[4:55 p.m.] Bryan Oates
OJPs are becoming a critical resource for policy-makers and researchers who study the labour market. LMIC continues to work with Vicinity Jobs’ data on OJPs, which can be explored in our Canadian Job Trends Dashboard. Valuable insights have been gained through our analysis of OJP data, including LMIC research lead
Suzanne Spiteri’s recent report on improving the quality and accessibility of job postings to reduce employment barriers for neurodivergent people.
Decoding job postings: Improving accessibility for neurodivergent job seekers
Improving the quality and accessibility of job postings is one way to reduce employment barriers for neurodivergent people.
Fabular Frames and the Four Ratio ProblemMajid Iqbal
Digital, interactive art showing the struggle of a society in providing for its present population while also saving planetary resources for future generations. Spread across several frames, the art is actually the rendering of real and speculative data. The stereographic projections change shape in response to prompts and provocations. Visitors interact with the model through speculative statements about how to increase savings across communities, regions, ecosystems and environments. Their fabulations combined with random noise, i.e. factors beyond control, have a dramatic effect on the societal transition. Things get better. Things get worse. The aim is to give visitors a new grasp and feel of the ongoing struggles in democracies around the world.
Stunning art in the small multiples format brings out the spatiotemporal nature of societal transitions, against backdrop issues such as energy, housing, waste, farmland and forest. In each frame we see hopeful and frightful interplays between spending and saving. Problems emerge when one of the two parts of the existential anaglyph rapidly shrinks like Arctic ice, as factors cross thresholds. Ecological wealth and intergenerational equity areFour at stake. Not enough spending could mean economic stress, social unrest and political conflict. Not enough saving and there will be climate breakdown and ‘bankruptcy’. So where does speculative design start and the gambling and betting end? Behind each fabular frame is a four ratio problem. Each ratio reflects the level of sacrifice and self-restraint a society is willing to accept, against promises of prosperity and freedom. Some values seem to stabilise a frame while others cause collapse. Get the ratios right and we can have it all. Get them wrong and things get more desperate.
Independent Study - College of Wooster Research (2023-2024) FDI, Culture, Glo...AntoniaOwensDetwiler
"Does Foreign Direct Investment Negatively Affect Preservation of Culture in the Global South? Case Studies in Thailand and Cambodia."
Do elements of globalization, such as Foreign Direct Investment (FDI), negatively affect the ability of countries in the Global South to preserve their culture? This research aims to answer this question by employing a cross-sectional comparative case study analysis utilizing methods of difference. Thailand and Cambodia are compared as they are in the same region and have a similar culture. The metric of difference between Thailand and Cambodia is their ability to preserve their culture. This ability is operationalized by their respective attitudes towards FDI; Thailand imposes stringent regulations and limitations on FDI while Cambodia does not hesitate to accept most FDI and imposes fewer limitations. The evidence from this study suggests that FDI from globally influential countries with high gross domestic products (GDPs) (e.g. China, U.S.) challenges the ability of countries with lower GDPs (e.g. Cambodia) to protect their culture. Furthermore, the ability, or lack thereof, of the receiving countries to protect their culture is amplified by the existence and implementation of restrictive FDI policies imposed by their governments.
My study abroad in Bali, Indonesia, inspired this research topic as I noticed how globalization is changing the culture of its people. I learned their language and way of life which helped me understand the beauty and importance of cultural preservation. I believe we could all benefit from learning new perspectives as they could help us ideate solutions to contemporary issues and empathize with others.
OJP data from firms like Vicinity Jobs have emerged as a complement to traditional sources of labour demand data, such as the Job Vacancy and Wages Survey (JVWS). Ibrahim Abuallail, PhD Candidate, University of Ottawa, presented research relating to bias in OJPs and a proposed approach to effectively adjust OJP data to complement existing official data (such as from the JVWS) and improve the measurement of labour demand.
University of North Carolina at Charlotte degree offer diploma Transcripttscdzuip
办理美国UNCC毕业证书制作北卡大学夏洛特分校假文凭定制Q微168899991做UNCC留信网教留服认证海牙认证改UNCC成绩单GPA做UNCC假学位证假文凭高仿毕业证GRE代考如何申请北卡罗莱纳大学夏洛特分校University of North Carolina at Charlotte degree offer diploma Transcript
In a tight labour market, job-seekers gain bargaining power and leverage it into greater job quality—at least, that’s the conventional wisdom.
Michael, LMIC Economist, presented findings that reveal a weakened relationship between labour market tightness and job quality indicators following the pandemic. Labour market tightness coincided with growth in real wages for only a portion of workers: those in low-wage jobs requiring little education. Several factors—including labour market composition, worker and employer behaviour, and labour market practices—have contributed to the absence of worker benefits. These will be investigated further in future work.
What's a worker’s market? Job quality and labour market tightness
Topics Volatility
1. Topics
“Volatility”
Helios
Padilla
Mayer
February
22,
2012
1.
(a)
The
term
asymmetric
volatility
arises
from
observation
that
we
observe
higher
volatilities
(higher
risk)
during
the
market
downturn
than
in
the
market
upturns.
The
most
common
mentioned
factor
that
contributes
to
such
risk
behavior
is
increased
market
leverage
that
was
produced
by
a
negative
shock;
however,
there
are
also
other
factors,
such
as
perceived
risk/reward
balance
in
different
stages
of
market
behavior.
As
explained
in
V-‐‑Lab
documentation,
the
plain
GARCH
model1
cannot
take
into
account
the
stronger
impact
of
negative
shocks
in
time
t-‐‑1
on
the
variance
in
time
t
than
in
case
of
positive
shocks
–
an
asymmetry
impact
of
negative
shock.
For
that
reason,
the
GARCH
model
has
been
augmented
into
models
such
as
the
Threshold
GARCH
(TGARCH),
the
Asymmetric
GARCH
(AGARCH)
and
the
Exponential
GARCH
(EGARCH).
The
V-‐‑Lab
documentation
and
the
task
in
this
exam
indicate
that
GJR
GARCH
should
be
used
to
introduce
an
asymmetry
into
the
variance
analysis.
Thus,
my
answers
will
be
based
on
the
use
of
the
GJR
GARCH
model.
2
I
also
assume
that
p=1,
q=1,
so
I
will
be
talking
about
GJR
GARCH
(1,
1)
and
GARCH
(1,
1).
We
assume
that
the
return
time
series
can
take
the
following
form:
,
where
is
expected
return
and
is
the
white
noise,
zero-‐‑mean
error
term.
The
assumption
is
that
is
serially
uncorrelated,
but
not
necessarily
serially
independent
–
it
can
present
a
conditional
heteroskedasticity,
which
is
taken
into
consideration
by
GJR
GARCH
model.
Error
term
is
then
split
into
a
stochastic
component
( )
and
time-‐‑dependent
standard
deviation
( ):
,
where
is
assumed
to
be
drawn
from
Gaussian
distribution
and
i.i.d.
(mean=0,
variance
=
1).
The
variance
is
then
formulated
as:
,
(1)
where
So,
GJR
GARCH
models
asymmetry
in
the
ARCH
model
and
as
the
basic
ARCH
includes
(a)
the
error
term,
(b)
a
conditional
variance
(which
brings
ARCH
model
to
GARCH),
plus
(c)
innovation
term,
which
for
asymmetry:
negative
shocks
in
time
t-‐‑1
have
stronger
impact
on
variance
than
positive
shocks.
Therefore
the
term
takes
value
of
1
GARCH stands for general autoregressive (meaning that past observations are incorporated into the
present situation) conditional (variance is conditioned on time) heteroskedascticity (a time-varying
variance).
2
I am also using a formulation of GJR GARCH consistent with VLAB in order to be able to interpret
results correctly.
ttr eµ += µ te
te
tz ts
ttt zse = tz
2
1
2
11
2
)( --- +++= tttt I bsegaws
þ
ý
ü
î
í
ì
á
³
=-
µ
µ
1-‐t
1-‐t
rif
rif
1
0
1tI
1-tI
2. 1,
when
returns
in
time
t-‐‑1
are
lower
than
expected
return
and
value
0
when
returns
in
time
t-‐‑1
are
higher
than
expected
return.
As
explained
in
VLAB,
the
effective
coefficient
associated
with
a
negative
shock
is
α+γ.
In
financial
time
series,
we
generally
find
that
γ
is
statistically
significant.
The
estimate
of
the
coefficients
is
performed
through
V-‐‑Lab
simultaneously
by
maximizing
the
log
likelihood
(MLE).
GJR
GARCH
also
captures
the
volatility
clustering:
if
volatility
was
high
at
time
t-‐‑1,
it
will
also
be
high
at
time
t.
Alternatively,
the
shock
at
time
t-‐‑1
will
also
impact
variance
at
time
t.
There
are
the
following
constraints
on
coefficients:
,
, , .
If
also
is
true,
the
volatility
is
mean
reverting
and
fluctuates
around .
If
this
is
true,
then
we
can
write
down
the
unconditional
variance
as
(2)
In
our
case,
we
observe
a
Carnival
Corp
volatility
data
range
from
02/21/2010
to
02/21/2012.
The
annual
GJR-‐‑GARCH
volatility
is
plotted
below,
along
with
the
volatility
summary
table
and
estimated
parameters.
1-‐‑year
volatility
prediction
with
the
GJR-‐‑GARCH
model
Volatility
summary
table
Price
30.74
Return
-‐‑0.75
1
Week
Pred:
42.9
Avg
Week
Vol:
43.84
Avg
Month
Vol:
49.89
1
Month
Pred:
43.5
Min
Vol:
16.32
Max
Vol:
189.91
6
Months
Pred:
47.2
Avg
Vol:
39.57
Vol
of
Vol:
45.17
1
Year
Pred:
50.8
0³w 0³a 0³b 0³+ga
1
2
á++ b
g
a s
)var(
2
1
2
tr=
---
=
b
g
a
w
s
3.
Parameter
Estimates
Parameter
t-‐‑stat
0.03981
8.693
0.01802
6.273
0.94189
393.603
0.07686
19.875
t-‐‑stat
for
estimated
parameters
shows
that
all
estimates
are
statistically
significant
at
1%;
thus,
all
restrictions
on
coefficients
are
fulfilled
and
the
process
does
follow
GJR-‐‑
GARCH
model.
The
Corp
Carnival
GJR-‐‑GARCH
model
can
be
then
written
as:
(3)
A
constant
in
the
model
shows
that
if
the
error
term
and
variance
in
time
t-‐‑1
were
0
and
returns
in
time
t-‐‑1
are
higher
than
expected
return,
the
variance
for
Corp
Carnival
would
be
0.03981.
Secondly,
a
1%
change
in
variance
in
time
t-‐‑1
would
increase
variance
in
time
t
by
0.94189%.
Third,
a
1%
negative
shock
on
returns
in
time
t-‐‑1
would
impact
variance
in
time
t
by
(0.01802+0.07686=)
0.09488
%.
As ,
volatility
is
mean
reverting
and
unconditional
variance,
=23.98193.
(b)
Below
I
plot
volatilities
over
the
last
3
months
for
GARCH
and
GJR-‐‑GARCH
models.
Given
the
definition
of
GJR-‐‑GARCH
model,
asymmetric
impact
of
negative
shock
is
captured
(unlike
in
GARCH
model)
and
therefore
the
volatility
prediction
on
the
basis
of
GJR-‐‑GARCH
model
is
bigger
than
volatility
prediction
on
the
basis
of
on
GARCH
model.
This
is
specifically
noticed
in
the
case
of
the
biggest
negative
shock
in
returns
on
14
January,
2012.
The
immediate
spike
in
volatility
after
this
date
is
significantly
higher
for
GJR-‐‑GARCH
than
for
GARCH
model.
3-‐‑month
volatility
prediction
with
the
GARCH
model
w
a
b
g
2
1
2
11
2
94189.0)07686.001802.0(03981.0 --- +++= tttt I ses
199834.0
2
á=++ b
g
a
)var( tr
4.
Volatility
summary
table
Price
30.74
Return
-‐‑0.75
1
Week
Pred:
40.19
Avg
Week
Vol:
40.66
Avg
Month
Vol:
43.23
1
Month
Pred:
40.38
Min
Vol:
17.33
Max
Vol:
107.53
6
Months
Pred:
41.68
Avg
Vol:
38.77
Vol
of
Vol:
36.61
1
Year
Pred:
43.04
Parameter
Estimates
Parameter
t-‐‑stat
0.01141
6.306
0.02821
18.816
0.97128
906.887
3-‐‑month
volatility
prediction
with
the
GJR-‐‑GARCH
Model
Volatility
summary
table
Price
30.74
Return
-‐‑0.75
1
Week
Pred:
42.9
Avg
Week
Vol:
43.84
Avg
Month
Vol:
49.89
1
Month
Pred:
43.5
Min
Vol:
16.32
Max
Vol:
189.91
6
Months
Pred:
47.2
Avg
Vol:
39.57
Vol
of
Vol:
45.17
1
Year
Pred:
50.8
Parameter
Estimates
Parameter
t-‐‑stat
0.03981
8.693
0.01802
6.273
0.94189
393.603
0.07686
19.875
(c)
In
both
cases,
when
volatility
is
estimated
with
GARCH
of
with
GJR-‐‑GARCH
models,
volatility
predications
are
higher
over
the
long-‐‑term
horizon
(1
year)
than
over
the
w
a
b
w
a
b
g
5. short-‐‑term
horizon
(1
week)
–
see
plots
from
V-‐‑Lab
below.
When
we
take
into
consideration
longer
horizons,
we
implicitly
“tag
along”
t-‐‑k
period
impacts
for
a
larger
number
of
k.
For
example,
if
we
observe
1
week
data,
we
would
be
considering
5
lagged
period
effects.
When
we
observe
1
year
data,
we
would
be
considering
250
lagged
period
effects.
The
volatility
impact
is
therefore
multiplying
through
periods
and
therefore
longer-‐‑term
forecasts
result
in
higher
volatilities
than
short-‐‑term
forecasts.
Annualized
Volatility
Predictions
with
the
GARCH
Model
Annualized
Volatility
Predictions
with
the
GJR-‐‑GARCH
Model
2.
The
VIX
index
measures
the
short-‐‑term
implied
volatility
of
the
S&P
500
index
and
has
become
a
benchmark
for
volatility
in
equity
markets.
Volatility
is
calculated
as
s
standard
deviation
of
returns.
However,
if
we
want
to
talk
about
the
implied
or
realized
volatility
of
VIX,
we
actually
talk
about
the
volatility
of
volatility.
Volatility
of
volatility
calculated
as
the
standard
deviation
of
the
percentage
change
in
VIX
volatility
and
it
tells
us
how
fast
volatility
changes.
Prior
and
during
the
financial
crisis,
the
standard
risk
measures,
such
as
VaR,
were
based
on
short-‐‑term
risk
measurements
–
1
to
10
day
horizons.
However,
investors
normally
hold
their
positions
for
a
longer
period
of
time
than
10
days.
As
short-‐‑term
risk
measures
during
the
low
risk
environment
did
not
alert
to
any
possible
increase
in
risk,
everyone
was
confidents
that
their
long-‐‑term
positions
are
safe
by
accepting
short-‐‑term
risk
assessment.
However,
none
actually
considered
the
possibility
(and
velocity)
of
change
in
risk
and
volatility
of
markets.
During
the
low
risk,
low
volatility
and
low
interest
period,
everyone
tried
to
increase
their
leverage
on
the
back
of
cheap
assets,
attractive
structured
products
offered
and
a
variety
of
insurance
instruments
available
to
insure
against
any
possible
risk
event.
The
problem
arose
when
market
volatility
started
increasing,
investors
were
holding
now
highly
risky
positions
that
they
were
able
to
sell
only
at
deep
discounts,
insurers
were
not
capitalized
enough
to
provide
payouts.
Thus,
the
major
problem
was
that
future
volatility
(which
was
observed
through
the
VoV
chart)
was
not
incorporated
at
all
in
risk
assessment
models
and
none
was
prepared
for
what
came
forward.
Short-‐‑term
risk
assessment
models
produce
short-‐‑term
results
and
observations.
If
one
compares
10-‐‑day,
30-‐‑day,
60-‐‑day
or
1-‐‑year
volatility
index
(VIX)
and
its
implied
6. volatility,
short-‐‑term
volatilities
will
be
lower
than
long-‐‑term
volatilities.
The
reason
behind
this
is
that
long-‐‑term
volatility
depends
on
macroeconomic
policies
(monetary,
fiscal,
balance
of
payments)
and
success
of
their
implementation.
If
investors
are
aware
that
long-‐‑term
volatility
is
higher
than
the
short-‐‑term,
they
can
decide
to
either
engage
in
short-‐‑term
investments
that
are
less
volatile
and
accept
lower
returns,
or
try
to
incorporate
higher
long-‐‑term
volatility
in
their
expected
returns.
Alternatively,
long-‐‑
term
investments
with
higher
volatility
can
be
also
hedged
with
proper
instruments
that
would
sufficiently
account
for
higher
volatility
in
the
future.
Long-‐‑term
investments
represent
higher
volatility
because
at
the
time
of
investment
decision
it
is
not
clear
whether
policymakers
will
be
successful
at
mitigating
future
risks
(for
example,
a
probability
of
Greek
default,
global
economic
slowdown,
Iran’s
nuclear
policy
impact
on
oil
supply,
etc..).
Furthermore,
new
macroeconomic
risks
will
arise
during
the
investment
period
and
more
uncertainties
will
have
to
be
taken
into
consideration.
3.
Correlation
is
a
measure
of
relation
between
two
variables
or
series.
If
variables
are
moving
in
the
same
direction,
correlation
is
positive
(up
to
+1,
which
indicates
perfect
positive
correlation).
When
variables
are
moving
in
opposite
direction,
correlation
is
negative
(up
to
-‐‑1,
which
means
perfect
negative
correlation).
Engel
(2000)
defines
unconditional
correlation
between
two
variables
r1
and
r2,
each
with
mean
zero
as
,
(1)
Where
is
covariance
between
variables
r1
and
r2,
and
and
is
variance
of
r1
and
r2,
respectively.
This
formula
does
not
include
time
component
and
therefore
it
is
assumed
that
correlation
is
not
based
on
information
known
from
previous
periods.
However,
we
know
that
correlations
are
sensitive
to
time.
This
time
sensitivity
is
taken
into
consideration
in
conditional
correlation,
where
both
covariances
and
variances
are
based
on
information
known
the
previous
period:
(2)
In
his
work,
Engle
(2000)
also
shows
that
conditional
correlation
can
be
interpreted
as
conditional
covariance
between
standardized
disturbances.
For
that
reason,
he
writes
the
returns
( )
as
the
conditional
standard
deviation
times
the
standardized
disturbance
( ),
with
mean
zero
and
variance
1:
and
(3)
If
we
substitute
(3)
into
(2),
we
get
(4)
There
are
many
ways
to
estimate
conditional
correlations
and
I
will
describe
two
based
on
multivariate
GARCH
models.
GARCH
models
assume
that
volatilities
are
correlations
are
functions
of
lagged
returns.
Bollerslev
(1990)
specified
the
constant
conditional
)()(
)(
2
2
2
1
21
2,1
rErE
rrE
=r
)( 21rrE )( 2
1rE )( 2
2rE
)()(
)(
2
,2
2
,11
,2,11
,2,1
ttt
ttt
t
rErE
rrE
-
-
=r
tih ,
ti,e
)( 2
,1, titti rEh -= 2,1,,,, == ihr tititi e
)(
)()(
)(
,3,11
2
,2
2
,11
,2,11
,2,1 ttt
ttt
ttt
t E
EE
E
ee
ee
ee
r -
-
-
==
7. correlation
(CCC)
multivariate
GARCH
specification,
where
conditional
covariances
and
variances
are
time-‐‑varying,
but
conditional
correlations
are
constant.
Conditional
variances
are
modelled
by
univariate
GARCH
models
and
correlation
matrix
is
estimated
by
MLE.
The
assumption
of
constant
correlations
allows
for
comparison
between
periods.
The
model
is
described
as
follows.
The
multivariate
GARCH
assumes
that
distribution
of
returns
(rt)
from
n
assets
has
mean
zero
and
covariance
matrix
Ht
(Engle,
Sheppard,
2001):
,
(5)
where
,
and
Conditional
covariance
matrix
Ht
is
then
decomposed
into
nxn
Rt
matrix
m
where
(6)
and
D
is
conditional
correlation
matrix
with
constant
correlations,
= ;
,
where
,
i=1,
…N.
Thus,
the
time
variation
in
covariance
matrix
Ht
is
then
explained
only
by
time-‐‑varying
conditional
variances
for
all
rt.
The
off-‐‑diagonal
elements
of
the
conditional
covariance
matrix
can
be
then
written
as
(Silvennoinen,
Terasvirta,
2008)
,
and
(7)
While
the
advantage
of
this
model
is
that
is
easy
to
estimate
(we
only
need
non
linear
n
estimates
of
univariate
GARCH
models),
the
problem
is
that
correlations
are
not
always
constant.
Engle
(2000,
2001)
therefore
suggested
an
alternative
estimation
of
conditional
correlations
with
dynamic
conditional
correlation
(DCC)
multivariate
GARCH
specification,
which
is
derived
from
Bollerslev’s
CCC
model,
but
allows
for
conditional
correlations
to
be
time-‐‑varying.
The
conditional
covariance
matrix
is
then
written
as
(8)
Conditional
correlations
are
then
estimated
on
the
basis
of
exponential
smoothing:
,
(9)
where
smoothing
is
introduced
through
),0( tt HNr »
ttt DRRH =
[ ]ijtt hH =
{ }tit hdiagR ,=
ijtr ijr [ ]ijD r=
1=iir
[ ] ijjtitijt hhH r= ji ¹ Nji ££ ,1
tttt RDRH =
[ ] jit
stj
s
s
sti
s
s
s
stjsti
s
tji R ,
2
,
1
2
,
1
1
,,
,, =
÷
÷
ø
ö
ç
ç
è
æ
÷
÷
ø
ö
ç
ç
è
æ
=
-
¥
=
-
¥
=
¥
=
--
åå
å
elel
eel
r
8. (10)
and
the
conditional
correlation
estimator
is
then
(11)
This
process
can
be
estimated
by
using
GARCH
(1,
1)
model:
(12)
The
unconditional
expectation
for
the
cross
product
is
and
for
variances, .
The
conditional
correlation
estimator
is
written
as
in
(11).
The
model
is
mean-‐‑reverting
as
long
as ,
however,
when
the
sum
equals
to
1,
the
model
is
as
explained
in
(10)
and
(11).
Reference:
Asai
Manabu
and
Michael
McAleer,
2005,
“Dynamic
Correlations
in
Symmetric
Multivariate
SV
Models”,
in:
MODSIM
2005
International
Congress
on
Modelling
and
Simulation.
Modelling
and
Simulation
Society
of
Australia
and
New
Zealand,
Zerger
A.
and
Argent
R.,
eds.
Bollerslev,
Tim,
1990,
“Modelling
the
Coherence
in
Short-‐‑Run
Nominal
Exchange
Rates:
A
Multivariate
Generalized
ARCH
Model,”
The
Review
of
Economics
and
Statistics
Vol.
72,
No.
3
(Aug.,
1990),
pp.
498-‐‑505.
Engle,
Robert
F
,
2000,
“Dynamic
Conditional
Correlation
–A
Simple
Class
Of
Multivariate
Garch
Models,”
July
1999
(revised
May
2000),
a
research
supported
by
NSF
grant
SBR-‐‑9730062
and
NBER,
27
pp.
Engle,
Robert,
2001,
“GARCH
101:
The
Use
of
ARCH/GARCH
Models
in
Applied
Econometrics,”
Journal
of
Economic
Perspectives,
Vol.
15,
No.
4
(Fall
2001),
pp.
157–168.
Engle,
Robert
F.
and
Kevin
Sheppard,
2001,
“Theoretical
and
Empirical
properties
of
Dynamic
Conditional
Correlation
Multivariate
GARCH”,
2001,
46
pp.
Hafner,
Christian
M.
and
Philip
Hans
Franses,
2009,
“A
Generalized
Dynamic
Conditional
Correlation
Model:
Simulation
And
Application
To
Many
Assets”,
Institute
de
Statistique,
Universite
Catholique
de
Louvain,
Workin
Paper
0904,
January
14,
2009,
25
pp.
Silvennoinen,
Annasvinna,
and
Timo
Terasvirta,
2008,
“Multivariate
GARCH
Models,
”
SSE/EFI
Working
Paper
Series
in
Economic
and
Finance,
January
2008,
No.
669,
25
pp.
( )( ) ( )1,,1,1,,, 1 --- +-= tjitjtitji qq leel
tjjtii
tji
tji
qq
q
,,
,,
,, =r
( ) stjsti
s
s
jijitjijitijitji qq --
¥
=
-- å+÷÷
ø
ö
çç
è
æ
-
--
=-+-+= ,,
1
,,1,,,1,,,,
1
1
() eeba
b
ba
rrbrear
r 1, =jir
1á+ ba