Stochastic modelling of the loss given default (LGD) for non-defaulted assetsGRATeam
In the Basel framework of credit risk estimation, banks seek to develop precise and stable internal models to limit their capital charge. Following the recent changes in terms of regulatory requirements (Basel regulation, definition of the Downturn…), it is prudent to think about innovative methods to estimate the credit risk parameters with the constrains of models’ stability, robustness, and economic cycles sensitivity.
This paper introduces a different recovery forecasting methodology for LGD (loss given default) parameter. The goal is to model the recovery dynamic by assuming that each maturity in default has a specific behavior and that the recovery rate depends on default generation change. The model focuses on the recovery rate time series where the time period is the default generation. Thus, the estimation of upcoming recoveries uses vertical diffusions, where the triangle’s columns are completed one by one through stochastic processes. This model is suggested to replace classical horizontal forecasting with Chain-Ladder methods.
First, a definition of the LGD parameter and the regulatory modelling requirements are provided, as well as a presentation of the data set used and the construction of the recovery triangle. Second, the stochastic forecasting is introduced with details of how to calibrate the model. Third, three classical methods of recovery forecasting based on Chain-Ladder are presented for comparison and to contest and the stochastic methodology. Finally, a regulatory calibration of the LGD for non-defaulted assets is proposed to include Downturn effects and margins of prudence.
It is not difficult to find situations of marked change in variables and with unpredictable event risk implies estimation problems. E.g.,
Credit spreads in 2008 rise to levels that could never have been forecast based upon previous history. The subprime crisis of 2007/8: credit spreads & volatility rise to unseen levels & shift in debtor behavior (delinquency patterns)
E.g., estimating the volatility from data in a calm (turbulent) period implies under (over) estimation of future realized volatility
Stochastic modelling of the loss given default (LGD) for non-defaulted assetsGRATeam
In the Basel framework of credit risk estimation, banks seek to develop precise and stable internal models to limit their capital charge. Following the recent changes in terms of regulatory requirements (Basel regulation, definition of the Downturn…), it is prudent to think about innovative methods to estimate the credit risk parameters with the constrains of models’ stability, robustness, and economic cycles sensitivity.
This paper introduces a different recovery forecasting methodology for LGD (loss given default) parameter. The goal is to model the recovery dynamic by assuming that each maturity in default has a specific behavior and that the recovery rate depends on default generation change. The model focuses on the recovery rate time series where the time period is the default generation. Thus, the estimation of upcoming recoveries uses vertical diffusions, where the triangle’s columns are completed one by one through stochastic processes. This model is suggested to replace classical horizontal forecasting with Chain-Ladder methods.
First, a definition of the LGD parameter and the regulatory modelling requirements are provided, as well as a presentation of the data set used and the construction of the recovery triangle. Second, the stochastic forecasting is introduced with details of how to calibrate the model. Third, three classical methods of recovery forecasting based on Chain-Ladder are presented for comparison and to contest and the stochastic methodology. Finally, a regulatory calibration of the LGD for non-defaulted assets is proposed to include Downturn effects and margins of prudence.
It is not difficult to find situations of marked change in variables and with unpredictable event risk implies estimation problems. E.g.,
Credit spreads in 2008 rise to levels that could never have been forecast based upon previous history. The subprime crisis of 2007/8: credit spreads & volatility rise to unseen levels & shift in debtor behavior (delinquency patterns)
E.g., estimating the volatility from data in a calm (turbulent) period implies under (over) estimation of future realized volatility
International journal of engineering and mathematical modelling vol1 no1_2015_2IJEMM
Default risk has always been a matter of importance for financial managers and scholars. In this paper we apply an intensity-based approach for default estimation with a software simulation of the Cox-Ingersoll-Ross model. We analyze the possibilities and effects of a non-linear dependence between economic and financial state variables and the default density, as specified by the theoretical model. Then we perform a test for verifying how simulation techniques can improve the analysis of such complex relations when closed-form solutions are either not available or hard to come by.
IFRS 9 Implementation : Using the Z-score approach as a KRI to identify adverse credit deterioration for Stage Transition from 1 to stages 2/3 in IFRS 9 Modeling
Skiera, Bernd / Bermes, Manuel / Horn, Lutz (2011), "Customer Equity Sustainability Ratio: A New Metric for Assessing a Firm’s Future Orientation", Journal of Marketing, Vol. 75 (May), 118-131
“Over” and “Under” Valued Financial Institutions: Evidence from a “Fair-Value...Ilias Lekkos
The aim of the study is to present our approach that allows us to evaluate relative over- and under-valuation of financial institutions based on the distance between their market-based price to book ratios and our estimated "fair-value" P/Bs.
EAD Parameter : A stochastic way to model the Credit Conversion FactorGenest Benoit
This white paper aims at estimating credit risk by modelling the Credit Conversion Factor (CCF) parameter related to the Exposure-at-Default (EAD). It has been decided to perform the estimation thanks to stochastic processes instead of usual statistical methodologies (such as classification tree or GLM).
Our paper will focus on two types of model: the Ornstein Uhlenbeck (OU) model – part of ARMA model types – and the Geometric Brownian Movement (GBM) model. First, we will describe, then implement and calibrate each model to ensure relevance and robustness of our results. Then, we will focus on GBM model to model CCF.
Regulatory capital requirements pose a major challenge for financial institutions today.
As the Asian financial crisis of 1997 and rapid development of credit risk management revealed many shortcomings and loop holes in measuring capital charges under Basel I, Basel II was issued in 2004 with the sole intent of improving international convergence of capital measurement and capital standards.
This paper introduces Basel II, the construction of risk weight functions and their limits in two sections:
In the first, basic fundamentals are presented to better understand these prerequisites: the likelihood of losses, expected and unexpected loss, Value at Risk, and regulatory capital. Then we discuss the founding principles of the regulatory formula for risk weight functions and how it works.
The latter section is dedicated to studying the different parameters of risk weight functions, in order to discuss their limits, modifications and impacts on the regulatory capital charge coefficient.
Weighted Average Cost of Capital (WACC) is often used for company valuation and for setting hurdle rates for project planning. With the recent increase in issuance of hybrid securities it is important to have a robust methodology for including hybrid capital in WACC.
Many analysts use a method based on rating agency equity credit. Unfortunately this can lead to misleading results.
In this paper we describe the correct methodology for computing WACC with hybrids, as well as a shortcut method that can be used if the hybrid is used only to repurchase debt or equity or both (rather than being used to fund company projects or acquisitions).
For situations in which the hybrid is being used to repurchase debt or equity or both, we recommend the shortcut method. Otherwise, we recommend the full WACC and Capital Asset Pricing Model (CAPM) approach.
We also note that hybrids reduce the cost of capital in many situations. However, since they typically form only a moderate proportion of a firm's capital structure, expectations of reduction in WACC from hybrid issuance must be kept realistic.
Dear students get fully solved assignments
Send your semester & Specialization name to our mail id :
help.mbaassignments@gmail.com
or
call us at : 08263069601
This presentation will survey and discuss various quantitative considerations in liquidity risk for a financial institution. This includes the concept of liquidity-at-risk (LaR) as a determinant of buffers, as well as how one defines and quantifies such buffers. We will also examine issues such as limit-related input for liquidity policy and transfer pricing as an alternative concept. Two stylized models of liquidity risk are presented and analyzed.
International journal of engineering and mathematical modelling vol1 no1_2015_2IJEMM
Default risk has always been a matter of importance for financial managers and scholars. In this paper we apply an intensity-based approach for default estimation with a software simulation of the Cox-Ingersoll-Ross model. We analyze the possibilities and effects of a non-linear dependence between economic and financial state variables and the default density, as specified by the theoretical model. Then we perform a test for verifying how simulation techniques can improve the analysis of such complex relations when closed-form solutions are either not available or hard to come by.
IFRS 9 Implementation : Using the Z-score approach as a KRI to identify adverse credit deterioration for Stage Transition from 1 to stages 2/3 in IFRS 9 Modeling
Skiera, Bernd / Bermes, Manuel / Horn, Lutz (2011), "Customer Equity Sustainability Ratio: A New Metric for Assessing a Firm’s Future Orientation", Journal of Marketing, Vol. 75 (May), 118-131
“Over” and “Under” Valued Financial Institutions: Evidence from a “Fair-Value...Ilias Lekkos
The aim of the study is to present our approach that allows us to evaluate relative over- and under-valuation of financial institutions based on the distance between their market-based price to book ratios and our estimated "fair-value" P/Bs.
EAD Parameter : A stochastic way to model the Credit Conversion FactorGenest Benoit
This white paper aims at estimating credit risk by modelling the Credit Conversion Factor (CCF) parameter related to the Exposure-at-Default (EAD). It has been decided to perform the estimation thanks to stochastic processes instead of usual statistical methodologies (such as classification tree or GLM).
Our paper will focus on two types of model: the Ornstein Uhlenbeck (OU) model – part of ARMA model types – and the Geometric Brownian Movement (GBM) model. First, we will describe, then implement and calibrate each model to ensure relevance and robustness of our results. Then, we will focus on GBM model to model CCF.
Regulatory capital requirements pose a major challenge for financial institutions today.
As the Asian financial crisis of 1997 and rapid development of credit risk management revealed many shortcomings and loop holes in measuring capital charges under Basel I, Basel II was issued in 2004 with the sole intent of improving international convergence of capital measurement and capital standards.
This paper introduces Basel II, the construction of risk weight functions and their limits in two sections:
In the first, basic fundamentals are presented to better understand these prerequisites: the likelihood of losses, expected and unexpected loss, Value at Risk, and regulatory capital. Then we discuss the founding principles of the regulatory formula for risk weight functions and how it works.
The latter section is dedicated to studying the different parameters of risk weight functions, in order to discuss their limits, modifications and impacts on the regulatory capital charge coefficient.
Weighted Average Cost of Capital (WACC) is often used for company valuation and for setting hurdle rates for project planning. With the recent increase in issuance of hybrid securities it is important to have a robust methodology for including hybrid capital in WACC.
Many analysts use a method based on rating agency equity credit. Unfortunately this can lead to misleading results.
In this paper we describe the correct methodology for computing WACC with hybrids, as well as a shortcut method that can be used if the hybrid is used only to repurchase debt or equity or both (rather than being used to fund company projects or acquisitions).
For situations in which the hybrid is being used to repurchase debt or equity or both, we recommend the shortcut method. Otherwise, we recommend the full WACC and Capital Asset Pricing Model (CAPM) approach.
We also note that hybrids reduce the cost of capital in many situations. However, since they typically form only a moderate proportion of a firm's capital structure, expectations of reduction in WACC from hybrid issuance must be kept realistic.
Dear students get fully solved assignments
Send your semester & Specialization name to our mail id :
help.mbaassignments@gmail.com
or
call us at : 08263069601
This presentation will survey and discuss various quantitative considerations in liquidity risk for a financial institution. This includes the concept of liquidity-at-risk (LaR) as a determinant of buffers, as well as how one defines and quantifies such buffers. We will also examine issues such as limit-related input for liquidity policy and transfer pricing as an alternative concept. Two stylized models of liquidity risk are presented and analyzed.
In this presentation Gopalkrishna Rajagopal talks about what a financial company is, with examples of who they are and what they do. And goes through the key sectors and the business model they have in place at the Williams Capital Group.
Assessing the impact of a disruption: Building an effective business impact a...Bryghtpath LLC
Many organizations have adopted the ISO 22301 standard for their business continuity management systems. Recently, ISO has released the new ISO 22317 Standard for Business Impact Analysis. In this webinar, learn about several different strategies to build an effective BIA that will help you advance your business continuity strategies.
The instructor for this webinar is Bryan Strawser, Founder and CEO of Bryghtpath LLC, a strategic advisory firm specializing in crisis management, business continuity, global risk, crisis communications, and public affairs.
Expert Judgement Credit Rating for SME & Commercial CustomersMike Coates
A high-level presentation from GBRW Consulting on some of the key issues relevant to developing and then implementing a sound credit scoring and rating system for Small- to Medium-sized Enterprises (SMEs) and commercial banking customers. It focuses on the implementation of an 'expert judgement' approach to credit rating as an alternative to statistical approaches where data is inadequate. It is particularly relevant for emerging market or start-up banks where historical financial statement analysis may be easily accessible or reliable.
In-spite of large volumes of Contingent Credit Lines (CCL) in all commercial banks, the paucity of Exposure at Default (EAD) models, unsuitability of external data and inconsistent internal data with partial draw-downs has been a major challenge for risk managers as well as regulators in for managing CCL portfolios. This current paper is an attempt to build an easy to implement, pragmatic and parsimonious yet accurate model to determine the exposure distribution of a CCL portfolio. Each of the credit line in a portfolio is modeled as a portfolio of large number of option instruments which can be exercised by the borrower, determining the level of usage. Using an algorithm similar to basic the CreditRisk+ and Fourier Transforms we arrive at a portfolio level probability distribution of usage. We perform a simulation experiment using data from Moody\'s Default Risk Service, historical draw-down rates estimated from the history of defaulted CCLs and a current rated portfolio of such.
Using Cross Asset Information To Improve Portfolio Risk Estimationyamanote
There are obvious relationships between the various securities of a given firm that impact our expectations of risk. For example, if fixed income investors expect a corporate bond of a company to default, there must be a related bankruptcy event that would negatively impact shareholders in that firm. In this presentation, Nick will describe how to use data from bond and option markets to improve risk estimation for equity portfolios, and how to use information from the equity markets to improve estimation of credit risk in fixed income securities. The goal of the process is to create holistic risk estimation where all expectations of risk are mutually consistent across the entire capital structure of a firm, and related derivatives.
Hedge Trackers reviews how FASB Exposure Draft on Financial Instruments - Der...HedgeTrackers
Hedge Trackers highlights how the new exposure draft, "Accounting for Financial Instruments and Revisions to the Accounting for Derivative Instruments and Hedging Activities: Financial Instruments (Topic 825) and Derivatives and Hedging (Topic 815)" will impact your interest rate hedge program. We are seeing many opportunities to lighten the hedge accounting load, but it doesn’t come without a price. Learn if your program will benefit or lose under the proposed changes. For more information visit www.hedgetrackers.com.
In this paper, we construct a Credit Default Swap pricing model for default recovery rates under
distributional uncertainty based on a structured pricing model and distributional uncertainty theory. The model
is algorithmically transformed into a solvable semi-definite programming problem using the Lagrangian dual
method, and the solution of the model is given using the projection interior point method. Finally, an empirical
analysis is conducted, and the results show that the model constructed in this paper is reasonable and efficient
Evaluation of Capital Needs in Insurancekylemrotek
Presentation on capital adequacy analysis for property casualty insurance companies, as presented to Milliman\'s 2008 Casualty Consultants Forum in Denver
We use GARCH model to calculate the probability to default.
Our innovation is to use two dimensional GARCH model through a formula that combines both the firm's risk and the market risk.
The method is calculating the total risk by taking into consideration the different influences of the firm’s and market’s risk, i.e. Beta, using different weights for each one.
In this study we survey practices and supervisory expectations for stress testing (ST), in a credit risk framework for banking book exposures. We introduce and motivate ST; and discuss the function, supervisory requirements and expectations, credit risk parameters, interpretation results
with respect to ST. This includes a typology of ST (uniform testing, risk factor sensitivities, scenario analysis; and historical, statistical and hypothetical scenarios) and procedures for con-ducting ST. We conclude with two simple and practical stress testing examples, one a ratings migration based approach, and the other a top-down ARIMA modeling approach.
Empirical Analysis of Bank Capital and New Regulatory Requirements for Risks ...Michael Jacobs, Jr.
We examine the impact of new supervisory standards for bank trading portfolios, additional capital requirements for liquidity risk and credit risk (the Incremental Risk Charge), introduced under Basel 2.5. We estimate risk measures under alternative assumptions on portfolio dynamics (constant level of risk vs. constant positions), rating systems (through-the-cycle vs. point-in-time), for different sectors (asset classes and industry groups), alternative credit risk frameworks (al-ternative dependency structures or factor models) and an extension to a Bayesian framework. We find a potentially material increase in capital requirements, above and beyond that concluded in the far-ranging impact studies conducted by the international supervisors utilizing the participation of a large sample of banks. Results indicate that capital charges are in general higher for either point-in-time ratings or constant portfolio dynamics, with this effect accentuated for financial or sovereign as compared to industrial sectors; and that regulatory is larger than economic capital for the latter, but not for the former sectors. A comparison of the single to a multi-factor credit models shows that capital estimates larger in the latter, and for the financial / sovereign by orders of magnitude vs. industrial or the Basel II model, and that there is less sensitivity of results across sectors and rating systems as compared with the single factor model. Furthermore, in a Bayesian experiment we find that the new requirements may introduce added uncertainty into risk measures as compared to existing approaches.
Modern credit risk modeling (e.g., Merton, 1974) increasingly relies on advanced mathematical, statistical and numerical echniques to measure and manage risk in redit portfolios
This gives rise to model risk (OCC 2011-16) and the possibility of nderstating nherent dangers stemming from very rare yet plausible occurrencs perhaps not in our eference data-sets International supervisors have recognized the importance of stress testing credit risk in the Basel framework (BCBS, 2009)
It can and has been argued that the art and science of stress testing has lagged in the domain of credit, vs. other types of risk (e.g., market), and our objective is to help fill this vacuum
We aim to present classifications & established techniques that will help practitioners formulate robust credit risk stress tests
odd-Frank and Basel III Post-Financial Crisis Developments and New Expectations in Regulatory Capital. Following the recent global financial crisis of 2009, financial regulators have responded with arrays of proposals to revise existing risk frameworks for financial institutions with the objective to further strengthen and improve upon bank models. In this meeting, Dr. Michael Jacobs will discuss new developments and expectations in regulatory capital with particular reference to the definition of the capital base, counterparty credit risk, procyclicality of capital, liquidity risk management, and sound compensation practices. He will also explain the implications of the Frank-Dodd rule for financial institutions and will conclude by presenting the implementation schedule for Basel III.
This study provides a practical way to anticipate systematic LGD risk. It introduces an LGD function that requires no parameters other than PD, expected LGD, and correlation. This function survives testing against more-elaborate models of corporate credit loss that allow either greater or less LGD risk. Unless a significant improvement were discovered, the LGD function presented here can be used to anticipate systematic LGD risk within a credit loss model or to quantify downturn LGD.
1. An Option Theoretic Model for Ultimate Loss-Given-Default with Systematic Recovery Risk and Stochastic Returns on Defaulted Debt Michael Jacobs, Ph.D., CFA Senior Financial Economist Credit Risk Analysis Division Office of the Comptroller of the Currency October, 2010 The views expressed herein are those of the author and do not necessarily represent the views of the Office of the Comptroller of the Currency or the Department of the Treasury.
LGD @ default approach does not address the discount ate question – “implicit discounting”
Broad definition of default (“quasi-Basel” according to Moody’s) Exceptions: trade payables & other off-balance sheet obligations debt type, seniority ranking, debt above / below, collateral type Obligor / Capital Structure: industry, proportion bank / secured debt, number of creditor classes / number instruments Defaults: amounts (EAD, AI), default type, coupon, dates / durations Recovery / LGD measures: prices of pre-petition (or received in settlement) instruments at emergence or restructuring Sub-set: prices of traded debt at around default (30-45 day avg.)
Many OC’s had restructuring dates very near or after default trading dates