Bag Jacobs Ead Model Ccl Irmc 6 10

1,392 views
1,275 views

Published on

In-spite of large volumes of Contingent Credit Lines (CCL) in all commercial banks, the paucity of Exposure at Default (EAD) models, unsuitability of external data and inconsistent internal data with partial draw-downs has been a major challenge for risk managers as well as regulators in for managing CCL portfolios. This current paper is an attempt to build an easy to implement, pragmatic and parsimonious yet accurate model to determine the exposure distribution of a CCL portfolio. Each of the credit line in a portfolio is modeled as a portfolio of large number of option instruments which can be exercised by the borrower, determining the level of usage. Using an algorithm similar to basic the CreditRisk+ and Fourier Transforms we arrive at a portfolio level probability distribution of usage. We perform a simulation experiment using data from Moody\'s Default Risk Service, historical draw-down rates estimated from the history of defaulted CCLs and a current rated portfolio of such.

Published in: Economy & Finance, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,392
On SlideShare
0
From Embeds
0
Number of Embeds
40
Actions
Shares
0
Downloads
30
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Bag Jacobs Ead Model Ccl Irmc 6 10

  1. 1. May 30, 2010 An Exposure at Default Model for Contingent Credit Lines Pinaki Bag Union National Bank, United Arab Emirates Michael Jacobs, Jr. Credit Risk Analysis Division U.S. Office of the Comptroller of the Currency The views expressed herein are those of the authors and do not necessarily represent the views of either Union National Bank, UAE or of the U.S. Office of the Comptroller of the Currency.
  2. 2. EAD Modeling <ul><li>Vital building block for economic capital or regulatory (Basel II) capital </li></ul><ul><li>Attempt to develop a parsimonious theoretical model with inputs from empirical study or expert opinion </li></ul>
  3. 3. Outline 1 Introduction - Motivation 2 Review of the Literature 3 The Model 4 Numerical Experiment 5 Conclusions
  4. 4. Introduction -Motivation <ul><li>Why is this important? </li></ul><ul><li>What have been the challenges? </li></ul>
  5. 5. Why Modeling EAD? <ul><li>Basel II regulatory capital is a function of PD, LGD and EAD – but EAD and LGD has potentially larger impacts than PD </li></ul><ul><li>Contingent Credit Lines (CCL) are modeled using the Basel II suggested Credit Conversion Factor (CCF) for capital calculation </li></ul><ul><li>AIRB allows banks to compute their own estimates of EAD for CCL, provided these can be supported empirically </li></ul>Probability of Default (PD) Loss Given Default (LGD) Exposure at Default (EAD) Basel II - 101
  6. 6. Why Modeling EAD? <ul><li>The FDIC as of 9-09 reports close to 80% of all C&I loans are CCLs with outstanding close to $1.9 Trillion </li></ul><ul><li>Popularity of CCLs attributed to financial flexibility (Avery & Berger, 1991), hedging (Kanatas, 1987), management of working capital (Hawkins, 1982) </li></ul><ul><li>Paucity of models </li></ul><ul><li>Unsuitability of external data </li></ul><ul><li>Inconsistency of internal data </li></ul>Challenges (FSA,2007)
  7. 7. What We Did? Portfolio Segments Segment Level Usage Unused Obligor Limits Each CCL as Portfolio of Put Options Basic CreditRisk+ Algorithm Fast Fourier Transform Moody's DRS Database (Current Sample Portfolio) Moody's MURD Database & Compustat (Reference Data for CCF Estimates)
  8. 8. Review of the Literature <ul><li>What has been done? </li></ul><ul><li>How have they been applied? </li></ul>
  9. 9. Review of the Literature <ul><li>Thakor et al. (1981): option-theoretic CCL pricing as puts written by the bank & measure the sensitivity to interest rates </li></ul><ul><li>Kaplan and Zingales (1997) & Gatev and Strahan (2003): empirical evidence that drawdowns on CCLs increase when firms more liquidity constrained or CP-Tbill rate spread rises </li></ul><ul><li>Jones and Wu (2009): model credit quality as a jump-diffusion process, draw-down & pricing functions of the difference between an opportunity rate & marginal cost of line borrowings </li></ul>
  10. 10. Review of the Literature <ul><li>Empirical literature on additional partial draw-downs prior to default find a decline as credit quality worsens </li></ul><ul><ul><li>Asarnow & Marker (1995): Citibank agency rated firms 1987-1992 </li></ul></ul><ul><ul><li>Jacobs & Araten (2001): JPMC internal rated firms 1995-2000 </li></ul></ul><ul><ul><li>Agarwal et al (2005): HELC in the U.S. market </li></ul></ul><ul><ul><li>Jacobs (2009): Agency rated & marketable debt 1987-2008 </li></ul></ul><ul><ul><li>Jiménez et al (2009): All C&I loans in Spain 1984-2005 </li></ul></ul><ul><li>Martin and Santomero (1997) study CCL pricing from the demand side of firms & show CCL usage depends on firm's business growth potential & uncertainty of such </li></ul>
  11. 11. Review of the Literature <ul><li>Moral (2006) examines modeling issues from a supervisory point of view & analyzes different EAD risk measures </li></ul><ul><li>Sufi (2008) reports that firms with low cash flow or high cash flow volatility rely more heavily on cash rather than credit lines </li></ul><ul><li>Jacobs (2009) finds that utilization is a stronger inverse driver than rating & that EAD risk may be counter-cyclical </li></ul><ul><li>But Jiminez (2009) reports higher utilization for defaulting vs. non-defaulting firms up to 3 years prior to default </li></ul><ul><li>Qi (2009) examines credit card usage in U.S. & finds borrowers are more active than lenders in this game of “race to default” </li></ul><ul><li>Several of these studies report importance of macro factors, size of credit line, borrower financials, collateralization, etc. </li></ul>
  12. 12. The Model <ul><li>What we did? </li></ul>
  13. 13. Model Overview Segment Level Usage Obligor Level Unused Limits Each obligor’s CCL is modeled as portfolio of large number of put options to determine usage Similar put size obligors are clubbed under each sub-segment Each sub-segment having similar expected usage are combined to determine segment level usage FFT used to convolute each segment to the overall portfolio usage distribution Individual obligors Sub-segment Segment Portfolio
  14. 14. Obligor Level Partial Draw-downs <ul><li>Assume obligor A, with a CCL having unused limit L A , has a very large number (n) of put options to exercise, which determines the level of partial draw-down. The size of each put can be given as: </li></ul><ul><li>(4) </li></ul><ul><li>The amount of partial draw-down is r XQ A ,where r is the number of puts exercised by A in the time horizon, from which it follows that the probability generating function (PGF) of r is defined as: </li></ul><ul><li>(5) </li></ul><ul><li>Assume that expected usage of the CCL is α A L A , so the average number of puts used by A is: </li></ul><ul><li>(6) </li></ul>Individual obligors
  15. 15. Sub-Segment Level Partial Draw-downs <ul><li>A Poisson process of exercise of each option makes the PGF: </li></ul><ul><li>(7) </li></ul><ul><li>Assuming independence of m≤N obligors in the portfolio having put size Q' the PGF for r number being exercised is: </li></ul><ul><li>(8) </li></ul><ul><li>Assume the overall expected additional usage on the unused in the segment is α & the unused limits of the m obligors to be L A : </li></ul><ul><li>(9) </li></ul>Individual obligors Sub-segment
  16. 16. <ul><li>Let and hence as this sub-segment has all put size equal to Q </li></ul><ul><li>(11) </li></ul><ul><li>The Poisson assumption implies the sub-segment PGF where each sub-segment i as </li></ul><ul><li>(15) </li></ul>Sub-Segment Level Partial Draw-downs Individual obligors Sub-segment
  17. 17. <ul><li>To find the overall segment usage distribution we convolute t sub-segments; assuming independence of each, the PGF is </li></ul><ul><li>(16) </li></ul>Segment Level Partial Draw-downs <ul><li>The segment exposure distribution follows from Taylor’s theorem </li></ul><ul><li>(17) </li></ul><ul><li>and Leibnitz’s n th order differentiation rule noting the fact that is constant. </li></ul>Individual obligors Sub-segment Segment
  18. 18. <ul><li>Hence, letting and after few algebraic manipulations we will have </li></ul><ul><li>(24) </li></ul>Portfolio Level Partial Draw-downs <ul><li>We can solve the above equation iteratively noting the fact that </li></ul><ul><li>Each Segment level usage distribution will than be convoluted using a standard Fast Fourier Transform to arrive at portfolio level usage distribution </li></ul>Individual obligors Sub-segment Segment Portfolio
  19. 19. Portfolio Segmentation <ul><li>This is vital step required for apt implementation of the discussed algorithm </li></ul><ul><li>This may be done in various ways depending upon rating, product criterion, industry etc. </li></ul><ul><li>Bank may segregate borrowers by keeping high commitment fees and low service fees in one contract, and low commitment fees and high service fees in another contract (Thakor and Udell,1987) </li></ul><ul><li>Contract choice may not be always that simple, since it may also depend upon structure of the borrower’s industry ( Maksimovic ,1990) </li></ul>
  20. 20. Numerical Experiment <ul><li>Experiment with Moody’s Data of </li></ul><ul><li>Contingent Credit Lines </li></ul>
  21. 21. Numerical Experiment with Moody's Data <ul><li>For a typical CCL portfolio ∑S i may quite be large & we trying to assign a probability to each dollar of usage, so calculation of a negative exponential of this leads to precision issues </li></ul><ul><li>I.e., the double-precision settings of common software applications under default settings approximates W 0 as zero, upon which derivation of the usage distribution depends </li></ul><ul><li>Potentially many alternatives exist to circumvent the problem, such as use of libraries which can handle very high precision calculations, detailed discussion of which is beyond our scope </li></ul><ul><li>Herein we use Linux based Genius 1.0.7 as arbitrary precision calculator & Linux based Octave for FFT and final distribution evaluation </li></ul>
  22. 22. Numerical Experiment with Moody's Data <ul><li>To illustrate we chose 2 sample segments of 13 obligors each with α = 65% & 40% for investment & junk grade, respectively </li></ul><ul><li>Taken randomly from Moody' Default Risk Service (DRS TM ) database of CCLs rated as of 12/31/2009 </li></ul><ul><li>Limits of each obligor varying from $25 MM to $235 MM </li></ul><ul><li>The values of α from Jacobs (2009) based upon estimated additional drawdowns on unused limits (or “LEQ” factors) </li></ul><ul><ul><li>Moody's rated CCLs 1987-2009 defaulting withing a 1-year horizon in Moody's Ultimate Recovery Database (MURD TM ) </li></ul></ul><ul><ul><li>Trace CCL usage prior to default in COMPUSTAT and Edgar SEC filings </li></ul></ul><ul><li>Each of the obligor's limit is divided into 1,000 puts each </li></ul><ul><ul><li>E.g., a CCL with limit $50mm is 1,000 puts with strike of $50,000 each </li></ul></ul>
  23. 23. Numerical Experiment with Moody's Data
  24. 24. Numerical Experiment with Moody's Data: Results <ul><li>The convoluted distribution has both higher mean and higher standard deviation than either the segments </li></ul><ul><li>However, the distributional statistics reveal these to be near Gaussian, which we would like to overcome in future extensions of the model </li></ul>
  25. 25. Numerical Experiment with Hypothetical Portfolio: Sensitivity Analysis <ul><li>The standard deviation of the usage distribution decreases as we increase the number of puts used </li></ul><ul><ul><li>May be explained that we assuming a known value of a in our model </li></ul></ul><ul><ul><li>Mean remains relatively stable but the extreme points converge </li></ul></ul><ul><li>Additional usage rate also increases the volatility of the exposure distribution </li></ul><ul><li>To incorporate volatility in the model we can also use a mixed Poisson process </li></ul><ul><ul><li>Commonly used distributions include Gamma, resulting in negative binomial </li></ul></ul><ul><ul><li>Argument against this is induces a second set of assumptions in our model </li></ul></ul>
  26. 26. Conclusions <ul><li>So What? </li></ul><ul><li>Where do we go from here? </li></ul>
  27. 27. Conclusions and Directions for Future Research <ul><li>We formulated a parsimonious model for the estimation of portfolio level EAD in a typical CCL portfolio each as a portfolio of option instruments </li></ul><ul><li>Exercise of each has been modeled as a standard Poisson process where average additional usage α is assumed to be known. </li></ul><ul><li>Previous literature indicates α probably depends on obligor credit quality, “race to default”, pricing, utilization, etc. </li></ul><ul><li>Our algorithm accommodates different values of α to model this model this correlation, as the portfolio may be segmented by criterion of the bank </li></ul><ul><ul><li>Various methods for estimating α have been outlined in the literature likely to work best for banks if this is based upon internal research </li></ul></ul><ul><ul><li>Further work may also be needed so that stable distribution parameters can be determined which will not be affected by choice of the number of puts used. </li></ul></ul><ul><li>Most current credit risk models have a constant EAD as economic credit VaR input and stochastic exposures make a notable difference in capital </li></ul>
  28. 28. Conclusions and Directions for Future Research (continued) <ul><li>Accurate EAD calculation is fundamental for liquidity risk management which poses a challenge to risk managers </li></ul><ul><ul><li>E.g., HELC where all the accounts are undrawn but committed lines </li></ul></ul><ul><ul><li>This algorithm may prove helpful in providing insight into the problem </li></ul></ul><ul><li>The other implication of the algorithm is EAD estimation for Basel II: as compared to PD, there has been limited research into this </li></ul><ul><ul><li>Can provide a foundation for the banks under AIRB approach to Basel II. </li></ul></ul><ul><li>This algorithm may also be used in stress testing for a worst case liquidity scenarios for the portfolio, as we have the complete distribution of usage </li></ul><ul><ul><li>We can get a good estimate of our worst case scenarios from 99th or 99.9th percentile depending upon the risk appetite of the bank </li></ul></ul><ul><li>We need further future work to improve the algorithm so as to use it in standard software applications with minimized hardware requirements </li></ul><ul><li>Finally, generalization of the Poisson assumption in order to model non-normality of the exposure distribution </li></ul>

×