Stochastic Modeling for Valuation and Risk Management


Published on

Presentation delivered at Fiserv Conference in May 2011.

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Stochastic Modeling for Valuation and Risk Management

  1. 1. Stochastic Modeling for Valuation and Risk Management<br />Roderick Powell, FRMPrincipal, Chicago Risk Advisors, LLC<br />
  2. 2. 2<br />Outline of Presentation<br />Preliminaries<br />Simple Random Sampling<br />Stratified Sampling<br />Application to Valuation<br />Application to Risk Management<br />Q & A<br />
  3. 3. 3<br />Preliminaries<br />
  4. 4. 4<br />Preliminaries<br />Definition and Uses of Stochastic Modeling<br />The term “stochastic” means random. Stochastic Modeling refers <br />to using a model designed to derive 1) the probability of realizing<br />a specific random outcome or 2) a probability-based (expected) value. <br />Stochastic Modeling can be used to facilitate the valuation of <br />options and option-embedded products, such as whole loans and <br />mortgage-backed securities. Stochastic Modeling can also be used to <br />derive “probabilistic” measures of risk, e.g., the probability that Net <br />Interest Income will drop below X dollars due to future market interest <br />rate changes, Value-at-Risk (“VaR”), and so on.<br />
  5. 5. 5<br />Preliminaries<br />Probability Experiment– a chance process that leads to specific outcomes. This can range from something as simple as a coin tossing experiment to something as complex as simulating thousands of random interest rate paths to dynamically value a mortgage-backed security. <br />Sample Space– all possible outcomes of an experiment. For example, if you flip a fair coin there are two possible outcomes, i.e., heads (H) or tails (T). Therefore the sample space (S) = {H,T}<br />Trial – one iteration in a probability experiment. For example, flipping a fair coin one time is a single trial. The outcome will be either H or T.<br />Event– a subset of the sample space. For example, an event could be getting a T from flipping a coin.<br />Probability – the chance of a realizing a particular outcome in very many trials. Probabilities must be between 0 (never occurs) and 1 (always occurs). For example, if you flip a fair coin, there is a .5 or 50% probability that you will get T and a .5 or 50% probability you will get H “over the long run.” There are two main types of probabilities: Classical/Empirical (based on evidence) and Subjective/Personal (based on judgment).<br />
  6. 6. 6<br />Preliminaries<br />Probability Distribution– values of a random variable and their corresponding <br />probabilities. Probabilities for all possible outcomes must sum to 1.0 or 100%. <br />Probability distributions can be expressed in tabular or graphical form. One of <br />the most well-known probability distributions is the normal distribution. <br />Normal Distribution – a distribution that is bell-shaped and symmetric about <br />the mean, . The mean is the average. The distance away from the mean in <br />either direction is expressed in standard deviations, . See depiction below.<br />
  7. 7. 7<br />Preliminaries<br />Expectation – the long-run average (mean) numerical outcome of a probability <br />experiment. For example, assume that ABC stock may trade a year from now <br />at $50, $40, or $10. You would like to derive the expected value (E) of ABC <br />stock in one year. The first step is to analyze the events that could affect ABC <br />stock over the next year and assign subjective probabilities corresponding to <br />each potential stock price:<br />Stock Price (S)Probability<br />$50 P ($50) = 30%<br />$40 P ($40) = 20%<br />$10 P ($10) = 50%<br />E (S) = ($50 x .30) + ($40 x .20) + ($10 x .5) = $28<br />
  8. 8. 8<br />Simple Random Sampling<br />
  9. 9. 9<br />Simple Random Sampling<br />Monte Carlo Simulation– methodology used to simulate repeatedly a random <br />or stochastic process for the variable of interest covering a wide range of <br />possible outcomes. These variables are often drawn from a pre-specified <br />probability distribution, such as the normal distribution.<br />Steps to Performing “Pure” Monte Carlo Simulation<br />List all possible outcomes of probability experiment, i.e., sample space.<br />Determine the probability of each outcome.<br />Set up mapping of random numbers to outcomes.<br />Generate a set of random numbers. <br />Record set of outcomes corresponding to random numbers.<br />Repeat steps 4 and 5 as many times as necessary, i.e., trials.<br />Calculate statistics based on simulation results.<br />[NOTE: Steps 1 and 2 involve specifying a probability distribution or model.]<br />
  10. 10. 10<br />Simple Random Sampling<br />Simple Example of Monte Carlo Simulation <br />The President of a Bank wants his or her management team to analyze the <br />financial impact on the Bank from a mock stress event twice a year. The<br />President comes up with six stress events that are assumed to be equally likely <br />to occur, i.e., probability of each event is 1/6. This is a multinomial distribution.<br />Each possible event is assigned a random number.<br />Number on DieStress Event<br />1 Housing Market Collapse <br />2 Terrorist Attack<br />3 Stock Market Crash<br />4 Bond Market Crash<br />5 Oil Price Shock<br />6 Earthquake<br />
  11. 11. 11<br />Simple Random Sampling<br />Example of Monte Carlo Simulation - continued <br />It is time to perform the stress test. The President reaches into his desk drawer <br />and pulls out a fair six-sided die and tosses it. A “3” ends up on the face of the <br />die. The President looks at his mapping scheme and sees that a random <br />number of “3” equates to a Stock Market Crash. Management is asked <br />to assess the impact of this stress event on the Bank’s financial performance.<br />Number on DieStress Event<br />1 Housing Market Collapse <br />2 Terrorist Attack<br />3 Stock Market Crash<br />4 Bond Market Crash<br />5 Oil Price Shock<br />6 Earthquake<br />
  12. 12. 12<br />Simple Random Sampling<br />Monte Carlo Simulation – From Simple to Complex<br />SimpleComplex<br />Probability Distribution Subjective Classical/Empirical<br />Random Numbers Roll Dice Use computer software to Flip Coin generate random numbers *<br />Sample Space Several Outcomes Thousands of Outcomes <br />Number of Trials Several Trials Thousands of Trials<br />Sampling Framework Distribution Binomial Lattice/Tree <br />* Technical Note:Computer-generated random numbers are really “pseudo” random because they are generated from an algorithm using a predefined rule. Truly random numbers would need to be derived from a source such as radioactive decay. Since it is impractical to use truly random numbers, it is common practice to use “pseudo” random numbers. However, it is critical to use a well-designed algorithm that can generate “pseudo” random numbers that appear independent over time.<br />
  13. 13. 13<br />Stratified Sampling<br />
  14. 14. 14<br />Stratified Sampling<br />Stratified Sampling–a way of sampling from a pre-specified probability <br />distribution that involves dividing the distribution into ranges and sampling each <br />range according to its probability. The major benefit of stratified sampling over <br />simple random sampling is that the former reduces the number of trials or <br />simulations necessary to obtain a representative sample of outcomes. <br />
  15. 15. 15<br />Stratified Sampling<br />Example of Stratified Sampling– Assume that a Portfolio Manager wants to <br />randomly create a portfolio of 5 stocks from a group of 12 different stocks in <br />three different sectors. The Manager believes that there is a 60% chance that <br />oil stocks will outperform the market, a 20% chance that tech stocks will <br />outperform the market, and a 20% chance that retail stocks will outperform the <br />market. The Manager would like the random sample of stocks to reflect his or <br />her opinion on sector outperformance. <br />SectorProbability of Outperformance<br />Oil P (Oil) = 60%<br />Tech P (Tech) =20%<br />Retail P (Retail) = 20%<br />
  16. 16. 16<br />Stratified Sampling<br />Example of Stratified Sampling - continued<br />Probability of Outperformance<br />Oil Stock 1 (O1) P(O1) = 15%<br />Oil Stock 2 (O2) P(O2) = 15%<br />Oil Stock 3 (O3) P(O3) = 15%<br />Oil Stock 4 (O4) P(O4) = 15% <br />Tech Stock 1 (T1) P(T1) = 5%<br />Tech Stock 2 (T2) P(T2) = 5%<br />Tech Stock 3 (T3) P(T3) = 5%<br />Tech Stock 4 (T4) P(T4) = 5%<br />Retail Stock 1 (R1) P(R1) = 5% <br />Retail Stock 2 (R2) P(R1) = 5%<br />Retail Stock 3 (R3) P(R3) = 5%<br />Retail Stock 4 (R4) P(R4) = 5%<br />
  17. 17. 17<br />Stratified Sampling<br />Example of Stratified Sampling - continued<br />Simple Random Sample – Place the 12 stocks into a hat, shake the hat, and <br />pick 5 stocks. The problem with this approach is that all 12 stocks have the <br />same chance of being picked. Therefore, you could end up randomly picking 5 <br />retail or tech stocks and no oil stocks. For example, you could end up with the <br />following sample: R1, R2, R3, T1, T4. This portfolio would not be a <br />representative sample based on the probabilities of sector outperformance.<br />Stratified Sample – First place the 4 oil stocks into a hat, shake the hat, <br />and pick 3 stocks. Then place the 8 tech and retail stocks into a hat, shake the <br />hat, and pick 2 stocks. In this case, you could end up with the following sample: <br />O1, O2, O4, R1, R2. This portfolio would be a representative sample based on <br />the probabilities on sector outperformance. <br />
  18. 18. 18<br />Application to Valuation<br />
  19. 19. 19<br />Application to Valuation<br />Option-adjusted Valuation (“OAV”)– approach used to value fixed-income <br />securities with embedded options. Embedded options may include interest rate <br />caps, floors, or prepayment options. OAV is known as “dynamic” valuation <br />because it considers the fact that principal cash flows of these instruments may <br />change when interest rates change. In contrast, “static” valuation assumes that <br />principal cash flows do not change when interest rates change. The OAV or <br />dynamic valuation approach can be used to derive the value of whole loans, <br />mortgage-backed securities (“MBS”), and mortgage servicing rights (“MSRs”). <br />OAV Equation<br />
  20. 20. 20<br />Application to Valuation<br />Toolkit for Robust MBS Valuation<br />Prepayment Model<br />Deal Cash Flow Model <br />Basis Model<br />Term Structure Model<br />Sampling Technique<br />Option-adjusted Spread (“OAS”)<br />
  21. 21. 21<br />Application to Valuation<br />Prepayment Model- the call option embedded in an MBS is referred to as a<br />prepayment option. This is because the borrower has an option to prepay his <br />or her mortgage in full (prepayment) or in part (curtailment) at any time, usually <br />without penalty. In order to derive the value of an MBS, you need to predict if <br />and when the borrower will exercise his or her option to prepay the loan. The <br />speed of mortgage prepayments, i.e., conditional prepayment rate (“CPR”), is <br />primarily driven by changes in mortgage rates. When current mortgage rates <br />drop relative to the existing rate on loans, borrowers tend to prepay their <br />mortgages, i.e., refi incentive. (See below.) There are also non-rate related <br />factors that drive prepayments, such as burnout, seasonality, and seasoning. <br />
  22. 22. 22<br />Application to Valuation<br />Deal Cash Flow Model– model used to determine sequence of tranche <br />payments for Collateralized Mortgage Obligations (“CMOs”) and other <br />structured MBS products.<br />Basis Model– model describes the relationship between the mortgage rate <br />and a market benchmark rate such as the LIBOR/Swap rate over the life of <br />mortgage. For example, the 30 year mortgage rate is usually modeled as a <br />spread to the 10 year swap rate. The spread may be modeled to stay constant <br />or vary based on specific factors through time.<br />
  23. 23. 23<br />Application to Valuation<br />Term Structure Model– a model that describes how the short rate or <br />forward rates evolve over time. Examples of popular short rate models include <br />the Hull-White (normal) and Black-Karasinski (lognormal) models. These <br />models are calibrated to be consistent with current market rates and option <br />volatilities. One-factor term structure models of the short rate are often <br />implemented as recombining binomial lattices and encompass 360 future <br />months or time steps. At each time step the short rate can either go up or down.<br />See example below.<br />
  24. 24. 24<br />Application to Valuation<br />Sampling Process <br />A binomial lattice with 360 discrete time steps has a sample space of 2360 or <br />2.35E + 180 possible rate paths! That is the sample space. In general, the <br />more paths that are sampled, the greater the precision of MBS valuation. <br />In statistical terms, the standard error (“SE”) of the estimate decreases at a <br />decreasing rate as the number of rate paths (n) sampled increases. See below.<br />It would be computationally prohibitive to consider the entire rate path sample <br />space in valuation calculations. Various methodologies known as variance <br />reduction techniques or acceleration methods can be used to reduce the <br />number of paths that must be sampled in order for the “mean” value to <br />converge to the “expected” value. <br />
  25. 25. 25<br />Application to Valuation<br />Sampling Process - Variance Reduction Techniques<br />Antithetic Variable Technique -consists of flipping the signs on all the <br />random samples. This effectively doubles the sample size without much <br />additional computation time. This method is appropriate when the distribution is <br />symmetric about the mean (not skewed right or left), e.g., normal distribution. <br />This is one of the easiest methods to implement.<br />Control Variate Technique – consists of first calculating the price of certain <br />interest rate options using closed-form solutions, such as the Black Model. You <br />can then compare benchmark prices derived via closed-form solutions to prices <br />derived via simulation. The prices derived using closed-form solutions are <br />assumed to be correct and prices derived using simulation will converge to <br />these prices if enough iterations are run. The difference between the two prices <br />can be used to adjust the price of similar instruments.<br />Stratified Sampling – consists of partitioning the sample space into groupings <br />according to probability of occurrence and sampling the groupings. This <br />reduces the number of trials necessary to reach convergence. This is the <br />concept behind the Linear Path Space process in Fiserv’s AL Manager solution.<br />
  26. 26. 26<br />Application to Valuation<br />Sampling Process - Variance Reduction Techniques<br />Importance Sampling – consists of sampling paths that are most important to <br />the problem at hand. <br />Advanced Techniques - advanced variance reduction techniques include <br />moment matching (quadratic re-sampling) and quasi-random sequences. <br />See John Hull’s book “Options, Futures, and Other Derivatives” for an overview <br />of these advanced variance reduction techniques.<br />Using a Combination of Variance Reduction Techniques<br />Variance reduction techniques are not mutually exclusive. They are often used <br />in combination to produce better results than if used singularly. <br />
  27. 27. 27<br />Application to Valuation<br />Option-adjusted Spread (“OAS”)– the one constant spread over all LIBOR <br />rate paths that results in the “model” price equaling the observed “market” price. <br />The OAS reflects credit risk, liquidity risk, and model risk. The OAS can be <br />used to assess relative value among similar instruments.<br />Steps in the OAV Process for a MBS<br />Derive the instrument’s principal and interest cash flows under various randomly selected interest rate scenarios drawn from a binomial lattice. <br />Discount the instrument’s cash flows (“CFs”) by path specific LIBOR rates to derive the present value (“PV”) for each path.<br />Average the PV’s for all sampled rate paths to derive the “model” price.<br />Derive an OAS so that the average of the various path values for the instrument forces the “model” price to equal the observed “market” price.<br />If the instrument’s market price is not observable, then the OAS for a similar product may be used to derive the target instrument’s value.<br />
  28. 28. 28<br />Application to Valuation<br />Visual Depiction of MBS OAV (assumes 100 paths) <br />
  29. 29. 29<br />Application to Valuation<br />Methods to Test Robustness of MBS Valuation<br />The question often arises about how to determine the appropriate number of <br />random rate paths to sample. Is it 100, 1000, or 10,000? The good news is <br />that there are ways to determine if you are using enough paths to produce an <br />accurate MBS price. The bad news is that the “correct” number of paths to <br />produce an accurate MBS price may vary according to the characteristics and <br />complexity of the particular MBS. <br />Price Convergence Tests<br />Run the model with 100 paths and calculate the standard error as described earlier. Re-run the model with increasingly more paths until you reach the point where the standard error is within the MBS bid-ask spread. *<br />Run the model with 100 paths and derive the OAS. Use the same OAS as a starting point, change the random seed, and re-run the model with the same number of paths. Note the difference in price between two runs. The price difference should decrease as the number of paths tested increase. Run the test with an increasing number of paths until you reach the point where the difference in price with different seeds is within the MBS bid-ask spread. *<br />* Note: The tolerance for MBS price error is within the bid-ask spread for some MBS trading desks. However, the MBS price error tolerance may be less strict for asset/liability management purposes. <br />
  30. 30. 30<br />Application to Valuation<br />OAV/Dynamic Valuation Approach<br />Advantages<br />Provides more accurate prices for option-embedded products, such as MBS, relative to “static” valuation approach.<br />“Best practice” approach to pricing MBS and related instruments.<br />Approach produces an OAS which can be used to assess relative value among various securities for investment purposes.<br />Approach facilitates derivation of more accurate price risk measures for MBS, such as OAS duration and convexity, than “static” valuation approach.<br />Enables calculation of vega or volatility duration. These risk measures cannot be produced using the “static” approach.<br />Disadvantages<br />May require investments in human capital and systems development.<br />Requires more computer power and run time than “static” valuation approach.<br />Less transparent than “static” valuation approach.<br />Exposes user to more model risk relative to “static” valuation approach.<br />
  31. 31. 31<br />Application to Valuation<br />Example of MBS Analysis – as of 3/29/11<br />InstrumentMetrics <br />FNMA 30 OAS = 25 bps <br />Price 101.5 Static Spread = 62 bps<br />Coupon 4.5% Option Cost = 37 bps<br />WAC 5.0% OAS Duration = 5.8<br />WAM 355 months OAS Convexity = -1.1<br />Interest Rate Risk Analysis *<br />50 bps upward parallel rate shock leads to a price drop of 3.08<br />[2.94 of price drop is due to duration risk]<br />[.14 of price drop is due to convexity risk]<br />* Based on OAS Duration and OAS Convexity estimates.<br />
  32. 32. 32<br />Application to Valuation<br />Example of MBS Analysis – as of 3/29/11<br />
  33. 33. 33<br />Application to Risk Management<br />
  34. 34. 34<br />Application to Risk Management<br />Value-at-Risk (“VaR”)<br />VaR is a probabilistic measure of market risk that can be used for the Trading <br />Book or the Banking Book. The goal is to be able to make a statement <br />such as “There is a X% chance that the loss on the portfolio will exceed <br />$X in a day.” VaR is usually defined as the 95th or 99th percentile dollar <br />loss and can be based on any horizon, such one day, ten days, or one year.<br />There are various approaches used to calculate VaR, such as Historical <br />Simulation (HS), variance-covariance, principal components analysis, and <br />Monte Carlo simulation. VaR methods are either 1) parametric or 2) non-<br />parametric. Parametric approaches to calculating VaR assume a certain <br />probability distribution while non-parametric approaches such as HS do not.<br />
  35. 35. 35<br />Application to Risk Management<br />Monte Carlo-based VaR <br />Monte Carlo-based VaR is based a large number of random draws of risk <br />Factor(s), e.g., interest rate changes, from a given probability distribution. It is a <br />parametric VaR approach. Just as with MBS valuation, a random set of rate <br />paths sampled from a binomial lattice implementation of a one factor short rate <br />term structure model can be used. <br />The portfolio is revalued numerous times based on a large number of randomly <br />generated risk factor(s). The sample of random portfolio gains or losses are <br />then ordered from largest gain to largest loss. The loss that corresponds to a <br />certain percentile loss, e.g., 99th, is the VaR.<br />The accuracy of the VaR approach depends on the number of simulations run. <br />The more simulations the more accurate the resulting VaR. Variance reduction <br />techniques can be employed when deriving VaR to cut down on the number of <br />simulations needed to obtain accurate results. <br />
  36. 36. 36<br />Application to Risk Management<br />Monte Carlo VaR Approach<br />Advantages<br />More flexible and powerful than other approaches.<br />Can handle securities with very complex payoffs, e.g., path-dependent, non-linear, or dependent on multiple risk factors.<br />Can fully revalue positions and does not necessarily require the use of estimated values based on the duration/convexity metrics.<br />Disadvantages<br />May require investments in human capital and systems development.<br />Need more computer power and time to run relative to other approaches. <br />Less transparent than other less complex approaches.<br />Exposes user to more model risk relative to other less complex approaches.<br />VaR results are highly dependent on assumed probability distribution.<br />Need robust pricing functions for all instruments.<br />
  37. 37. 37<br />Application to Risk Management<br />“Stochastic” Earnings-at-Risk (“EaR”) <br />Stochastic EaR is a probabilistic measurement of interest rate risk from an <br />earnings perspective. The metricallows the user to make a statement such as <br />“There is a X% chance that net interest income will fall by $X over the next X <br />months.” The approach randomly simulates various interest rate scenarios over <br />specific horizon, e.g., 12 months, and calculates earnings, i.e., net interest <br />income (“NII”), corresponding to each scenario.<br />Asset/Liability Managers traditionally rely on “deterministic” as opposed to <br />“stochastic” rate scenarios to measure interest rate risk. An example of a <br />deterministic scenario is a 200 basis point parallel rate shock. NII under this <br />particular deterministic scenario could be calculated.<br />The use of stochastic EaR creates an expanded set of NII metrics to help <br />management more fully understand rate risk. For example, a distribution of <br />simulated NII figures enables the user to calculate the mean/expected value, <br />the range, min/max, and the standard deviation of NII over a given horizon. In <br />addition, the probability of each NII outcome can be obtained. <br />
  38. 38. 38<br />Application to Risk Management<br />“Stochastic” Earnings-at-Risk (“EaR”) <br />Advantages<br />Derive the likelihood of a particular earnings forecast.<br />Derive a richer set of risk metrics than deterministic EaR.<br />Gain fuller understanding of earnings risk exposure to interest rate changes.<br />Disadvantages<br />May require investments in human capital and systems development.<br />Need more computer power and time to run relative to deterministic EaR. <br />Less transparent than deterministic EaR.<br />Relatively more model risk than deterministic EaR.<br />Results highly dependent on probability distribution assumption.<br />Risk is typically measured over a short-term horizon, i.e., one or two years. The measure does not give an indication of risk beyond the specified horizon.<br />
  39. 39. 39<br />Application to Risk Management<br />Example of “Stochastic” EaR Report <br />
  40. 40. 40<br />Application to Risk Management<br />“Stochastic” Market Value of Equity (“MVE”)<br />“Stochastic” MVE follows the same approach as “Stochastic” EaR. The <br />major difference is that the variable of interest is MVE and not NII.<br />“Stochastic” MVE can be viewed as a long-term measure of risk since it <br />considers all principal and interest cash flows until the end of an instrument’s <br />life. This is in contrast to “Stochastic” EaR which is typically measured over a <br />short-term horizon. To get a fuller understanding of an institution’s short-term <br />and long-term exposure to interest rate risk both measures can be calculated.<br />“Monte Carlo” VaR versus “Stochastic” MVE<br />VaR distills market risk down to a single metric, i.e., the potential loss in value <br />of a trading book or banking book over a specific horizon at a specific <br />significance level. On the other hand, “Stochastic” MVE is a broader measure <br />of risk. There can conceivably be a large gap between the “worst” case MVE <br />produced by a “Stochastic” MVE analysis and the VaR.<br />
  41. 41. 41<br />Q & A<br />