The document provides an overview of stochastic modeling in the financial reporting world. It discusses (1) the definition and purpose of stochastic modeling, including its use for product design, forecasting, and risk management; (2) a generic modeling framework that involves economic scenario generation, random number generation, and stochastic modeling of assets and liabilities; and (3) an example of stochastic modeling of a guaranteed minimum income benefit rider to calculate reserve and capital requirements under different economic scenarios.
"Portfolio Optimisation When You Don’t Know the Future (or the Past)" by Rob...Quantopian
We generally assume the past is a good guide to the future, but well do we even know the past? What effect does this uncertainty when estimating inputs have on the notoriously unstable algorithms for portfolio optimization?
I explore this issue, look at some commonly used solutions, and also introduce some alternative methods.
Individuals asset class choice behavior in their pension fund individual account appear consistent with the use of naive learning rules. Preliminary results from joint work with Felix Villatoro, Olga Fuentes and Pamela Searle.
MLX 2018 - Marcos López de Prado, Lawrence Berkeley National Laboratory Comp...Mehdi Merai Ph.D.(c)
Presented by: Marcos López de Prado, Lawrence Berkeley National Laboratory Computational Research Division
MLX FinTech Conference II, Toronto, May 2018.
More info at: https://www.machinelearningx.net
"Enhancing Statistical Significance of Backtests" by Dr. Ernest Chan, Managin...Quantopian
Insufficient historical data is a major hurdle in building a trading model free from data snooping bias. Dr. Chan's talk will discuss several techniques, some borrowed from machine learning, that can alleviate overfitting and enhance the statistical significance of a backtest.
"Portfolio Optimisation When You Don’t Know the Future (or the Past)" by Rob...Quantopian
We generally assume the past is a good guide to the future, but well do we even know the past? What effect does this uncertainty when estimating inputs have on the notoriously unstable algorithms for portfolio optimization?
I explore this issue, look at some commonly used solutions, and also introduce some alternative methods.
Individuals asset class choice behavior in their pension fund individual account appear consistent with the use of naive learning rules. Preliminary results from joint work with Felix Villatoro, Olga Fuentes and Pamela Searle.
MLX 2018 - Marcos López de Prado, Lawrence Berkeley National Laboratory Comp...Mehdi Merai Ph.D.(c)
Presented by: Marcos López de Prado, Lawrence Berkeley National Laboratory Computational Research Division
MLX FinTech Conference II, Toronto, May 2018.
More info at: https://www.machinelearningx.net
"Enhancing Statistical Significance of Backtests" by Dr. Ernest Chan, Managin...Quantopian
Insufficient historical data is a major hurdle in building a trading model free from data snooping bias. Dr. Chan's talk will discuss several techniques, some borrowed from machine learning, that can alleviate overfitting and enhance the statistical significance of a backtest.
In All About Factors, we cover the basics of what factors are, where we expect them to derive their excess returns from, their advantages and disadvantages and if there is indeed any merit to this approach or if it just another Wall Street marketing gimmick.
After covering the commonly accepted factors basics, we discuss expectations for factor investing, the theory as to why short-term pain must be present for long-term return, and some key considerations in moving from the academic research to creating investible portfolios.
Also explored is the current on-going debate between industry titans Rob Arnott (Research Affiliates) and Cliff Asness (AQR) as to the efficacy of using valuation-based spreads to time factor exposures.
Lastly, we look at some different methods that a retail investor can utilize smart-beta investing, by highlighting some of the current industry techniques for diversifying factor exposures and building a multi-factor portfolio.
Algorithmic Finance Meetup: Starmine Short Interest Talk Quantopian
With the commoditization of such basic quant factors as value and momentum, in recent years systematic investors have turned more and more to sentiment based alpha signals. Aggregated open short interest level provides a profitable, low turnover signal rooted in buy-side sentiment, aka "the smart money." Dr. Stauth will cover the basics of short selling and data availability and will review the research and proprietary formulation of the StarMine short interest model as well as covering a range of sample trading strategies.
"Build Effective Risk Management on Top of Your Trading Strategy" by Danielle...Quantopian
Presented at QuantCon Singapore 2016, Quantopian's quantitative finance and algorithmic trading conference, November 11th.
Risk management is an essential but often overlooked prerequisite to success in trading. No one would like to see their substantial profits generated over his lifetime of trading just vanishing over a few bad trades.
In this talk, Danielle will discuss a quantitative understanding of risk. She will then share a few techniques in risk management, with a case study to show how a proper risk management system helps improve the overall performance of trading strategies.
"Trading Without Regret" by Dr. Michael Kearns, Professor at the Computer and...Quantopian
No-regret learning is a collection of tools designed to give provable performance
guarantees in the absence of any statistical or other assumptions on the data (!),
and thus stands in stark contrast to most classical modeling approaches.
With origins stretching back to the 1950s, the field has yielded a rich body of algorithms and analyses that covers problems ranging from forecasting
from expert advice to online convex optimization.
Dr. Kearns will survey the field, with special emphasis on applications to quantitative finance problems, including portfolio construction and inventory risk.
"From Alpha Discovery to Portfolio Construction: Pitfalls and Solutions" by D...Quantopian
From QuantCon 2017: Implementation is the efficient translation of alpha research into portfolios. It includes portfolio construction and trading. It is a vital step in the quant equity workflow, as poor implementation can ruin even the best alpha ideas. Two crucial challenges must be solved: how to construct a portfolio that most efficiently captures a given alpha signal; and, in the presence of multiple signals, how to optimally combine them into a single composite alpha factor.
This talk addresses these challenges, examines common pitfalls in the implementation of quantitative strategies and good practices to avoid them. A common theme is striking the right balance between factor signal purity and investability. We look at how factor models and optimisation techniques help professional investors answer three key questions:
· What risks should your risk model be cognisant of?
· What objective function should you use?
· What effect do investability constraints have on your portfolio?
"Snake Oil, Swamp Land, and Factor-Based Investing" by Gary Antonacci, author...Quantopian
BlackRock forecasts smart beta investing oriented toward size, value, quality, momentum, and low volatility to reach $1 trillion by 2020 and $2.4 trillion by 2025. Gary’s talk will show that this growth may not be justified due to these factors' lack of robustness, consistency, persistence, intuitiveness, and investability. Gary will also show that the success attributed to these factors would be better directed toward macro momentum and the short interest ratio.
"A Framework-Based Approach to Building Quantitative Trading Systems" by Dr. ...Quantopian
Contrary to popular wisdom the difference between a retail quant trader and a professional portfolio manager is not in "having better trade entry and exit rules". Rather it is the difference in how each approaches the concepts of portfolio optimisation and risk management.
Both of these topics are synonymous with heavy math, which can be off-putting for beginner retail systematic traders. Hence, it can be extremely daunting for those without institutional experience to know how to turn a set of trading rules into a robust portfolio and risk management system.
In this talk, Mike will discuss how to take a typical retail quant strategy and place it in a professional quantitative trading framework, with proper position sizing and risk assessment, without resorting to pages of formulas or the need to have a PhD in statistics!
Lamar Van Dusen | Optimal Turnover Revisited : Levels Corresponding to Highes...Lamar Van Dusen
Lamar Van Dusen is explaining the Optimal Turnover Revisited and Levels Corresponding to the Highest Net Return. Lamar Van Dusan provides all financial solutions and he is so talented in his work.
"Quantum Hierarchical Risk Parity - A Quantum-Inspired Approach to Portfolio ...Quantopian
Maxwell will present the methodologies and results behind the algorithm that has been developed by 1QBit, named Quantum Hierarchical Risk Parity, or QHRP.
This is an extension of the work done by Marcos Lopez de Prado on
Hierarchical Risk Parity in his paper "Building Diversified Portfolios that Outperform Out-of-Sample."
QHRP tackles the problem of minimizing the risk of a portfolio of assets using a quantum-inspired approach. Although the ideas surrounding this go back to Markowitz’s mean-variance portfolio optimization of 1952’s Portfolio Selection, we have applied recent quantum-ready machine learning tools to the problem to demonstrate strong performance in terms of a variety of risk measures and lower susceptibility to inaccuracies in the input data.
The quantum-ready approach to portfolio optimization is based on
an optimization problem that can be solved using a quantum annealer. The algorithm utilizes a hierarchical clustering tree that is based on the covariance matrix of the asset returns. The results of real market data used to benchmark this approach against other common portfolio optimization methods will be shared in this presentation.
View the White Paper: https://bit.ly/2k5xTxW.
Predictive Model for Loan Approval Process using SAS 9.3_M1Akanksha Jain
This is a Predictive Model which uses Logistic Regression to statistically help make better loan approval decisions in future for a German Bank. It uses an historical credit data set with 1000 data points and 20 variables.
Tool used:
SAS 9.3_M1
Steps Involved are:
- Data Quality check using Correlations and VIF Tests
- Analysis of different Variable Selection Methods such as Forward, Backward and Stepwise
- Variable Selection on the basis of Parameter Estimates and Odds Ratio
- Outlier Analysis to identify the outliers and improve the model
- Final Model Selection Decision based on ROC curve, Percent Concordant, PROC Rank and Hosmer Lemeshow Test
"A Framework for Developing Trading Models Based on Machine Learning" by Kris...Quantopian
Presented at QuantCon Singapore 2016, Quantopian's quantitative finance and algorithmic trading conference, November 11th.
Machine learning is improving facets of our lives as diverse as health screening, transportation and even our entertainment choices. It stands to reason that machine learning can also improve trading performance, however the practical application is fraught with pitfalls and obstacles that nullify the benefits and present a high barrier to entry. Building on background information and introductory material, Kris will propose a framework for efficient and robust experimentation with machine learning methods for algorithmic trading. The framework's objective is to arrive at parsimonious models whose positive past performance is unlikely to be due to chance. The framework is demonstrated via practical examples of various machine learning models for algorithmic trading.
Stop Flying Blind! Quantifying Risk with Monte Carlo SimulationSam McAfee
Product development is inherently risky. While lean and agile methods are praised for supporting rapid feedback from customers through experiments and continuous iteration, teams could do a lot better at prioritizing using basic modeling techniques from finance. This talk will focus on quantitative risk modeling when developing new products or services that do not have a well understood product/market fit scenario. Using modeling approaches like Monte Carlo simulations and Cost of Delay scenarios, combined with qualitative tools like the Lean Canvas and Value Dynamics, we will explore how lean innovation teams can bring scientific rigor back into their process.
Presented 25-Sep-2013 for Borsa İstanbul's Vadeli İşlem ve Opsiyon Piyasası (VİOP)
Borsa İstanbul : Vadeli İşlem ve Opsiyon Piyasası (VİOP)
- popular strategies' concentrate strikes & cause some skew
- review implied probabilities and conditional payoff are model-free
- gamma trading shows dynamic hedge issues
- volatility is not a normal "asset class"
- market maker's priorities for hedging jumps
- key hidden assumptions causing model risk
- important portfolio mismatch risks
- spotting real options & non-economic options
http://borsaistanbul.com/en/news/2013/09/26/borsa-istanbul-organizes-the-first-of-futures-and-options-market-seminar-series
Empowering Innovation Portfolio Decision-Making through SimulationSopheon
New product development is a complex, high-risk endeavor for any organization. In order to execute a game-changing innovation program, leaders must be willing to engage the unknowns around future markets and the technologies that will serve them.
This webinar discusses how simulation and specialized business processes can provide a risk-free proving ground to challenge and compare innovation strategies, thereby empowering analysts and executives to confidently make difficult investment decisions.
To view this webinar, go to http://budurl.com/zgs5
In All About Factors, we cover the basics of what factors are, where we expect them to derive their excess returns from, their advantages and disadvantages and if there is indeed any merit to this approach or if it just another Wall Street marketing gimmick.
After covering the commonly accepted factors basics, we discuss expectations for factor investing, the theory as to why short-term pain must be present for long-term return, and some key considerations in moving from the academic research to creating investible portfolios.
Also explored is the current on-going debate between industry titans Rob Arnott (Research Affiliates) and Cliff Asness (AQR) as to the efficacy of using valuation-based spreads to time factor exposures.
Lastly, we look at some different methods that a retail investor can utilize smart-beta investing, by highlighting some of the current industry techniques for diversifying factor exposures and building a multi-factor portfolio.
Algorithmic Finance Meetup: Starmine Short Interest Talk Quantopian
With the commoditization of such basic quant factors as value and momentum, in recent years systematic investors have turned more and more to sentiment based alpha signals. Aggregated open short interest level provides a profitable, low turnover signal rooted in buy-side sentiment, aka "the smart money." Dr. Stauth will cover the basics of short selling and data availability and will review the research and proprietary formulation of the StarMine short interest model as well as covering a range of sample trading strategies.
"Build Effective Risk Management on Top of Your Trading Strategy" by Danielle...Quantopian
Presented at QuantCon Singapore 2016, Quantopian's quantitative finance and algorithmic trading conference, November 11th.
Risk management is an essential but often overlooked prerequisite to success in trading. No one would like to see their substantial profits generated over his lifetime of trading just vanishing over a few bad trades.
In this talk, Danielle will discuss a quantitative understanding of risk. She will then share a few techniques in risk management, with a case study to show how a proper risk management system helps improve the overall performance of trading strategies.
"Trading Without Regret" by Dr. Michael Kearns, Professor at the Computer and...Quantopian
No-regret learning is a collection of tools designed to give provable performance
guarantees in the absence of any statistical or other assumptions on the data (!),
and thus stands in stark contrast to most classical modeling approaches.
With origins stretching back to the 1950s, the field has yielded a rich body of algorithms and analyses that covers problems ranging from forecasting
from expert advice to online convex optimization.
Dr. Kearns will survey the field, with special emphasis on applications to quantitative finance problems, including portfolio construction and inventory risk.
"From Alpha Discovery to Portfolio Construction: Pitfalls and Solutions" by D...Quantopian
From QuantCon 2017: Implementation is the efficient translation of alpha research into portfolios. It includes portfolio construction and trading. It is a vital step in the quant equity workflow, as poor implementation can ruin even the best alpha ideas. Two crucial challenges must be solved: how to construct a portfolio that most efficiently captures a given alpha signal; and, in the presence of multiple signals, how to optimally combine them into a single composite alpha factor.
This talk addresses these challenges, examines common pitfalls in the implementation of quantitative strategies and good practices to avoid them. A common theme is striking the right balance between factor signal purity and investability. We look at how factor models and optimisation techniques help professional investors answer three key questions:
· What risks should your risk model be cognisant of?
· What objective function should you use?
· What effect do investability constraints have on your portfolio?
"Snake Oil, Swamp Land, and Factor-Based Investing" by Gary Antonacci, author...Quantopian
BlackRock forecasts smart beta investing oriented toward size, value, quality, momentum, and low volatility to reach $1 trillion by 2020 and $2.4 trillion by 2025. Gary’s talk will show that this growth may not be justified due to these factors' lack of robustness, consistency, persistence, intuitiveness, and investability. Gary will also show that the success attributed to these factors would be better directed toward macro momentum and the short interest ratio.
"A Framework-Based Approach to Building Quantitative Trading Systems" by Dr. ...Quantopian
Contrary to popular wisdom the difference between a retail quant trader and a professional portfolio manager is not in "having better trade entry and exit rules". Rather it is the difference in how each approaches the concepts of portfolio optimisation and risk management.
Both of these topics are synonymous with heavy math, which can be off-putting for beginner retail systematic traders. Hence, it can be extremely daunting for those without institutional experience to know how to turn a set of trading rules into a robust portfolio and risk management system.
In this talk, Mike will discuss how to take a typical retail quant strategy and place it in a professional quantitative trading framework, with proper position sizing and risk assessment, without resorting to pages of formulas or the need to have a PhD in statistics!
Lamar Van Dusen | Optimal Turnover Revisited : Levels Corresponding to Highes...Lamar Van Dusen
Lamar Van Dusen is explaining the Optimal Turnover Revisited and Levels Corresponding to the Highest Net Return. Lamar Van Dusan provides all financial solutions and he is so talented in his work.
"Quantum Hierarchical Risk Parity - A Quantum-Inspired Approach to Portfolio ...Quantopian
Maxwell will present the methodologies and results behind the algorithm that has been developed by 1QBit, named Quantum Hierarchical Risk Parity, or QHRP.
This is an extension of the work done by Marcos Lopez de Prado on
Hierarchical Risk Parity in his paper "Building Diversified Portfolios that Outperform Out-of-Sample."
QHRP tackles the problem of minimizing the risk of a portfolio of assets using a quantum-inspired approach. Although the ideas surrounding this go back to Markowitz’s mean-variance portfolio optimization of 1952’s Portfolio Selection, we have applied recent quantum-ready machine learning tools to the problem to demonstrate strong performance in terms of a variety of risk measures and lower susceptibility to inaccuracies in the input data.
The quantum-ready approach to portfolio optimization is based on
an optimization problem that can be solved using a quantum annealer. The algorithm utilizes a hierarchical clustering tree that is based on the covariance matrix of the asset returns. The results of real market data used to benchmark this approach against other common portfolio optimization methods will be shared in this presentation.
View the White Paper: https://bit.ly/2k5xTxW.
Predictive Model for Loan Approval Process using SAS 9.3_M1Akanksha Jain
This is a Predictive Model which uses Logistic Regression to statistically help make better loan approval decisions in future for a German Bank. It uses an historical credit data set with 1000 data points and 20 variables.
Tool used:
SAS 9.3_M1
Steps Involved are:
- Data Quality check using Correlations and VIF Tests
- Analysis of different Variable Selection Methods such as Forward, Backward and Stepwise
- Variable Selection on the basis of Parameter Estimates and Odds Ratio
- Outlier Analysis to identify the outliers and improve the model
- Final Model Selection Decision based on ROC curve, Percent Concordant, PROC Rank and Hosmer Lemeshow Test
"A Framework for Developing Trading Models Based on Machine Learning" by Kris...Quantopian
Presented at QuantCon Singapore 2016, Quantopian's quantitative finance and algorithmic trading conference, November 11th.
Machine learning is improving facets of our lives as diverse as health screening, transportation and even our entertainment choices. It stands to reason that machine learning can also improve trading performance, however the practical application is fraught with pitfalls and obstacles that nullify the benefits and present a high barrier to entry. Building on background information and introductory material, Kris will propose a framework for efficient and robust experimentation with machine learning methods for algorithmic trading. The framework's objective is to arrive at parsimonious models whose positive past performance is unlikely to be due to chance. The framework is demonstrated via practical examples of various machine learning models for algorithmic trading.
Stop Flying Blind! Quantifying Risk with Monte Carlo SimulationSam McAfee
Product development is inherently risky. While lean and agile methods are praised for supporting rapid feedback from customers through experiments and continuous iteration, teams could do a lot better at prioritizing using basic modeling techniques from finance. This talk will focus on quantitative risk modeling when developing new products or services that do not have a well understood product/market fit scenario. Using modeling approaches like Monte Carlo simulations and Cost of Delay scenarios, combined with qualitative tools like the Lean Canvas and Value Dynamics, we will explore how lean innovation teams can bring scientific rigor back into their process.
Presented 25-Sep-2013 for Borsa İstanbul's Vadeli İşlem ve Opsiyon Piyasası (VİOP)
Borsa İstanbul : Vadeli İşlem ve Opsiyon Piyasası (VİOP)
- popular strategies' concentrate strikes & cause some skew
- review implied probabilities and conditional payoff are model-free
- gamma trading shows dynamic hedge issues
- volatility is not a normal "asset class"
- market maker's priorities for hedging jumps
- key hidden assumptions causing model risk
- important portfolio mismatch risks
- spotting real options & non-economic options
http://borsaistanbul.com/en/news/2013/09/26/borsa-istanbul-organizes-the-first-of-futures-and-options-market-seminar-series
Empowering Innovation Portfolio Decision-Making through SimulationSopheon
New product development is a complex, high-risk endeavor for any organization. In order to execute a game-changing innovation program, leaders must be willing to engage the unknowns around future markets and the technologies that will serve them.
This webinar discusses how simulation and specialized business processes can provide a risk-free proving ground to challenge and compare innovation strategies, thereby empowering analysts and executives to confidently make difficult investment decisions.
To view this webinar, go to http://budurl.com/zgs5
In this study we survey practices and supervisory expectations for stress testing (ST), in a credit risk framework for banking book exposures. We introduce and motivate ST; and discuss the function, supervisory requirements and expectations, credit risk parameters, interpretation results
with respect to ST. This includes a typology of ST (uniform testing, risk factor sensitivities, scenario analysis; and historical, statistical and hypothetical scenarios) and procedures for con-ducting ST. We conclude with two simple and practical stress testing examples, one a ratings migration based approach, and the other a top-down ARIMA modeling approach.
Objectives
• Know that standard NPV analysis does not account for real options
• Basic understanding of option pricing
– Black-Scholes formula
– Binomial model
• Know different types of real options and their implications
– Option to Expand
– Option to Wait
• Improve your ability to recognize valuable real options to make good business decisions
In this Spark session Ravi Saraogi talks about why estimating default risk in fund structures can be a challenging task. He presents on how this process has evolved over the years and the current methodologies for assessing such risks.
Discounted Cash Flow Methodology for Banks and Credit UnionsLibby Bierman
As institutions prepare for the CECL or current expected credit loss model for the allowance for loan and lease losses (ALLL), institutions are prudently learning the various methodologies available to them. Discounted Cash Flow or DCF is one proposed methodology. This session presents best practices and use cases for the ALLL methodology. See the recording: http://web.sageworks.com/dcf-webinar/
Similar to Stochastic Modeling - Financial Reporting (20)
1. 1
Stochastic Modeling
In The Financial Reporting World
Ron Harasym
AVP Financial Risk Management
TS 68
Society of Actuaries
2003 Washington DC Spring Meeting
2
Presentation Outline
I. Overview of Stochastic Modeling
II. A Generic Modeling Framework
III. Random Number Generation
IV. Economic Scenario Generation
V. Stochastic Modeling of a GMIB Rider
VI. Model Results & Sensitivity Testing
VII. Reserve & Capital Relief
VIII. Final Thoughts
2. 2
3
I. Overview of Stochastic Modeling
4
Stochastic Modeling - Definition
• Stochastic [Greek stokhastikos, from stokhasts, diviner, from
stokhazesthai, to guess at, from stokhos, aim, goal.]
• A stochastic model by definition has at least one random variabl e
and deals explicitly with time-variable interaction.
• A stochastic simulation uses a statistical sampling of multiple
replicates, repeated simulations, of the same model.
• Such simulations are also sometimes referred to as Monte Carlo
simulations because of their use of random variables.
3. 3
5
Stochastic Modeling - What it is
• A stochastic model is an imitation of a real world system. An
imprecise technique and that provides only statistical estimates
and not exact results.
• Stochastic modeling serves as a tool in a company’s risk
measurement toolkit to provide assistance in:
• Product Design & Pricing
• Forecasting
• Financial Reporting
• Risk Management
• Simulations are used when the systems being modeled are too
complex to be described by a set of mathematical equations for
which a closed form analytic solution is readily attainable.
• Part art, part science, part judgement, part common sense.
6
Stochastic Modeling - And What it isn’t
• Not a magical solution!
• Need to perform reality checks.
• Need to understand model limitations.
4. 4
7
Advantages of Stochastic Modeling
• Systems with long time frames can be studied in compressed time.
• Able to assist in decision making and to quantify future outcomes
arising from different actions/strategies before implementation.
• Can attempt to better understand properties of real world systems
such as policyholder behavior.
• Potential reserve and regulatory capital relief.
• Pick-up on diversification benefits.
• You can watch your company fail over and over again!
8
Limitations of Stochastic Modeling
• Requires a considerable investment of time and expertise.
• Technically challenging, computationally demanding.
• Reliance on a few “good” people.
• For any given set of inputs, each scenario gives only estimates of
the model’s outputs.
• May create a false sense of confidence - a false sense of precision.
• Relies heavily on data inputs and the identification of variable
interactions.
• It is not possible to include all future events in a model.
• Results may be difficult to interpret.
• Effective communication of results may be even harder.
• Garbage in, Garbage out!
5. 5
9
Stochastic Modeling is Preferred over
Deterministic Modeling When:
• Risks are dependent.
• When dealing with skewed and/or discontinuous
distributions/cost functions.
• There is significant volatility in the underlying variables.
• Outcomes are sensitive to initial conditions.
• There is path dependence.
• Volatility or skewness of underlying variables is likely to change
over time.
• There are real economic incentives, such as reserve or capital
relief, to perform stochastic modeling.
10
II. A Generic Modeling Framework
6. 6
11
Is There Really A Starting and Ending Point? … No!
Output
Historical
Economic Data
Historical
Policyholder
Data
RandomNumber
Generator
Economic
Scenario
Generator (ESG)
StochasticESG
Parameters &
Assumptions
Policyholder
Input Data
Economic
Scenarios
Data Validation
&
ESG Calibration
Random
Numbers
Stochastic
Asset / Liability
Models
Liability Data
Validation
Deterministic &
StochasticLiability
Assumptions
Deterministic &
Stochastic Asset
Assumptions
Result Tabulation,
Validation, & Review
Reported
Financial Results,
RiskManagement
Measures
12
Where does one Start? Key Steps Are ...
• Identify the key objectives and potential roadblocks before
considering ways of solving the problem.
• Identify key issues and potential road blocks.
• Describe the process/model in general terms before proceeding to
the specific.
• Develop the model: assumptions, input parameters, data, output.
• Fit the model: gather and analyze data, estimate input
parameters
• Implement the model.
• Analyze and test sensitivity of the model results.
• Communicate the results.
7. 7
13
Points to Keep in Mind.
• Stochastic modeling is an evolutionary process.
• Learn to “walk” before you “run”.
• Recognize that no one model fits all solutions.
• Be careful of becoming married to the method, rather than the
objective.
• Keep it simple, keep it practical, keep it understandable.
• Keep performing validation and reality checks throughout all
modeling steps.
• Strive towards the production of actionable information.
14
III. Random Number Generation
8. 8
15
Random Number Generator (RNG)
• Objective:
• To produce random numbers between 0 and 1
• Issues:
• The RNG is a foundation building block
• Critical, but often ignored/forgotten about!
• Poor RNG can compromise all post modeling sophistication.
• Many RNGs to choose from.
• Desirable Characteristics to check for:
• Robustness independent of the seed number
• Periodicity
• Independence
• Fast, efficient, & effective algorithm
• Other statistical tests
16
IV. Economic Scenario Generation
9. 9
17
Economic Scenario Generator
• Objective:
• To produce capital market or economic scenarios
• Issues:
• Outputs determined by end requirements.
• Economic vs. Statistical model
• Arbitrage-Free vs. Equilibrium
• Calibration.
• Is the focus on the mean, median, or tail events?
• Many Economic Scenario Generators to choose from.
• Desirable Characteristics to check for:
• Integrated model (equity, interest rate, inflation, currency)
• Incorporates the principle of parsimony.
• Flexible. A component approach with variable modes.
18
VI: Stochastic Modeling of a GMIB Rider
10. 10
19
A Practical Example
• Product:
• Guaranteed Minimum Income Benefit Rider
• Objective:
• Produce Measures for Financial Reporting
• Calculate Total Balance Sheet Requirement (TBSR)
• Calculate Reserve & Capital Requirements
• Nature of the Situation:
• GMIB Guaranteed Account Value of $1.4B
• Market Account Value of $1.0B
• 5% Roll-up rate per annum
• Conservative interest and mortality assumptions in pricing
20
Economic Scenario Generation
• Economic Scenario Generator:
• Equity returns modeled using RSLN2 model
• Fixed income returns modeled using Cox-Ingersol-Ross model
• Calibration Method:
• Maximum Likelihood Estimation
• Calibration Issues:
• Data is limited and often inconsistent/incorrect.
• Insufficient effort is often not given to data validation.
• Requires complex methods
• Historical data period vs. forecast horizon
• Frequency of re-calibration
• Simulation:
• 1000 scenarios, monthly frequency, 35 year projection horizon
11. 11
21
VII. Model Results & Sensitivity Testing
22
Conditional Tail Expectation: CTE(%)
• CTE is a conditional expected value based on downside risk.
• CTE can be defined as the average of outcomes that exceed a
specified percentile.
• The CTE(Q%) is calculated as the weighted-average of the worst
(100-Q)% results of the stochastic simulation.
• CTE is considered to be a more robust measure than percentiles.
12. 12
23
Stochastic Simulation Results:
• Recall
• GMIB Guaranteed Account Value of $1.4B
• Market Account Value of $1.0B
CTE GMIB ($millions)
95% $204.3
90% $177.2
80% $145.8
75% $133.9
70% $123.8
65% $114.9
60% $106.9
0% $43.4
24
(Negative) PV of GMIB Cash Flow by CTE
$0
$50
$100
$150
$200
$250
$300
$350
$400
$450
0% 20% 40% 60% 80% 100%
Conditional Tail Expectation ( % )
Base Case
Equity Return = 6%
Lapse Rate x0.5
PV of GMIB Cash Flow by Percentile
-$300
-$250
-$200
-$150
-$100
-$50
$0
$50
0% 20% 40% 60% 80% 100%
Percentile (%)
Base Case
Equity Return = 6%
Lapse Rate x0.5
13. 13
25
Present Value vs. Average Interest Rate per Scenario Scatter Plot
Stochastic Base Case: Target Equity Return = 8%, Target Interest Rate = 6%
2%
4%
6%
8%
10%
12%
-$300 -$250 -$200 -$150 -$100 -$50 $0 $50 $100
AverageInterestRateoverProjectionHorizon
2%
4%
6%
8%
10%
12%
26
Present Value vs. Average Equity Return per Scenario Scatter Plot
Stochastic Base Case: Target Equity Return = 8%, Target Interest Rate = 6%
-5%
0%
5%
10%
15%
20%
25%
-$300 -$250 -$200 -$150 -$100 -$50 $0 $50 $100
AverageEquityReturnoverProjectionHorizon
14. 14
27
Sensitivity Testing
• Quantifies the impact of an immediate change in an assumption or
variable.
• Useful for validation of the model. A check on the modeled variable
interactions
• Allows one to identify and there by direct more effort on key
assumptions or variables.
• GMIB Observations:
• Results are highly sensitive to the lapse and annuitization
assumptions.
• Results are moderately sensitive to the interest rate and the
equity return assumptions.
28
GMIB CTE Measures: Liability Assumption Sensitivity Testing
$0
$50
$100
$150
$200
$250
$300
BaseCase
RiderCharge-10bps
CurrentPricing
Spread-10bps
Pre-AnnMortDecr
10%
Post-AnnMortDecr
10%
LapseRatex2
LapseRatex0.5
AnnuitizationRatex2
AnnuitizaionRate
x0.5
CTE(95%)
CTE(90%)
CTE(80%)
CTE(70%)
CTE(60%)
CTE(0%)
Base
Case
15. 15
29
GMIB CTE Measures: Investment Assumption Sensitivity Testing
$0
$50
$100
$150
$200
$250
$300
BaseCase
EquityReturn=10%
EquityReturn=9%
EquityReturn=7%
EquityReturn=6%
LTYield=8%
LTYield=7%
LTYield=5%
LTYield=4%
CTE(95%)
CTE(90%)
CTE(80%)
CTE(70%)
CTE(60%)
CTE(0%)
Base
Case
30
VIII. Reserve & Capital Relief
16. 16
31
Why Perform Stochastic Modelling?
• AAA capital recommendations and MMMM promote the use of
stochastic approaches.
• Proposed changes to US GAAP reserving for GMDB and GMIB
benefits also promote the use stochastic approaches.
• Canadian MCCSR requirements favor the use of stochastic
approaches.
32
IX. Final Comments & Other Issues
17. 17
33
Recommended Practices
• Keep focused on the business objectives.
• No one model fits all. Best to understand fundamentals.
• Cultivate “best practices”. Keep it simple and practical.
• Don’t use a sledgehammer to crack a walnut.
• Focus on accuracy first, precision second.
• Add complexity on a cost/benefit basis.
• Perform reality checks.
• Don’t ignore model and data validation procedures.
• Avoid the creation of “black boxes”.
• Constantly loop back through the process.
34
Other Issues to Wrestle With
• Some models generate more volatility in results than others. How
do we choose between them?
• How do we perform calibration and parameter estimation?
• How do we model fixed-income returns.
• How do we capture the correlations between markets.
• How many scenarios do we use?
• How do we model policyholder behavior?
• How do we incorporate hedging in the model?