The document discusses stochastic mortality models for modelling longevity risk. It compares deterministic versus stochastic models and describes various stochastic mortality models, including Lee-Carter and CBD models. It then discusses how to apply stochastic models by calibrating models with historical data to generate simulated mortality rates and cash flows, and compares the two-factor CBD model to the Lee-Carter model based on statistical tests. The two-factor CBD model predicts a smoother distribution of pension liabilities and is identified as the most appropriate model.
Coherent mortality forecasting using functional time series modelsRob Hyndman
The document discusses coherent mortality forecasting using functional time series models. It describes modeling mortality rates over time as functional time series, where the rates are modeled as the sum of mean and deviation functions plus error. Mortality rates for different groups like males and females are expected to behave similarly over time. The model decomposes the rates into principal components to obtain scores that can be forecast individually with univariate time series models. This allows forecasting future mortality rates coherently across groups so the forecasts do not diverge over time. Existing functional models do not impose coherence across groups.
This document discusses demographic forecasting using functional data analysis. It presents a functional linear model to model and forecast age-specific demographic rates like mortality and fertility over time. The model represents rates as curves that vary annually based on common age patterns, principal components of variation, and residuals. The document outlines how the model can be used to analyze outliers, produce functional forecasts, forecast groups of populations, and generate population forecasts.
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...hanshang
The document discusses four topics: 1) the Lee-Carter model for modeling and forecasting age-specific mortality rates, 2) nonparametric smoothing of functional data, 3) functional principal component analysis (FPCA) as a dimension reduction technique, and 4) functional time series forecasting. FPCA decomposes the variability in functional data into orthogonal principal components to extract the most important patterns in the data with few dimensions.
A value at risk framework for longevity risk printversion 0Okuda Boniface
This document presents a framework for determining how much the value of a longevity liability could change over one year based on new information. It discusses three existing approaches - the stressed-trend method, mortality-shock method, and a value-at-risk proposal. The paper then proposes a new general framework that can work with various stochastic mortality projection models to estimate the one-year change in longevity liability. It describes components of longevity risk and only addresses the trend risk component within this framework. The framework avoids nested simulations and allows practitioners to explore the impact of model risk.
Mortality Product Development Symposium 2008Yuhong Xue
The document summarizes a presentation on modeling annuitant mortality using generalized linear models (GLMs). It discusses measuring current mortality experience, developing mortality improvement trends, and examples of GLM analyses showing the effects of factors like annuity amount, calendar year, joint life status, and birth cohort on mortality. Predictive modeling techniques from property and casualty are applied to better understand the "true" influence of multiple risk factors on mortality simultaneously.
Harnessing Data to Improve Health Equity - Dr. Ali MokdadLauren Johnson
1) The document discusses methods used by the Institute for Health Metrics and Evaluation (IHME) to conduct comprehensive analyses of global, national, and subnational disease burden through their Global Burden of Disease (GBD) study.
2) Key methods discussed include garbage code redistribution to reassign unspecified causes of death, Bayesian meta-regression to estimate incidence and prevalence, and small area statistical models that borrow strength across space, time, and covariates to produce estimates of disease burden for locations with limited data.
3) The GBD study aims to quantify health loss from major diseases, injuries, and risk factors globally and over time in order to help identify and address the world's most pressing health challenges.
Multivariate Regression using Skull StructuresJustin Pierce
The study aimed to develop a method for predicting the age of human remains based on measurements of the occipital condyles. Measurements of length, width, and height were taken from 68 juvenile specimens and used to generate linear regression models. The best model predicted age based on right condyle length and width, explaining 15% of variation, though neither model met the accuracy standard for legal admissibility. Limitations included a small sample size and age range, suggesting a larger, more diverse sample could improve predictive power.
Prediction, Big Data, and AI: Steyerberg, Basel Nov 1, 2019Ewout Steyerberg
Title"Clinical prediction models in the age of artificial intelligence and big data", presented at the Basel Biometrics Society seminar Nov 1, 2019, Basel, by Ewout Steyerberg, with substantial inout from Maarten van Smeden and Ben van Calster
Coherent mortality forecasting using functional time series modelsRob Hyndman
The document discusses coherent mortality forecasting using functional time series models. It describes modeling mortality rates over time as functional time series, where the rates are modeled as the sum of mean and deviation functions plus error. Mortality rates for different groups like males and females are expected to behave similarly over time. The model decomposes the rates into principal components to obtain scores that can be forecast individually with univariate time series models. This allows forecasting future mortality rates coherently across groups so the forecasts do not diverge over time. Existing functional models do not impose coherence across groups.
This document discusses demographic forecasting using functional data analysis. It presents a functional linear model to model and forecast age-specific demographic rates like mortality and fertility over time. The model represents rates as curves that vary annually based on common age patterns, principal components of variation, and residuals. The document outlines how the model can be used to analyze outliers, produce functional forecasts, forecast groups of populations, and generate population forecasts.
Modeling and forecasting age-specific mortality: Lee-Carter method vs. Functi...hanshang
The document discusses four topics: 1) the Lee-Carter model for modeling and forecasting age-specific mortality rates, 2) nonparametric smoothing of functional data, 3) functional principal component analysis (FPCA) as a dimension reduction technique, and 4) functional time series forecasting. FPCA decomposes the variability in functional data into orthogonal principal components to extract the most important patterns in the data with few dimensions.
A value at risk framework for longevity risk printversion 0Okuda Boniface
This document presents a framework for determining how much the value of a longevity liability could change over one year based on new information. It discusses three existing approaches - the stressed-trend method, mortality-shock method, and a value-at-risk proposal. The paper then proposes a new general framework that can work with various stochastic mortality projection models to estimate the one-year change in longevity liability. It describes components of longevity risk and only addresses the trend risk component within this framework. The framework avoids nested simulations and allows practitioners to explore the impact of model risk.
Mortality Product Development Symposium 2008Yuhong Xue
The document summarizes a presentation on modeling annuitant mortality using generalized linear models (GLMs). It discusses measuring current mortality experience, developing mortality improvement trends, and examples of GLM analyses showing the effects of factors like annuity amount, calendar year, joint life status, and birth cohort on mortality. Predictive modeling techniques from property and casualty are applied to better understand the "true" influence of multiple risk factors on mortality simultaneously.
Harnessing Data to Improve Health Equity - Dr. Ali MokdadLauren Johnson
1) The document discusses methods used by the Institute for Health Metrics and Evaluation (IHME) to conduct comprehensive analyses of global, national, and subnational disease burden through their Global Burden of Disease (GBD) study.
2) Key methods discussed include garbage code redistribution to reassign unspecified causes of death, Bayesian meta-regression to estimate incidence and prevalence, and small area statistical models that borrow strength across space, time, and covariates to produce estimates of disease burden for locations with limited data.
3) The GBD study aims to quantify health loss from major diseases, injuries, and risk factors globally and over time in order to help identify and address the world's most pressing health challenges.
Multivariate Regression using Skull StructuresJustin Pierce
The study aimed to develop a method for predicting the age of human remains based on measurements of the occipital condyles. Measurements of length, width, and height were taken from 68 juvenile specimens and used to generate linear regression models. The best model predicted age based on right condyle length and width, explaining 15% of variation, though neither model met the accuracy standard for legal admissibility. Limitations included a small sample size and age range, suggesting a larger, more diverse sample could improve predictive power.
Prediction, Big Data, and AI: Steyerberg, Basel Nov 1, 2019Ewout Steyerberg
Title"Clinical prediction models in the age of artificial intelligence and big data", presented at the Basel Biometrics Society seminar Nov 1, 2019, Basel, by Ewout Steyerberg, with substantial inout from Maarten van Smeden and Ben van Calster
Clinical Trials Versus Health Outcomes Research: SAS/STAT Versus SAS Enterpri...cambridgeWD
Clinical trials and health outcomes research differ in important ways that impact statistical modeling approaches. Clinical trials typically use homogeneous samples and focus on a single endpoint, while health outcomes data is heterogeneous with multiple endpoints. Predictive modeling techniques used in health outcomes research, like those in SAS Enterprise Miner, are better suited than traditional methods as they can handle complex real-world data without strong assumptions and more accurately predict rare events. Validation of models on separate test data is also important for generalizing results.
Clinical Trials Versus Health Outcomes Research: SAS/STAT Versus SAS Enterpri...cambridgeWD
This document discusses the differences between clinical trials and health outcomes research. Clinical trials use homogeneous samples, surrogate endpoints, and focus on a single outcome. They are also typically underpowered for rare events. Health outcomes research uses heterogeneous data from the general population to examine multiple real endpoints simultaneously. It has larger samples and data that allow analysis of rare occurrences. Predictive modeling is better suited than traditional statistical methods for analyzing heterogeneous health outcomes data due to relaxed assumptions like normality.
Presentation by U. Devrim Demirel, CBO's Fiscal Policy Studies Unit Chief, and James Otterson at the 28th International Conference of The Society for Computational Economics.
This document is a dissertation submitted by Jekaterina Pasecnika for their MSc in Actuarial Science. It evaluates the goodness-of-fit and forecast accuracy of four stochastic mortality models - Lee-Carter, Lee-Miller, Booth-Maindonald-Smith and Hyndman-Ullah - using population data from five countries. The Booth-Maindonald-Smith model had the best fit according to the Bayesian information criterion for all countries except Latvia, where the Hyndman-Ullah model was best. However, residual and observed vs fitted plots indicated the Hyndman-Ullah model fitted the data better overall. The models generally underestimated future male mortality except in
This document presents a Bayesian approach to the Munich Chain-Ladder (MCL) method for claims reserving. The MCL method aims to optimize the use of both paid and incurred claims data in estimating ultimate claims, as these two data sources often produce different results. The document provides background on the MCL method and its goal of reconciling estimates from paid and incurred claims triangles. It then describes applying a Bayesian formulation to the MCL method, using MCMC simulations. This allows evaluating the results from a stochastic perspective while maintaining the structure of the deterministic MCL method.
Longevity Risk in Fair Valuing Level-Three Assets in Securitized Portfolios -...Peter Mazonas
This document presents a methodology for valuing level 3 assets, such as life settlements, that incorporates longevity predictive modeling and Bayesian statistics. It addresses challenges in valuing assets where the underlying value is determined by an individual's mortality, including variability between individual life expectancy estimates. The methodology aims to reconcile differences in estimates to establish ongoing fair valuation of portfolios, consistent with accounting and auditing standards. It involves using a "Longevity Cost Calculator" to value individual policies and entire portfolios.
Due to the advancements in various data acquisition and storage technologies, different disciplines have attained the ability to not only accumulate a wide variety of data but also to monitor observations over longer time periods. In many real-world applications, the primary objective of monitoring these observations is to estimate when a particular event of interest will occur in the future. One of the major difficulties in handling such problem is the presence of censoring, i.e., the event of interests is unobservable in some instance which is either because of time limitation or losing track. Due to censoring, standard statistical and machine learning based predictive models cannot readily be applied to analyze the data. An important subfield of statistics called survival analysis provides different mechanisms to handle such censored data problems. In addition to the presence of censoring, such time-to-event data also encounters several other research challenges such as instance/feature correlations, high-dimensionality, temporal dependencies, and difficulty in acquiring sufficient event data in a reasonable amount of time. To tackle such practical concerns, the data mining and machine learning communities have started to develop more sophisticated and effective algorithms that either complement or compete with the traditional statistical methods in survival analysis. In spite of the importance of this problem and relevance to real-world applications, this research topic is scattered across various disciplines. In this tutorial, we will provide a comprehensive and structured overview of both statistical and machine learning based survival analysis methods along with different applications. We will also discuss the commonly used evaluation metrics and other related topics. The material will be coherently organized and presented to help the audience get a clear picture of both the fundamentals and the state-of-the-art techniques.
Modelling Credit Risk of Portfolios of Consumer LoansMadhur Malik
1) The document develops a Markov chain model to assess credit risk in portfolios of consumer loans based on consumer credit ratings known as behavioral scores.
2) Behavioral scores are calculated monthly and act as proxies for creditworthiness, similar to corporate credit ratings. The model uses historical behavioral score data to construct transition matrices showing how scores change over time.
3) By modeling behavioral score migrations, the framework generates multi-period default rate forecasts and assesses long-term portfolio risk, helping lenders with decisions like capital requirements.
This document outlines a methodology for modeling multi-population longevity risk across Canadian provinces. It begins with a literature review on single and multi-population longevity modeling. Next, it describes retrieving Lee Carter mortality indices for 9 Canadian provinces and testing for cointegration among the indices. Finally, it discusses estimating vector autoregression and vector error correction models to forecast mortality and evaluate their ability to price annuities for different cohorts across provinces.
IRJET- Analysis of Automated Detection of WBC Cancer Diseases in Biomedical P...IRJET Journal
This document discusses the analysis of automated detection of white blood cell (WBC) cancer diseases like leukemia and myeloma using machine learning techniques. It proposes using a random forest classifier for the final diagnosis decision. The methodology aims to reduce misdiagnosis cases by learning disease parameters from tissue samples, evaluating texture features, and reducing image noise. Experimental results show that increasing mean accuracy and texture feature values reduces image noise and improves the final results.
The effectiveness of various analytical formulas for
estimating R2 Shrinkage in multiple regression analysis was
investigated. Two categories of formulas were identified estimators
of the squared population multiple correlation coefficient (
2
)
and those of the squared population cross-validity coefficient
(
2 c
). The authors compeered the effectiveness of the analytical
formulas for determining R2 shrinkage, with squared population
multiple correlation coefficient and number of predictors after
finding all combination among variables, maximum correlation
was selected to computed all two categories of formulas. The
results indicated that Among the 6 analytical formulas designed to
estimate the population
2
, the performance of the (Olkin & part
formula-1 for six variable then followed by Burket formula &
Lord formula-2 among the 9 analytical formulas were found to be
most stable and satisfactory.
Developing and validating statistical models for clinical prediction and prog...Evangelos Kritsotakis
Talk on clinical prediction models presented at the Joint Seminar Series in Translational and Clinical Medicine organised by the University of Crete Medical School, the Institute of Molecular Biology and Biotechnology of the Foundation for Research and Technology Hellas (IMBB-FORTH), and the University of Crete Research Center (UCRC), Heraklion [online], Greece, April 7, 2021.
Presentation to CCG - Capita Health Freakononics v3Mike Thorogood
This document discusses using econometric modeling and statistical analysis to understand factors that influence weight gain and loss. It presents an initial model that links weight to food consumption. The model is then developed to also account for exercise and different activities. The document outlines testing the model by examining the overall fit and significance of individual variables. It also discusses checking for issues like collinearity between variables and establishing causality. Further tests are described to identify patterns in the residuals and improve model specification. Applications of similar modeling for targeted health interventions and estimating cost savings are briefly mentioned.
Selection of Research Material relating to RiskMetrics Group CDO Managerquantfinance
This document provides a summary of four approaches to modeling default rates over multiple time periods for use in analyzing collateralized debt obligations (CDOs). The models are calibrated to the same input data and their resulting default distributions are examined. Significant differences are found between the models, attributed to their structural differences and distributional assumptions. For single-period models, previous studies found models produce similar results when calibrated to the same data, but model choice is more important for multi-period modeling of CDO structures. The impacts of model structures and assumptions require more research for analyzing CDO performance over time.
Estimating ambiguity preferences and perceptions in multiple prior models: Ev...Nicha Tatsaneeyapan
The document presents research estimating ambiguity preferences and perceptions using a multiple prior model. Key findings include:
1) The α-MaxMin model best explains choices under ambiguity, with one parameter (α) measuring ambiguity aversion and another (δ) quantifying perceived ambiguity.
2) On average, Americans are slightly ambiguity averse (α=0.56), but perceptions of ambiguity vary (δ=0.40).
3) Ambiguity aversion is more common for moderate-high likelihood gains, but ambiguity seeking prevails for low likelihoods and losses.
The study focused on developing and validating a multi-state model to predict multimorbidity of cardiovascular disease, type 2 diabetes, and chronic kidney diseases. The presentation is a walk through the complete process starting from acquiring, filtering, splitting the data, developing the prediction model of the training data, validating the generated model on the testing data, and comparing its accuracy.
The following GitHub repository contains the R scripts required to complete this investigation.
https://github.com/jmanali1996/Multimorbidity-Multistate-Model.git
MIRA Risk Review: Multivariate metabolic risk calculatorMunich Re
Worldwide, cardiovascular diseases (CVDs) are among the most widespread causes of death. They also play an important role in morbidity and disability. Providing unprecedented detail and precision, the highly sophisticated multivariate metabolic risk calculator looks at all four main cardiovascular risk factors excessive weight, blood pressure, blood lipids and blood glucose – as well as their correlation and interaction. This MIRA Risk Review paper looks at the multivariate metabolic risk calculator, a new methodology that allows for unprecedented accuracy.
Munich Re’s internet-based underwriting tool MIRA is a high-performance integrated solution that fits seamlessly into your workflow. MIRA gives you instant access to a vast and continuously evolving pool of rating recommendations as well as interactive support from Munich Re underwriting experts – and enables you to process and store the resulting documentation in your own data infrastructure.
For more information on MIRA, visit: http://bit.ly/MIRA-Risk-Review
This document summarizes a presentation on pricing mortality solutions for changing environments. The presentation discusses factors affecting life insurance pricing, trends in mortality rates, demographic trends, techniques for modeling mortality, pricing effects and risk management, and threats and opportunities. Specifically, it examines mortality trends in different regions, the impact of changing demographics on dependency ratios, methods for smoothing mortality data and fitting population tables, and how shifts in mortality assumptions affect pricing of term insurance, annuities, and pension funds. The presentation also explores opportunities for general pension funds, variable retirement benefits that share risk, and new product development.
ISCB 2023 Sources of uncertainty b.pptxBenVanCalster
This document discusses sources of uncertainty in clinical prediction models. It identifies several types of uncertainty including aleatory uncertainty, epistemic uncertainty, approximation/estimation uncertainty, model uncertainty, data uncertainty, and population uncertainty. It illustrates these uncertainties using a model to predict ovarian cancer risk. Accounting for different sources of uncertainty, the predicted risks for individual patients can vary by over 50 percentage points. The document concludes that completely quantifying uncertainty is impossible and that transparency around uncertainty is important for clinical use and risk communication with patients.
This paper essentially demonstrates to academics and the profession that the current method of computing retirement income essentially arrives at a single solution applicable only to today; it does not model the future as currently interpreted. Our paper contrasts the difference between a calculation and a "multi-cast" simulation model.
Our research summary paper is published in the Journal of Financial Planning, Nov 2016. A link to the paper is available here "Combining Stochastic Simulations and Actuarial Withdrawals into One Model." ( http://bit.ly/2eLBUq9 )
Our working paper documenting our research project won the CFP® Board Best Research Paper Award at the 2016 Academy of Financial Services ( http://academyfinancial.org/ ) annual conference through an academic panel using a blind review process. "Certainty of Lifestyle: Contrasting a Simulation Over a Fixed Period versus Multiple Period Models" ( http://bit.ly/2dWtuNz )
In early Nov 2016, two blogs will post going into more insights from the research: Just where does the fear of outliving our money come from? Part I with link to Part II. ( http://wp.me/p2Oizj-H2 )
Clinical Trials Versus Health Outcomes Research: SAS/STAT Versus SAS Enterpri...cambridgeWD
Clinical trials and health outcomes research differ in important ways that impact statistical modeling approaches. Clinical trials typically use homogeneous samples and focus on a single endpoint, while health outcomes data is heterogeneous with multiple endpoints. Predictive modeling techniques used in health outcomes research, like those in SAS Enterprise Miner, are better suited than traditional methods as they can handle complex real-world data without strong assumptions and more accurately predict rare events. Validation of models on separate test data is also important for generalizing results.
Clinical Trials Versus Health Outcomes Research: SAS/STAT Versus SAS Enterpri...cambridgeWD
This document discusses the differences between clinical trials and health outcomes research. Clinical trials use homogeneous samples, surrogate endpoints, and focus on a single outcome. They are also typically underpowered for rare events. Health outcomes research uses heterogeneous data from the general population to examine multiple real endpoints simultaneously. It has larger samples and data that allow analysis of rare occurrences. Predictive modeling is better suited than traditional statistical methods for analyzing heterogeneous health outcomes data due to relaxed assumptions like normality.
Presentation by U. Devrim Demirel, CBO's Fiscal Policy Studies Unit Chief, and James Otterson at the 28th International Conference of The Society for Computational Economics.
This document is a dissertation submitted by Jekaterina Pasecnika for their MSc in Actuarial Science. It evaluates the goodness-of-fit and forecast accuracy of four stochastic mortality models - Lee-Carter, Lee-Miller, Booth-Maindonald-Smith and Hyndman-Ullah - using population data from five countries. The Booth-Maindonald-Smith model had the best fit according to the Bayesian information criterion for all countries except Latvia, where the Hyndman-Ullah model was best. However, residual and observed vs fitted plots indicated the Hyndman-Ullah model fitted the data better overall. The models generally underestimated future male mortality except in
This document presents a Bayesian approach to the Munich Chain-Ladder (MCL) method for claims reserving. The MCL method aims to optimize the use of both paid and incurred claims data in estimating ultimate claims, as these two data sources often produce different results. The document provides background on the MCL method and its goal of reconciling estimates from paid and incurred claims triangles. It then describes applying a Bayesian formulation to the MCL method, using MCMC simulations. This allows evaluating the results from a stochastic perspective while maintaining the structure of the deterministic MCL method.
Longevity Risk in Fair Valuing Level-Three Assets in Securitized Portfolios -...Peter Mazonas
This document presents a methodology for valuing level 3 assets, such as life settlements, that incorporates longevity predictive modeling and Bayesian statistics. It addresses challenges in valuing assets where the underlying value is determined by an individual's mortality, including variability between individual life expectancy estimates. The methodology aims to reconcile differences in estimates to establish ongoing fair valuation of portfolios, consistent with accounting and auditing standards. It involves using a "Longevity Cost Calculator" to value individual policies and entire portfolios.
Due to the advancements in various data acquisition and storage technologies, different disciplines have attained the ability to not only accumulate a wide variety of data but also to monitor observations over longer time periods. In many real-world applications, the primary objective of monitoring these observations is to estimate when a particular event of interest will occur in the future. One of the major difficulties in handling such problem is the presence of censoring, i.e., the event of interests is unobservable in some instance which is either because of time limitation or losing track. Due to censoring, standard statistical and machine learning based predictive models cannot readily be applied to analyze the data. An important subfield of statistics called survival analysis provides different mechanisms to handle such censored data problems. In addition to the presence of censoring, such time-to-event data also encounters several other research challenges such as instance/feature correlations, high-dimensionality, temporal dependencies, and difficulty in acquiring sufficient event data in a reasonable amount of time. To tackle such practical concerns, the data mining and machine learning communities have started to develop more sophisticated and effective algorithms that either complement or compete with the traditional statistical methods in survival analysis. In spite of the importance of this problem and relevance to real-world applications, this research topic is scattered across various disciplines. In this tutorial, we will provide a comprehensive and structured overview of both statistical and machine learning based survival analysis methods along with different applications. We will also discuss the commonly used evaluation metrics and other related topics. The material will be coherently organized and presented to help the audience get a clear picture of both the fundamentals and the state-of-the-art techniques.
Modelling Credit Risk of Portfolios of Consumer LoansMadhur Malik
1) The document develops a Markov chain model to assess credit risk in portfolios of consumer loans based on consumer credit ratings known as behavioral scores.
2) Behavioral scores are calculated monthly and act as proxies for creditworthiness, similar to corporate credit ratings. The model uses historical behavioral score data to construct transition matrices showing how scores change over time.
3) By modeling behavioral score migrations, the framework generates multi-period default rate forecasts and assesses long-term portfolio risk, helping lenders with decisions like capital requirements.
This document outlines a methodology for modeling multi-population longevity risk across Canadian provinces. It begins with a literature review on single and multi-population longevity modeling. Next, it describes retrieving Lee Carter mortality indices for 9 Canadian provinces and testing for cointegration among the indices. Finally, it discusses estimating vector autoregression and vector error correction models to forecast mortality and evaluate their ability to price annuities for different cohorts across provinces.
IRJET- Analysis of Automated Detection of WBC Cancer Diseases in Biomedical P...IRJET Journal
This document discusses the analysis of automated detection of white blood cell (WBC) cancer diseases like leukemia and myeloma using machine learning techniques. It proposes using a random forest classifier for the final diagnosis decision. The methodology aims to reduce misdiagnosis cases by learning disease parameters from tissue samples, evaluating texture features, and reducing image noise. Experimental results show that increasing mean accuracy and texture feature values reduces image noise and improves the final results.
The effectiveness of various analytical formulas for
estimating R2 Shrinkage in multiple regression analysis was
investigated. Two categories of formulas were identified estimators
of the squared population multiple correlation coefficient (
2
)
and those of the squared population cross-validity coefficient
(
2 c
). The authors compeered the effectiveness of the analytical
formulas for determining R2 shrinkage, with squared population
multiple correlation coefficient and number of predictors after
finding all combination among variables, maximum correlation
was selected to computed all two categories of formulas. The
results indicated that Among the 6 analytical formulas designed to
estimate the population
2
, the performance of the (Olkin & part
formula-1 for six variable then followed by Burket formula &
Lord formula-2 among the 9 analytical formulas were found to be
most stable and satisfactory.
Developing and validating statistical models for clinical prediction and prog...Evangelos Kritsotakis
Talk on clinical prediction models presented at the Joint Seminar Series in Translational and Clinical Medicine organised by the University of Crete Medical School, the Institute of Molecular Biology and Biotechnology of the Foundation for Research and Technology Hellas (IMBB-FORTH), and the University of Crete Research Center (UCRC), Heraklion [online], Greece, April 7, 2021.
Presentation to CCG - Capita Health Freakononics v3Mike Thorogood
This document discusses using econometric modeling and statistical analysis to understand factors that influence weight gain and loss. It presents an initial model that links weight to food consumption. The model is then developed to also account for exercise and different activities. The document outlines testing the model by examining the overall fit and significance of individual variables. It also discusses checking for issues like collinearity between variables and establishing causality. Further tests are described to identify patterns in the residuals and improve model specification. Applications of similar modeling for targeted health interventions and estimating cost savings are briefly mentioned.
Selection of Research Material relating to RiskMetrics Group CDO Managerquantfinance
This document provides a summary of four approaches to modeling default rates over multiple time periods for use in analyzing collateralized debt obligations (CDOs). The models are calibrated to the same input data and their resulting default distributions are examined. Significant differences are found between the models, attributed to their structural differences and distributional assumptions. For single-period models, previous studies found models produce similar results when calibrated to the same data, but model choice is more important for multi-period modeling of CDO structures. The impacts of model structures and assumptions require more research for analyzing CDO performance over time.
Estimating ambiguity preferences and perceptions in multiple prior models: Ev...Nicha Tatsaneeyapan
The document presents research estimating ambiguity preferences and perceptions using a multiple prior model. Key findings include:
1) The α-MaxMin model best explains choices under ambiguity, with one parameter (α) measuring ambiguity aversion and another (δ) quantifying perceived ambiguity.
2) On average, Americans are slightly ambiguity averse (α=0.56), but perceptions of ambiguity vary (δ=0.40).
3) Ambiguity aversion is more common for moderate-high likelihood gains, but ambiguity seeking prevails for low likelihoods and losses.
The study focused on developing and validating a multi-state model to predict multimorbidity of cardiovascular disease, type 2 diabetes, and chronic kidney diseases. The presentation is a walk through the complete process starting from acquiring, filtering, splitting the data, developing the prediction model of the training data, validating the generated model on the testing data, and comparing its accuracy.
The following GitHub repository contains the R scripts required to complete this investigation.
https://github.com/jmanali1996/Multimorbidity-Multistate-Model.git
MIRA Risk Review: Multivariate metabolic risk calculatorMunich Re
Worldwide, cardiovascular diseases (CVDs) are among the most widespread causes of death. They also play an important role in morbidity and disability. Providing unprecedented detail and precision, the highly sophisticated multivariate metabolic risk calculator looks at all four main cardiovascular risk factors excessive weight, blood pressure, blood lipids and blood glucose – as well as their correlation and interaction. This MIRA Risk Review paper looks at the multivariate metabolic risk calculator, a new methodology that allows for unprecedented accuracy.
Munich Re’s internet-based underwriting tool MIRA is a high-performance integrated solution that fits seamlessly into your workflow. MIRA gives you instant access to a vast and continuously evolving pool of rating recommendations as well as interactive support from Munich Re underwriting experts – and enables you to process and store the resulting documentation in your own data infrastructure.
For more information on MIRA, visit: http://bit.ly/MIRA-Risk-Review
This document summarizes a presentation on pricing mortality solutions for changing environments. The presentation discusses factors affecting life insurance pricing, trends in mortality rates, demographic trends, techniques for modeling mortality, pricing effects and risk management, and threats and opportunities. Specifically, it examines mortality trends in different regions, the impact of changing demographics on dependency ratios, methods for smoothing mortality data and fitting population tables, and how shifts in mortality assumptions affect pricing of term insurance, annuities, and pension funds. The presentation also explores opportunities for general pension funds, variable retirement benefits that share risk, and new product development.
ISCB 2023 Sources of uncertainty b.pptxBenVanCalster
This document discusses sources of uncertainty in clinical prediction models. It identifies several types of uncertainty including aleatory uncertainty, epistemic uncertainty, approximation/estimation uncertainty, model uncertainty, data uncertainty, and population uncertainty. It illustrates these uncertainties using a model to predict ovarian cancer risk. Accounting for different sources of uncertainty, the predicted risks for individual patients can vary by over 50 percentage points. The document concludes that completely quantifying uncertainty is impossible and that transparency around uncertainty is important for clinical use and risk communication with patients.
This paper essentially demonstrates to academics and the profession that the current method of computing retirement income essentially arrives at a single solution applicable only to today; it does not model the future as currently interpreted. Our paper contrasts the difference between a calculation and a "multi-cast" simulation model.
Our research summary paper is published in the Journal of Financial Planning, Nov 2016. A link to the paper is available here "Combining Stochastic Simulations and Actuarial Withdrawals into One Model." ( http://bit.ly/2eLBUq9 )
Our working paper documenting our research project won the CFP® Board Best Research Paper Award at the 2016 Academy of Financial Services ( http://academyfinancial.org/ ) annual conference through an academic panel using a blind review process. "Certainty of Lifestyle: Contrasting a Simulation Over a Fixed Period versus Multiple Period Models" ( http://bit.ly/2dWtuNz )
In early Nov 2016, two blogs will post going into more insights from the research: Just where does the fear of outliving our money come from? Part I with link to Part II. ( http://wp.me/p2Oizj-H2 )
Similar to Living Longer At What Price- Mortality Modelling (20)
Technology and Investing - Where to from here?Redington
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive function. Exercise causes chemical changes in the brain that may help protect against mental illness and improve symptoms.
The Impact of Technology on the Pensions IndustryRedington
The impact of technology on the pensions industry (past, present, future).
Prezi version: https://prezi.com/aadascppmnor/the-impact-of-technology-on-the-pensions-industry
21st Century Schemes – Deciding on the right scheme design for your membersRedington
This document discusses considerations for designing retirement schemes for members. It highlights collecting data on members' current pension pots, contributions, ages and where they are in their retirement journey. The goal is to help employees securely plan their financial futures through empowering individual decision making with easy, attractive, social and timely communication. Technology like personalized pension apps and emotional connections are seen as ways to improve pension savings success.
"Hi Alexa, how should I save for my pension?"Redington
The document discusses various ways for people to save effectively for their pension. It recommends understanding where you are in your retirement journey, engaging members through education and technology, and using behavioral insights and personalized approaches to empower people to make good savings decisions. Gamification and simple ideas can help communicate complex choices. Members have different risk profiles and a one-size approach does not fit all, so pensions should be tailored individually. The goal is to help people save and invest adequately for a secure retirement.
Why we need to teach our children how to budget, save, invest and give backRedington
The document discusses the benefits of meditation for reducing stress and anxiety. Regular meditation practice can help calm the mind and body by lowering heart rate and blood pressure. Making meditation a part of a daily routine, even if just 10-15 minutes per day, can have mental and physical health benefits over time by helping people feel more relaxed and focused.
Making Decisions; An Effective Trustee BoardRedington
What are the 10 core strengths of a Trustee Ninja?
1. Passion
2. Trust
3. Open Minded
4. Intellectual Curiosity
5. Numeracy
6. Collegiate
8. Prepare to challenge and be challenged
7. Seeing the wood for the trees
9. Prepare to stand out from the crowd
10. Make decisions and live with the consequences
Critical Friends - The Need for Straight TalkingRedington
This document summarizes a presentation on providing constructive feedback between pension trustees and their advisers. It discusses the need for "critical friends" to have honest conversations and help each other improve. A survey found that advisers rarely give critical feedback to trustees. The presentation advocates using a framework called "Radical Candor" to build feedback into the relationship through open and caring criticism. It encourages trustees and advisers to find a trusted partner to help them strengthen governance through respectful feedback.
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive function. Exercise causes chemical changes in the brain that may help protect against mental illness and improve symptoms for those who already suffer from conditions like depression and anxiety.
The document discusses the shift from defined benefit pensions to defined contribution plans, with individuals taking on more responsibility for retirement savings. It notes this is an issue not just in the UK and explores strategies to reduce risks for individuals as they invest for the long term, including balanced income and asset allocation approaches. The document also discusses how financial technology and consulting are evolving to better serve individuals by offering personalized strategies and engagement.
"I haven’t told you the best part,” said Grandpa. “When you save your acorns, they don’t just sit there and wait for you. They grow into trees, and the trees give you more and more acorns.”
Join Oliver and Amelia as Grandpa teaches them the importance of saving. They hear the story of how the bears saved the monkeys. They learned about the consequences of wasting bananas, sharing berries and saving acorns. The best part is the acorns they save can grow over time into trees with more acorns.
Five steps to help your employees secure their financial futuresRedington
The document outlines five steps to help employees secure their financial futures: 1) Engagement, 2) Education, 3) Digital tools and solutions, 4) Having an outcome focus, and 5) Personalization. It discusses challenges like defined benefit pensions ending, economic uncertainty, and lack of financial education. It emphasizes the importance of engaging employees, teaching them about concepts like compound interest, using digital tools, focusing on desired outcomes, and creating personalized plans tailored to individual risk profiles and needs.
Is Your Property Allocation RIght for You?Redington
The document discusses the impact of Brexit on the UK property market and different property investment options. It notes that commercial rental markets are expected to weaken due to uncertainty after Brexit. It then discusses suspensions of redemptions in some UK property funds and notes that institutional money funds have experienced less redemption pressure. Finally, it outlines different property investment strategies and their risk-return profiles that may be suitable depending on whether a pension fund is in the opening, middle, or end stage of its funding journey.
Introduction and outlook of EU pension systemRedington
This document summarizes the pension system and challenges in Europe. It discusses the three pillars of pension systems: social security, employer pensions, and personal pensions. It then focuses on defined contribution pensions in the UK, including typical plan designs, contributions, taxation, and investment options. The document notes challenges like lack of financial literacy and planning. It proposes solutions such as personal retirement planning, lifecycle investment strategies, and education initiatives to increase participation and knowledge.
Challenges and opportunities for financial market globalisationRedington
This document summarizes an event discussing opportunities and challenges of financial market globalization. It presents examples from China and Japan showing how capital controls have limited China's contribution to global financial integration compared to its contributions to trade and GDP. It also discusses how financial institutions must innovate their business models to maintain growth as traditional sources of revenue decline. Overall, the document examines how globalization brings opportunities like expanded financing sources but also challenges like increased vulnerability to external shocks for financial markets and institutions.
What role do consultants play in the value chainRedington
The document provides contact information for Robert J Gardner including links to his Twitter and LinkedIn profiles and a request to get in touch. It lists two URLs for Robert J Gardner's social media profiles and a brief message asking anyone who sees the document to contact him.
Summary of the key messages from the 2016 Annual Funding StatementRedington
The document provides guidance for pension scheme trustees undertaking 2016 valuations from the Pensions Regulator. It emphasizes that an integrated risk management approach is key to assessing the risks impacting scheme assets and liabilities. Trustees are expected to set realistic investment return assumptions based on current market conditions. Most schemes will likely have larger deficits than expected and trustees should discuss increasing employer contributions or alternative options with employers where affordability allows. Trustees are also advised to focus on longer term risks and rewards rather than short term market volatility.
“Amidst Tempered Optimism” Main economic trends in May 2024 based on the results of the New Monthly Enterprises Survey, #NRES
On 12 June 2024 the Institute for Economic Research and Policy Consulting (IER) held an online event “Economic Trends from a Business Perspective (May 2024)”.
During the event, the results of the 25-th monthly survey of business executives “Ukrainian Business during the war”, which was conducted in May 2024, were presented.
The field stage of the 25-th wave lasted from May 20 to May 31, 2024. In May, 532 companies were surveyed.
The enterprise managers compared the work results in May 2024 with April, assessed the indicators at the time of the survey (May 2024), and gave forecasts for the next two, three, or six months, depending on the question. In certain issues (where indicated), the work results were compared with the pre-war period (before February 24, 2022).
✅ More survey results in the presentation.
✅ Video presentation: https://youtu.be/4ZvsSKd1MzE
13 Jun 24 ILC Retirement Income Summit - slides.pptxILC- UK
ILC's Retirement Income Summit was hosted by M&G and supported by Canada Life. The event brought together key policymakers, influencers and experts to help identify policy priorities for the next Government and ensure more of us have access to a decent income in retirement.
Contributors included:
Jo Blanden, Professor in Economics, University of Surrey
Clive Bolton, CEO, Life Insurance M&G Plc
Jim Boyd, CEO, Equity Release Council
Molly Broome, Economist, Resolution Foundation
Nida Broughton, Co-Director of Economic Policy, Behavioural Insights Team
Jonathan Cribb, Associate Director and Head of Retirement, Savings, and Ageing, Institute for Fiscal Studies
Joanna Elson CBE, Chief Executive Officer, Independent Age
Tom Evans, Managing Director of Retirement, Canada Life
Steve Groves, Chair, Key Retirement Group
Tish Hanifan, Founder and Joint Chair of the Society of Later life Advisers
Sue Lewis, ILC Trustee
Siobhan Lough, Senior Consultant, Hymans Robertson
Mick McAteer, Co-Director, The Financial Inclusion Centre
Stuart McDonald MBE, Head of Longevity and Democratic Insights, LCP
Anusha Mittal, Managing Director, Individual Life and Pensions, M&G Life
Shelley Morris, Senior Project Manager, Living Pension, Living Wage Foundation
Sarah O'Grady, Journalist
Will Sherlock, Head of External Relations, M&G Plc
Daniela Silcock, Head of Policy Research, Pensions Policy Institute
David Sinclair, Chief Executive, ILC
Jordi Skilbeck, Senior Policy Advisor, Pensions and Lifetime Savings Association
Rt Hon Sir Stephen Timms, former Chair, Work & Pensions Committee
Nigel Waterson, ILC Trustee
Jackie Wells, Strategy and Policy Consultant, ILC Strategic Advisory Board
KYC Compliance: A Cornerstone of Global Crypto Regulatory FrameworksAny kyc Account
This presentation explores the pivotal role of KYC compliance in shaping and enforcing global regulations within the dynamic landscape of cryptocurrencies. Dive into the intricate connection between KYC practices and the evolving legal frameworks governing the crypto industry.
5 Compelling Reasons to Invest in Cryptocurrency NowDaniel
In recent years, cryptocurrencies have emerged as more than just a niche fascination; they have become a transformative force in global finance and technology. Initially propelled by the enigmatic Bitcoin, cryptocurrencies have evolved into a diverse ecosystem of digital assets with the potential to reshape how we perceive and interact with money.
Dr. Alyce Su Cover Story - China's Investment Leadermsthrill
In World Expo 2010 Shanghai – the most visited Expo in the World History
https://www.britannica.com/event/Expo-Shanghai-2010
China’s official organizer of the Expo, CCPIT (China Council for the Promotion of International Trade https://en.ccpit.org/) has chosen Dr. Alyce Su as the Cover Person with Cover Story, in the Expo’s official magazine distributed throughout the Expo, showcasing China’s New Generation of Leaders to the World.
2. Contents
Mortality Modelling
Contents
Deterministic Mortality Models
Deterministic Models 3
Stochastic Mortality Models
Why Stochastic Models? 6
Stochastic Mortality Models 9
Application Procedure and Model Comparison 15
Appendix 22
2
“Where there is a considerable range of possible outcomes, the FSA expects firms to use
stochastic techniques to evaluate these risks. In time, for example, longevity risk, where this
constitutes a significant risk for the firm, may fall into this category.”
-----FSA’s Regulatory Guidance for Actuaries
4. Mortality Modelling
Deterministic Projections
Scenario Tests with Different Deterministic Mortality Tables
Different mortality assumptions imply different pension benefit cash flow structures for a
specific pension scheme.
This research compares the impact on a specific pension scheme’s cash flows, presents
value and duration of ten deterministic mortality projections produced by the CMI,
including the original “92” Series projection, cohort projections and cohort projection with
1% and 2% underpins.
4
5. The table below shows that a change in mortality assumptions can have a big impact on
the PV and Duration of a pension scheme’s benefits.
*E.g. The PV for this specific scheme (calculated with a long cohort / 2% underpin mortality
table) is 13% larger than the PV calculated with “92” table, and the duration increases by
13.5%.
Scenario Tests with Different Deterministic Mortality Tables
Active Deferred Pensioners Total
PV
(£ million)
Duration
(year)
PV
(£ million)
Duration
(year)
PV
(£ million)
Duration
(year)
PV
(£ million)
Duration
(year)
"92" 195 26.8 195 25.5 598 11.1 988 17.0
SC 196 26.9 197 25.6 605 11.2 998 17.1
SC 1% 201 27.5 201 26.1 613 11.4 1,015 17.5
SC 2% 216 29.3 218 27.8 641 12.3 1,075 18.9
MC 200 27.2 201 25.9 619 11.4 1,020 17.4
MC 1% 204 27.8 205 26.4 626 11.6 1,035 17.7
MC 2% 218 29.4 220 27.9 648 12.4 1,086 18.9
LC 209 28.0 210 26.6 648 12.0 1,067 18.0
LC 1% 212 28.5 214 27.1 652 12.2 1,078 18.3
LC 2% 223 29.8 225 28.3 666 12.7 1,115 19.3
5
Accounting
basis
TPR new
proposal
Mortality Modelling
Deterministic Projections
6. Mortality Modelling
Why Stochastic Models?
Projection vs. Experience : How projections can go wrong (CMI Lee-Carter Projection)
Source: CMI Male Assured Lives
6
By using Male Assured Lives data from 1947 to 1980 to project the future mortality rate from 1980
to 2004 at age 65.
The graph shows that, by the end of 2004, the projected probability of death for males aged 65 is
about 0.015, but the realised mortality rate is about 0.009 - which is much lower than predicted.
What is worse is that projections continually overestimate the probability of death over the
estimation period; in other words, they underestimate the life expectancy substantially.
7. Graduation: Smoothing Raw Data to Remove Random Fluctuation
Source: CMI Male Assured Lives 1947-2005
Raw Death Rate Graduated Death Rate
In order to calculate the PV of liabilities, an actuarial valuation requires best estimate mortality rates
as a starting point, even if a prudent margin is being added, to evaluate liability present values; these
will be an instruction to set contribution rate as well. But we will never know when mortality
experience will go wrong and how far it may be from the “so-called” best estimates. These best
estimates are based on “smoothed” results - which means the randomness in mortality is largely
ignored.
7
Mortality Modelling
Why Stochastic Models
8. Stochastic Projections:
Allow for randomness in mortality rates
Can estimate Longevity Value at Risk (LVaR)
•Monte-Carlo Simulation:
Can show the whole range of possible future mortality rates
•Probability of each scenario
•Full distribution of future pension liabilities
Deterministic Projections:
Assume fixed mortality rates
Scenario Tests:
Cannot determine the probability of a specific scenario
•Single Value of Liability: Does not include all scenarios
• Does not show full distribution of possible future mortality
rates
Compare Deterministic and Stochastic Mortality Models
8
Mortality Modelling
Why Stochastic Models
10. Mortality Modelling
Stochastic Mortality Models
Eight Stochastic Mortality Models
10
Details of model specifications are described in Redington Longevity Technical paper, which is available upon request.
All models capture the age and period effects but they vary in the modelling approach.
Lee-Carter extension, APC, Three factor CBD, Three-factor CBD extension and Four-Factor CBD capture
the cohort effect in the model. (2, 6, 7 and 8)
1
2
3
4
5
6
7
8
11. Two Major Model Families
CBD Models: The Cairns, Blake and Dowd (CBD) model was developed by three professors in
the UK: Professor David Blake from Cass Business School, Professor Andrew Cairns from Heriot-
Watt University and Professor Kevin Dowd from Nottingham University Business School. The
CBD model was developed for and tested using mortality data from males living in England and
Wales, and has yet to be tested with data from any other countries. However, the model has
already been taken up widely by actuaries in Germany and is currently being investigated by
the CMI (Pension Institute, 2007).
Lee Carter Models: The Lee-Carter model was developed by Professors Ronald Lee and
Lawrence Carter. This model has become the “leading statistical model of mortality
forecasting in the demographic literature” in the United States (Deaton and Paxson, 2004). Lee
and Carter originally calibrated their model to use United States mortality data from 1933-
1987. Girosi and King (2007) note that the model is “now being applied to all-cause and cause-
specific mortality data from many countries and time periods, and all well beyond the
application for which it was designed” (Girosi and King, 2007).
11
Mortality Modelling
Stochastic Mortality Models
12. Two Major Model Families
Four models from the two major model families, the CBD and Lee-Carter, were tested.
Lee Carter Model: A simple one-factor model that assumes mortality improvement for
different ages has a perfect correlation. As a result, the model gives less flexibility in age
specific volatilities and usually projects less volatile future mortality rates.
CBD Model: Accounts for more factors to allow different improvements across different
ages at different periods of time. As a result, the model gives more flexibility in age specific
volatilities and often projects more volatile future mortality rates than Lee Carter model.
12
Mortality Modelling
Stochastic Mortality Models
13. Data Sample:
The data used in testing the four models was taken from the Human Mortality Database (HMD).
In order to analyse the models’ sensitivities to the choice of data sample, models were calibrated
and tested against two samples:
The first sample included population mortality data for ages 20 to 100 from 1920 to 1960;
The second test included population mortality data for ages 20 to 100 from 1963 to 1983;
The figures below show mortality rate data in the first sample is more volatile than the mortality
rate data in the second sample.
13
Mortality Modelling
Stochastic Mortality Models
14. Summary of Statistical Tests
Statistical tests: MSE, Sign and Outlier Test
• MSE and Sign Test show that the forecasting accuracy of CBD family models is more sensitive
to the choice of sample than Lee Carter family.
• The Outlier test shows that both of the models fail to capture the “hump” effect for younger
ages, and the Lee-Carter models also systematically over-estimated the mortality rates for
old ages and failed to project a proper confidence interval for old ages statistically.
Conclusion
The Two-Factor CBD model is the most appropriate model for longevity risk analysis and
management purpose, because it:
Produces stabler projections of future mortality rates;
Produces confidence intervals that cover most realised mortality rates, especially for older
ages;
Is easy to implement.
14
Mortality Modelling
Stochastic Mortality Models
16. Application Procedures
Evaluate possible solutions
Manage longevity risk in the overall LDI strategy
Quantify longevity risk
Estimate benefit cash flows with simulated mortality rates
Generate simulations on future mortality rates with calibrated model
Calibrate mortality model with historical data
16
Mortality Modelling
Applications
17. Stochastic Simulations – Mortality Rate
• By calibrating stochastic mortality models with historical data, a large number of simulated
mortality tables can be generated (e.g. 1000 tables).
• These simulated mortality tables provide a range of possible future mortality rates by incorporating
age, period and cohort effects.
• The two graphs below show the range of mortality rate simulations and a comparison with the
realised mortality rate during that period.
17
Mortality Modelling
Applications
18. Stochastic Cash Flows
For a better understanding of pension liability, cashflow analysis is largely adopted:
By using the simulated future mortality rates from previous simulations, the cashflow
structure of a generic pension scheme is projected for 1000 times.
By doing this, we can observe a range of possible cash flows implied by the two models.
This forms the basis of quantifying longevity risk for a specific pension scheme.
CBD Model Lee Carter Model
18
Mortality Modelling
Applications
19. The cashflow structures are then discounted with a flat discount curve. The two graphs
below show the distribution of pension liabilities under the different projections of the
two models.
CBD Model: Predicts a smoother and more spread out distribution of liabilities. This means
more longevity risk on the tails.
Lee Carter Model: Predicts a tighter distribution of pension liabilities. This means less
longevity risk on the tails.
Model Comparison - Distributions
19
Mortality Modelling
Applications
20. Model Comparison – Main Statistics
Two- Factor CBD Model Lee Carter Model
Two-Factor CBD Model PV Lee-Carter Model PV Short cohort Medium cohort Long cohort
(£’000,000) (£’000,000) (£’000,000) (£’000,000) (£’000,000)
Mean 1,002 998 998 1,020 1,067
S.d. 32 16 n/a n/a n/a
95% 1,054 1,026 n/a n/a n/a
5% 954 973 n/a n/a n/a
VaLR 52 27 n/a n/a n/a
As % of Total Liability 5.17% 2.73% n/a n/a n/a
20
Mortality Modelling
Applications
21. Contacts
Dawid Konotey-Ahulu | Partner Direct: +44 (0) 207 250 3415
dawid@redingtonpartners.com
Robert Gardner | Partner Direct: +44 (0) 207 250 3416
robert.gardner@redingtonpartners.com
Redington Partners LLP
13 -15 Mallow Street London EC1Y 8RD
Telephone: +44 (0) 207 250 3331
www.redingtonpartners.com
THE DESTINATION FOR ASSET & LIABILITY MANAGEMENT
Contacts
Disclaimer
Disclaimer For professional investors only. Not suitable for private customers.
The information herein was obtained from various sources. We do not guarantee every aspect of its accuracy. The information is for your private information and is for discussion
purposes only. A variety of market factors and assumptions may affect this analysis, and this analysis does not reflect all possible loss scenarios. There is no certainty that the
parameters and assumptions used in this analysis can be duplicated with actual trades. Any historical exchange rates, interest rates or other reference rates or prices which appear
above are not necessarily indicative of future exchange rates, interest rates, or other reference rates or prices. Neither the information, recommendations or opinions expressed
herein constitutes an offer to buy or sell any securities, futures, options, or investment products on your behalf. Unless otherwise stated, any pricing information in this message is
indicative only, is subject to change and is not an offer to transact. Where relevant, the price quoted is exclusive of tax and delivery costs. Any reference to the terms of executed
transactions should be treated as preliminary and subject to further due diligence .
Please note, the accurate calculation of the liability profile used as the basis for implementing any capital markets transactions is the sole responsibility of the Trustees' actuarial
advisors. Redington Partners will estimate the liabilities if required but will not be held responsible for any loss or damage howsoever sustained as a result of inaccuracies in that
estimation. Additionally, the client recognizes that Redington Partners does not owe any party a duty of care in this respect.
Redington Partners are investment consultants regulated by the Financial Services Authority. We do not advise on all implications of the transactions described herein. This
information is for discussion purposes and prior to undertaking any trade, you should also discuss with your professional tax, accounting and / or other relevant advisers how
such particular trade(s) affect you. All analysis (whether in respect of tax, accounting, law or of any other nature), should be treated as illustrative only and not relied upon as
accurate.
21
22. Appendix
Model Testing
Evaluation Criteria
Mean Squared Error (MSE)
MSE is used to test the forecasting accuracy of the models defined as:
Where T is the total number of projected years, and are respectively the projected and actual
observation for age x at time i. The smaller the test statistic is the more accurate the projection is.
Sign Test
Sign test is used to test the hypothesis that the model residuals are unbiased. Mathematically, if m is the
number of positive residuals, then m should follow a binomial distribution with parameters being the
number of residuals (n) and 0.5. For test here, normal approximation is used.
Outlier Test
The outliers test is used to study the model’s capability to project confidence intervals which can cover
most of the realized mortality rates as expected. For example, a 90% confidence interval is expected to
cover 90% of the observation points.
22
23. MSE and Sign Test
MSE test shows that, in the 40 year data sample, the Lee-Carter model provides the best
projection accuracy; however, in the 20 year data sample, the CBD models performed better than
the Lee Carter models, while the three-factor CBD model provided the most accurate results.
Comparing the sign test results reveals that nearly all models based on two datasets predict
biased mortality rates; this is because the sign test statistics are all significantly larger than the
critical value of 1.96. The exception is the three-factor CBD model in the 40 year sample data
test.
In general, the two tests show that the forecasting accuracy for CBD family models is more
sensitive to the choice of sample than Lee Carter family. MSE test statistics show that the CBD
models can produce a better projection if the end-user properly selects the sample space.
23
Appendix
Model Testing
24. Outliers Test
The tables above show that, for both projection periods, the number of outliers is
greater than expected. The exception is the two-factor CBD model, which
produced significantly less outliers than expected.
24
Appendix
Model Testing
25. Outliers Test
The orange area represents outlier in right hand tail; the red area represents outlier in left
hand tail, and the green area represents observations that are within the confidence
interval.
The test results show that both of the models failed to capture the “hump” effect for
younger ages. The Lee-Carter models also systematically over-estimated the mortality rates
for old ages and failed to project a proper confidence interval for old ages.
CBD ModelLee Carter Model
25
Appendix
Model Testing