Quantifying reserve risk with worked examples for modeling for IFRS17 Risk Adjustment and Solvency 2 Risk Margin. In this presentation, we cover the modeling of Risk Adjustment of IFRS17 and compare it to Risk Margin of Solvency 2. We also cover strategic considerations to quantifying reserve risk.
We review basic reserving methodologies for reserving general insurance like lag analysis and chain ladder. We then move forward to consider multiple stochastic loss reserving models in detail and show how they uncover more insights than basic reserving models.
PECB Webinar: The importance of business impact analysisPECB
Business Impact Analysis (BIA) is the key element to an effective disaster recovery, which is in the heart of business continuity. In order to elaborate the BIA importance better, this webinar will cover the following areas:
• Why a Business Impact Analysis?
• Business Impact Analysis in the BCM Lifecycle
• New Standard ISO 22317 on the BIA
• BIA Approaches
• Challenges when doing a BIA
• Socrates Maps
• BIA Critical Success Factors
IFAC Senior Technical Manager Vincent Tophoff presentation during the Institute of Chartered Accountants of Pakistan's CFO Conference 2013, CFO: Meeting Future Challenges! Mr. Tophoff discusses current trends and thinking in risk management and best practices.
We review basic reserving methodologies for reserving general insurance like lag analysis and chain ladder. We then move forward to consider multiple stochastic loss reserving models in detail and show how they uncover more insights than basic reserving models.
PECB Webinar: The importance of business impact analysisPECB
Business Impact Analysis (BIA) is the key element to an effective disaster recovery, which is in the heart of business continuity. In order to elaborate the BIA importance better, this webinar will cover the following areas:
• Why a Business Impact Analysis?
• Business Impact Analysis in the BCM Lifecycle
• New Standard ISO 22317 on the BIA
• BIA Approaches
• Challenges when doing a BIA
• Socrates Maps
• BIA Critical Success Factors
IFAC Senior Technical Manager Vincent Tophoff presentation during the Institute of Chartered Accountants of Pakistan's CFO Conference 2013, CFO: Meeting Future Challenges! Mr. Tophoff discusses current trends and thinking in risk management and best practices.
Assessing the impact of a disruption: Building an effective business impact a...Bryghtpath LLC
Many organizations have adopted the ISO 22301 standard for their business continuity management systems. Recently, ISO has released the new ISO 22317 Standard for Business Impact Analysis. In this webinar, learn about several different strategies to build an effective BIA that will help you advance your business continuity strategies.
The instructor for this webinar is Bryan Strawser, Founder and CEO of Bryghtpath LLC, a strategic advisory firm specializing in crisis management, business continuity, global risk, crisis communications, and public affairs.
Operational Risk Management under BASEL eraTreat Risk
Operational risk have always ignored by Banks as they thought Credit and market risks can cause catastrophe. But history of misfortunes taught us different lessons. Controls and internal audit have long been construed as guard till BASEL II dictates forced banks to look with insight. Understand the dimension of ORM in this presentation.
Enterprise Risk Management provides decision makers with a
realistic picture of likely
outcomes to their strategic initiatives by integrating risk into the cost benefit analysis of
all strategic investments.
How does Operational Risk Management fit into an organization's Strategic Planning? This presentation attempts to provide a functional and implementable response.
Operational Risk Management - Understanding Your Risk LandscapeEneni Oduwole
This presentation provides insights on how the proper implementation of Operational Risk Management can lead to effective risk profiling, analysis and mitigation. It introduces operational risk as a bedrock for meaningful risk management irrespective of which industry an organization plays in.
A new report by the Business Continuity Institute, supported by certification body NQA, has shown that 6 out of 10 organizations adopt ISO 22301. Organizations with strong top management commitment to standardising business continuity practice are four times more likely to adopt ISO 22301 than those who do not.
There are many reasons why an organization would want to embrace ISO 22301, most notably it provides assurance of continued service with 61% respondents identifying this as a significant reason. By certifying to the Standard, organizations can provide reassurance to their stakeholders that, in the event of a crisis, it will still be able to function.
Read the full survey report for more information on the business benefits of ISO 22301.
https://www.nqa.com/en-gb/resources/news/6-out-of-10-organizations-adopt-iso-22301
PECB Webinar: Introduction to ISO 22317 – Business Impact Analysis (BIA)PECB
We will cover:
• Importance of Business Impact Analysis (BIA)
• What does new standard ISO 22317 cover?
• Elaborating ISO 22317
Presenter:
This session will be hosted by our partner Dr. Wolfgang H. Mahr, M.Sc., MBCI, the Managing Director of governance & continuity gmbh with more than 20 years of experience.
Five lines of assurance a new paradigm in internal audit & ermDr. Zar Rdj
• Boards are provided with a tangible vehicle to demonstrate they are actively overseeing the company’s “risk appetite framework” (“RAF”)
• The process is designed to fully integrate with strategic planning, new product/service initiatives, and M&A activities.
• The process provides a clear response to emerging expectations like the UK Governance Code, Canadian Securities Administrators, SEC, FSB, credit agencies, institutional investors and TSB.
• The main role of internal audit is to report on the effectiveness of the risk management processes and the consolidated report on residual risk status the board receives from the CEO or his/her designate and to help the company build and maintain robust risk management processes
• Boards are provided with a tangible vehicle to demonstrate they are actively overseeing the company’s “risk appetite framework” (“RAF”)
• The process is designed to fully integrate with strategic planning, new product/service initiatives, and M&A activities.
• The process provides a clear response to emerging expectations like the UK Governance Code, Canadian Securities Administrators, SEC, FSB, credit agencies, institutional investors and TSB.
• The main role of internal audit is to report on the effectiveness of the risk management processes and the consolidated report on residual risk status the board receives from the CEO or his/her designate and to help the company build and maintain robust risk management processes.
Implementing Enterprise Risk Management with ISO 31000:2009Goutama Bachtiar
This presentation slides is intended for the training-workshop lead as well as the participants.
Developed based on ISO 31000:2009 – Principles and Guidelines on Implementation, ISO/IEC 31010:2009 – Risk Assessment Techniques, ISO Guide 73:2009 – Vocabulary.
How Can You Drive Opportunity If You Cannot Manage Risk?Lora Cecere
Report Details: The research for this report was conducted via an online survey from March 12 - May 11, 2018. Surveys were conducted among 93 respondents -- a mix of business users (manufacturers, wholesalers/distributors/co-operatives, and third-party logistics providers, n=34), vendors (software providers and consultants, n=39), and others (academics, analysts, unemployed, and others, n=20).
Objective: To understand the current and expected future state of supply chain risk management, the biggest drivers of risk, and the impact on supply chain disruptions. NOTE: supply chain risk management is defined as the proactive identification and assessment of potential risks to the supply chain, as well as the development of strategies to avoid these risks.
Highlight: Nearly two-thirds of respondents believe that their company performs better today on risk management practices than five years ago yet they had 3.5 disruptions last year on average. Managing risk requires a network approach. Today’s investments in end-to-end supply chain are by and large not effective in risk mitigation. Only 37% have visibility of extended-tier suppliers and most lack the solutions to manage global complexity.
Discussion- 11. How does efficient frontier analysis (EFA) dif.docxmadlynplamondon
Discussion- 1
1. How does efficient frontier analysis (EFA) differ from other forms of complex risk assessment techniques?
The issue of the selection of the risk management methods to support investment decision-making is one of the key issues discussed in the management of portfolios. The factor contributing to the development and dissemination of the risk management methods is the fact that the development of this theory, the risk of portfolios of financial institutions began to measure widely using the Markowitz portfolio selection model. Currently, this problem has been solved, since his designation used linear programming. It cannot be missed with these two facts. The indication of such a relationship, as well as its characteristics are the main purpose of the publication, in which there was not only used the study literature. The efficient frontier can be defined as the image of a set of portfolios that provide the maximum return for each level of risk or minimal risk for any level of return. In addition, this measure brings important details in the development area of portfolios’ management of financial instruments, on the grounds that it considers the possibility of the investor’s bankruptcy and may be regarded as a dynamic measurement of the risk (Bali T.G).
2. What limitations might an analyst encounter with EFA?
The financial equivalent of racing cars if They're one of the most touted, yet most misunderstood and misused, tools in the field of financial planning. Understanding the nature of an efficient frontier model and the assumptions on which it relies. As with a sophisticated racing car, a powerful tool in the wrong hands can be a very dangerous thing. For example, it's logical to believe that stocks will outperform bonds in the future. Efficient frontier models rely on historical data and relationships to generate the "perfect" portfolio. In my experience, many investors who use efficient frontier models are unaware of their pitfalls. These models are being marketed as solutions to the problem of portfolio construction, but they come without instructions.
3. How can efficient frontier analysis results be communicated and utilized with nonmathematical decision maker?
Communication is not a crank to be turned mindlessly, but a decision problem of its own. As we will see, there are many alternatives to consider. The analyst’s choices constitute the design of a communication plan. In ideal cases, the client is infinitely patient, unshakably invested in the problem, fully committed to finding the highest quality solutions, flexible about the process, and unwavering in confidence in the analyst’s work. In such cases, tight outlines or rambling jumbles may lead to the same outcome. Good quantitative analysis alone does not usually produce good decisions, because rarely does the analyst control all the resources required to decide and act. Decision makers and other players who influence the decision must assimilate the results of th ...
Assessing the impact of a disruption: Building an effective business impact a...Bryghtpath LLC
Many organizations have adopted the ISO 22301 standard for their business continuity management systems. Recently, ISO has released the new ISO 22317 Standard for Business Impact Analysis. In this webinar, learn about several different strategies to build an effective BIA that will help you advance your business continuity strategies.
The instructor for this webinar is Bryan Strawser, Founder and CEO of Bryghtpath LLC, a strategic advisory firm specializing in crisis management, business continuity, global risk, crisis communications, and public affairs.
Operational Risk Management under BASEL eraTreat Risk
Operational risk have always ignored by Banks as they thought Credit and market risks can cause catastrophe. But history of misfortunes taught us different lessons. Controls and internal audit have long been construed as guard till BASEL II dictates forced banks to look with insight. Understand the dimension of ORM in this presentation.
Enterprise Risk Management provides decision makers with a
realistic picture of likely
outcomes to their strategic initiatives by integrating risk into the cost benefit analysis of
all strategic investments.
How does Operational Risk Management fit into an organization's Strategic Planning? This presentation attempts to provide a functional and implementable response.
Operational Risk Management - Understanding Your Risk LandscapeEneni Oduwole
This presentation provides insights on how the proper implementation of Operational Risk Management can lead to effective risk profiling, analysis and mitigation. It introduces operational risk as a bedrock for meaningful risk management irrespective of which industry an organization plays in.
A new report by the Business Continuity Institute, supported by certification body NQA, has shown that 6 out of 10 organizations adopt ISO 22301. Organizations with strong top management commitment to standardising business continuity practice are four times more likely to adopt ISO 22301 than those who do not.
There are many reasons why an organization would want to embrace ISO 22301, most notably it provides assurance of continued service with 61% respondents identifying this as a significant reason. By certifying to the Standard, organizations can provide reassurance to their stakeholders that, in the event of a crisis, it will still be able to function.
Read the full survey report for more information on the business benefits of ISO 22301.
https://www.nqa.com/en-gb/resources/news/6-out-of-10-organizations-adopt-iso-22301
PECB Webinar: Introduction to ISO 22317 – Business Impact Analysis (BIA)PECB
We will cover:
• Importance of Business Impact Analysis (BIA)
• What does new standard ISO 22317 cover?
• Elaborating ISO 22317
Presenter:
This session will be hosted by our partner Dr. Wolfgang H. Mahr, M.Sc., MBCI, the Managing Director of governance & continuity gmbh with more than 20 years of experience.
Five lines of assurance a new paradigm in internal audit & ermDr. Zar Rdj
• Boards are provided with a tangible vehicle to demonstrate they are actively overseeing the company’s “risk appetite framework” (“RAF”)
• The process is designed to fully integrate with strategic planning, new product/service initiatives, and M&A activities.
• The process provides a clear response to emerging expectations like the UK Governance Code, Canadian Securities Administrators, SEC, FSB, credit agencies, institutional investors and TSB.
• The main role of internal audit is to report on the effectiveness of the risk management processes and the consolidated report on residual risk status the board receives from the CEO or his/her designate and to help the company build and maintain robust risk management processes
• Boards are provided with a tangible vehicle to demonstrate they are actively overseeing the company’s “risk appetite framework” (“RAF”)
• The process is designed to fully integrate with strategic planning, new product/service initiatives, and M&A activities.
• The process provides a clear response to emerging expectations like the UK Governance Code, Canadian Securities Administrators, SEC, FSB, credit agencies, institutional investors and TSB.
• The main role of internal audit is to report on the effectiveness of the risk management processes and the consolidated report on residual risk status the board receives from the CEO or his/her designate and to help the company build and maintain robust risk management processes.
Implementing Enterprise Risk Management with ISO 31000:2009Goutama Bachtiar
This presentation slides is intended for the training-workshop lead as well as the participants.
Developed based on ISO 31000:2009 – Principles and Guidelines on Implementation, ISO/IEC 31010:2009 – Risk Assessment Techniques, ISO Guide 73:2009 – Vocabulary.
How Can You Drive Opportunity If You Cannot Manage Risk?Lora Cecere
Report Details: The research for this report was conducted via an online survey from March 12 - May 11, 2018. Surveys were conducted among 93 respondents -- a mix of business users (manufacturers, wholesalers/distributors/co-operatives, and third-party logistics providers, n=34), vendors (software providers and consultants, n=39), and others (academics, analysts, unemployed, and others, n=20).
Objective: To understand the current and expected future state of supply chain risk management, the biggest drivers of risk, and the impact on supply chain disruptions. NOTE: supply chain risk management is defined as the proactive identification and assessment of potential risks to the supply chain, as well as the development of strategies to avoid these risks.
Highlight: Nearly two-thirds of respondents believe that their company performs better today on risk management practices than five years ago yet they had 3.5 disruptions last year on average. Managing risk requires a network approach. Today’s investments in end-to-end supply chain are by and large not effective in risk mitigation. Only 37% have visibility of extended-tier suppliers and most lack the solutions to manage global complexity.
Discussion- 11. How does efficient frontier analysis (EFA) dif.docxmadlynplamondon
Discussion- 1
1. How does efficient frontier analysis (EFA) differ from other forms of complex risk assessment techniques?
The issue of the selection of the risk management methods to support investment decision-making is one of the key issues discussed in the management of portfolios. The factor contributing to the development and dissemination of the risk management methods is the fact that the development of this theory, the risk of portfolios of financial institutions began to measure widely using the Markowitz portfolio selection model. Currently, this problem has been solved, since his designation used linear programming. It cannot be missed with these two facts. The indication of such a relationship, as well as its characteristics are the main purpose of the publication, in which there was not only used the study literature. The efficient frontier can be defined as the image of a set of portfolios that provide the maximum return for each level of risk or minimal risk for any level of return. In addition, this measure brings important details in the development area of portfolios’ management of financial instruments, on the grounds that it considers the possibility of the investor’s bankruptcy and may be regarded as a dynamic measurement of the risk (Bali T.G).
2. What limitations might an analyst encounter with EFA?
The financial equivalent of racing cars if They're one of the most touted, yet most misunderstood and misused, tools in the field of financial planning. Understanding the nature of an efficient frontier model and the assumptions on which it relies. As with a sophisticated racing car, a powerful tool in the wrong hands can be a very dangerous thing. For example, it's logical to believe that stocks will outperform bonds in the future. Efficient frontier models rely on historical data and relationships to generate the "perfect" portfolio. In my experience, many investors who use efficient frontier models are unaware of their pitfalls. These models are being marketed as solutions to the problem of portfolio construction, but they come without instructions.
3. How can efficient frontier analysis results be communicated and utilized with nonmathematical decision maker?
Communication is not a crank to be turned mindlessly, but a decision problem of its own. As we will see, there are many alternatives to consider. The analyst’s choices constitute the design of a communication plan. In ideal cases, the client is infinitely patient, unshakably invested in the problem, fully committed to finding the highest quality solutions, flexible about the process, and unwavering in confidence in the analyst’s work. In such cases, tight outlines or rambling jumbles may lead to the same outcome. Good quantitative analysis alone does not usually produce good decisions, because rarely does the analyst control all the resources required to decide and act. Decision makers and other players who influence the decision must assimilate the results of th ...
The Global Supply Chain Ups the Ante for Risk ManagementLora Cecere
Executive Summary
Unfortunately, supply chain disruptions are a fact of life for today’s global multinational company. The reasons are many. A risk management event can be triggered by natural events, geopolitical shifts, economic uncertainty and demand/supply volatility.
Historically, the roots and genesis of risk management programs were based on attempts to reduce insurance costs. Today it is much, much more. The focus is on prevention, early sensing, and the execution of well-orchestrated plans to mitigate the impact of a disruption. Global supply chain leaders understand that designing and implementing a robust risk management practice is essential and fundamental to running a global business. The size of the bubble in Figure 2 indicates the relative level of risk today, and the colors correspond to the level of risk.
Figure 2. Comparison of Risk Drivers for the Past Five Years and Future Five Years
While product quality and supply chain visibility are declining but still important, the areas of operations complexity and the definition of globalization infrastructure is increasing. The areas of economic uncertainty, supplier reliability, along with demand volatility, are continued risk factors.
Over time, as supply chains morphed from regional to global multinational organizations, globalization and regulatory compliance increased. As a result, procurement has shifted from traditional programs focused solely on contract management, price and term negotiations, and supplier scorecards to include the evolution of supplier development, to manage product quality and multi-tier supplier relationships, in and across value chain relationships.
Today is a less certain world than a decade ago. Geopolitical shifts, economic uncertainty and demand/supply volatility are rising. In addition, to spur growth companies are quick to add products to the item master, but slow to rationalize the portfolio. The rising complexity of items sold decreases the organization’s ability to forecast, and the longer lead times across multiple tiers of sourcing and supply increases the Bullwhip Effect’s impact (distortion of the demand signal across multiple tiers of the value network). As a result, there is a greater need for supplier development and supplier sensing to reduce supply risk. Inventory management and supplier financial sensing grow in importance with the increase in uncertainty.
Risk management is no longer narrowly focused: a technology, a response to a natural disaster, or improving supply chain visibility. Instead, it is more holistic with a focus on managing demand and supply variability cross-functionally and improving outcomes in an uncertain world.
In this report, we share insights on the current state of risk management programs while providing recommendations on what defines excellence.
Analytics, business cycles and disruptionsMark Albala
The digital economy is different. Depending on platforms and a much more malleable set of methods to interact with consumers, an accelerated rate of disruptions compromises the orderly business experience of most market participants. A well-honed analytics program facilitates understanding these accelerated disruptions. With a platform based digital marketplace, obtaining the information necessary to decipher unexpected outcomes and prescribe suitable actions is difficult because the information required Both of these facts are important to analytics. First, platforms. Platform based activity is hard to decipher, not because it is more complex but because the information needed to decipher activity is not contained within your four walls.
Once deciphered, the next challenge facing organizations deciphering unexpected outcomes is a determination of whether the unexpected outcome is truly a disruptive event or simply a phase change in a regularly occurring business cycle. There are significant differences in the suitable reactions to disruptions and business cycle phase changes. Unfortunately, many organizations are ill equipped to discern between these two classes of unexpected business outcomes and consistently find their business plans fall victim to the actions of others within the marketplace.
Luckily, many of the activities of governmental and regulatory bodies are focused on predicting phase changes to the business cycles likely to impact the economic forces within the next fiscal year and describe their economic policies and agendas in publicly available documents and analysis. Understanding where to find these documents and how to use the published to discern between the likely business cycle phase changes and true disruptions as one of the vehicles available within your arsenal of analytics will lessen the occurrence of falling victim in the marketplace by misreading the clues available from unexpected outcomes. This document will address the sources most likely to assist and the actions to be taken to utilize the information attained from these documents.
Risk Monitoring and Management Trends In CommoditiesCTRM Center
Commodity producers, traders, and industrial consumers are all facing a barrage of risks such as price exposure and cyber vulnerability, as well as legal, credit, operational and market risks. The risks associated with buying, selling, and moving commodities only seem to be increasing exponentially with greater regulatory oversight and a broadening of supply chain operational issues like traceability. Many of these risks can be business killers – the actions of rogue traders or the impact of counterparty business failures, for example – and lead to fatal damage such as an inability to access capital or damage to brands (via issues around sourcing commodities or producing substandard end-products). Other risks, such as ineffective price risk management, inefficient scheduling of transportation, or regulatory non-compliance can erode profitability and damage the company’s ability to execute on strategic plans and growth initiatives.
Of course, often where there is risk, there is also an opportunity to profit - but only when those risks are recognized, effectively managed, and properly mitigated. The rise in stakeholder scrutiny and regulatory oversight also means that being able to demonstrate effective risk management across the organization is certainly more important today than ever before.
Dashboard for Currency Risk Management & HedgingBenjamin Koch
Good FX management means providing CFOs with the Treasury reporting information they require. Benjamin Koch and Achim Kreuzer suggest a dashboard approach
DISUSSION-1RE Chapter 15 Embedding ERM into Strategic Planning.docxmadlynplamondon
DISUSSION-1
RE: Chapter 15: Embedding ERM into Strategic Planning at the City of Edmonton
COLLAPSE
Top of Form
The two strategic processes
The two strategic processes which are tightly connected to ERM in the current scenario of Edmonton City ERM implementation are:
Results based budgeting and Performance measurement.
Results based budgeting (RBB):
ERM helps organizations to allocate the resources based on the requirement for completing the tasks and to produce the desired output. The RBB assists to determine the funding allocation requirements which are mandatory to fulfill the strategic objectives of organization. This budget formulation is performed based on predefined objectives such as priority, resource availability and expected results etc. here the expected results represents the desired outputs which organization expects to meet its strategic goals. In simple words the Results-based budgeting is about emphasizing performance and accountability.
Performance measurement:
The continuous performance measurement helps organizations to drive the progress in risk mitigation and it provides insights where additional attention is required. The Key performance indicators (KPIs) can be used to measure the effectiveness of risk management activities. The Performance measurement in ERM sends the list of desired outcomes to RBB and receives list of prioritized programs and costs to ensure ERM works at its full potential (Fraser, J., Simkins, B. J., & Narvaez, K., 2015).
Two criteria’s must be balanced in a successful ERM model
The two criteria are model power and user-friendliness. The powerful model can provide large amount of information and lets the organization to compare the results and risks, effectiveness’ of current program and impact of future initiatives. The user friendliness program helps to easily add information, add new features and easy to understand by the user with simple steps. The user friendliness also includes if needed some unnecessary steps could also be removed without losing model robustness (Fraser, J., Simkins, B. J., & Narvaez, K., 2015).
Thank you
References
Fraser, J., Simkins, B. J., & Narvaez, K. (2015). Implementing enterprise risk management: Case studies and best practices. Hoboken: Wiley.
Bottom of Form
DISCUSSION-2
1. What the other strategic processes are closely tied to ERM?
The strategic processes may have success strategy which is linked to the command of risk and organization understanding. The selection of strategy is an exercise of high-stakes. Approx. 80% of the underperformer may against the industry who have lost their wat over the prior 10 years because of blunder who are strategic and the business and strategy magazine. It may blame on failure on operations errors and the external event or compliance fault.
2. What are three kinds of risks are identified within the city of Edmonton?
There may be three risks which may involve avoidance or risk termination, tolerance or acceptance of ...
Performing Strategic Risk Management with simulation modelsWeibull AS
“How can you be better than us to understand our business risk?"
This is a question we often hear and the simple answer is that we don’t! But by using our methods and models we can utilize your knowledge in such a way that it can be systematically measured and accumulated throughout the business and be presented in easy to understand graphs to the management and board.
The main reason for this lies in how we can treat uncertainties 1 in the variables and in the ability to handle uncertainties stemming from variables from different departments simultaneously.
Proposed Guidance: COSO Internal Controls for Integrated ReportingWorkiva
1: Describe how the proposed guidance applies to integrated reporting (<ir>), especially non-financial information.
2: Explain proposed guidance on how to apply COSO to <ir> and integrated thinking.
3: Recognize how to apply basic templates for mapping COSO to the <ir> framework.
Finance is the lifeblood and lifeline of any business entity either commercial or non-commercial. The
Survival, Stability and Sustainability of a firm is highly associated with its financial wellness. It can be observed through its ability to pay(re) short-term as well as long term liabilities, meeting the regular financial obligations, to increase the value of firm and ability to generate profit. Financial analysis, evaluation, and assessment help in determines the financial position and financial strength of a firm. Among the plenty of methods and tolls available for financial performance, ratio analysis is more useful and meaningful. These ratios make it possible to analyze the evolution of the financial situation of a firm (trend analysis), cross-sectional analysis and comparative analysis.
201310 Risk Aggregation and Reporting. More than Just a Data IssueFrancisco Calzado
Many banks feel overwhelmed by the sheer volume of regulation that is coming their way. It is not surprising, therefore, that when the Basel Committee on Banking Supervision (BCBS) consultative paper, “Principles for effective risk data aggregation and risk reporting” was published in June 2012 it raised a number of concerns
Financial ratios are created with the use of numerical values taken from financial statements to gain meaningful information about a company. The numbers found on a company’s financial statements – balance sheet, income statement, and cash flow statement – are used to perform quantitative analysis and assess a company’s liquidity, leverage, growth, margins, profitability, rates of return, valuation, and more.
Modes of Expression of Ratios:
Ratios may be expressed in any one or more of the following ways:
(a) Proportion,
(b) Rate or times
(c) Percentage.
Advantages of Ratio Analysis:
The information shown in financial statements does not signify anything individually because the facts shown are inter-related. Hence it is necessary to establish relationships between various items to reveal significant details and throw light on all notable financial and operational aspects. Ratio analysis caters to the needs of various parties interested in financial statements. The basic objective of ratio analysis is to help management in interpretation of financial statements to enable it to perform the managerial functions efficiently.
Limitations of Ratio Analysis:
Ratios are precious tools in the hands of management but the utility lies in the proper utilisation of ratios. Mishandling or misuse of ratios and using them without proper context may lead the management to a wrong direction. The financial analyst should be well versed in computing ratios and proper utilization of ratios. Like all techniques of control, ratio analysis also suffers from several ‘ifs and buts’ and for proper computation and utilization of ratios the analyst should be aware of the limitations of ratio analysis.
Uses and Users of Financial Ratio Analysis
Analysis of financial ratios serves two main purposes:
1. Track company performance
Determining individual financial ratios per period and tracking the change in their values over time is done to spot trends that may be developing in a company. For example, an increasing debt-to-asset ratio may indicate that a company is overburdened with debt and may eventually be facing default risk.
2. Make comparative judgments regarding company performance
Comparing financial ratios with that of major competitors is done to identify whether a company is performing better or worse than the industry average. For example, comparing the return on assets between companies helps an analyst or investor to determine which company is making the most efficient use of its assets.
Users of financial ratios include parties external and internal to the company:
External users: Financial analysts, retail investors, creditors, competitors, tax authorities, regulatory authorities, and industry observers
Internal users: Management team, employees, and owners
Cloud Storage Gateway Market - Outlook (2017-21)ResearchFox
Cloud storage gateways enable organizations to leverage cloud storage while delivering on premise services to remote and branch offices thereby eliminating data silos, reducing reliance on traditional and expensive IT storage. Cloud storage gateways acts a bridge between cloud storage systems and enterprise applications and overcomes incompatibility between the protocols used for public cloud technologies and legacy storage systems. Besides, cloud storage gateway appliances provide data deduplication and compression capabilities to make use of available bandwidth efficiently and move data as quickly as possible. This report presents interpretative and easy-to-understand facts on how the current Cloud Storage Gateway market is segmented based on end users, industry verticals and geographies. It cuts through several facets of the cloud storage gateway market such as market size, market share for each segment, the drivers and constraints of cloud storage gateway marketplace. This report also provides information on the challenges and opportunities that lie ahead for these Cloud Storage Gateways.
1. Elemental Economics - Introduction to mining.pdfNeal Brewster
After this first you should: Understand the nature of mining; have an awareness of the industry’s boundaries, corporate structure and size; appreciation the complex motivations and objectives of the industries’ various participants; know how mineral reserves are defined and estimated, and how they evolve over time.
Yes of course, you can easily start mining pi network coin today and sell to legit pi vendors in the United States.
Here the what'sapp contact of my personal vendor.
+12349014282
#pi network #pi coins #legit #passive income
#US
how to sell pi coins in South Korea profitably.DOT TECH
Yes. You can sell your pi network coins in South Korea or any other country, by finding a verified pi merchant
What is a verified pi merchant?
Since pi network is not launched yet on any exchange, the only way you can sell pi coins is by selling to a verified pi merchant, and this is because pi network is not launched yet on any exchange and no pre-sale or ico offerings Is done on pi.
Since there is no pre-sale, the only way exchanges can get pi is by buying from miners. So a pi merchant facilitates these transactions by acting as a bridge for both transactions.
How can i find a pi vendor/merchant?
Well for those who haven't traded with a pi merchant or who don't already have one. I will leave the what'sapp number of my personal pi merchant who i trade pi with.
Message: +12349014282 VIA Whatsapp.
#pi #sell #nigeria #pinetwork #picoins #sellpi #Nigerian #tradepi #pinetworkcoins #sellmypi
how to sell pi coins in Hungary (simple guide)DOT TECH
If you are interested in selling your pi coins, i have a verified pi merchant, who buys pi coins and resell them to exchanges looking forward to hold till mainnet launch.
Because the core team has announced that pi network will not be doing any pre-sale. The only way exchanges like huobi, bitmart and hotbit can get pi is by buying from miners.
Now a merchant stands in between these exchanges and the miners. As a link to make transactions smooth. Because right now in the enclosed mainnet you can't sell pi coins your self. You need the help of a merchant,
i will leave the what'sapp contact of my personal pi merchant below. 👇
+12349014282
BONKMILLON Unleashes Its Bonkers Potential on Solana.pdfcoingabbar
Introducing BONKMILLON - The Most Bonkers Meme Coin Yet
Let's be real for a second – the world of meme coins can feel like a bit of a circus at times. Every other day, there's a new token promising to take you "to the moon" or offering some groundbreaking utility that'll change the game forever. But how many of them actually deliver on that hype?
Seminar: Gender Board Diversity through Ownership NetworksGRAPE
Seminar on gender diversity spillovers through ownership networks at FAME|GRAPE. Presenting novel research. Studies in economics and management using econometrics methods.
when will pi network coin be available on crypto exchange.DOT TECH
There is no set date for when Pi coins will enter the market.
However, the developers are working hard to get them released as soon as possible.
Once they are available, users will be able to exchange other cryptocurrencies for Pi coins on designated exchanges.
But for now the only way to sell your pi coins is through verified pi vendor.
Here is the what'sapp contact of my personal pi vendor
+12349014282
2. Elemental Economics - Mineral demand.pdfNeal Brewster
After this second you should be able to: Explain the main determinants of demand for any mineral product, and their relative importance; recognise and explain how demand for any product is likely to change with economic activity; recognise and explain the roles of technology and relative prices in influencing demand; be able to explain the differences between the rates of growth of demand for different products.
Tax System, Behaviour, Justice, and Voluntary Compliance Culture in Nigeria -...
IFRS17 Risk Adjustment modeling
1. I F R S 1 7 R I S K A D J U S T M E N T
F O R I N S U R A N C E C O N T R A C T S
A L O N G W I T H S O L V E N C Y 2 R I S K M A R G I N S
SYED DANISH ALI
QUANTIFYING RESERVE RISK
WITH WORKED EXAMPLES
2. 1
2
3
4
IFRS17 4 Phases
Overview of Main Points covered
IFRS17 Risk Adjustment Definition & Description
Regional Regulators’ Review Points
CONTENTS
I 2
3. 1
2
3
4
Worked example modeling of Solvency 2 Risk Margin
Worked example modeling of IFRS17 Risk Adjustment
calculations
Metrics; 1) VaR 2) TVaR 3) PHT
2) Cost of Capital 1) Analytic 2) Simulated 3) VaR
CONTENTS
Final Notes
Data and Descriptive Analytics/Exploratory Data Analysis
3) Sources of statistical uncertainty
I 3
4. 1 ) I F R S 1 7 4 P H A S E S
Implementing IFRS17 in a phased manner
5. IFRS17 4 Phases
5
Phase 1 – Gap
Analysis
Phase 4 -
Implementation
Phase 2 – Financial
Impact Assessment FIA
Phase 3 – Systems
Design and
Methodology
IFRS17 explained simply in 3 minutes. Part 1:
https://www.youtube.com/watch?v=9RAacCBTYc8
and Part 2 https://www.youtube.com/watch?v=LXziE9DqMxQ
IFRS4 was a patchwork that was never meant to remain as comprehensive IFRS for Insurance in
the first place. But it took 2 Decades of consultation to arrive at the comprehensive regulation of
IFRS17 that meets the aspirations of the IAIS group finally for insurance contracts.
I 5
6. IFRS17 4 Phases
Different vested interests lead to different perceptions regarding to why we are implementing IFRS17 in the first place at all. Many
insurers delay IFRS17 work for as long as they can, while adhering to regulatory deadlines for IFRS17. Regulators also differ and
some are more pro-active than others.
Even now many people at insurance are dismissive or combative of ifrs17 that it adds very less value, adds too many costs; and
they are trying their best to apply as minimum as possible (think of the very strong preference for PAA, no budget for software;
just want patched up excel based new actuarial and accounting work).
Of course, they have their own view which is that they face a lot of burning fires and are fire fighting so much already ( increasing
loss ratio over time, the impact of covid19, increased digitization/product development, host of other challenges), and the last
thing they needed what another regulation!
The market structure is also highly skewed. Other than the Top 5 insurers in a given country, the rest 20-40 insurers fight over
hardly total 20%-30% of the market share; they are too small to have budgets to hire any specialized skills, many functions are
missing, they are too simple and unsophisticated, and their focus usually is selling clones of products at minimum prices to gain
market share. Consultants are far more optimistic because IFRS17 has opened sources of revenues with magnitudes previously
unthinkable (Some of which get trickled down to employees in form of better remuneration than if IFRS17 wasn't here).
No one party is right, and the views of all stakeholders should be respected so let's see how we reach the middle common ground
now.
9. 3 ) I F R S 1 7 R I S K
A D J U S T M E N T D E F I N I T I O N
& D E S C R I P T I O N
Current Practice under IFRS4, detailed description of RA in IFRS17
18. SAMA Review of IFRS17 FIA Reports point on RA
The lognormal distribution is the most
commonly used claims distribution. Some
insurance companies have varied their
approaches between LRC and LIC
calculations. It is obvious from the above
that a range of approaches is being
adopted in the sector to estimate the risk
adjustment. The estimate requires input
from both management and the actuarial
function. SAMA expects the estimation
approach to be refined over time as the
actuarial profession grows in
sophistication in the Kingdom. SAMA also
expects management to provide active
input and steer to ensure alignment
between the selected confidence interval
and the Company’s risk appetite. As
regards the companies that are yet to
complete work in this regard, SAMA notes
that this was against its expectations and
it will follow up with those companies
21. 5 ) D A T A A N D D E S C R I P T I V E
A N A L Y T I C S / E X P L O R A T O R Y
D A T A A N A L Y S I S
1) Data used for both Solvency 2 Risk Margin and IFRS17 Risk Adjustment
2) Residuals’ Analysis
3) Sources of Statistical Uncertainty
25. These are unscaled residuals. Various ways to scale residuals but
trends still visible as 1) variables need to be scaled and not only
residuals, 2) information compression of making a 2D table and
ignoring everything else in chain ladder makes us lose a lot of
information. Chain ladder was first made 5 decades or so ago when
regression or individual level reserving was not possible generally so
model diagnostics will be quite poor for it no matter what we do.
Pricing has progressed far beyond reserving as it is treated as core part
of business and not as regulatory burden of reserving where the only
focus of management is on minimizing the reserve figures.
25
28. actions. (Sources: Macro Ops: Unparalleled Investing Research (macro-ops.com) Introduction to Endogeneity. An ice cream vendor sells ice cream on… | by ashutosh nayak | Towards Data Science 23
Investing Lessons from George Soros | Casey Research https://datascienceplus.com/how-to-detect-heteroscedasticity-and-rectify-it/)
28
29. More sources of statistical errors
Collinearity is a linear association between two predictors. Multicollinearity is a situation where two or more predictors are highly
linearly related. In general, an absolute correlation coefficient of >0.7 among two or more predictors indicates the presence of
multicollinearity.
‘Predictors’ is the point of focus here. Correlation between a ‘predictor and response’ is a good indication of better predictability. But
correlation ‘among the predictors’ is a problem to be rectified to be able to come up with a reliable model.
For example, COVID19 incidence is 5 times higher for those with health insurance sum insured SAR 30 thousand and above. Does it
mean that we should then flag those with high sum insureds? No because most of health comes under group health where large sum
insured/high benefit plans are for senior employees only like VIP Plan, Plan A etc. At these plans, ages are quite high above 50 usually
(the higher the age, the higher the COVID19 incidence), are predominantly men in senior posts (men have higher chances of catching
COVID19), can have greater level of awareness of health benefits and more serious attitude with taking care of their health). Those are
the real reasons and not Sum insured above SAR 30 thousand.
Source: https://www.statisticshowto.com/multicollinearity/ https://blog.clairvoyantsoft.com/correlation-and-collinearity-how-they-can-make-or-break-a-model-
9135fbe6936a
Intuitive understanding of what statistics does (what is cross-validation, what is t-test, what does p less than 0.5 show? What is ANOVA
practically?) plus some level of domain knowledge is far more important in reality than just coding and churning numbers. If you can’t
explain it simply enough, your understanding is lacking. Opposite attitude to that of throwing around only fancy big words. Best example
startups. They will market as if they have made AGI, but behind it all, it will be a simple logistic regression by a junior employee who
doesn’t really know what he is doing).
31. Summary Tables – Analytical Method 1 Results
The first table on the left shows analytic results for the SD of the
reserves over their lifetime, and the SD of the claims
development result (CDR) over 1 year.
The two tables shown below shows analytic results for the SD of
the CDRs over a sequence of 1-year views. The results can be
shown incrementally or cumulatively.
Notice that the square root of the sum of squares of the
incremental SDs equals the SD of the reserves over their
lifetime. This demonstrates how the traditional lifetime view of
risk can be partitioned into a sequence of one-year views.
Table 1
Table 3
Table 2
32. Summary Tables – Analytics Results
Mack method is based upon the chain ladder method which is probably the most popular method for the calculation of claim
reserves. This technique measures the variability of the chain ladder estimates and uses it to develop a confidence interval for the
estimated ultimate claim amounts and the claim reserves. The Confidence interval is important due to the difference in the actual
ultimate amount and estimated ultimate amount. The confidence level is based upon an entity’s preference. Mack method uses
the weighted average link ratios, calculated by chain ladder method, to determine the standard error of the estimates of the
ultimate claim amounts. The method assumes that the age-to-age factors are independent for each accident year, but the claim
payments are correlated with the earlier payment for that year. For the calculation of the risk adjustment, we would have to
assume a normal or lognormal distribution for the claim amounts and determine the parameters for the selected distribution.
In Table 1, the chain-ladder reserves are shown, together with the standard deviations of the forecasts (RMSEPs) from Mack’s
model, giving a coefficient of variation of the total reserves under the lifetime view of risk of 13.1%. In addition, the RMSEPs of the
CDRs over 1 year using the formulae from Merz and Wuthrich are shown in Table 1. The RMSEPs divided by the expected reserves
at the start of the year are also shown, giving 9.5% for the total CDR. This one-year measure of risk is lower than the traditional
lifetime view.
Table 2 and 3 shows the RMSEPs (i.e., standard deviations) of the CDRs for each future calendar period (the “full picture”) using
the formulae from Merz and Wuthrich. The result of squaring the values (to give variances), adding up across all columns within
each row, and taking the square root is shown in the final column. A comparison with Table 2 shows that the square root of the
sum of squares of the CDRs gives the same result as the RMSEP from Mack’s model over the lifetime of the liabilities. This
demonstrates how the lifetime view of risk under Mack’s model can be partitioned into a sequence of one-year views. It also
shows that the one-year view of risk must always be lower than the lifetime view since variances cannot be negative. This is an
interesting result and links the lifetime view of risk with the one-year view of Solvency II using analytic approaches.
32
33. Summary Tables – Bootstrap Method 2 Results
Table 4 Table 5
Simulation assumes 3% discount rate which we can change. 1,000 Simulation has been run and we should generally simulate from 1,000 to 10,000 simulation runs.
Bootstrap method for simulation has been taken as Mack although we could also run it on Overdispersed Poisson Non-Constant Scale or Constant Scale as well.
Table 4 shows analogous results to Table 2, but by using Bootstrap simulation instead. Notice that the simulation results are very close to the analytic results,
justifying the procedure. Table 4 shows the expected reserves, standard deviation (SD) (prediction error), and coefficient of variation from bootstrapping Mack’s
model using 1,000 simulations. Also shown are the standard deviations of the one-year ahead CDRs using the re-reserving approach and the standard deviations
expressed as a proportion of the expected reserves at the start of the year. Comparison with Table 2 and 3 shows that the expected reserves are very close to the
chain-ladder reserves, and the standard deviations of the simulated reserves from bootstrapping Mack’s model are very close to the analytic results given by Mack’s
model. In addition, the standard deviations of the one-year ahead CDRs are very close to the analytic results given by the formulae from Merz and Wuthrich.
Table 5 shows a summary of results on a discounted basis. This highlights one of the benefits of the simulation approach - a full distribution of all cash-flows is
available, which can be used to go beyond the analytic results. 33
35. Forecast distribution has been assumed Gamma in the modeling process. We could also select Non-parametric to chart the forecast distribution.
Nonparametric methods are statistical methods that require fewer assumptions about a population or probability distribution and are applicable
in a wider range of situations.
Table 6 shows the Standard Deviation (SD) of the CDRs over a sequence of 1-year views, using the simulation results. Notice that the simulation
results are very close to the analytic results in Table 3. Table 6 is incremental whereas Table 7 is cumulative. A comparison with Table 4 shows that
the square root of the sum of squares of the CDRs are very close to the standard deviations from bootstrapping Mack’s model over the lifetime of
the liabilities, and again demonstrates how the lifetime view of risk under Mack’s model can be partitioned into a sequence of one-year views.
Table 8 and Table 9 shows the value-at-risk @ 99.5% of the CDRs over a sequence of 1-year views, using the simulation results. Again, this
highlights the benefits of a simulation approach since any risk measure can be applied to the simulated distribution. The analytic approach only
provides SDs.
The values in Tables 3 and 6 show remarkable similarity, validating the simulation approaches and connecting the lifetime and one-year
views of risk for analytic and simulation-based approaches associated with Mack’s model. Again, an advantage of the simulation-based
approach is that a full predictive distribution is available, from which any risk measure can be obtained. For example, Table 8 and 9 shows
value at-risk of the CDRs at 99.5% (where VaR at 99.5% is the negative of the 0.5th percentile of the distribution of the CDR).
For a statistical method to be classified as a nonparametric method, it must satisfy one of the following conditions: (1) the method is used with
qualitative data, or (2) the method is used with quantitative data when no assumption can be made about the population probability distribution.
In cases where both parametric and nonparametric methods are applicable, statisticians usually recommend using parametric methods because
they tend to provide better precision. Nonparametric methods are useful, however, in situations where the assumptions required by parametric
methods appear questionable.
Source: https://www.britannica.com/science/statistics/Residual-analysis
Summary Tables – Simulated CDR Method 3 Results
35
36. Summary Tables – Simulated CDR Method 3 Results
By bootstrapping Mack’s model, it provides a way of simulating cumulative payments for all future calendar periods, and hence all incremental payments (by differencing the
cumulative payments). For each origin period, we therefore have a way of simulating the payments that emerge over the next calendar period. All that remains is to estimate
the outstanding liabilities at the end of the year conditional on what has emerged, for each simulation. This will depend not only on the payments made over the next year in
origin period i, but on all other origin periods too.
To complete the process, it is necessary to augment the original payments triangle by the simulated payments that emerge over the next calendar period for each origin
period. That is, the original payments triangle is augmented by one diagonal, since that is all an actuary sees over a one-year period. Conditional on the payments that
emerge (for each simulation), it is then necessary to estimate the reserves at the end of the period. At this point, an automated reserving methodology is required that can
be applied to the results for each simulation. An actuary in the computer is required, or an “actuary-in-the-box”, as the procedure is known1 . To remain consistent with the
underlying methodology described in this presentation, the standard chain-ladder method is adopted for this purpose. That is, for each new simulated triangle, the chain-
ladder model is re-fitted conditional on the claims that have emerged in the year, giving the reserves at the end of the year. This automatic re-fitting of the reserving
methodology has led to the “actuary-in-the-box” procedure also being known as “re-reserving”.
The Actuary-in-the-Box is a general procedure for estimating one-year reserve risk. It assumes that we already have an algorithmic method for setting
reserves, and then specifies a procedure for simulating the next year of claims development, and re-applying the algorithm to get the reserves in one year's
time. The method is:
1. Obtain the Best Estimate of the opening reserve. It is assumed that this is done according to a well-defined algorithm, and that it does not include any
risk margin.
2. Extend the input data needed for the algorithm used in step 1 by simulating one further year of data.
3. Apply exactly the same algorithm as is step 1 to the extended data set generated in step 2 to produce a distribution of the closing claims reserve.
One fundamental limitation is that the method cannot adequately capture the judgement used by a real-world actuary in setting reserves, or many of the
other subtle aspects of a complex reserving process. Another fundamental limitation is that the actuary-in-the-box method cannot make use of information
not contained in the claims data used by the underlying model, which would likely be considered by a real-world actuary.
The output is a full distribution of the ultimate claims, which can be used to calculate any risk statistic desired. It can also be iterated to give an
understanding of how the risk will emerge up until the whole triangle is fully run-off.
We demonstrate that the standard deviation of the simulated distribution of the CDR using the re-reserving approach matches the analytic approach of Merz and Wuthrich,
connecting the analytic and simulation-based approaches for the one-year view of risk. And then connect the one-year view of risk and the traditional lifetime view.
We demonstrate that the standard deviation of the simulated distributions of the incremental CDRs using the recursive re-reserving approach match the analytic results from
the Merz and Wuthrich formulae, again connecting the analytic and simulation based approaches, and connecting the one-year view of risk and the traditional lifetime view
36
39. Histograms show a graphical representation of the 1,000 simulations done at each origin points. We can
see the total histogram at around normal distribution but at other origin points show left skew.
Since the bootstrap approach provides distributions of all future cash-flows (not just the reserves), it is
straightforward to obtain a distribution of the discounted reserves. The Histograms show the results of
discounting the future cash-flows at 3% (assuming payments are made mid-way through the year).
Histograms
Density Charts
of Discounted
Total Reserves
obtained by
Bootstrapping
39
42. As we can see, later origin period have lower data items
and so have greater variability in forecasts because each
successive row in the triangle has lower data points.
42
43. Risk Margins Solvency II Cost of Capital
The cost of capital is calculated by applying a cost of capital rate to this amount. This rate can be determined by several techniques such as Weighted
Average Cost of Capital (WACC) and Capital Asset Pricing Model (CAPM). For IFRS 17 risk adjustment, the entity’s cost-of-capital rate would be chosen to
meet the specific measurement objectives, reflecting a rate of return consistent with the entity being indifferent between fulfilling an insurance contract
liability with a range of possible outcomes versus fulfilling a liability that will generate fixed cash flows with the same expected value of cash flows as the
insurance contract. The amount of capital used to estimate the cost-of-capital will depend on the level of security desired, an assessment of the
probabilities that unfavorable cash flow outcomes will consume some or all the capital, and the entity’s level of risk aversion regarding the uncertain,
unfavorable outcomes.
Within the Solvency II regulatory regime in Europe, a risk margin is required in addition to considering reserving risk within internal capital models or when
applying the Standard Formula. Whereas Solvency II considers risk over a one-year time horizon, IFRS 17 is based on the fulfilment cash flows over their
lifetime. As such, the definitions of reserve risk are different, which needs to be recognized and understood. it includes all four elements needed to estimate
capital requirements: 1. A risk profile (distribution of the basic own funds) 2. A risk measure (value-at-risk) 3. A risk tolerance criterion (99.5%) 4. A time
horizon (one year).
Solvency II stipulates that risk margins must be calculated using a cost-of-capital approach. The mechanics of the approach are straightforward. Given capital
requirements for each future year as the reserves run-off, the risk margin is the sum of the discounted costs of capital, where the costs of capital are the
capital requirements multiplied by the cost-of-capital rate.
In Tables 14-17, the columns are 1) ‘Disc Fut Res’ shows projected reserves, 2) ‘Capital’ shows projected capital requirements. These are the reserves
remaining in each future period, discounted to the start of that period at 3% discount rate (assuming that payments occur half-way through each year) and
evaluated using the cash-flows from the chain ladder model applied deterministically. 3) shows Capital Profile which shows the capital requirements at each
future period expressed as a percentage of the opening capital requirements 4) shows Cost of capital at 6% assumed multiplied by capital requirements and
5) shows Discounted Cost of Capital, assuming 3% discount rate.
43
44. Risk Margin Calculations Solvency II Cost of Capital
Table 10 allows cost-of-capital risk margins to be calculated. Capital amounts are calculated given an initial
capital requirement and a 'capital profile’. The default initial capital requirement is taken from the value-at-
risk @ 99.5% of the total CDR over 1 year (shown in Table 8 and Table 9 incremental/cumulative). A variety of
capital profiles can be selected.
The liabilities side of the opening Solvency II balance sheet contains an estimate of the expected outstanding liabilities. Each
simulated balance sheet one year ahead also contains an estimate of the expected outstanding liabilities at that time,
conditional on the payments that have emerged in the year. This introduces the concept of the profit or loss on the reserves,
which is known as the claims development result (CDR) or simply the run-off result.
If at the end of the year, the estimated ultimate cost of claims has gone up, there is a loss on the reserves, since CDR(n+1) i <
0, which must be made up from capital. Similarly, if the estimated ultimate cost of claims at the end of the year has gone
down, there is a profit on the reserves, since CDR(n+1) i > 0. Under Solvency II, it is the change in the ultimate cost of claims
over a one-year time horizon (the profit or loss over one year) that is important, and the Solvency II definition of reserve risk
is in that context. The analogy on the assets side of the balance sheet is the change in the value of assets over one year.
Clearly, the Solvency II definition of reserve risk is different from the traditional actuarial view of risk, which considers the
outstanding payments over their lifetime.
44
45. Risk Margin Calculations Solvency II Cost of Capital
SD Discounted Reserves SD Undiscounted Reserves
VaR Reserves @98.1% VaR Reserves @99.5%
Table 14 Table 15
Table 16 Table 17
46. Risk Margin Calculations Solvency II Cost of Capital
Analytic: SD (Reverse Sum CDRs)
Simulated: SD (Reverse Sum CDRs)
VaR (Reverse Sum CDRs) @97.1% VaR (Reverse Sum CDRs) @99.5%
Table 18 Table 19
Table 20 Table 21
47. Risk Margin Calculations Solvency II Cost of Capital
Reserves based on
different Capital
profiles in each
future year is
shown in the
graph on the left.
Reverse sum of CDR;
Reverse Sum
simply means;
___+5=11. so 6 is
the reverse sum
here.
47
49. IFRS17 Risk Adjustments Calculations
According to IFRS 17: “An entity shall adjust the estimate of the present value of the future cash flows to reflect the compensation that the
entity requires for bearing the uncertainty about the amount and timing of the cash flows that arises from non-financial risk.”
IFRS 17 is more principles based than Solvency II, and does not specify the techniques for calculating the “risk adjustment”, which is just a risk
margin by another name. Although IFRS 17 does not specify the techniques that should be used, it does state that: “If the entity uses a
technique other than the confidence level technique for determining the risk adjustment for non-financial risk, it shall disclose the technique
used and the confidence level corresponding to the results of that technique.” The “confidence level” is the percentile level of a value-at-risk
measure, although the risk profile associated with the risk measure is not specified. We can infer from the IFRS 17 documentation that the
most appropriate risk profile is the distribution of the discounted fulfilment cash-flows over their lifetime. It is clear, therefore, that IFRS 17
takes the traditional actuarial lifetime view of reserve risk, not the one-year view of Solvency II.
The most obvious techniques to calculate a risk adjustment under IFRS 17 are therefore risk measures applied to the distribution of the
discounted fulfilment cash-flows. Several risk measures have been proposed, including:
1. VaR: Value-at-risk (“confidence level technique”)
2. TVaR: Tail value-at-risk (conditional tail estimation)
3. PHT: Proportional hazards transform
Clearly, there are other possibilities, including multiples of the standard deviation or variance. Given the choice of risk measure, the only
other input is the associated risk tolerance level (that is, percentile level for VaR or TVaR, and proportional hazards parameter for PHT). The
risk adjustment is then the risk measure evaluated at the selected risk tolerance level less the mean.
49
50. IFRS17 Risk Adjustments Calculations
A cost-of-capital approach is also likely to be popular given its use for Solvency II, although a cost-of-capital risk adjustment under IFRS 17 will be different
from a cost-of-capital risk margin under Solvency II. Although it is open to debate, we contend that the capital requirements in an IFRS 17 context will need
to consider the fulfilment cash-flows over their remaining lifetime at each period (not the one-year view of Solvency II), and the cost-of-capital rate and
discount rates will be entity specific.
Value-at-risk is easy to explain to a non-technical audience, and has the advantage of simplicity, but since it is based on a single simulation, could be prone
to simulation error (although there are techniques to mitigate this). It has a range from the minimum simulated value to the maximum, as the percentile
level changes. It has been criticized since it does not adequately recognize skewness nor extremes, nor is it a coherent risk measure since it does not obey
the sub-additivity property.
Tail value-at-risk is straightforward to calculate. It has a range from the mean to the maximum simulated value as the percentile level changes, and is better
at recognizing skewness and extremes since all values above a given percentile level are included in the calculation. It also has the advantage of being a
coherent risk measure, and can be used for allocations of risk to sub-groups, where distributions have been combined before the risk measure is applied.
The proportional hazards transform, introduced by Wang in the context of insurance, also has a range from the mean to the maximum simulated value as
the associated parameter increases from 1 to infinity. It could be argued that it is even better at recognizing skewness and extremes since the weights
increase as the simulation values increase, unlike TVaR where the weights are constant above a given percentile level. It is also a coherent risk measure, and
again can be used for allocations of risk to sub-groups.
One method of expressing risk preferences is using a risk preference model to adjust the probability distribution. Such a model assigns lower preference
adjusted probability values to more favorable outcomes, i.e., outcomes that have lower cash flow liabilities than the mean. For unfavorable (adverse)
outcomes, i.e., outcomes that have higher cash flow liabilities than the mean, higher preference-adjusted probability values would be assigned. This class of
risk preference models is referred to as proportional hazard transforms. the Wang Transform provides a functional transformation which assigns higher
probabilities to the more severe outcomes by reducing the cumulative percentile associated with the less severe outcomes. This technique enables the
probability-weighted based calculation of a risk adjusted value of the uncertain liabilities. A risk preference parameter, lambda (λ), represents the
compensation for bearing risk and this parameter is applied to the entire probability distribution. Consequently, risk is measured in terms of an adjustment
to the expected value derived from a proportional hazard transform of the probability distribution. Lambda is the key parameter when using this technique
to estimate the risk adjustment. This parameter indicates how much the compensation will increase, when a measure of risk increases by one unit. This
parameter is independent of the nature of risk and is closely related to the entity’s overall risk tolerance. (Source: IAA IFRS17 book/monograph on Risk
Adjustment).
50
51. IFRS17 Risk Adjustments Calculations – Risk Metrics
Risk Adjustments using VaR, TVaR and PHT
Equivalent Risk Tolerance Levels Required to
obtain 6% Cost of Capital
Table 22 shows IFRS 17 risk adjustments obtained by applying three
different risk measures to the distribution of discounted outstanding
liabilities over their lifetime. The distribution is summarized in Table 5 and in
the Histograms. TVaR 40% is usually near to VaR 75%. PHT 1.85 is also
around VaR 75%. VaR 75% is a useful benchmark and is also required under
some regulations such as in Australia and Hong Kong.
Table 23 shows the equivalent risk tolerance levels for the cost-of-capital risk
margin shown in Table 10. This is important since under the disclosure
requirements of IFRS 17, the 'equivalent confidence level' must be disclosed
if the 'confidence level technique' (i.e. value-at-risk) is not used. RAs are
higher in Table 22 than in Table 23. The Best Estimate Cost of Capital 4.69%
shows the analogous confidence intervals as 21.20% and 65.39% and 1.45
for PHT. If these CIs are seeming too low, putting CoC 11% instead of 6% can
lead to similar levels of RA to 75%. An alternative view is that the distribution
given by Mack’s model is too wide, and a narrower distribution would give a
higher equivalent confidence level.
Table 22
Table 23
51
52. IFRS17 Risk Adjustment Calculations - Cost of Capital
Expected value, standard deviation and value-at-risk of the discounted reserves at each
future period. Also, standard deviation of the undiscounted reserves. Cost-of-capital risk
adjustments are shown for each basis.
The square root of the reverse sum of the CDR MSEPs, together with the
standard deviation and VaR of the reverse sum of simulated CDRs. Cost-of-
capital risk margins are shown for each basis.
Table 24 Table
25
Tables 24 and 25 show different bases that could be used to obtain a risk profile in a cost-of-capital risk adjustment
under IFRS 17, if a lifetime view of risk is used for assessing an entity's capital requirements, instead of the one-year view
of Solvency II. If insurance entities use a cost-of-capital approach for IFRS 17, they will need to decide whether a one-year
view is acceptable for capital calculations under IFRS 17.
In Table 24, a distribution of the remaining discounted reserves at each future time point is used, conditional on
information currently available. In Table 25, a distribution of the reverse sum of CDRs is used at each future time point,
which is a prudent approximation to the distribution of the remaining discounted reserves at each future time point,
conditional on information available at that time.
52
53. IFRS17 Risk Adjustment Calculations - Cost of Capital
The risk tolerance level of VaR at 97.1% was selected such that the value in the first year is close to the opening capital
requirement in Table 14, again allowing corresponding risk adjustments to be compared.
It should also be noted that it is not clear what risk tolerance level is appropriate under a cost-of-capital risk adjustment for IFRS
17; the choice is entity specific and is not prescribed.
It should be noted that the recursive re-reserving approach is computationally expensive. Therefore, although it may be better to
use a capital profile obtained from a risk measure applied to a distribution of the reverse sum of CDRs for future capital
requirements under IFRS 17, using the distribution of the discounted outstanding future cash-flows given data up to calendar
period n may be expedient (with the risk tolerance level being used to control the level of prudence).
53
54. 8 ) F I N A L N O T E S
1) Ending Notes
2) Fur ther areas to develop in RA modeling
3) Key takeaways
4) More Key points
5) Lessons to live by
6) Recap: what we covered in this presentation
55. Ending Notes of Presentation on RA
In this presentation, various concepts associated with the quantification of reserve risk have been connected. The analytic formula-based approaches of
Mack for the lifetime view of reserve risk, and Merz and Wuthrich for the one-year view of Solvency II, have been compared to simulation-based results
obtained by bootstrapping Mack’s model, supplemented with the re-reserving approach. Furthermore, the lifetime and one-year views were brought
together by considering a sequence of one-year views until the liabilities are extinguished. Again, this was considered analytically, using Merz and
Wuthrich, and using a simulation-based approach by applying re-reserving recursively.
IFRS 17 risk adjustments are also required on a gross and reinsurance basis. Clearly, it is the net position that is most relevant for the interpretation of an
insurance entity’s financial position, so it seems appropriate to estimate risk adjustments from distributions of gross and net discounted fulfilment cash-
flows, then taking the difference as the reinsurance risk adjustment. Reinsurance modelling to obtain an accurate distribution of the net discounted
fulfilment cashflows (together with an assessment of credit risk) could be complex. In particular, the current actuarial practice of applying an approximate
net-to-gross ratio looks increasingly inadequate (where non-proportional reinsurance treaties exist), and triangle methods for attritional claims may need
to be supplemented by individual claims modelling for large claims, with accurate reinsurance modelling. Furthermore, risk adjustments are required for
groups of contracts, not just at the aggregate entity level (or holding company level for a multinational group), which raises questions about allocation of
risk and diversification. a simulation framework can be used (using copulae to apply dependencies when aggregating), but the issues are complex.
If the cost-of-capital technique is used for IFRS 17 risk adjustments, it should be recognized that this will be different from a Solvency II risk margin.
Solvency II considers the one-year view of risk for capital requirements, whereas the lifetime view of risk is more appropriate under IFRS 17. A distribution
of the remaining total cash-flows at each future time period is more appropriate as a basis for estimating capital requirements (although as discussed in
section 6 and Appendix 3, the time perspective becomes important). Furthermore, cost-of-capital and discount rates are entity specific under IFRS 17 but
prescribed under Solvency II. The cost of-capital technique is considerably more complex than simply applying a risk measure to a distribution of fulfilment
cash-flows, and requires more parameters to select and justify; it requires an opening capital requirement, future capital requirements, a cost-of-capital
rate and a yield curve for discounting. Since the equivalent “confidence level” is required anyway under IFRS 17, it questions why the cost-of-capital
method would be used at all. A distribution of discounted fulfilment cash-flows is required for the equivalent confidence level, so it seems more
straightforward to calculate IFRS 17 risk adjustments simply from a risk measure applied to that distribution. Given the distribution, the only input to select
is the entity specific risk tolerance level.
55
56. Further areas to develop
Insurers need to make new set of KPIs like new ratios for quantitative performance analysis.
Diversification method needs to be worked at such as Copulas, maximum allowance for diversification and so on.
Modification in RA for LRC is needed. IAA Book/Monograph on Risk Adjustment also contains many different methods for life and non-life lines of
business in Chapter 10 Case Studies. IAN100 contains answers to many general queries in IFRS17 implementation which is a welcome sight for much
needed clarification and benchmarking instead of only relying on market consensus that can might be a consensus of many industry players but still be
technical wrong.
A comprehensive RA model needs to have selection methods (select CoC, VaR, TVaR, PHP,), claim intervals (select monthly triangles, quarterly, annual)
and then working for those basis. Gross and RI triangles (and Net or not?) need to be worked for which all classes of business. For example, Motor and
Medical as they have high frequency low severity claims that are data intensive and inducive to credible modeling. Models need to be comprehensive but
not take much time to reach from data to final stage otherwise implementation can become impractical. RA for long term life insurance is a different
ballgame than described here so separate models need to be develop to handle RA for long term life insurance.
RA working needs to be accommodated for RA on reinsurance level. An entity needs to calculate risk adjustment separately on gross basis and
reinsurance basis. It must be noted that the risk adjustment for non-financial risk on the reinsurance contract is not the compensation that the reinsurer
requires for bearing the nonfinancial risk on reinsurer’s side. The reinsurer’s risk adjustment is dependent upon the reinsurer’s risk appetite, and
methodology for this working will have no direct bearing on the insurer’s financials. Non-financial risk transfer is to be reflected in the RA for non-
financial risk on reinsurance basis. The risk of non-performance created by contract is to be reflected in estimates of future cashflows. Useful to see if
company benefiting from the reinsurance arrangement or not and reinsurance optimization exercises can help inform the reinsurance considerations for
RA. Source: https://www.ifrs.org/content/dam/ifrs/supporting-implementation/ifrs-17/ifrs-17-pocket-guide-on-reinsurance-contracts-held.pdf
56
58. More Key Points
All risks need to be measurable and be
quantified.
Measurable
No use if calculating Risk Adjustment needs an
unreachable budget for the company. But
reasonable budgets should also be there otherwise
patchworks can mean quality can suffer. The person
who buys expensive cries once but the person who
buys cheap cries ten times.
Cost effective
The RA needs to work across very diverse lines of
business including motor, medical, short term life,
long-term life, marine, engineering, liabilities and
so on.
Multiple lines
59. Lessons to Live by
Pragmatic Vision and
Budgets
Leadership
Quality
Deep Expertise
IFRS17 is unlike normal work like
reserving or pricing which actuaries
have repeated thousand of times.
This is being done for the first time
worldwide and no one has done A to
Z all of it before so it’s better to
over- prepare than under-prepare as
the consequences of under-
preparation are far worse than of
over-preparation. That vision needs
to be backed up by reasonable
budgets. Going for unreasonably low
budgets mean lots of pain
afterwards.
Deep expertise is needed in
order to implement solutions
that are technically sound in line
with principles of IFRS17 instead
of simple patchworks.
The binary view that
insurer is compliant with
IFRS17 or not compliant is
misleading as quality of
compliance differs
drastically across different
insurers and markets.
Unless the top management of
company takes IFRS17 seriously,
implementation will suffer
drastically. It has been noticed
across various markets that 90%
or more work is done by
consultants but there is extremely
low ownership and knowledge by
company employees of IFRS17.
Collaboration
Across different
segments of business
from Finance to
underwriting to IT and
Actuaries is crucial
Communication
tailored to specific
stakeholders is key
Collaboration
Communication
60. 01 02 03
04 05 06
Recap - What we covered in this presentation
Phased approach. 1) Gap Analysis, 2)
Financial impact assessment 3)
system design and methodology and
4) implementation.
IFRS17 Phases
Current practice under IFRS4.
Detailed description of RA
requirements under IFRS17. one year
view Vs ultimate view.
IFRS17 Risk Adjustment
Definition & Description
SAMA regulator KSA and IA regulator
of UAE Insurance market review
points and what we can learn from
those points.
Regional Regulators’
review points
Exploratory Data Analysis of data
used in calculation of Solvency 2 Risk
Margin and IFRS17 Risk Adjustments.
Exploratory Data Analysis
risk metrics. 1) Analytical Mack
method. 2) Bootstrap and 3) Simulated
CDR method.
Cost of Capital. 1) Analytical Mack 2)
Simulated CDR 3) VaR
Solvency 2 Risk Margins
Risk Metrics; 1) VaR 2) TVaR 3) PHT.
Cost of Capital 1) Analytic 2) Simulated
3) VaR.
IFRS17 Risk Adjustment
Calculations
61. T H A N K Y O U !
A n y Q u e s t i o n s ?
SYED DANISH ALI