A transition matrix is a square matrix describing the probabilities of moving from one state to another in a dynamic system. In each row there are the probabilities of moving, from the state represented by that row, to the other states. Thus each row of a transition matrix adds to one.
In our earlier blog, we discussed PD terminology and PD calibration approaches as applicable to the IFRS 9 framework. In this blog, we have discussed the methodologies for adjusting PDs for the ‘forward-looking’ macroeconomic scenarios and development of PD Term Structure.
A key metric that summarizes the credit worthiness of a bank’s obligor is the Probability of Default (PD). Besides credit worthiness assessment and capital computation under IRB, PD is one of the key metrics required in the updated IFRS 9 accounting standards. At present, there are many PD related terminologies used in the banking industry, such as: PIT PD, TTC PD, 12-month PD and so on. Such a wide spectrum of terminologies has led to confusion among users, especially when it comes to IFRS 9, which lays special focus on PIT PD and lifetime PD. This blog intends to clarify these key terminologies.
As discussed in our previous blog, PIT PD describes an expectation of the future, starting from the current situation and integrating all relevant cyclical changes & all values of the obligor idiosyncratic effect with appropriate probabilities. A PIT PD mimics the observed default rates over a period of time. TTC PDs, in contrast, reflect circumstances anticipated over an extremely long period, and thus nullify the effects of credit cycle. Basing it on these definitions, the current article focuses on range of PD Calibration approaches for aligning internal rating model output with actual default rates.
Counterparty Credit Risk and CVA under Basel IIIHäner Consulting
Financial institutions which apply for an IMM waiver under Basel III need to fullfill a broad set of requirements. We present the quantitative, organizational and operational implications and provide some hand-on guidance how to fulfill the regulatory requirements.
Continuing with our updates on the key aspects of IFRS 9 Implementation, our current post (attached) talks about “Exposure at Default (EAD)” where, possible uses and business interpretation nuances of terms linked to EAD are highlighted. The post enumerates on the computation methods of EAD and the modeling approaches available for each of the methods with key consideration points from Basel and IFRS9 perspectives highlighted in between for the readers.
We look forward to your valuable feedback on the current article or the challenges faced by you in IFRS9 implementation.
Liquidity Risk is normally a crucial issue in a banking crisis, however, during the 2007-2010 period, Liquidity has not been as difficult for us as we may have thought. There are many reasons for this, but number one is the fact that today’s community bankers simply have a better understanding of the various techniques for raising both retail deposits and wholesale funds. What does make this crisis a bit different is the relative pricing efficiencies in the wholesale or non-core funding arena these days and our session will focus on how bankers can avoid those difficult examiner discussions about the use of FHLB Advances and Brokered Deposits. It’s all about process and we will provide guidance on what needs to be in your ALCO Policy as it relates to wholesale funding. We will also explore the April 2010 Liquidity and Funds Management Guidance to ensure your bank is up to speed on those requirements. Finally, we will provide specific guidance on both Ratio Analysis and creating your Contingency Funding Plan and will review a sample CFP.
In our earlier blog, we discussed PD terminology and PD calibration approaches as applicable to the IFRS 9 framework. In this blog, we have discussed the methodologies for adjusting PDs for the ‘forward-looking’ macroeconomic scenarios and development of PD Term Structure.
A key metric that summarizes the credit worthiness of a bank’s obligor is the Probability of Default (PD). Besides credit worthiness assessment and capital computation under IRB, PD is one of the key metrics required in the updated IFRS 9 accounting standards. At present, there are many PD related terminologies used in the banking industry, such as: PIT PD, TTC PD, 12-month PD and so on. Such a wide spectrum of terminologies has led to confusion among users, especially when it comes to IFRS 9, which lays special focus on PIT PD and lifetime PD. This blog intends to clarify these key terminologies.
As discussed in our previous blog, PIT PD describes an expectation of the future, starting from the current situation and integrating all relevant cyclical changes & all values of the obligor idiosyncratic effect with appropriate probabilities. A PIT PD mimics the observed default rates over a period of time. TTC PDs, in contrast, reflect circumstances anticipated over an extremely long period, and thus nullify the effects of credit cycle. Basing it on these definitions, the current article focuses on range of PD Calibration approaches for aligning internal rating model output with actual default rates.
Counterparty Credit Risk and CVA under Basel IIIHäner Consulting
Financial institutions which apply for an IMM waiver under Basel III need to fullfill a broad set of requirements. We present the quantitative, organizational and operational implications and provide some hand-on guidance how to fulfill the regulatory requirements.
Continuing with our updates on the key aspects of IFRS 9 Implementation, our current post (attached) talks about “Exposure at Default (EAD)” where, possible uses and business interpretation nuances of terms linked to EAD are highlighted. The post enumerates on the computation methods of EAD and the modeling approaches available for each of the methods with key consideration points from Basel and IFRS9 perspectives highlighted in between for the readers.
We look forward to your valuable feedback on the current article or the challenges faced by you in IFRS9 implementation.
Liquidity Risk is normally a crucial issue in a banking crisis, however, during the 2007-2010 period, Liquidity has not been as difficult for us as we may have thought. There are many reasons for this, but number one is the fact that today’s community bankers simply have a better understanding of the various techniques for raising both retail deposits and wholesale funds. What does make this crisis a bit different is the relative pricing efficiencies in the wholesale or non-core funding arena these days and our session will focus on how bankers can avoid those difficult examiner discussions about the use of FHLB Advances and Brokered Deposits. It’s all about process and we will provide guidance on what needs to be in your ALCO Policy as it relates to wholesale funding. We will also explore the April 2010 Liquidity and Funds Management Guidance to ensure your bank is up to speed on those requirements. Finally, we will provide specific guidance on both Ratio Analysis and creating your Contingency Funding Plan and will review a sample CFP.
Fundamental Review of the Trading Book (FRTB) – Data Challengesaccenture
In this Accenture Finance & Risk presentation we explore the challenges facing banks responding to the new Fundamental Review of the Trading Book (FRTB) rules and offer guidance on how to respond to these. http://bit.ly/2fojCKB
Modern credit risk modeling (e.g., Merton, 1974) increasingly relies on advanced mathematical, statistical and numerical echniques to measure and manage risk in redit portfolios
This gives rise to model risk (OCC 2011-16) and the possibility of nderstating nherent dangers stemming from very rare yet plausible occurrencs perhaps not in our eference data-sets International supervisors have recognized the importance of stress testing credit risk in the Basel framework (BCBS, 2009)
It can and has been argued that the art and science of stress testing has lagged in the domain of credit, vs. other types of risk (e.g., market), and our objective is to help fill this vacuum
We aim to present classifications & established techniques that will help practitioners formulate robust credit risk stress tests
Overview of the Basel Committee's revised "Minimum capital requirements for market risk" (formerly FRTB), with notes and tips for technical implementation.
Our latest analysis of readiness and maturity of intraday liquidity management shows that many financial institutions run the risk not to meet payment and settlement obligations, if they don’t manage their intraday liquidity effectively. There are ways to make up for the necessary investments to that end by optimizing the intraday liquidity management.
Default Probability Prediction using Artificial Neural Networks in R ProgrammingVineet Ojha
The objective of the project is to analyze the ability of the Artificial Neural Network Model
developed to forecast the credit risk profile of retails banking loan consumers and credit card
customers.
From a theoretical point of view, this project introduces a literature review on the detailed
working and the application of Artificial Neural Networks for credit risk management.
Practically, the aim of this project is presenting a model for estimating the Probability of Default
using Artificial Neural Network to accrue benefit non-linear models.
Banks are scrambling to meet with IFRS 9 guidelines and are setting down on the path to implement various ECL estimation methodologies and models. But a topic that hasn’t been given enough attention is the need for governance of these models and the attendant model risk management framework that needs to be set up to lend credibility to the model estimates. This blog touches upon the need for validation of models and how model risk governance has become paramount in view of the new guidelines.
A Tale of Two Risk Measures: Economic Capital vs. Stress Testing and a Call f...Xiaoling (Sean) Yu Ph.D.
In this presentation that I gave in the 9th Annual Capital Allocation and Stress Testing Conference, I advocated for a coherent risk management framework that integrates Economic Capital and Stress Testing, after compared and contrasted the two.
In this presentation I gave in the 7th Annual Risk Americas Conference, I first discussed the inconsistency of CECL from risk philosophy perspective, and then shared some thoughts on key aspects of CECL modeling, i.e. Reasonable and Supportable Period, leveraging CCAR models for CECL, and model performance testing.
Are CCPs here to manage risk or instead to cause it?
•Changing environment: security vs capital efficiency
•Diversity killed by regulation?
•CCP default scenario: recovery vs resolution
2
Credit Risk Losses | Real Losses Are they inconsistent?László Árvai
Focusing on:
• Contracting a new deal
• Good and properous customer relationship
• To have nice conversation
• Learn all the needs of his client
• Write a loan application
• Fill in the forms of the system
• Cope with all the compliance „handicaps“
• Analyse the business plan, the forecast, …
Fundamental Review of the Trading Book (FRTB) – Data Challengesaccenture
In this Accenture Finance & Risk presentation we explore the challenges facing banks responding to the new Fundamental Review of the Trading Book (FRTB) rules and offer guidance on how to respond to these. http://bit.ly/2fojCKB
Modern credit risk modeling (e.g., Merton, 1974) increasingly relies on advanced mathematical, statistical and numerical echniques to measure and manage risk in redit portfolios
This gives rise to model risk (OCC 2011-16) and the possibility of nderstating nherent dangers stemming from very rare yet plausible occurrencs perhaps not in our eference data-sets International supervisors have recognized the importance of stress testing credit risk in the Basel framework (BCBS, 2009)
It can and has been argued that the art and science of stress testing has lagged in the domain of credit, vs. other types of risk (e.g., market), and our objective is to help fill this vacuum
We aim to present classifications & established techniques that will help practitioners formulate robust credit risk stress tests
Overview of the Basel Committee's revised "Minimum capital requirements for market risk" (formerly FRTB), with notes and tips for technical implementation.
Our latest analysis of readiness and maturity of intraday liquidity management shows that many financial institutions run the risk not to meet payment and settlement obligations, if they don’t manage their intraday liquidity effectively. There are ways to make up for the necessary investments to that end by optimizing the intraday liquidity management.
Default Probability Prediction using Artificial Neural Networks in R ProgrammingVineet Ojha
The objective of the project is to analyze the ability of the Artificial Neural Network Model
developed to forecast the credit risk profile of retails banking loan consumers and credit card
customers.
From a theoretical point of view, this project introduces a literature review on the detailed
working and the application of Artificial Neural Networks for credit risk management.
Practically, the aim of this project is presenting a model for estimating the Probability of Default
using Artificial Neural Network to accrue benefit non-linear models.
Banks are scrambling to meet with IFRS 9 guidelines and are setting down on the path to implement various ECL estimation methodologies and models. But a topic that hasn’t been given enough attention is the need for governance of these models and the attendant model risk management framework that needs to be set up to lend credibility to the model estimates. This blog touches upon the need for validation of models and how model risk governance has become paramount in view of the new guidelines.
A Tale of Two Risk Measures: Economic Capital vs. Stress Testing and a Call f...Xiaoling (Sean) Yu Ph.D.
In this presentation that I gave in the 9th Annual Capital Allocation and Stress Testing Conference, I advocated for a coherent risk management framework that integrates Economic Capital and Stress Testing, after compared and contrasted the two.
In this presentation I gave in the 7th Annual Risk Americas Conference, I first discussed the inconsistency of CECL from risk philosophy perspective, and then shared some thoughts on key aspects of CECL modeling, i.e. Reasonable and Supportable Period, leveraging CCAR models for CECL, and model performance testing.
Are CCPs here to manage risk or instead to cause it?
•Changing environment: security vs capital efficiency
•Diversity killed by regulation?
•CCP default scenario: recovery vs resolution
2
Credit Risk Losses | Real Losses Are they inconsistent?László Árvai
Focusing on:
• Contracting a new deal
• Good and properous customer relationship
• To have nice conversation
• Learn all the needs of his client
• Write a loan application
• Fill in the forms of the system
• Cope with all the compliance „handicaps“
• Analyse the business plan, the forecast, …
Building out a Robust and Efficient Risk Management - Alan CheungLászló Árvai
Credit Derivatives are off-balance sheet financial statements that permit one party to transfer the risk of a reference asset, which it typically owns, to another one party (the guarantor) without actually selling the assets.
Danske Bank — Version and strategy
The Risk function in Personal Banking
Building an Oprisk framework
How do you influence the risk culture
Improving risk culture through 1:1 risk attention
Improving risk culture through measurement
Improving risk culture — Empowerment & consequences
Nous sommes tous créatifs. Ce sont nos formations, nos parcours, nos fonctions qui nous enferment dans une boite. La pensée créative complète la pensée analytique pour trouver de nouvelles solutions.
Ce cours donné à EM Lyon présente la posture et des méthodes de creative thinking.
G20 (2009): Strengthen loan loss accounting using broader
range of information aiming at greater stability
IFRS 9: 3S-approach replaces Incurred Loss (IL)-approach
Basel Committee Guidelines: Are claims justified?
higher model quality and backtesting
Macroeconomic projections
Denouncing shortcuts (e.g. 30d past due)
For lack of empirical evidence: Let’s use simulations
Revolving 10Y-loan portfolio, infinitely granular follows Moody’s
US-Corp. migration statistics, transfer S1/2: 3notch downgrade
(papers.ssrn.com/sol3/papers.cfm?abstract_id=2187515
Comment construire un nouveau business? comment éviter que le projet aille dans le mur?
Nous abordons ces sujets dans un cours donné à EM Lyon. Au sommaire :
-Pourquoi ça va si souvent dans le mur
-Aucun Business Plan ne survit au premier RDV client
-Comment déminer un projet innovant?
-Une start-up ne se lève pas d'un seul coup
-Soyez des Business Designers
-Prenez le temps d'imaginer et tester plusieurs pistes
-Sortez de votre bureau pour tester vos hypothèses
-Une fois votre business model validé, vous pouvez construire votre activité
-Pilotez votre apprentissage
-Construisez plutôt qu'étudier
Affine cascade models for term structure dynamics of sovereign yield curvesLAURAMICHAELA
Rafael Serrano profesor de la Universidad del Rosario
Resumen:
In the first part of the talk, I will present an introduction to stochastic affine short rate models for term structure of yield curves In the second part, I will focus on a recursive affine cascade with persistent factors for which the number of parameters, under specifications, is invariant to the size of the state space and converges to a stochastic limit as the number of factors goes to infinity. The cascade construction thereby overcomes dimensionality difficulties associated with general affine models. We contrast two specfifications of the model using linear Kalman filter for a panel of Colombian sovereign yields.
Multi-dimensional time series based approach for Banking Regulatory Stress Te...Genpact Ltd
Under regulatory paradigm of banking risk management, banks are required to perform stress testing of internally computed risk parameters to ensure holding of adequate amount of capital to offset the effects of downturn events. For this purpose, most of the contemporary stress-testing practices are limited to one dimensionality of the calculation, where endogenous risk parameters are predicted by modeling and scenario based values of exogenous parameters (macroeconomic variables).
Dynamic asset allocation under regime switching: an in-sample and out-of-samp...Andrea Bartolucci
My work consists of a comparative study of the performances of the multivariate regime switching model against the single regime model in terms of portfolio returns in the context of dynamic asset allocation.
The study was conducted through the practical application, both in-sample and out of-sample, of the two models under various portfolio optimization approaches.
In the first part of the asset allocation exercise I constructed for any asset pricing model, both in-sample and out-of-sample, two dynamic recursive efficient portfolios that maximize the Sharpe among portfolios on the efficient frontier (one with opened budget constraint that permits between 0% and 100% in the riskless asset, one whose weights must sum to 1); in addition short selling, thus negative asset class weights, is not allowed. The other three dynamic recursive portfolios that I constructed have been chosen as those that maximize the investor utility function with three different risk aversion coefficient subject to non-negative weights and opened upper budget constraint.
The second part of the asset allocation exercise focuses only on the out-of-sample period. Here the Copula-Opinion Pooling approach is applied to implement in the asset pricing model views on the asset returns produced by both the single regime model and the regime switching model. The purpose of this section is to investigate and make a comparison of the behavior of the regime switching model and the single state model in the COP framework in terms of both expected and realized portfolio returns and Sharpe ratio in the context of mean-variance and conditional value-atrisk (CVaR) portfolio optimization. Therefore, in addition to the five recursive optimal portfolios chosen with the same portfolio selection process as in the first part, here using conditional value-at-risk as the risk exposure constraint, I derived the dynamic optimal weights of other five different portfolios equally distributed, in terms of CVaR, along the time dependent efficient frontier for different values of the confidence in the views.
The overperformance can be achieved by the more efficient and desirable risk-reward combinations on the state-dependent frontier that can be obtained only by systematically altering portfolio allocations in response to changes in the investment opportunities as the economy switches back and forth among different states. An investor who ignores regimes sits on the unconditional frontier, thus an investor can do better by holding a higher Sharpe ratio portfolio when the low volatility regime prevails. Conversely, when the bad regime occurs, the investor who ignores regimes holds too high a risky asset weight. She would have been better off shifting into the risk-free asset when the bear regime hits. As a consequence, the presence of two regimes and two frontiers means that the regime switching investment opportunity set dominates the investment opportunity set offered by one frontier.
In-spite of large volumes of Contingent Credit Lines (CCL) in all commercial banks, the paucity of Exposure at Default (EAD) models, unsuitability of external data and inconsistent internal data with partial draw-downs has been a major challenge for risk managers as well as regulators in for managing CCL portfolios. This current paper is an attempt to build an easy to implement, pragmatic and parsimonious yet accurate model to determine the exposure distribution of a CCL portfolio. Each of the credit line in a portfolio is modeled as a portfolio of large number of option instruments which can be exercised by the borrower, determining the level of usage. Using an algorithm similar to basic the CreditRisk+ and Fourier Transforms we arrive at a portfolio level probability distribution of usage. We perform a simulation experiment using data from Moody\'s Default Risk Service, historical draw-down rates estimated from the history of defaulted CCLs and a current rated portfolio of such.
Research on the Trading Strategy Based On Interest Rate Term Structure Change...inventionjournals
Bond pricing errors exist in the bond market universally, the formation of the reasons for its formation has been controversial. In this paper, in order to obtain the pricing error, the authors first estimate the term structure of interest rate of China's interbank market by using the three spline model and the Svensson model. Then, the author using the moving average model and time series model to build the bond trading strategy based on the pricing error. Through the simulation of our bond portfolio trading, the result shows that bond trading can obtain about 11 basis points of the annual excess return based on bond pricing errors, and the excess return rate is not caused by different bond liquidity or risk characteristics, instead is due to the effective economic information included in the bond pricing error.
International journal of engineering and mathematical modelling vol1 no1_2015_2IJEMM
Default risk has always been a matter of importance for financial managers and scholars. In this paper we apply an intensity-based approach for default estimation with a software simulation of the Cox-Ingersoll-Ross model. We analyze the possibilities and effects of a non-linear dependence between economic and financial state variables and the default density, as specified by the theoretical model. Then we perform a test for verifying how simulation techniques can improve the analysis of such complex relations when closed-form solutions are either not available or hard to come by.
Medicines Verification Systems in Europe – a perspective from wholesale distr...László Árvai
The Falsified Medicines Directive and its Delegated Regulation on Safety Features
European Medicines Verification Organisation (EMVO) and the roll-out of National Medicines Verification Systems
GIRP and wholesale distributors perspectives on medicines verification systems
Impact on the actors in the supply chain
Best Practices in the Field of Serialization and Safe Supply Chain László Árvai
GS1 – and global standards • ABC – Argentina, Brasil, China and other countries – what is the world doing beyond Europe? • Serialisation – how and when? • Visibility in the supply chain – reality or myth • Patient Safety and the “Level below the Each”
The transparency of securities financing transactions in the EULászló Árvai
Capital Markets Union (CMU)
•EU capital markets are global
•Considerable progress has been made
•But inefficiencies, legal barriers, insufficient competition remain.
•Markets need to work better for the economy, delivering growth and jobs.
Potential career path between profilesLászló Árvai
Architects
Business
Development
Governance
Group IT - job profiles
Infrastructure
IT Development
IT Operations (development)
IT Service
Management
Management
Project Management
Security
Staff
Test Management
Agenda
Danske Bank IT – A Global Workforce
Career Path project vision and overview
Job profile example
Presentation of speaker
Paradigm shifts of IT Competencies
CSDR Technical Standards and Technical AdviceLászló Árvai
Presentation to explain ESMA’s on-going involvement in the drafting of
the CSDR Level 2 requirements
Outline of ESMA’s role in the Level 2 process
Reference to main stakeholder feedback to the ESMA consultation
papers and the related ESMA responses
Questions
What affects interview
• Overconfidence!
• Biases
- First impressions
- Stereotypes
- Like me/I like you
- Halo/horns
• Poor planning for interview (lack of knowledge of role and/or criteria)
• Poor decision making
• Lack of interviewing skills (questioning, listening, note taking, evaluating)
Why could this presentation be relevant for you?
IF you have: - duplications in your organisation
-unclear processes
-unclear responsibilities Or: - It is not clearly defined, who your internal consumers are - What are the real added value activies? - You do not exactly know how measure the effectiveness?
CHALLENGES FOR A CRO IN A NEARLY GONE CONCERN OPERATING ENVIRONMENTLászló Árvai
Banca Marche is a very typical commercial retail bank, offering a wide range of products and services… … at the end of 2012 (last official figures available)… was among the first 20 Italian banks in terms of total assets (23 bln) had a very high market share in Marche Region (25%, 23% deposits and 20% branches) and about 1% of whole Italian market share (1,0% loans, 0,8% deposits and 1,0% branches)
The post trade challenges of implementing CSDR settlement discipline: Mandato...László Árvai
What is a buy-in, and how do they work?
What is cash compensation?
The challenges of buy-ins and cash compensation
CSDR Level 1 and mandatory buy-ins
The Level 2 ESMA Consultation Paper: the 3 options for a buy-in mechanism
The challenges with the options
Conclusion (the need to amend the Level 1 text)
CSDR Mandatory Buy-ins
1 . How did we get into this mess
•Ukraine is one of only two countries of the FSU yet to recover to its 1991 level of GDP
–Total reform failure
–Got democracy but Yanukovych was a bad choice
•In the last decade Russia has been transformed into a “normal” country
(albeit with a lot of problems)
100 mln clients 6 countries 70,000 employees
TOP-3 best employers
85% vacancies are closed by internal candidates
TOP-10 best companies for leaders in Russia (2014)
A new European framework for resolution cases: the BRRDLászló Árvai
1.The control of State aid during the financial crisis
2.A new comprehensive framework: the BRRD
3.Implementation of the BRRD and upcoming challenges
This presentation reflects the view of the author and does not express the view of the European Commission.
A new European framework for resolution cases: the BRRDLászló Árvai
1.The control of State aid during the financial crisis
2.A new comprehensive framework: the BRRD
3.Implementation of the BRRD and upcoming challenges
This presentation reflects the view of the author and does not express the view of the European Commission.
how to sell pi coins on Bitmart crypto exchangeDOT TECH
Yes. Pi network coins can be exchanged but not on bitmart exchange. Because pi network is still in the enclosed mainnet. The only way pioneers are able to trade pi coins is by reselling the pi coins to pi verified merchants.
A verified merchant is someone who buys pi network coins and resell it to exchanges looking forward to hold till mainnet launch.
I will leave the telegram contact of my personal pi merchant to trade with.
@Pi_vendor_247
how to sell pi coins in South Korea profitably.DOT TECH
Yes. You can sell your pi network coins in South Korea or any other country, by finding a verified pi merchant
What is a verified pi merchant?
Since pi network is not launched yet on any exchange, the only way you can sell pi coins is by selling to a verified pi merchant, and this is because pi network is not launched yet on any exchange and no pre-sale or ico offerings Is done on pi.
Since there is no pre-sale, the only way exchanges can get pi is by buying from miners. So a pi merchant facilitates these transactions by acting as a bridge for both transactions.
How can i find a pi vendor/merchant?
Well for those who haven't traded with a pi merchant or who don't already have one. I will leave the telegram id of my personal pi merchant who i trade pi with.
Tele gram: @Pi_vendor_247
#pi #sell #nigeria #pinetwork #picoins #sellpi #Nigerian #tradepi #pinetworkcoins #sellmypi
how to sell pi coins at high rate quickly.DOT TECH
Where can I sell my pi coins at a high rate.
Pi is not launched yet on any exchange. But one can easily sell his or her pi coins to investors who want to hold pi till mainnet launch.
This means crypto whales want to hold pi. And you can get a good rate for selling pi to them. I will leave the telegram contact of my personal pi vendor below.
A vendor is someone who buys from a miner and resell it to a holder or crypto whale.
Here is the telegram contact of my vendor:
@Pi_vendor_247
what is the best method to sell pi coins in 2024DOT TECH
The best way to sell your pi coins safely is trading with an exchange..but since pi is not launched in any exchange, and second option is through a VERIFIED pi merchant.
Who is a pi merchant?
A pi merchant is someone who buys pi coins from miners and pioneers and resell them to Investors looking forward to hold massive amounts before mainnet launch in 2026.
I will leave the telegram contact of my personal pi merchant to trade pi coins with.
@Pi_vendor_247
The Evolution of Non-Banking Financial Companies (NBFCs) in India: Challenges...beulahfernandes8
Role in Financial System
NBFCs are critical in bridging the financial inclusion gap.
They provide specialized financial services that cater to segments often neglected by traditional banks.
Economic Impact
NBFCs contribute significantly to India's GDP.
They support sectors like micro, small, and medium enterprises (MSMEs), housing finance, and personal loans.
how can i use my minded pi coins I need some funds.DOT TECH
If you are interested in selling your pi coins, i have a verified pi merchant, who buys pi coins and resell them to exchanges looking forward to hold till mainnet launch.
Because the core team has announced that pi network will not be doing any pre-sale. The only way exchanges like huobi, bitmart and hotbit can get pi is by buying from miners.
Now a merchant stands in between these exchanges and the miners. As a link to make transactions smooth. Because right now in the enclosed mainnet you can't sell pi coins your self. You need the help of a merchant,
i will leave the telegram contact of my personal pi merchant below. 👇 I and my friends has traded more than 3000pi coins with him successfully.
@Pi_vendor_247
Abhay Bhutada Leads Poonawalla Fincorp To Record Low NPA And Unprecedented Gr...Vighnesh Shashtri
Under the leadership of Abhay Bhutada, Poonawalla Fincorp has achieved record-low Non-Performing Assets (NPA) and witnessed unprecedented growth. Bhutada's strategic vision and effective management have significantly enhanced the company's financial health, showcasing a robust performance in the financial sector. This achievement underscores the company's resilience and ability to thrive in a competitive market, setting a new benchmark for operational excellence in the industry.
what is the future of Pi Network currency.DOT TECH
The future of the Pi cryptocurrency is uncertain, and its success will depend on several factors. Pi is a relatively new cryptocurrency that aims to be user-friendly and accessible to a wide audience. Here are a few key considerations for its future:
Message: @Pi_vendor_247 on telegram if u want to sell PI COINS.
1. Mainnet Launch: As of my last knowledge update in January 2022, Pi was still in the testnet phase. Its success will depend on a successful transition to a mainnet, where actual transactions can take place.
2. User Adoption: Pi's success will be closely tied to user adoption. The more users who join the network and actively participate, the stronger the ecosystem can become.
3. Utility and Use Cases: For a cryptocurrency to thrive, it must offer utility and practical use cases. The Pi team has talked about various applications, including peer-to-peer transactions, smart contracts, and more. The development and implementation of these features will be essential.
4. Regulatory Environment: The regulatory environment for cryptocurrencies is evolving globally. How Pi navigates and complies with regulations in various jurisdictions will significantly impact its future.
5. Technology Development: The Pi network must continue to develop and improve its technology, security, and scalability to compete with established cryptocurrencies.
6. Community Engagement: The Pi community plays a critical role in its future. Engaged users can help build trust and grow the network.
7. Monetization and Sustainability: The Pi team's monetization strategy, such as fees, partnerships, or other revenue sources, will affect its long-term sustainability.
It's essential to approach Pi or any new cryptocurrency with caution and conduct due diligence. Cryptocurrency investments involve risks, and potential rewards can be uncertain. The success and future of Pi will depend on the collective efforts of its team, community, and the broader cryptocurrency market dynamics. It's advisable to stay updated on Pi's development and follow any updates from the official Pi Network website or announcements from the team.
Yes of course, you can easily start mining pi network coin today and sell to legit pi vendors in the United States.
Here the telegram contact of my personal vendor.
@Pi_vendor_247
#pi network #pi coins #legit #passive income
#US
What price will pi network be listed on exchangesDOT TECH
The rate at which pi will be listed is practically unknown. But due to speculations surrounding it the predicted rate is tends to be from 30$ — 50$.
So if you are interested in selling your pi network coins at a high rate tho. Or you can't wait till the mainnet launch in 2026. You can easily trade your pi coins with a merchant.
A merchant is someone who buys pi coins from miners and resell them to Investors looking forward to hold massive quantities till mainnet launch.
I will leave the telegram contact of my personal pi vendor to trade with.
@Pi_vendor_247
when will pi network coin be available on crypto exchange.DOT TECH
There is no set date for when Pi coins will enter the market.
However, the developers are working hard to get them released as soon as possible.
Once they are available, users will be able to exchange other cryptocurrencies for Pi coins on designated exchanges.
But for now the only way to sell your pi coins is through verified pi vendor.
Here is the telegram contact of my personal pi vendor
@Pi_vendor_247
how to swap pi coins to foreign currency withdrawable.DOT TECH
As of my last update, Pi is still in the testing phase and is not tradable on any exchanges.
However, Pi Network has announced plans to launch its Testnet and Mainnet in the future, which may include listing Pi on exchanges.
The current method for selling pi coins involves exchanging them with a pi vendor who purchases pi coins for investment reasons.
If you want to sell your pi coins, reach out to a pi vendor and sell them to anyone looking to sell pi coins from any country around the globe.
Below is the contact information for my personal pi vendor.
Telegram: @Pi_vendor_247
2. 2
Overview of transition matrices applications in Risk Management
Application Desiderata
New impairment model
(IFSR 9)
Lifetime EL (PD’s term structure on the basis of
internal data).
Pricing and Fair Value PD’s continuous term structure on the basis of
internal data.
Stress test model Historical series of internal one year transition
matrices, allowing us to estimate a relationship with
macro variables.
Backtesting Infra-annual transition matrices on the basis of
internal data.
Maturity treatment in
portfolio models
One year internal transition matrices.
PD’s term structure consistent with internal DRs.
Some points should be respected:
• at each bank level, the methodology should be the
same for different purposes and entities
• the proposed methodology should guarantee
consistency between matrices and cumulative PDs
Common ground: historical
series of internal matrices for
the different segments and
cumulative default
probabilities on multi-year
periods.
Methodology:
• Markov homogeneous chain /
non-homogeneous chains
• Vintage analysis
• Macro linkage
3. 3
What are transition matrices? What are PDs’ term structures?
Rating transition matrices show the probability of a company migrating from one rating
category to another during a certain period of time. They are based on historical data. The
rating categories can be either those used internally by the financial institution or those
produced by rating agencies such as Moody’s, S&P, or Fitch.
A transition matrix is a square matrix describing the probabilities of moving from one state
to another in a dynamic system. In each row there are the probabilities of moving, from the
state represented by that row, to the other states. Thus each row of a transition matrix adds
to one.
The term structure of default probabilities is the set of (cumulative or conditional) default
probabilities for future time periods.
Let pt denote the probability of default during period t: PD(t − 1, t) = pt . This is called the
conditional or marginal default probability since it is the probability that the firm defaults at
time t given that it has survived until t − 1.
The cumulative survival probability from now until period k is:
PS(0,k) = P[survival from 0 to k] = (1 − p1)(1 − p2)…(1 − pk)
The cumulative default probability until time k is the probability of defaulting at any point in
time until time k, or PD(0,k) = 1 − PS(0,k) .
4. 4
‘Well-behaved’ matrices
A ‘well-behaved’ matrix should respect some monotonicity requirements:
a) The worst rating classes should present higher default probabilities (data increasing in the
last column of the transition matrix).
b) Transition probabilities should decrease with the increase of the number of notches from
the initial rating class (decreasing probabilities from the principal diagonal to the extremes
of each row). This does not hold for the default event, which is normally more likely than a
downgrade to the worst ratings.
rating finale
Ratinginiziale
a)
+
-
b)
-
b)
c)
-
-
c)
c) Probability of migrating towards a
certain rating should be higher for
the nearest classes (transition
matrix columns decreasing from
the principal diagonal to the
extremes).
final rating
initialrating
5. 5
Markov’ transition matrices for different periods of time
Transition matrices can be used to calculate a
transition matrix for periods other than the basis.
The n-year transition matrix is in fact calculated as
the n-th power of the one year matrix. Not
surprisingly, the probability of a company keeping
the same credit rating over n years is much less
than it is over one year and default probabilities
over n years are much higher than over one year.
The credit rating change over a period less than a
year is not so easy to be calculated. For example,
estimating a transition matrix for six months
involves taking the square root of the one year
matrix; estimating the transition matrix for three
months involves taking the fourth root of the
matrix; and so on.
A stochastic process has the Markov
property if the conditional probability
distribution of future states of the
process depends only upon the present
state; that is, given the present, the
future does not depend on the past
(transitions are dependent only on the
values of the current state, and not on
the previous history of the system up
to that point).
6. 6
The generator of a transition matrix: the homogeneous case
Markov approaches to the estimate of PD’s term structure:
Markov
homogeneous
chain
,..)3,2,1()( ),(
)(
kMDR DRrow
kk
R
)0())(exp( ),(
)(
ttQDR DRrow
t
R continuous
If for a discrete time chain defined by a one
year migration matrix M the Q generator can
be found such that:
then the discrete time chain can be
incorporated into a continuous time chain.
)exp(QM
Q can be defined as a
generator if:
jiq
iq
iq
ij
ii
N
j
ij
0
0
0
1
probability to migrate
from one to another state
does not depend on time
discrete
k-periods transition matrix
transition matrix from 0 to t
7. 7
Some considerations on matrices time homogeneity
The homogeneity hypothesis for transition matrices can be refused for a number of
reasons:
rating models (especially the external agencies’ ones) tend to be not enough
dynamic: the worsening of credit quality is generally not immediately
acknowledged but it is taken into account with a time lag;
Lando et al. observe a lower upgrade probability for counterparts that in
preceeding periods have suffered a downgrade;
persistence of positive and negative phases of the economic cycle.
If the generator is estimated on internal data and used to build the cumulative
and forward probability curves, it’s fundamental to test homogeneity hypothesis.
8. 8
The generator of a transition matrix: the non-homogeneous case
A generator that approximates well on a one-year time
horizon can sometimes generate PDs term structures that
significantly deviate from frequencies observed on longer time
horizons.
homogeneity hypothesis is left apart
QtQt )(time dependent generator
diagonal matrix in which each
element depends on 2 parameters
For each couple of parameters it’s possible to generate a term
structure of cumulated PDs calculating the migration matrices:
and optimising parameters in order to obtain a good fitting with the
observed term structures.
0)(t)exp( tt tQM
Example of a term structure: homogeneity hypothesis
Example of a term structure: non-homogeneity hypothesis
C. Bluhm and L. Overbeck, “Calibration of PD Term Structures: To Be Markov Or Not To Be”, 2006
9. 9
Vintage analysis
Long term default curves can be derived on
observed data. Vintage curves, built on the basis
of the analysis of registered defaults divided by
cohort (time from origination), can trace portfolio
riskiness evolution and underline its changes,
linked to different credit politics and economic
cycle fluctuations.
Vintage curves need long historical time series,
but they are more precise than the markovian
approach.
Vintage curve for a mortgage
Time from origination
LT (lifetime) default curve
High Risk
Medium-High Risk
Medium Risk
Medium-Low Risk
Low Risk
A refinement of vintage analysis can be
realized differentiating the analysis by risk
class, in order to build the long period
default curves that we need for both the
IFSR 9 and the pricing model.
Marginal default curves are typically
strongly differentiated when risk increases.
AIFIRM NEWSLETTER RISK MANAGEMENT MAGAZINE year 8 n. 3-1
10. 10
Vintage analysis vs. Markov approach
• Markov approach implies a mean reversion
phenomenon such that, independently from the
starting rating class, long term PD tends to an
average: for higher rating classes PD increases in
time.
• In the Markov approach the conditional probability
to enter default at time t, being performing at time
t-1, depends only on the stochastic transition
process.
• For corporate portfolios, characterized by a higher
proportion of revolving credits and a plurality of
loans which insist on the same counterpart (thus
smoothing the vintage effect), Markov chains seem
to be more appropriate.
Vintage approach Markov approach
Time from origination Time from origination
Low risk curve
Medium-high risk curve
• In the vintage analysis the typical bell-shaped
form is generally maintained for all rating
classes.
• In the vintage approach the conditional
probability to enter default at time t, being
performing at time t-1, strongly depends on the
credit maturity at time t-1.
• The vintage approach captures risk dynamics
linked to the maturity, which is particularly
relevant for mortgages (and for the other
retail). Mortgages in fact, being both retail and
long term, show a different behaviour according
to the vintage of the loan.
Low risk curve
Medium-high risk curve
AIFIRM NEWSLETTER RISK MANAGEMENT MAGAZINE year 8 n. 3-1
11. 11
Credits
Objective impairment
evidence (loss event)
No objective
impairment evidence
(NO loss event)
Significant activity NON-significant
activity
Analytical
evaluation
Collective
evaluation
Analytical
evaluation
Possibility to evaluate by
homogenous risk classes
New impairment model
The credits impairment process in IAS 39
12. 12
New impairment model
The 3 buckets approach
At origination
credits are
evaluated on
the basis of one
year EL
In the case of «significant
deterioration» of credit
quality we shift to
Lifetime EL
Impairment
(decayed
credits): no
changes One year EL can be treated
with Basel 2 methodologies,
conveniently integrated.
Lifetime EL is
the actual value
of expected
losses until
credit
commitment.
Its computation
requires a PD
term structure.
13. 13
Pricing model and full fair value
Outline of the models
Viceversa, starting from known Spread values (e.g. the market price of a CDS) referred to a certain
rating class and assuming that LGD is known, it’s possible to obtain the expected market return
(implicit Raroc), using the pricing model through the reverse engineering.
PRICING MODEL
INPUT:
PD, LGD,
Raroc (Ke)
OUTPUT:
CREDIT SPREAD
PRICING MODEL
OUTPUT:
IMPLICIT
RAROC
(market)
INPUT:
PD, LGD,
CREDIT SPREAD
Starting from known values of PD, LGD and expected revenue it is possible to apply the pricing
model in a direct way and obtain as an output the CREDIT SPREAD
market value
or
14. 14
Pricing model and full fair value
PD in pricing
Given the counterparty rating, a PD term structure is needed in order to determine Net Present
Value. Two (related) concepts of PD are required:
• Cumulated default probabilities = PD(t)= probability that the counterparty defaults before time t
• Forward default probability =PD(t|t-1)= probability that the counterparty defaults before time t
having survived until t-1.
The relationship between the two probabilities is described by
or
Starting from (monthly in our case) PDFwd matrices it’s possible to determine yearly (and monthly)
PDCum.
PDCum(t)1
PDCum(t)1)PDCum(t
1)tPDFwd(t,
PDCum(t))1(1)tPDFwd(t,PDCum(t)1)PDCum(t
Monthly forward PDs are used in order to calculate each month Expected Loss for all the
operation lenght.
Yearly forward PDs are used in the Regulatory Capital formula or in the Economic Capital
calculation, both having an yearly horizon.
Monthly cumulated PD is used – as its opposite or survival probability (1 – PdCum) – as a
corrective factor in the actualization formula used to calculate Capital and Revenues
15. 15
Pricing model and full fair value
Full fair value evaluation
The pricing is used also to support FULL FAIR VALUE evaluation, which is a fair measure of
banking book assets without a market price.
The estimate applies the following formula, including the credit risk premium as an important
component of the actualization rate:
𝐹𝐹𝑉 =
𝑡=1
𝑛
CF𝑡
(1 + 𝑅𝐹𝑡 + 𝑅𝑃𝑡) 𝑡
where:
FFV = Full Fair Value;
CFt = (assumed) cash flow at time t
RFt = Risk Free market rate for lenght t
RPt= CREDIT RISK PREMIUM for lenght t
The Pricing model allows us to determine the RPt component of the FFV actualization rate,
that is Credit Spread.
16. 16
Pricing model and full fair value
A view on pricing model in ISP
Corporate and Financial Standard & Poor’s matrices, calculated on 1980-2013 data, were used for the
estimate.
Matrices were adjusted to eliminate the unrated column (i.e. credits that were set out of the ratings
sample, e.g. for mergers, debt repayment, etc.).
The transition matrix logarithm was calculated and then a regolarization algorithm was applied,
imposing that negative elements out of the diagonal are equal to 0 and that the diagonal elements are
equal to the sum of the non diagonal elements of the row, with sign -.
Parameter optimization was done so that the default column of multi-year transition matrices could
approximate the empirical term structure of S&P’s default rates.
Internal PDs term structure was calculated through an interpolation among term structures referred to
different S&Ps ratings.
The generator calculated as the logarithm of an empirical matrix poses some problems:
• existence: there are no general conditions assuring that a generator exists and transition matrices
typically have properties precluding its existence (e.g. the presence of elements equal to zero)
• uniqueness: more than a generator can originate the same transition matrix
The matrix obtained as the logarithm of the starting transition matrix needs to be ‘regularized’ so that it
respects the structure a generator should have (sum of each row elements equal to zero and non-negative
out of diagonal numbers): some methods exist in practice to satisfy this purpose.
The same problem of existence and uniqueness exists in the discrete case if we raise the
matrix at a power less than one in order to obtain infra-annual matrices.
17. 17
ECONOMETRIC CREDIT
RISK MODEL DR
MACRO
VARIABLES
Migration Matrices
The model consists in the estimate of the
relationship betweeen the system default rates
(divided by sector) and the macro variables:
Credit quality change (PD) is determined through
establishing a link between the internal one year
migration matrix (differentiated by sector) and the
change in default rates subject to the stress.
Satellite Models
PD
PD MODEL
PD
TIME
TD
𝑇𝐷𝑡,𝑠 = 𝑓 𝑀𝑎𝑐𝑟𝑜𝑉𝑎𝑟𝑖𝑎𝑏𝑙𝑒𝑠 𝑃𝐷𝑡,𝑠 = 𝑓 𝑇𝐷𝑡,𝑠
Stress test
The adopted approach
In the stress model punctual internal transition matrices are used, the last which are available
for the different segments.
18. 18
The starting point is the observed matrix (for the segments: Corporate, Mortgage, SME
retail) 𝑀𝑡−1,𝑡 . The matrix is transformed into 𝑀𝑡,𝑡+1
𝑆
guaranteeing that annual change in
portfolio default rate is equal to system default rate (forecast by the econometric model):
Departure matrix 𝑀𝑡−1,𝑡
A B C D E F D
A
B
C
D
E
F
Arrival matrix 𝑀𝑡,𝑡+1
𝑆
A B C D E F D
A
B
C
D
E
F
Change in the risk factor
common to all rating classes
k is optimized in order to satisfy condition
𝐷𝑅 𝑡+1
𝑝
𝐷𝑅 𝑡
𝑝 = 1 + ∆%𝐷𝑅𝑡+1
𝑆
Shock
Stress test
Transmission of Default Rates to Default Probabilities
The matrix stress is endogenous to the model and depends only indirectly on the
macroeconomic variables. The matrix in fact moves on the basis of a sole underlying factor,
such that percentage change in default rate is equal to the one in the stress scenario.
19. 19
Backtesting
How matrices can be used
Continuous transition matrices are useful to study rare events. If we use Markov chain in the continuous time,
it is possible to catch, inside an established time window (e.g. one year), the existence of indirect defaults
through a downgrade sequence (positive default rates also for investment grade).
0 1 year
D D
D
D
D
D
D
PD 6 months
Using an infra-annual migration matrix allows us to correctly check
the number of defaults which originate in each rating class.
6 months
PD 1 year
20. 20
In order to determine the component of economic capital which is due to longer than one
year maturity , in Intesa Sanpaolo:
Migrations among internal rating classes were used.
A smoothing procedure was applied, in order to guarantee the second requirement of a
‘well-behaved’ matrix, which is not respected by internal matrices. The third requirement,
not respected too, was on the contrary bypassed.
To approximate migrations (as a function of the notching distance from the main
diagonal) an exponential function was used. Migration probabilities were then obtained
through minimization of the squared error with respect to the empirical matrix.
In the process of smoothing the matrix the following limits were imposed:
given an initial rating, the probability to remain in the same class at period end, the upgrade
probability and the downgrade probability were preserved
default probabilities (set equal to the Master scale ones) are not modified
On the basis of one year matrices cumulative default probabilities were calculated
through the application of an homogeneous Markov process. Then there is no guarantee
that these probabilities correctly approximate the cumulative default rates.
Maturity
Its treatment in Economic Capital (ISP case)
21. 21
For the various applications we would need:
an historical time series of internal migration matrices, divided by segment
cumulative default rates on multi-year periods, divided by segment
The available historical time series are relatively short, as:
- In the specific Intesa Sanpaolo case the merger caused a discontinuity in both rating
models and default definition (which is quite common in the banking context).
- The change of regulatory default definition also caused a break.
- The frequent (as it should be) revision or update of rating models implies a difficult
comparison along time. This can be more or less evident for the different rating
segments.
A note on internal availability of data
22. 22
The work on transition matrices is in progress. In this process some points deserve
attention:
some methods should be found to reasonably combine internal and external
data, allowing us to use the maximum available information without neglecting
personalisation; this is useful also for the retail segment, that in principle should
be richer in data (but rating models changes and perimeter changes can pose
comparability problems);
as the time horizon is generally quite long, it is important to forecast default
rates precisely, in order to avoid mistakes that can become relevant in a broad
period.
Some further work on matrices and term structures