SlideShare a Scribd company logo
Shelly Biggs, Senior Business Consultant—Risk Management,
Teradata; Alex Depaoli, Sales Support Engineer, Fuzzy Logix;
SVP of CCAR for a large bank
7.16 EB9456 FINANCIAL SERVICES
Allowance for Loan and Lease Losses:
Organizing Data & Analytics for Success
Beyond Data Aggregation & Compliance Reporting
BANKING ON INFORMATION2
The biggest challenge banks face in computing ALLL is
gathering data from disparate sources or spreadsheets
through manually-intensive, error-prone, and time-
consuming processes. All too often, ALLL production
demands an excessive effort from bank staff—requiring
80-hour work weeks.
Teradata and Fuzzy Logix can help banks greatly
improve the efficiency, accuracy, and regulatory
compliance of their methodology. By organizing the
bank’s ALLL data in a central repository with embedded
analytics, they can enhance the quality, consistency, and
timeliness of their calculations.
ALLL Defined
Expected Losses = Exposure at Default x
Probability of Default x Loss Given Default
The Allowance for Loan and Lease Losses (ALLL)
consists of two pieces: calculated components and
judgmental components. Many institutions use the ALLL
models in business as usual planning as well as DFAST/
CCAR/BCAR and BCBS 239 requirements. Additionally
IFRS9 compliance places additional requirements for
loss recognition.
Executive Summary
The Allowance for Loan and Lease Losses (ALLL) is one of the most critical line items on a bank’s financial
statement—and a focus of intense regulatory scrutiny. Examiners seek to ensure that a bank’s ALLL method-
ology is compliant with a number of key regulatory provisions, including:
• Staying abreast of ALLL regulatory changes and their impact on the reserve
• Evaluating impaired loans according to ASC 310-10-35 (FAS 114) and applying the correct impairment analysis
• Applying collateral value, present value of cash flows or Loan pricing methods
• Ensuring appropriate ACS 450-20 (FAS 5) methodology, with accurate loss rates and qualitative factor
adjustments for each pool
• Ensuring compliance with IFRS9 loss recognition
• Providing adequate documentation, and satisfying ASU 2010-20 disclosure reporting requirements
TERADATA.COM3
Scope of Data Required
Original transaction data at the atomic loan level, along
with slowly changing data (e.g., payment history and risk
ratings) across all portfolios, must be used for regulatory
ALLL calculations. They must also be used as the
foundation for other regulatory calculations, including
BCBS 239 and Comprehensive Capital Analysis and Review
(CCAR). Unfortunately, this data is usually distributed
across multiple databases and divisions within the bank,
making it difficult to integrate for timely calculations.
Additionally, reconciliation with the General Ledger is
difficult since it provides only aggregated views, and is
rarely able to trace back to full data detail at the source.
To build high-quality models, the regulatory burden of
data quality and data governance is very high for this
foundational regulatory and transactional data. Once
the data quality is sufficient, models can be developed
that stack upon results from other models. Judgment
factors can also be applied with reasonable confidence
in the security of capital adequacy, and consequent
government approvals.
Scope of Models Required
Given the present portfolio, ALLL models must reflect the
expected scenario over the next one-year horizon.
New requirements under the principles-based IFRS9
intensify the requirement for strong internal data for all
components of the balance sheet. The loss forecasting
will have more complex assessments of cash flows
calculations, asset class categorization, balances and loss
measurements (including lifetime expected credit losses).
For use in Dodd Frank Annual Stress Testing (DFAST)
and CCAR, regulators provide scenarios with multiple key
economic variables to reflect the implications of various
risk factors on capital security. Bankers calculate the
impact of the variables on key metrics, such as starting
balances and exposure on revolving credit. ALLL models
must perform under multiple scenarios, including multiple
baseline and stress scenarios.
To meet the many new regulatory requirements, ALLL
model development requires a long history of reliable
data—and the necessary variables for loss estimation and
loss drivers that are responsive to economic scenarios.
Additionally, a nimble infrastructure is required for quick
updates of loss forecasting to address ad hoc requests
from management or regulators, or to quickly reassess
loss forecasts due to economic or market shifts. The
process may include the following steps:
•• Identify model development data requirements and
available data sources
•• Assess data sources and potential uses in model
development
•• Model development includes several key steps that
are dependent on having sufficient, complete, and
accurate data—as well as reliable and efficient model
development platforms:
–	 Identification of appropriate model segmentation
(e.g., along lines of asset classes) that is supported
by the historical data, statistical analysis, and
business segments
–	 Definition and calculation of the outcome to be
measured (e.g., lifetime losses) on historical data,
and where proxies are required
–	 Model methodology selection, such as the type of
regression, simulation approaches, use/combination
of multiple models, and use of expert judgment
–	 Gather reliable candidate variables by classes,
such as customer characteristics/performance,
portfolio characteristics, and economic, sector or
market drivers
Data Aggregation Process and
System Constraints
Data—when aggregated and reconciled to the
General Ledger, and specific to account level—can
be useful for many business purposes. Marketing,
regulatory reporting, profiling risk by portfolio,
CCAR/DFAS/BCAR, and most other business
functions benefit from this reconciled, consis-
tent, and detailed data. This improves regulatory
response time, and satisfies Regulator require-
ments for consistent process and data quality
(BCBS 239).
BANKING ON INFORMATION4
–	 Variables selected among classes of variables,
including statistical selection procedures and
business review (often an iterative process)
–	 Model fit / final model design within selected
methodology, variable selection, and additional
analyses deployed for the model design process
–	 Model statistical testing for key model performance
requirements
–	 Model sensitivity analysis, outcomes analysis / back-
testing, and benchmarking of final model
–	 Final model documentation of all data attributed,
decisions and supporting analysis
•• Independent model validation to include conceptual
soundness review, data review, statistical
testing, sensitivity analysis, outcomes analysis,
and benchmarking. For independent validation
methodology review and benchmark, models are
critical to challenge the model design choices made by
the developer
•• Model implementation and production, including
accurate and efficient data delivery and coding. There
are increasing requirements for:
–	 Fast implementation of model enhancements and/
or updates
–	 Movement, merging, and transforming of multiple
large sets of data
–	 Fast production of results with strong control
systems to ensure reliability of results
How it Works Today
The problems with compiling a sufficient, complete, and
accurate dataset for model development are fundamental
to model development. There are often many distributed
datasets with varying data accuracy and completeness of
available variables and history.
Organizationally, there may be many locally-maintained
databases aggregating the same source data in
different manners. Additionally, different groups
or divisions may be conducting slightly different
calculations of losses—often due to different business
uses, different interpretations of the requirements,
or different uses of judgment in the final estimates/
calculations. These varied local databases and
calculations are not easily integrated or reconciled,
TERADATA.COM5
making the presentation of a final “correct” answer
(or a single version of the truth) challenging or even
impossible under the current architecture.
Vision for a Streamlined ALLL Process
The ideal state is the integration of data in a dynamic
manner from source systems that make it easily
accessible for real-time calculations—while supporting
enhancements to calculations and models, as needed.
This provides for loss models and forecasting that
are based on well-defined business and regulatory
requirements; can be quickly adapted, redeveloped, or
enhanced to changing requirements or business needs (in
efficient, controlled and reliable platforms); can be quickly
implemented and aggregated into production systems;
and produce reliable results with limited overhead.
Data modeling, forecasting, and reporting need to be nimble
with respect to changes in a wide variety of regulatory,
business, market, and economic factors, including:
•• Lending policies and procedures, including changes
in underwriting standards and collection, charge-off,
and recovery practices not considered elsewhere in
estimating credit losses
•• International, national, regional,
and local economic and business
conditions and developments that
effect cash flows and the collectability
of a portfolio, including the condition
of various market segments
•• Nature and volume of a portfolio and
in terms of loans and distribution
across different segments and asset
classes, which can be increasingly
complex under IFRS9
•• Experience, ability, and depth of
lending management and other
relevant staff—including their
ability to define and refine portfolio
strategies and apply expert judgment
•• Volume and severity of past due
loans, the volume of nonaccrual
loans, and the volume and severity of
adversely classified or graded loans
•• Quality of the institution’s loan review system, rating
philosophy, and ratings migration observed historically
and currently
•• Value of underlying collateral for collateral-dependent
loans, and any changes in rules that impact fair value
recognition under different scenarios or unexpected
economic or market changes
•• Existence and effect of any concentrations of credit,
and changes in the level of such concentrations
•• Effect of other external factors such as competition,
economic, industry, legal and regulatory requirements
on the level of estimated credit losses in the
institution’s existing portfolio
Simulation of results based on multiple and substantially
different economic scenarios are often inefficient and
Hypothetical Bank Risk Rating Scheme
Figure 1. For illustration purposes only.
The ideal state is the integration of data in a
dynamic manner from source systems that make
it easily accessible for real-time calculations—
while supporting enhancements to calculations
and models, as needed.
Bank
Rating
Regulatory
Classification
Rating Agency
Equivalent
Expected
Default Rate (bps)
1 Pass Aaa 0
2 Pass Aa 2
3 Pass A 2
4 Pass Baa 12
5 Pass Ba 72
6 Pass B 304
7 Special Mention Caa 1335
8 Sub-Standard Caa 1335
9 Doubtful Caa 1335
10 Loss Caa 1335
BANKING ON INFORMATION6
prone to error given the often-times many manual
steps involved with the process. When economic
drivers feed simulations, Fuzzy Logix provides the
flexibility to incorporate parameters, starting values,
and adjustments that feed Monte Carlo simulations to
help refine human judgment with calculated spectrum of
possible and likely outcomes.
For example, internal ratings are developed based on
internally developed models for risk ratings. These risk
ratings are correlated to reference model Probability of
Default (PD) ratings from agencies such as Moody’s, SP,
and Fitch.
Figure 1 illustrates a risk rating scale, how it converts to
the corresponding regulatory classification, and how it
evolves to Moody’s and other ratings through efficient
simulations based on multiple scenarios.
For ALLL and loss forecasting, it is imperative to lock
down an end-to-end sequence of data feeds and
calculations for transparent, consistent, and validated
results. This allows banks to include historical and current
state information in the required demonstrable processes,
while supporting consistent and timely insight to assist
in re-calibrating quantitative factors—and verifying
qualitative factors with real-time performance metrics.
This also allows banks to support consistent and timely
insight to assist in re-calibrating judgmental factors. See
Figure 2.
Fuzzy Logix Capabilities
Fuzzy Logix DBLytix® software is a library of analytic
functions that are installed and run in the Teradata
platform, leveraging the massively parallel processing
architecture of Teradata to deliver unsurpassed
processing times for analytics. DBlytix is invoked within
the SQL environment, making it accessible to the
organization’s business analysts who are familiar with the
world’s most popular business querying language.
Embedding analytics with the data in one environment
means that time and expense aren’t wasted on moving
data to specialized computing environments. Data
governance and security are simplified, and time isn’t
spent trying to aggregate modeling components
developed in remote environments. Layered modeling
tasks like ALLL and CCAR are performed under one
roof, meaning that downstream dependencies can
run just as soon as upstream results are computed,
eliminating the wait time for hand-offs. Models and
Proposed Teradata Calibration Tool for
Judgmental Portion of ALLL
Discretionary or Judgmental Attributes
1. Changes in lending policies and procedures,
including changes in underwriting standards and
collection, charge-off, and recovery practices not
considered elsewhere in estimating credit losses.
2. Changes in international, national, regional, and
local economic and business conditions and
developments that affect the collectability of
the portfolio, including the condition of various
market segments.
3. Changes in the nature and volume of the
portfolio and in the terms of loans.
4. Changes in the experience, ability, and depth of
lending management and other relevant staff.
5. Changes in the volume and severity of past due
loans, the volume of nonaccrual loans and the
volume and severity of adversely classified or
graded loans.
6. Changes in the quality of the institution’s loan
review system.
7. Changes in the value of underlying collateral for
collateral-dependent loans.
8. The existence and effect of any concentrations
of credit, and changes in the level of such
concentrations.
9. The effect of other external factors such
as competition and legal and regulatory
requirements on the level of estimated credit
losses in the institution’s existing portfolio.
Figure 2
algorithms can be hard-coded into the Teradata
appliance to ensure that querying and analysis are
performed in a seamless process—and with guaranteed
consistency and accuracy.
More efficient processes mean that multiple scenarios can
be explored in the same amount of time it would take to
run just one, deepening insights and resulting in improved
final answers.
Conclusion
Teradata and Fuzzy Logix offer the tools needed to
rationalize the disparate parts of the ALLL process into
a streamlined whole, greatly improving consistency,
timeliness, and accuracy. By integrating data and
analytics into a central repository, banks have an
unmatched ability to ensure the quality, transparency,
and control of the process—pleasing regulators and
making the ALLL process more manageable for
everyone involved. Shortened process times give banks
greater flexibility to analyze emerging risks, and provide
a much needed edge for gaining regulatory approval for
annual capital plans.
Together, Teradata and Fuzzy Logix can help with
detailed solution development—either onsite or analytics-
as-a-service—including data integration, calculation
algorithms, and process management. Data movement
can be expensive and slow, and introduce error. Teradata
and Fuzzy Logix can help your organization with more
efficient and consistent data, analytics, and processes for
ALLL and related CCAR requirements.
To learn more about how your organization can leverage
a sophisticated analytics platform to uncover valuable
business insights—and simplify compliance with
regulatory demands—visit teradata.com/contact-us.
10000 Innovation Drive, Dayton, OH 45342    Teradata.com
Teradata and the Teradata logo are registered trademarks of Teradata Corporation and/or its affiliates in the U.S. and worldwide. Teradata continually improves products as
new technologies and components become available. Teradata, therefore, reserves the right to change specifications without prior notice. All features, functions and operations
described herein may not be marketed in all parts of the world. Consult your Teradata representative or Teradata.com for more information.
Copyright © 2016 by Teradata Corporation    All Rights Reserved.    Produced in U.S.A.
7.16 EB9456
About the Authors
Shelly Biggs, Senior Business Consultant—
Risk Management, Teradata
Over three decades experience in the financial services
industry with emphasis in Enterprise Risk Management,
including risk rating process; underwriting and loan review;
KPI reporting; credit risk  portfolio monitoring; data
integration and decision sciences; regulatory reporting and
analytics; allowance for loan loss, CCAR, and Dodd Frank
Stress Testing.
Alex Depaoli, Sales Support Engineer, Fuzzy Logix
Over two decades experience in business analytics spanning
marketing, credit strategy development, risk scoreboard
development, loss forecasting  capital markets arbitrage
with a focus on financial services.

More Related Content

What's hot

James hall ch 13
James hall ch 13James hall ch 13
James hall ch 13
David Julian
 
Mis management information system
Mis   management information systemMis   management information system
Mis management information system
Monikadubey30
 
Regulatory and accounting compliance
Regulatory and accounting complianceRegulatory and accounting compliance
Regulatory and accounting compliance
Altan Atabarut, MSc.
 
London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239
London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239
London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239
Greg Soulsby
 
CRIF IFRS9 Solution- Not just for your CFO
CRIF IFRS9 Solution- Not just for your CFOCRIF IFRS9 Solution- Not just for your CFO
CRIF IFRS9 Solution- Not just for your CFO
Lawrence Billson
 
Business Intelligence System and instrumental level multi dimensional database
Business Intelligence System and instrumental level multi dimensional database Business Intelligence System and instrumental level multi dimensional database
Business Intelligence System and instrumental level multi dimensional database
Rolta
 
Industrializing Zero Application Maintenance
Industrializing Zero Application MaintenanceIndustrializing Zero Application Maintenance
Industrializing Zero Application Maintenance
Cognizant
 
Creating a Unified Data Strategy for Risk-Adjusted Payments
Creating a Unified Data Strategy for Risk-Adjusted PaymentsCreating a Unified Data Strategy for Risk-Adjusted Payments
Creating a Unified Data Strategy for Risk-Adjusted Payments
Cognizant
 
Optimizing IT Operations with Natural Language Processing
Optimizing IT Operations with Natural Language ProcessingOptimizing IT Operations with Natural Language Processing
Optimizing IT Operations with Natural Language Processing
Cognizant
 
Data governance guide
Data governance guideData governance guide
Data governance guide
CenapSerdarolu
 
Independent models validation and automation
Independent models validation and automationIndependent models validation and automation
Independent models validation and automation
Sohail_farooq
 
Asset Information: Addressing 21st Century Challenges
Asset Information: Addressing 21st Century ChallengesAsset Information: Addressing 21st Century Challenges
Asset Information: Addressing 21st Century Challenges
Cognizant
 
Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...
Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...
Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...
Cognizant
 
Meradia investment performance_systems
Meradia investment performance_systemsMeradia investment performance_systems
Meradia investment performance_systems
Meradia Group
 
Government Financial Management System Of Tomorrow
Government Financial Management System Of TomorrowGovernment Financial Management System Of Tomorrow
Government Financial Management System Of Tomorrow
FreeBalance
 
James hall ch 2
James hall ch 2James hall ch 2
James hall ch 2
David Julian
 
Vivek cv
Vivek cvVivek cv
Vivek cv
Vivek Cholera
 

What's hot (20)

James hall ch 13
James hall ch 13James hall ch 13
James hall ch 13
 
Mis management information system
Mis   management information systemMis   management information system
Mis management information system
 
Regulatory and accounting compliance
Regulatory and accounting complianceRegulatory and accounting compliance
Regulatory and accounting compliance
 
London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239
London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239
London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239
 
CRIF IFRS9 Solution- Not just for your CFO
CRIF IFRS9 Solution- Not just for your CFOCRIF IFRS9 Solution- Not just for your CFO
CRIF IFRS9 Solution- Not just for your CFO
 
Business Intelligence System and instrumental level multi dimensional database
Business Intelligence System and instrumental level multi dimensional database Business Intelligence System and instrumental level multi dimensional database
Business Intelligence System and instrumental level multi dimensional database
 
Industrializing Zero Application Maintenance
Industrializing Zero Application MaintenanceIndustrializing Zero Application Maintenance
Industrializing Zero Application Maintenance
 
Creating a Unified Data Strategy for Risk-Adjusted Payments
Creating a Unified Data Strategy for Risk-Adjusted PaymentsCreating a Unified Data Strategy for Risk-Adjusted Payments
Creating a Unified Data Strategy for Risk-Adjusted Payments
 
Optimizing IT Operations with Natural Language Processing
Optimizing IT Operations with Natural Language ProcessingOptimizing IT Operations with Natural Language Processing
Optimizing IT Operations with Natural Language Processing
 
Data governance guide
Data governance guideData governance guide
Data governance guide
 
Information System Plan
Information System PlanInformation System Plan
Information System Plan
 
Independent models validation and automation
Independent models validation and automationIndependent models validation and automation
Independent models validation and automation
 
Asset Information: Addressing 21st Century Challenges
Asset Information: Addressing 21st Century ChallengesAsset Information: Addressing 21st Century Challenges
Asset Information: Addressing 21st Century Challenges
 
Information System Plan
Information System PlanInformation System Plan
Information System Plan
 
Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...
Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...
Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...
 
Meradia investment performance_systems
Meradia investment performance_systemsMeradia investment performance_systems
Meradia investment performance_systems
 
Government Financial Management System Of Tomorrow
Government Financial Management System Of TomorrowGovernment Financial Management System Of Tomorrow
Government Financial Management System Of Tomorrow
 
Chapter 02
Chapter 02Chapter 02
Chapter 02
 
James hall ch 2
James hall ch 2James hall ch 2
James hall ch 2
 
Vivek cv
Vivek cvVivek cv
Vivek cv
 

Viewers also liked (7)

P7 internal audit
P7 internal auditP7 internal audit
P7 internal audit
 
Outsourcing
OutsourcingOutsourcing
Outsourcing
 
The MMS quiz prelims
The MMS quiz prelimsThe MMS quiz prelims
The MMS quiz prelims
 
P7 reports to management
P7 reports to managementP7 reports to management
P7 reports to management
 
The mms quiz prelims answers
The mms quiz prelims  answersThe mms quiz prelims  answers
The mms quiz prelims answers
 
Audit p7 completion
Audit p7 completionAudit p7 completion
Audit p7 completion
 
P7 forensic auditing
P7 forensic auditingP7 forensic auditing
P7 forensic auditing
 

Similar to ALLL FZL and TD

eBook Spreadsheet to WebAPP
eBook Spreadsheet to WebAPPeBook Spreadsheet to WebAPP
eBook Spreadsheet to WebAPP
Abhishek Ranjan
 
Financial Modeling
Financial ModelingFinancial Modeling
Financial Modeling
Abhishek Ranjan
 
Considerations for an Effective Internal Model Method Implementation
Considerations for an Effective Internal Model Method ImplementationConsiderations for an Effective Internal Model Method Implementation
Considerations for an Effective Internal Model Method Implementation
accenture
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
JAMES OKARIMIA
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
JAMES OKARIMIA
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
JAMES OKARIMIA
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
JAMES OKARIMIA
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
JAMES OKARIMIA
 
dt_mt_SREP_Pub_BCBS239_200216lo
dt_mt_SREP_Pub_BCBS239_200216lodt_mt_SREP_Pub_BCBS239_200216lo
dt_mt_SREP_Pub_BCBS239_200216loMark Micallef
 
SD Basel process automation seminar presentation
SD Basel process automation seminar presentationSD Basel process automation seminar presentation
SD Basel process automation seminar presentationsarojkdas
 
James Okarimia - Aligning Finance , Risk and Data Analytics in Meeting the R...
James Okarimia -  Aligning Finance , Risk and Data Analytics in Meeting the R...James Okarimia -  Aligning Finance , Risk and Data Analytics in Meeting the R...
James Okarimia - Aligning Finance , Risk and Data Analytics in Meeting the R...
JAMES OKARIMIA
 
James Okarimia - Aligning Finance, Risk and Data Analytics in Meeting the Req...
James Okarimia - Aligning Finance, Risk and Data Analytics in Meeting the Req...James Okarimia - Aligning Finance, Risk and Data Analytics in Meeting the Req...
James Okarimia - Aligning Finance, Risk and Data Analytics in Meeting the Req...
JAMES OKARIMIA
 
James Okarimia Aligning Finance , Risk and Compliance to Meet Regulation
James Okarimia   Aligning Finance , Risk and Compliance to Meet RegulationJames Okarimia   Aligning Finance , Risk and Compliance to Meet Regulation
James Okarimia Aligning Finance , Risk and Compliance to Meet Regulation
JAMES OKARIMIA
 
James Okarimia Aligning Finance , Risk and Compliance to Meet Regulation
James Okarimia   Aligning Finance , Risk and Compliance to Meet RegulationJames Okarimia   Aligning Finance , Risk and Compliance to Meet Regulation
James Okarimia Aligning Finance , Risk and Compliance to Meet Regulation
JAMES OKARIMIA
 
Leveraging Data in Financial Services to Meet Regulatory Requirements and Cre...
Leveraging Data in Financial Services to Meet Regulatory Requirements and Cre...Leveraging Data in Financial Services to Meet Regulatory Requirements and Cre...
Leveraging Data in Financial Services to Meet Regulatory Requirements and Cre...
Perficient, Inc.
 
PwC CECL Overview Placemat_Final
PwC CECL Overview Placemat_FinalPwC CECL Overview Placemat_Final
PwC CECL Overview Placemat_FinalBen Havird
 
Pragmatic Approach to Datawarehouse Testing_.docx
Pragmatic Approach to Datawarehouse Testing_.docxPragmatic Approach to Datawarehouse Testing_.docx
Pragmatic Approach to Datawarehouse Testing_.docx
shankarmani
 
rahul cv modified
rahul cv modifiedrahul cv modified
rahul cv modifiedRahul Patil
 

Similar to ALLL FZL and TD (20)

eBook Spreadsheet to WebAPP
eBook Spreadsheet to WebAPPeBook Spreadsheet to WebAPP
eBook Spreadsheet to WebAPP
 
Financial Modeling
Financial ModelingFinancial Modeling
Financial Modeling
 
Considerations for an Effective Internal Model Method Implementation
Considerations for an Effective Internal Model Method ImplementationConsiderations for an Effective Internal Model Method Implementation
Considerations for an Effective Internal Model Method Implementation
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
 
Aligning finance , risk and compliance
Aligning finance , risk and complianceAligning finance , risk and compliance
Aligning finance , risk and compliance
 
dt_mt_SREP_Pub_BCBS239_200216lo
dt_mt_SREP_Pub_BCBS239_200216lodt_mt_SREP_Pub_BCBS239_200216lo
dt_mt_SREP_Pub_BCBS239_200216lo
 
SD Basel process automation seminar presentation
SD Basel process automation seminar presentationSD Basel process automation seminar presentation
SD Basel process automation seminar presentation
 
James Okarimia - Aligning Finance , Risk and Data Analytics in Meeting the R...
James Okarimia -  Aligning Finance , Risk and Data Analytics in Meeting the R...James Okarimia -  Aligning Finance , Risk and Data Analytics in Meeting the R...
James Okarimia - Aligning Finance , Risk and Data Analytics in Meeting the R...
 
James Okarimia - Aligning Finance, Risk and Data Analytics in Meeting the Req...
James Okarimia - Aligning Finance, Risk and Data Analytics in Meeting the Req...James Okarimia - Aligning Finance, Risk and Data Analytics in Meeting the Req...
James Okarimia - Aligning Finance, Risk and Data Analytics in Meeting the Req...
 
James Okarimia Aligning Finance , Risk and Compliance to Meet Regulation
James Okarimia   Aligning Finance , Risk and Compliance to Meet RegulationJames Okarimia   Aligning Finance , Risk and Compliance to Meet Regulation
James Okarimia Aligning Finance , Risk and Compliance to Meet Regulation
 
James Okarimia Aligning Finance , Risk and Compliance to Meet Regulation
James Okarimia   Aligning Finance , Risk and Compliance to Meet RegulationJames Okarimia   Aligning Finance , Risk and Compliance to Meet Regulation
James Okarimia Aligning Finance , Risk and Compliance to Meet Regulation
 
Leveraging Data in Financial Services to Meet Regulatory Requirements and Cre...
Leveraging Data in Financial Services to Meet Regulatory Requirements and Cre...Leveraging Data in Financial Services to Meet Regulatory Requirements and Cre...
Leveraging Data in Financial Services to Meet Regulatory Requirements and Cre...
 
JZacharkan-RES2016
JZacharkan-RES2016JZacharkan-RES2016
JZacharkan-RES2016
 
PwC CECL Overview Placemat_Final
PwC CECL Overview Placemat_FinalPwC CECL Overview Placemat_Final
PwC CECL Overview Placemat_Final
 
Pragmatic Approach to Datawarehouse Testing_.docx
Pragmatic Approach to Datawarehouse Testing_.docxPragmatic Approach to Datawarehouse Testing_.docx
Pragmatic Approach to Datawarehouse Testing_.docx
 
rahul cv modified
rahul cv modifiedrahul cv modified
rahul cv modified
 
Javed Siddiqi
Javed SiddiqiJaved Siddiqi
Javed Siddiqi
 

ALLL FZL and TD

  • 1. Shelly Biggs, Senior Business Consultant—Risk Management, Teradata; Alex Depaoli, Sales Support Engineer, Fuzzy Logix; SVP of CCAR for a large bank 7.16 EB9456 FINANCIAL SERVICES Allowance for Loan and Lease Losses: Organizing Data & Analytics for Success Beyond Data Aggregation & Compliance Reporting
  • 2. BANKING ON INFORMATION2 The biggest challenge banks face in computing ALLL is gathering data from disparate sources or spreadsheets through manually-intensive, error-prone, and time- consuming processes. All too often, ALLL production demands an excessive effort from bank staff—requiring 80-hour work weeks. Teradata and Fuzzy Logix can help banks greatly improve the efficiency, accuracy, and regulatory compliance of their methodology. By organizing the bank’s ALLL data in a central repository with embedded analytics, they can enhance the quality, consistency, and timeliness of their calculations. ALLL Defined Expected Losses = Exposure at Default x Probability of Default x Loss Given Default The Allowance for Loan and Lease Losses (ALLL) consists of two pieces: calculated components and judgmental components. Many institutions use the ALLL models in business as usual planning as well as DFAST/ CCAR/BCAR and BCBS 239 requirements. Additionally IFRS9 compliance places additional requirements for loss recognition. Executive Summary The Allowance for Loan and Lease Losses (ALLL) is one of the most critical line items on a bank’s financial statement—and a focus of intense regulatory scrutiny. Examiners seek to ensure that a bank’s ALLL method- ology is compliant with a number of key regulatory provisions, including: • Staying abreast of ALLL regulatory changes and their impact on the reserve • Evaluating impaired loans according to ASC 310-10-35 (FAS 114) and applying the correct impairment analysis • Applying collateral value, present value of cash flows or Loan pricing methods • Ensuring appropriate ACS 450-20 (FAS 5) methodology, with accurate loss rates and qualitative factor adjustments for each pool • Ensuring compliance with IFRS9 loss recognition • Providing adequate documentation, and satisfying ASU 2010-20 disclosure reporting requirements
  • 3. TERADATA.COM3 Scope of Data Required Original transaction data at the atomic loan level, along with slowly changing data (e.g., payment history and risk ratings) across all portfolios, must be used for regulatory ALLL calculations. They must also be used as the foundation for other regulatory calculations, including BCBS 239 and Comprehensive Capital Analysis and Review (CCAR). Unfortunately, this data is usually distributed across multiple databases and divisions within the bank, making it difficult to integrate for timely calculations. Additionally, reconciliation with the General Ledger is difficult since it provides only aggregated views, and is rarely able to trace back to full data detail at the source. To build high-quality models, the regulatory burden of data quality and data governance is very high for this foundational regulatory and transactional data. Once the data quality is sufficient, models can be developed that stack upon results from other models. Judgment factors can also be applied with reasonable confidence in the security of capital adequacy, and consequent government approvals. Scope of Models Required Given the present portfolio, ALLL models must reflect the expected scenario over the next one-year horizon. New requirements under the principles-based IFRS9 intensify the requirement for strong internal data for all components of the balance sheet. The loss forecasting will have more complex assessments of cash flows calculations, asset class categorization, balances and loss measurements (including lifetime expected credit losses). For use in Dodd Frank Annual Stress Testing (DFAST) and CCAR, regulators provide scenarios with multiple key economic variables to reflect the implications of various risk factors on capital security. Bankers calculate the impact of the variables on key metrics, such as starting balances and exposure on revolving credit. ALLL models must perform under multiple scenarios, including multiple baseline and stress scenarios. To meet the many new regulatory requirements, ALLL model development requires a long history of reliable data—and the necessary variables for loss estimation and loss drivers that are responsive to economic scenarios. Additionally, a nimble infrastructure is required for quick updates of loss forecasting to address ad hoc requests from management or regulators, or to quickly reassess loss forecasts due to economic or market shifts. The process may include the following steps: •• Identify model development data requirements and available data sources •• Assess data sources and potential uses in model development •• Model development includes several key steps that are dependent on having sufficient, complete, and accurate data—as well as reliable and efficient model development platforms: – Identification of appropriate model segmentation (e.g., along lines of asset classes) that is supported by the historical data, statistical analysis, and business segments – Definition and calculation of the outcome to be measured (e.g., lifetime losses) on historical data, and where proxies are required – Model methodology selection, such as the type of regression, simulation approaches, use/combination of multiple models, and use of expert judgment – Gather reliable candidate variables by classes, such as customer characteristics/performance, portfolio characteristics, and economic, sector or market drivers Data Aggregation Process and System Constraints Data—when aggregated and reconciled to the General Ledger, and specific to account level—can be useful for many business purposes. Marketing, regulatory reporting, profiling risk by portfolio, CCAR/DFAS/BCAR, and most other business functions benefit from this reconciled, consis- tent, and detailed data. This improves regulatory response time, and satisfies Regulator require- ments for consistent process and data quality (BCBS 239).
  • 4. BANKING ON INFORMATION4 – Variables selected among classes of variables, including statistical selection procedures and business review (often an iterative process) – Model fit / final model design within selected methodology, variable selection, and additional analyses deployed for the model design process – Model statistical testing for key model performance requirements – Model sensitivity analysis, outcomes analysis / back- testing, and benchmarking of final model – Final model documentation of all data attributed, decisions and supporting analysis •• Independent model validation to include conceptual soundness review, data review, statistical testing, sensitivity analysis, outcomes analysis, and benchmarking. For independent validation methodology review and benchmark, models are critical to challenge the model design choices made by the developer •• Model implementation and production, including accurate and efficient data delivery and coding. There are increasing requirements for: – Fast implementation of model enhancements and/ or updates – Movement, merging, and transforming of multiple large sets of data – Fast production of results with strong control systems to ensure reliability of results How it Works Today The problems with compiling a sufficient, complete, and accurate dataset for model development are fundamental to model development. There are often many distributed datasets with varying data accuracy and completeness of available variables and history. Organizationally, there may be many locally-maintained databases aggregating the same source data in different manners. Additionally, different groups or divisions may be conducting slightly different calculations of losses—often due to different business uses, different interpretations of the requirements, or different uses of judgment in the final estimates/ calculations. These varied local databases and calculations are not easily integrated or reconciled,
  • 5. TERADATA.COM5 making the presentation of a final “correct” answer (or a single version of the truth) challenging or even impossible under the current architecture. Vision for a Streamlined ALLL Process The ideal state is the integration of data in a dynamic manner from source systems that make it easily accessible for real-time calculations—while supporting enhancements to calculations and models, as needed. This provides for loss models and forecasting that are based on well-defined business and regulatory requirements; can be quickly adapted, redeveloped, or enhanced to changing requirements or business needs (in efficient, controlled and reliable platforms); can be quickly implemented and aggregated into production systems; and produce reliable results with limited overhead. Data modeling, forecasting, and reporting need to be nimble with respect to changes in a wide variety of regulatory, business, market, and economic factors, including: •• Lending policies and procedures, including changes in underwriting standards and collection, charge-off, and recovery practices not considered elsewhere in estimating credit losses •• International, national, regional, and local economic and business conditions and developments that effect cash flows and the collectability of a portfolio, including the condition of various market segments •• Nature and volume of a portfolio and in terms of loans and distribution across different segments and asset classes, which can be increasingly complex under IFRS9 •• Experience, ability, and depth of lending management and other relevant staff—including their ability to define and refine portfolio strategies and apply expert judgment •• Volume and severity of past due loans, the volume of nonaccrual loans, and the volume and severity of adversely classified or graded loans •• Quality of the institution’s loan review system, rating philosophy, and ratings migration observed historically and currently •• Value of underlying collateral for collateral-dependent loans, and any changes in rules that impact fair value recognition under different scenarios or unexpected economic or market changes •• Existence and effect of any concentrations of credit, and changes in the level of such concentrations •• Effect of other external factors such as competition, economic, industry, legal and regulatory requirements on the level of estimated credit losses in the institution’s existing portfolio Simulation of results based on multiple and substantially different economic scenarios are often inefficient and Hypothetical Bank Risk Rating Scheme Figure 1. For illustration purposes only. The ideal state is the integration of data in a dynamic manner from source systems that make it easily accessible for real-time calculations— while supporting enhancements to calculations and models, as needed. Bank Rating Regulatory Classification Rating Agency Equivalent Expected Default Rate (bps) 1 Pass Aaa 0 2 Pass Aa 2 3 Pass A 2 4 Pass Baa 12 5 Pass Ba 72 6 Pass B 304 7 Special Mention Caa 1335 8 Sub-Standard Caa 1335 9 Doubtful Caa 1335 10 Loss Caa 1335
  • 6. BANKING ON INFORMATION6 prone to error given the often-times many manual steps involved with the process. When economic drivers feed simulations, Fuzzy Logix provides the flexibility to incorporate parameters, starting values, and adjustments that feed Monte Carlo simulations to help refine human judgment with calculated spectrum of possible and likely outcomes. For example, internal ratings are developed based on internally developed models for risk ratings. These risk ratings are correlated to reference model Probability of Default (PD) ratings from agencies such as Moody’s, SP, and Fitch. Figure 1 illustrates a risk rating scale, how it converts to the corresponding regulatory classification, and how it evolves to Moody’s and other ratings through efficient simulations based on multiple scenarios. For ALLL and loss forecasting, it is imperative to lock down an end-to-end sequence of data feeds and calculations for transparent, consistent, and validated results. This allows banks to include historical and current state information in the required demonstrable processes, while supporting consistent and timely insight to assist in re-calibrating quantitative factors—and verifying qualitative factors with real-time performance metrics. This also allows banks to support consistent and timely insight to assist in re-calibrating judgmental factors. See Figure 2. Fuzzy Logix Capabilities Fuzzy Logix DBLytix® software is a library of analytic functions that are installed and run in the Teradata platform, leveraging the massively parallel processing architecture of Teradata to deliver unsurpassed processing times for analytics. DBlytix is invoked within the SQL environment, making it accessible to the organization’s business analysts who are familiar with the world’s most popular business querying language. Embedding analytics with the data in one environment means that time and expense aren’t wasted on moving data to specialized computing environments. Data governance and security are simplified, and time isn’t spent trying to aggregate modeling components developed in remote environments. Layered modeling tasks like ALLL and CCAR are performed under one roof, meaning that downstream dependencies can run just as soon as upstream results are computed, eliminating the wait time for hand-offs. Models and Proposed Teradata Calibration Tool for Judgmental Portion of ALLL Discretionary or Judgmental Attributes 1. Changes in lending policies and procedures, including changes in underwriting standards and collection, charge-off, and recovery practices not considered elsewhere in estimating credit losses. 2. Changes in international, national, regional, and local economic and business conditions and developments that affect the collectability of the portfolio, including the condition of various market segments. 3. Changes in the nature and volume of the portfolio and in the terms of loans. 4. Changes in the experience, ability, and depth of lending management and other relevant staff. 5. Changes in the volume and severity of past due loans, the volume of nonaccrual loans and the volume and severity of adversely classified or graded loans. 6. Changes in the quality of the institution’s loan review system. 7. Changes in the value of underlying collateral for collateral-dependent loans. 8. The existence and effect of any concentrations of credit, and changes in the level of such concentrations. 9. The effect of other external factors such as competition and legal and regulatory requirements on the level of estimated credit losses in the institution’s existing portfolio. Figure 2
  • 7. algorithms can be hard-coded into the Teradata appliance to ensure that querying and analysis are performed in a seamless process—and with guaranteed consistency and accuracy. More efficient processes mean that multiple scenarios can be explored in the same amount of time it would take to run just one, deepening insights and resulting in improved final answers. Conclusion Teradata and Fuzzy Logix offer the tools needed to rationalize the disparate parts of the ALLL process into a streamlined whole, greatly improving consistency, timeliness, and accuracy. By integrating data and analytics into a central repository, banks have an unmatched ability to ensure the quality, transparency, and control of the process—pleasing regulators and making the ALLL process more manageable for everyone involved. Shortened process times give banks greater flexibility to analyze emerging risks, and provide a much needed edge for gaining regulatory approval for annual capital plans. Together, Teradata and Fuzzy Logix can help with detailed solution development—either onsite or analytics- as-a-service—including data integration, calculation algorithms, and process management. Data movement can be expensive and slow, and introduce error. Teradata and Fuzzy Logix can help your organization with more efficient and consistent data, analytics, and processes for ALLL and related CCAR requirements. To learn more about how your organization can leverage a sophisticated analytics platform to uncover valuable business insights—and simplify compliance with regulatory demands—visit teradata.com/contact-us. 10000 Innovation Drive, Dayton, OH 45342    Teradata.com Teradata and the Teradata logo are registered trademarks of Teradata Corporation and/or its affiliates in the U.S. and worldwide. Teradata continually improves products as new technologies and components become available. Teradata, therefore, reserves the right to change specifications without prior notice. All features, functions and operations described herein may not be marketed in all parts of the world. Consult your Teradata representative or Teradata.com for more information. Copyright © 2016 by Teradata Corporation    All Rights Reserved.    Produced in U.S.A. 7.16 EB9456 About the Authors Shelly Biggs, Senior Business Consultant— Risk Management, Teradata Over three decades experience in the financial services industry with emphasis in Enterprise Risk Management, including risk rating process; underwriting and loan review; KPI reporting; credit risk portfolio monitoring; data integration and decision sciences; regulatory reporting and analytics; allowance for loan loss, CCAR, and Dodd Frank Stress Testing. Alex Depaoli, Sales Support Engineer, Fuzzy Logix Over two decades experience in business analytics spanning marketing, credit strategy development, risk scoreboard development, loss forecasting capital markets arbitrage with a focus on financial services.