The document discusses challenges banks face in calculating Allowance for Loan and Lease Losses (ALLL) due to disparate data sources. It proposes integrating ALLL data into a central repository with Teradata and Fuzzy Logix tools to improve efficiency, accuracy, and regulatory compliance. This would enhance ALLL calculations, models, and reporting to satisfy increasing regulatory demands through consistent, flexible, and reliable processes and data.
BCBS 239 Compliance: A Comprehensive ApproachCognizant
In 2013, the Basel Committee on Banking Supervision (BCBS) issued 14 principles for effectively aggregating risk data and reporting, with the goal of enabling banks to understand and address risk exposures that influence their major decisions. While Global Systemically Important Banks (GSIBs) have made progress in complying with BCBS 239, Domestic Systemically Important Banks (DSIBs) are still in the early stages.
A two part presentation outlining a software driven compliance solution for BCBS239 with the Diaku Axon platform. The first part summarises the regulation from the risk & data perspectives. The second part is a deep-dive into the solution within each of those areas, and also how enterprise-wide collaboration can be fostered.
BCBS239 represents an extraordinary challenge for the financial services sector, but it also represents a real opportunity for competitive advantage.
This report presents the comprehensive data and relevant analysis of enterprise finance and IT leaders regarding:
The evolution of the Finance function
Challenges that may exist with current Financial Management Systems
Preferences for deploying new business software
How cloud computing and advanced analytics fits into new business plans
The survey was designed specifically to help understand how companies will acquire and use Cloud-based business software solutions to help better manage their companies over the next 2-4 years.
BCBS 239 Compliance: A Comprehensive ApproachCognizant
In 2013, the Basel Committee on Banking Supervision (BCBS) issued 14 principles for effectively aggregating risk data and reporting, with the goal of enabling banks to understand and address risk exposures that influence their major decisions. While Global Systemically Important Banks (GSIBs) have made progress in complying with BCBS 239, Domestic Systemically Important Banks (DSIBs) are still in the early stages.
A two part presentation outlining a software driven compliance solution for BCBS239 with the Diaku Axon platform. The first part summarises the regulation from the risk & data perspectives. The second part is a deep-dive into the solution within each of those areas, and also how enterprise-wide collaboration can be fostered.
BCBS239 represents an extraordinary challenge for the financial services sector, but it also represents a real opportunity for competitive advantage.
This report presents the comprehensive data and relevant analysis of enterprise finance and IT leaders regarding:
The evolution of the Finance function
Challenges that may exist with current Financial Management Systems
Preferences for deploying new business software
How cloud computing and advanced analytics fits into new business plans
The survey was designed specifically to help understand how companies will acquire and use Cloud-based business software solutions to help better manage their companies over the next 2-4 years.
Prometeia’s unique business model offers a truly onestop
solution, combining extensive consulting services,
an integrated and cross-functional software package,
implementation support and methodological training.
The ERMAS Suite solution stays on top of regulatory
developments and meets all reporting needs. Its
functionalities and interfaces have been designed to be
fully adaptable, customisable, intuitive and easy to use
for the client.
London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239Greg Soulsby
The London Financial Modelling Group meeting of 2015 04 30 - Model driven solutions to BCBS239.
You will learn how to:
– Demonstrate compliance to each of the principles by re-purposing your information architecture
– Meet the obligations more efficiently by leveraging FIBO and semantic technology
– Show your management team how data architecture helps meet BCBS239 using his “BCSB239 Model driven solutions checklist” tool.
Business Intelligence System and instrumental level multi dimensional database Rolta
The PDF gives a brief introduction of business intelligence system and instrumental level multi dimensional database. As most associations still experience an absence of Business Intelligence (BI) in their basic leadership forms while executing endeavor frameworks, for example, Enterprise Resource Planning (ERP), Customer Relationship Management (CRM), and Supply Chain Management (SCM). Thusly, a model and strategies to assess and evaluate the insight level of big business frameworks can enhance choice backing.
Industrializing Zero Application MaintenanceCognizant
By applying a rigorous and automated approach to supporting applications, IT organizations can reduce spending, increase repair accuracy, minimize application debt across the portfolio, free up resources for more strategic business imperatives and improve application yield to deliver enhanced business outcomes.
Creating a Unified Data Strategy for Risk-Adjusted PaymentsCognizant
Taking a holistic approach to addressing tighter Medicare data requirements and new risk models will help payers optimize data accuracy for risk-adjusted payments as well as improved operations, patient health and regulatory compliance.
Optimizing IT Operations with Natural Language ProcessingCognizant
As artificial intelligence becomes increasingly mainstream, natural language processing techniques are emerging to help IT teams gain enhanced understanding of their operations landscape and to further optimize the ticket management process.
Independent models validation and automationSohail_farooq
This deck describes our service approach for independent third party validation of risk and capital models, and automation of validation tests for 2nd and 3rd lines of defense.
The rest of this deck is organized as follows:
Independent validation and description of validation tests and reporting
Automation of validation tests and a case study on cost savings
Asset Information: Addressing 21st Century ChallengesCognizant
Asset-intensive companies can reduce costs, improve operational uptime and enhance worker safety by collecting and analyzing process optimization data, unleashed by their response to regulatory requirements and the Internet of Things.
Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...Cognizant
Predictive analytics is a process of using statistical and data mining techniques to analyze historic and current data sets, create rules and predict future events. This paper outlines a game plan for effective implementation of predictive analytics.
While the prime objective of active asset management
firms is the search for alpha, important questions arise:
How does one measure alpha, identify levers that
contribute to alpha and define a consistent process?
In simpler terms, these could be viewed as return
computation, performance attribution and systemic
automation of the investment performance process
respectively. A deeper insight into the process reveals
nuances and behavior required from an investment
performance management solution/product.
Prometeia’s unique business model offers a truly onestop
solution, combining extensive consulting services,
an integrated and cross-functional software package,
implementation support and methodological training.
The ERMAS Suite solution stays on top of regulatory
developments and meets all reporting needs. Its
functionalities and interfaces have been designed to be
fully adaptable, customisable, intuitive and easy to use
for the client.
London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239Greg Soulsby
The London Financial Modelling Group meeting of 2015 04 30 - Model driven solutions to BCBS239.
You will learn how to:
– Demonstrate compliance to each of the principles by re-purposing your information architecture
– Meet the obligations more efficiently by leveraging FIBO and semantic technology
– Show your management team how data architecture helps meet BCBS239 using his “BCSB239 Model driven solutions checklist” tool.
Business Intelligence System and instrumental level multi dimensional database Rolta
The PDF gives a brief introduction of business intelligence system and instrumental level multi dimensional database. As most associations still experience an absence of Business Intelligence (BI) in their basic leadership forms while executing endeavor frameworks, for example, Enterprise Resource Planning (ERP), Customer Relationship Management (CRM), and Supply Chain Management (SCM). Thusly, a model and strategies to assess and evaluate the insight level of big business frameworks can enhance choice backing.
Industrializing Zero Application MaintenanceCognizant
By applying a rigorous and automated approach to supporting applications, IT organizations can reduce spending, increase repair accuracy, minimize application debt across the portfolio, free up resources for more strategic business imperatives and improve application yield to deliver enhanced business outcomes.
Creating a Unified Data Strategy for Risk-Adjusted PaymentsCognizant
Taking a holistic approach to addressing tighter Medicare data requirements and new risk models will help payers optimize data accuracy for risk-adjusted payments as well as improved operations, patient health and regulatory compliance.
Optimizing IT Operations with Natural Language ProcessingCognizant
As artificial intelligence becomes increasingly mainstream, natural language processing techniques are emerging to help IT teams gain enhanced understanding of their operations landscape and to further optimize the ticket management process.
Independent models validation and automationSohail_farooq
This deck describes our service approach for independent third party validation of risk and capital models, and automation of validation tests for 2nd and 3rd lines of defense.
The rest of this deck is organized as follows:
Independent validation and description of validation tests and reporting
Automation of validation tests and a case study on cost savings
Asset Information: Addressing 21st Century ChallengesCognizant
Asset-intensive companies can reduce costs, improve operational uptime and enhance worker safety by collecting and analyzing process optimization data, unleashed by their response to regulatory requirements and the Internet of Things.
Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Ind...Cognizant
Predictive analytics is a process of using statistical and data mining techniques to analyze historic and current data sets, create rules and predict future events. This paper outlines a game plan for effective implementation of predictive analytics.
While the prime objective of active asset management
firms is the search for alpha, important questions arise:
How does one measure alpha, identify levers that
contribute to alpha and define a consistent process?
In simpler terms, these could be viewed as return
computation, performance attribution and systemic
automation of the investment performance process
respectively. A deeper insight into the process reveals
nuances and behavior required from an investment
performance management solution/product.
Spreadsheet errors costed $6.2 Billion loss for JPMC…. Barclays had to purchase 179 toxic deals they never wanted in the first place and At the London 2012 Olympics, a staffer had a one-key-stroke error when entering the number...
Considerations for an Effective Internal Model Method Implementationaccenture
In this Accenture Finance & Risk presentation we discuss an approach banks can use to develop, manage, and monitor a robust and effective Internal Model Method program. Learn more about the Accenture Finance & Risk Practice: bit.ly/2j2JD6X
James J Okarimia
Managing Partner
Aligning Finance, Risk and Data Analytics in Meeting the Requirements of Emerging Regulations
Banks must meet more (and more varied) regulations today than ever. The sheer scale and scope of banking regulations, including Dodd-Frank, Basel III and IFRS, pose challenges to all financial institutions, from the smallest bank to the largest financial services enterprise.
A Summary of Top 28 areas covered by EC Proposed Regulation for CRR, CRD IV and Basel III Regulatory Compliance and Implementation of the proposal: A publication by James Jeffrey Okarimia
Partner at RM associates: Partners in Enterprise Risk Managements
James J Okarimia
Managing Partner
Aligning Finance, Risk and Data Analytics in Meeting the Requirements of Emerging Regulations
Banks must meet more (and more varied) regulations today than ever. The sheer scale and scope of banking regulations, including Dodd-Frank, Basel III and IFRS, pose challenges to all financial institutions, from the smallest bank to the largest financial services enterprise.
James J Okarimia
Managing Partner
Aligning Finance, Risk and Data Analytics in Meeting the Requirements of Emerging Regulations
Banks must meet more (and more varied) regulations today than ever. The sheer scale and scope of banking regulations, including Dodd-Frank, Basel III and IFRS, pose challenges to all financial institutions, from the smallest bank to the largest financial services enterprise.
James J Okarimia
Managing Partner
Aligning Finance, Risk and Data Analytics in Meeting the Requirements of Emerging Regulations
Banks must meet more (and more varied) regulations today than ever. The sheer scale and scope of banking regulations, including Dodd-Frank, Basel III and IFRS, pose challenges to all financial institutions, from the smallest bank to the largest financial services enterprise.
Leveraging Data in Financial Services to Meet Regulatory Requirements and Cre...Perficient, Inc.
Regulators have increased their scrutiny on the financial services industry in recent years and have levied fines against banks of all sizes – fines that have reached into the billions of dollars.
Because of this, the financial services industry is focusing considerable time and expense on capturing data to meet regulatory requirements in reporting, risk management, and compliance. Gathering, enhancing, and reporting the required data from multiple applications to meet regulatory demands is a significant challenge, and firms often address these data challenges as one-off projects with the objective of complying with a single regulation rather than improving risk management overall. Mergers and acquisitions further complicate these efforts by increasing the number of systems that information needs to be pulled from.
Meanwhile, revenue-generating exercises that could benefit from the availability of better client and market information often go unfunded due to an over-arching focus on regulatory compliance.
Rather than structure your data project as a tactical approach to meet regulatory requirements, learn how your data investment can drive true cost savings, fine avoidance, revenue creation, and competitive advantage.
This event promises:
• An experienced point of view on current regulatory, compliance and anti-money laundering issues that financial institutions are facing.
• Pros and cons of tactical versus strategic approaches to meeting regulatory requirements, specifically around data governance,
• Examples of how you can leverage data governance to drive compliance as well as competitive advantage.
1. Shelly Biggs, Senior Business Consultant—Risk Management,
Teradata; Alex Depaoli, Sales Support Engineer, Fuzzy Logix;
SVP of CCAR for a large bank
7.16 EB9456 FINANCIAL SERVICES
Allowance for Loan and Lease Losses:
Organizing Data & Analytics for Success
Beyond Data Aggregation & Compliance Reporting
2. BANKING ON INFORMATION2
The biggest challenge banks face in computing ALLL is
gathering data from disparate sources or spreadsheets
through manually-intensive, error-prone, and time-
consuming processes. All too often, ALLL production
demands an excessive effort from bank staff—requiring
80-hour work weeks.
Teradata and Fuzzy Logix can help banks greatly
improve the efficiency, accuracy, and regulatory
compliance of their methodology. By organizing the
bank’s ALLL data in a central repository with embedded
analytics, they can enhance the quality, consistency, and
timeliness of their calculations.
ALLL Defined
Expected Losses = Exposure at Default x
Probability of Default x Loss Given Default
The Allowance for Loan and Lease Losses (ALLL)
consists of two pieces: calculated components and
judgmental components. Many institutions use the ALLL
models in business as usual planning as well as DFAST/
CCAR/BCAR and BCBS 239 requirements. Additionally
IFRS9 compliance places additional requirements for
loss recognition.
Executive Summary
The Allowance for Loan and Lease Losses (ALLL) is one of the most critical line items on a bank’s financial
statement—and a focus of intense regulatory scrutiny. Examiners seek to ensure that a bank’s ALLL method-
ology is compliant with a number of key regulatory provisions, including:
• Staying abreast of ALLL regulatory changes and their impact on the reserve
• Evaluating impaired loans according to ASC 310-10-35 (FAS 114) and applying the correct impairment analysis
• Applying collateral value, present value of cash flows or Loan pricing methods
• Ensuring appropriate ACS 450-20 (FAS 5) methodology, with accurate loss rates and qualitative factor
adjustments for each pool
• Ensuring compliance with IFRS9 loss recognition
• Providing adequate documentation, and satisfying ASU 2010-20 disclosure reporting requirements
3. TERADATA.COM3
Scope of Data Required
Original transaction data at the atomic loan level, along
with slowly changing data (e.g., payment history and risk
ratings) across all portfolios, must be used for regulatory
ALLL calculations. They must also be used as the
foundation for other regulatory calculations, including
BCBS 239 and Comprehensive Capital Analysis and Review
(CCAR). Unfortunately, this data is usually distributed
across multiple databases and divisions within the bank,
making it difficult to integrate for timely calculations.
Additionally, reconciliation with the General Ledger is
difficult since it provides only aggregated views, and is
rarely able to trace back to full data detail at the source.
To build high-quality models, the regulatory burden of
data quality and data governance is very high for this
foundational regulatory and transactional data. Once
the data quality is sufficient, models can be developed
that stack upon results from other models. Judgment
factors can also be applied with reasonable confidence
in the security of capital adequacy, and consequent
government approvals.
Scope of Models Required
Given the present portfolio, ALLL models must reflect the
expected scenario over the next one-year horizon.
New requirements under the principles-based IFRS9
intensify the requirement for strong internal data for all
components of the balance sheet. The loss forecasting
will have more complex assessments of cash flows
calculations, asset class categorization, balances and loss
measurements (including lifetime expected credit losses).
For use in Dodd Frank Annual Stress Testing (DFAST)
and CCAR, regulators provide scenarios with multiple key
economic variables to reflect the implications of various
risk factors on capital security. Bankers calculate the
impact of the variables on key metrics, such as starting
balances and exposure on revolving credit. ALLL models
must perform under multiple scenarios, including multiple
baseline and stress scenarios.
To meet the many new regulatory requirements, ALLL
model development requires a long history of reliable
data—and the necessary variables for loss estimation and
loss drivers that are responsive to economic scenarios.
Additionally, a nimble infrastructure is required for quick
updates of loss forecasting to address ad hoc requests
from management or regulators, or to quickly reassess
loss forecasts due to economic or market shifts. The
process may include the following steps:
•• Identify model development data requirements and
available data sources
•• Assess data sources and potential uses in model
development
•• Model development includes several key steps that
are dependent on having sufficient, complete, and
accurate data—as well as reliable and efficient model
development platforms:
– Identification of appropriate model segmentation
(e.g., along lines of asset classes) that is supported
by the historical data, statistical analysis, and
business segments
– Definition and calculation of the outcome to be
measured (e.g., lifetime losses) on historical data,
and where proxies are required
– Model methodology selection, such as the type of
regression, simulation approaches, use/combination
of multiple models, and use of expert judgment
– Gather reliable candidate variables by classes,
such as customer characteristics/performance,
portfolio characteristics, and economic, sector or
market drivers
Data Aggregation Process and
System Constraints
Data—when aggregated and reconciled to the
General Ledger, and specific to account level—can
be useful for many business purposes. Marketing,
regulatory reporting, profiling risk by portfolio,
CCAR/DFAS/BCAR, and most other business
functions benefit from this reconciled, consis-
tent, and detailed data. This improves regulatory
response time, and satisfies Regulator require-
ments for consistent process and data quality
(BCBS 239).
4. BANKING ON INFORMATION4
– Variables selected among classes of variables,
including statistical selection procedures and
business review (often an iterative process)
– Model fit / final model design within selected
methodology, variable selection, and additional
analyses deployed for the model design process
– Model statistical testing for key model performance
requirements
– Model sensitivity analysis, outcomes analysis / back-
testing, and benchmarking of final model
– Final model documentation of all data attributed,
decisions and supporting analysis
•• Independent model validation to include conceptual
soundness review, data review, statistical
testing, sensitivity analysis, outcomes analysis,
and benchmarking. For independent validation
methodology review and benchmark, models are
critical to challenge the model design choices made by
the developer
•• Model implementation and production, including
accurate and efficient data delivery and coding. There
are increasing requirements for:
– Fast implementation of model enhancements and/
or updates
– Movement, merging, and transforming of multiple
large sets of data
– Fast production of results with strong control
systems to ensure reliability of results
How it Works Today
The problems with compiling a sufficient, complete, and
accurate dataset for model development are fundamental
to model development. There are often many distributed
datasets with varying data accuracy and completeness of
available variables and history.
Organizationally, there may be many locally-maintained
databases aggregating the same source data in
different manners. Additionally, different groups
or divisions may be conducting slightly different
calculations of losses—often due to different business
uses, different interpretations of the requirements,
or different uses of judgment in the final estimates/
calculations. These varied local databases and
calculations are not easily integrated or reconciled,
5. TERADATA.COM5
making the presentation of a final “correct” answer
(or a single version of the truth) challenging or even
impossible under the current architecture.
Vision for a Streamlined ALLL Process
The ideal state is the integration of data in a dynamic
manner from source systems that make it easily
accessible for real-time calculations—while supporting
enhancements to calculations and models, as needed.
This provides for loss models and forecasting that
are based on well-defined business and regulatory
requirements; can be quickly adapted, redeveloped, or
enhanced to changing requirements or business needs (in
efficient, controlled and reliable platforms); can be quickly
implemented and aggregated into production systems;
and produce reliable results with limited overhead.
Data modeling, forecasting, and reporting need to be nimble
with respect to changes in a wide variety of regulatory,
business, market, and economic factors, including:
•• Lending policies and procedures, including changes
in underwriting standards and collection, charge-off,
and recovery practices not considered elsewhere in
estimating credit losses
•• International, national, regional,
and local economic and business
conditions and developments that
effect cash flows and the collectability
of a portfolio, including the condition
of various market segments
•• Nature and volume of a portfolio and
in terms of loans and distribution
across different segments and asset
classes, which can be increasingly
complex under IFRS9
•• Experience, ability, and depth of
lending management and other
relevant staff—including their
ability to define and refine portfolio
strategies and apply expert judgment
•• Volume and severity of past due
loans, the volume of nonaccrual
loans, and the volume and severity of
adversely classified or graded loans
•• Quality of the institution’s loan review system, rating
philosophy, and ratings migration observed historically
and currently
•• Value of underlying collateral for collateral-dependent
loans, and any changes in rules that impact fair value
recognition under different scenarios or unexpected
economic or market changes
•• Existence and effect of any concentrations of credit,
and changes in the level of such concentrations
•• Effect of other external factors such as competition,
economic, industry, legal and regulatory requirements
on the level of estimated credit losses in the
institution’s existing portfolio
Simulation of results based on multiple and substantially
different economic scenarios are often inefficient and
Hypothetical Bank Risk Rating Scheme
Figure 1. For illustration purposes only.
The ideal state is the integration of data in a
dynamic manner from source systems that make
it easily accessible for real-time calculations—
while supporting enhancements to calculations
and models, as needed.
Bank
Rating
Regulatory
Classification
Rating Agency
Equivalent
Expected
Default Rate (bps)
1 Pass Aaa 0
2 Pass Aa 2
3 Pass A 2
4 Pass Baa 12
5 Pass Ba 72
6 Pass B 304
7 Special Mention Caa 1335
8 Sub-Standard Caa 1335
9 Doubtful Caa 1335
10 Loss Caa 1335
6. BANKING ON INFORMATION6
prone to error given the often-times many manual
steps involved with the process. When economic
drivers feed simulations, Fuzzy Logix provides the
flexibility to incorporate parameters, starting values,
and adjustments that feed Monte Carlo simulations to
help refine human judgment with calculated spectrum of
possible and likely outcomes.
For example, internal ratings are developed based on
internally developed models for risk ratings. These risk
ratings are correlated to reference model Probability of
Default (PD) ratings from agencies such as Moody’s, SP,
and Fitch.
Figure 1 illustrates a risk rating scale, how it converts to
the corresponding regulatory classification, and how it
evolves to Moody’s and other ratings through efficient
simulations based on multiple scenarios.
For ALLL and loss forecasting, it is imperative to lock
down an end-to-end sequence of data feeds and
calculations for transparent, consistent, and validated
results. This allows banks to include historical and current
state information in the required demonstrable processes,
while supporting consistent and timely insight to assist
in re-calibrating quantitative factors—and verifying
qualitative factors with real-time performance metrics.
This also allows banks to support consistent and timely
insight to assist in re-calibrating judgmental factors. See
Figure 2.
Fuzzy Logix Capabilities
Fuzzy Logix DBLytix® software is a library of analytic
functions that are installed and run in the Teradata
platform, leveraging the massively parallel processing
architecture of Teradata to deliver unsurpassed
processing times for analytics. DBlytix is invoked within
the SQL environment, making it accessible to the
organization’s business analysts who are familiar with the
world’s most popular business querying language.
Embedding analytics with the data in one environment
means that time and expense aren’t wasted on moving
data to specialized computing environments. Data
governance and security are simplified, and time isn’t
spent trying to aggregate modeling components
developed in remote environments. Layered modeling
tasks like ALLL and CCAR are performed under one
roof, meaning that downstream dependencies can
run just as soon as upstream results are computed,
eliminating the wait time for hand-offs. Models and
Proposed Teradata Calibration Tool for
Judgmental Portion of ALLL
Discretionary or Judgmental Attributes
1. Changes in lending policies and procedures,
including changes in underwriting standards and
collection, charge-off, and recovery practices not
considered elsewhere in estimating credit losses.
2. Changes in international, national, regional, and
local economic and business conditions and
developments that affect the collectability of
the portfolio, including the condition of various
market segments.
3. Changes in the nature and volume of the
portfolio and in the terms of loans.
4. Changes in the experience, ability, and depth of
lending management and other relevant staff.
5. Changes in the volume and severity of past due
loans, the volume of nonaccrual loans and the
volume and severity of adversely classified or
graded loans.
6. Changes in the quality of the institution’s loan
review system.
7. Changes in the value of underlying collateral for
collateral-dependent loans.
8. The existence and effect of any concentrations
of credit, and changes in the level of such
concentrations.
9. The effect of other external factors such
as competition and legal and regulatory
requirements on the level of estimated credit
losses in the institution’s existing portfolio.
Figure 2