1. Model validation aims to identify and document model assumptions and limitations to help mitigate model risk arising from pricing models.
2. A robust model validation process focuses on assessing model assumptions, limitations, and potential impact given model usage.
3. Effective model validation is firmly embedded within a comprehensive model governance framework involving collaboration between model validation, model owners, risk managers, controls functions, and audit.
The document provides a mark scheme for assessing customer needs. It outlines several key points:
1) Methods that companies can use to reduce costs such as training, Just-in-Time processes, and motivational techniques.
2) BMW should consider either quality control or quality assurance to ensure quality for their new Mini model. Both approaches have benefits and drawbacks depending on the workforce and costs.
3) Meeting customer expectations and dealing with complaints are important for good customer service, as are reliability and clear communications. Failing to meet customer needs can damage reputation and profits.
This document discusses the roles and responsibilities of various parties in the pricing and validation process at a bank. Finance has overall responsibility for financial reporting and delegates pricing and valuation control to other functions. Risk is responsible for approving models and setting market parameters. Front office is responsible for deal execution while middle office validates standard parameters and produces P&L statements. Operations ensures accurate deal representation in systems.
This document provides guidance on developing a solicitation for a FAR 15 procurement. It discusses reviewing customer requirements, developing evaluation criteria, and proposal preparation instructions. Key steps include setting up an acquisition team, conducting exchanges with industry, performing market research, and determining appropriate evaluation factors and a scoring system. The evaluation criteria and proposal instructions must be aligned with the solicitation's terms and conditions and statement of work. An evaluation plan should also be developed to guide the source selection process.
Let us delve upon the various skill levels or knowledge levels for the testing industry being designated as K-Levels.
What are K-Levels of knowledge?
K-Levels or “Knowledge Levels” basically refers to the prescription of an upper limit of skills or knowledge essential for a particular certification.
Hierarchy of K-Levels is described in globally recognized Bloom’s Texonomy of learning. Reaching a particular K-Level means that the individual has successfully achieved some measurable & meaningful objectives.
This document tests three option pricing models - binomial, Black-Scholes, and trinomial - against covered warrant prices for three companies. It is expected that the trinomial model will be most statistically reliable since it includes an additional condition of stationary stock price. Simple regressions are performed to test the models, followed by tests for serial correlation, heteroskedasticity, and multicollinearity. The models are then modified using ARMA structures based on a journal article. The modified models are used for static and dynamic forecasts to compare their reliability based on root mean squared errors against actual warrant prices.
This document appears to be a thesis submitted by Amit Kumar Sinha to the National University of Singapore's Risk Management Institute to earn a Master of Science in Financial Engineering degree. The thesis focuses on pricing and exposure measurement of interest rate derivatives using a short rate model approach. It discusses motivation for counterparty exposure measurement, defines quantitative measures of counterparty exposure at the trade, counterparty, and portfolio levels. It also covers credit risk mitigation techniques like netting and collateral agreements and their impact on exposure measurement. The document outlines the implementation of short rate models like Cox-Ingersoll-Ross and Hull-White 1-factor for term structure modeling, derivatives pricing, and simulation of risk factors to measure potential future exposure.
- Outsourcing has become common for many U.S. businesses as a way to reduce costs, though it also carries risks that must be carefully considered.
- A modified failure mode and effects analysis (FMEA) can help businesses evaluate potential risks of outsourcing options. Risks are rated based on their opportunity, probability, and severity to calculate a risk priority number.
- Analyzing risks using the FMEA process and Pareto chart allows companies to identify high-risk failures and develop actions to mitigate those risks, helping improve decision making around outsourcing.
The document discusses Porter's framework for industry analysis and defining an industry's scope. It addresses three issues in defining an industry's scope: horizontal scope across product markets, vertical scope along the value chain, and geographic scope across boundaries. Porter's guiding question is what happens to the potential profit or value created by a product - is it bargained away by suppliers/customers, dissipated in rivalry, or limited by substitutes. The document also discusses strategic groups as groups of firms following similar strategies.
The document provides a mark scheme for assessing customer needs. It outlines several key points:
1) Methods that companies can use to reduce costs such as training, Just-in-Time processes, and motivational techniques.
2) BMW should consider either quality control or quality assurance to ensure quality for their new Mini model. Both approaches have benefits and drawbacks depending on the workforce and costs.
3) Meeting customer expectations and dealing with complaints are important for good customer service, as are reliability and clear communications. Failing to meet customer needs can damage reputation and profits.
This document discusses the roles and responsibilities of various parties in the pricing and validation process at a bank. Finance has overall responsibility for financial reporting and delegates pricing and valuation control to other functions. Risk is responsible for approving models and setting market parameters. Front office is responsible for deal execution while middle office validates standard parameters and produces P&L statements. Operations ensures accurate deal representation in systems.
This document provides guidance on developing a solicitation for a FAR 15 procurement. It discusses reviewing customer requirements, developing evaluation criteria, and proposal preparation instructions. Key steps include setting up an acquisition team, conducting exchanges with industry, performing market research, and determining appropriate evaluation factors and a scoring system. The evaluation criteria and proposal instructions must be aligned with the solicitation's terms and conditions and statement of work. An evaluation plan should also be developed to guide the source selection process.
Let us delve upon the various skill levels or knowledge levels for the testing industry being designated as K-Levels.
What are K-Levels of knowledge?
K-Levels or “Knowledge Levels” basically refers to the prescription of an upper limit of skills or knowledge essential for a particular certification.
Hierarchy of K-Levels is described in globally recognized Bloom’s Texonomy of learning. Reaching a particular K-Level means that the individual has successfully achieved some measurable & meaningful objectives.
This document tests three option pricing models - binomial, Black-Scholes, and trinomial - against covered warrant prices for three companies. It is expected that the trinomial model will be most statistically reliable since it includes an additional condition of stationary stock price. Simple regressions are performed to test the models, followed by tests for serial correlation, heteroskedasticity, and multicollinearity. The models are then modified using ARMA structures based on a journal article. The modified models are used for static and dynamic forecasts to compare their reliability based on root mean squared errors against actual warrant prices.
This document appears to be a thesis submitted by Amit Kumar Sinha to the National University of Singapore's Risk Management Institute to earn a Master of Science in Financial Engineering degree. The thesis focuses on pricing and exposure measurement of interest rate derivatives using a short rate model approach. It discusses motivation for counterparty exposure measurement, defines quantitative measures of counterparty exposure at the trade, counterparty, and portfolio levels. It also covers credit risk mitigation techniques like netting and collateral agreements and their impact on exposure measurement. The document outlines the implementation of short rate models like Cox-Ingersoll-Ross and Hull-White 1-factor for term structure modeling, derivatives pricing, and simulation of risk factors to measure potential future exposure.
- Outsourcing has become common for many U.S. businesses as a way to reduce costs, though it also carries risks that must be carefully considered.
- A modified failure mode and effects analysis (FMEA) can help businesses evaluate potential risks of outsourcing options. Risks are rated based on their opportunity, probability, and severity to calculate a risk priority number.
- Analyzing risks using the FMEA process and Pareto chart allows companies to identify high-risk failures and develop actions to mitigate those risks, helping improve decision making around outsourcing.
The document discusses Porter's framework for industry analysis and defining an industry's scope. It addresses three issues in defining an industry's scope: horizontal scope across product markets, vertical scope along the value chain, and geographic scope across boundaries. Porter's guiding question is what happens to the potential profit or value created by a product - is it bargained away by suppliers/customers, dissipated in rivalry, or limited by substitutes. The document also discusses strategic groups as groups of firms following similar strategies.
This document summarizes the services of a company that specializes in designing and building custom automated systems, including deposition, laser integration, vision systems, software validation, and assembly of complex components. They have 25 years of experience in bespoke automation for applications such as medical devices, diagnostics, and biotechnology. Their services include mechanical and electrical design, software development, machine building, testing, validation, documentation, and distribution partnerships.
Software validation do's and dont's may 2013John Cachat
The document summarizes best practices for software validation based on a presentation by John Cachat. It discusses that software validation should be based on risk, not all software needs to be validated, and that validation requires evidence collected throughout the software development lifecycle. It also notes that software is different than hardware due to its complexity and ease of changes impacting the system. Finally, it provides an overview of typical tasks involved in software validation like requirements, design, testing, and maintaining validation when software changes.
The document outlines the software validation process which includes test development to check if software meets customer specifications. It describes the human and material resources needed for testing as well as constraints like limited resources and budget. The validation process involves documentation, test setup, development and execution, reporting, result analysis, defect retesting, regression testing, and closure to archive results for future projects.
The document discusses documentation principles and practices. It introduces terminology used in documentation. It emphasizes that documentation ensures consistency, communication and regulatory compliance. It outlines the document hierarchy and types of documents like SOPs, forms and records. It discusses preparing, revising, reviewing and controlling documents. It also provides seven good documentation practices.
The document summarizes an executive education program offered jointly by Georgetown University's McDonough School of Business and ESADE Business School. The program, called GAMP, provides a 3-week modular experience in Washington D.C., Shanghai, and Madrid focused on leadership development, global business operations and innovation, and decision-making for senior executives. The program utilizes dynamic learning through interactions with peers and draws on faculty expertise and real-world case studies and simulations.
V-model in software testing means Verification and Validation model. Much the same as the waterfall model, the V-Shaped life cycle is a consecutive path of execution of procedures. Every stage must be completed before the following stage starts. Testing of the product is arranged in parallel with a parallel stage of development in V-model.
The document proposes reclaiming and reframing the V-Model for software testing by removing outdated notions like the concept of time and instead focusing on validation and verification as processes of questioning and checking. It suggests ordering testing activities to meet specific needs and using one's own language and models rather than rigidly following the V-Model. The goal is to help testers operate effectively in the real world rather than just pass a certification.
Computer System Validation (CSV) is a core requirement for several industries. The aim of Computer System Validation is to ensure, through documentation, that the computer systems function the way they are intended to, consistently, repeatedly and reproducibly, somewhat in the manner expected of scientific experiments. So, the validation, meaning authentication or corroboration, is something that has to be done right from the start, that is, defining the computer system, to their use and going all the way right up to the time the computer system is retired.
The document discusses the V-model of the system development life cycle (SDLC). It begins by defining the SDLC as a structured process or framework for developing software. It then describes the key phases of the V-model - requirements analysis, design, implementation, unit testing, integration testing, system testing, and acceptance testing. Each phase in the development process (left side of the V) has a corresponding testing phase (right side of the V) to validate the work. The V-model aims to ensure quality at each stage and prevent defects from propagating through the lifecycle.
This document describes the Maven release process which involves checking in code, preparing a release by updating POMs and tagging the code, performing the release by deploying artifacts, and rolling back a release if needed. The release:prepare goal updates POMs, runs tests, and commits/tags the code. The release:perform goal checks out the tagged code, runs tests, and deploys artifacts. A single command can prepare and perform the release. Best practices include doing a dry run to simulate SCM operations before an actual release.
This document provides an overview of computer validation and compliance with regulatory guidance. It discusses the need for computer validation and outlines key principles from guidance documents such as software validation, use of off-the-shelf software in medical devices, and validation of electronic records and signatures. Validation approaches for different systems and software are covered, including spreadsheets. The document provides references to FDA and international regulatory guidance on these topics.
This document outlines the "V" model approach to system development. It discusses the key stages of the "V" model including requirements elicitation, system design, and testing phases. It provides an illustration of the "V" model workflow. The document also covers advantages of the "V" model like defined goals for each phase and early test planning. Disadvantages discussed are difficulty changing requirements late and limitations for complex projects. Finally, it provides examples comparing the suitability of the "V" and waterfall models for different problem scenarios.
The V-Model is a software development lifecycle model that structures testing activities in parallel to steps in the design process. It extends the waterfall model by performing testing at each development stage in both forward and reverse direction. This allows bugs to be found early and defects to be tracked proactively. The V-Model follows a strict process to develop quality software and constantly measures productivity through test case creation and coverage. However, it requires significant resources and money, and changes midway require updating documentation.
The V-Model is a software development model that depicts the relationships between system requirements, design, and testing. It emphasizes testing at each development stage with testing occurring in reverse order that modules are developed (top-down). The V-Model includes requirements analysis and system design at the beginning, followed by module design, integration, validation, and operation. Each stage establishes entry and exit criteria and is tested in turn before proceeding to the next stage. Benefits include reduced faults, improved quality, and validation at each stage, while disadvantages include high costs and rigidity.
The V Model outlines a process for determining objectives at different levels to ensure business alignment. It begins with (1) setting reaction objectives to define user preferences then progresses to (2) learning objectives, (3) application objectives, (4) impact objectives and ends with the highest level of (5) ROI objectives which are analyzed in relation to initial business payoff needs and goals. Each level of objectives must be clearly defined and linked to the next to ultimately measure the project's return on investment.
1. The document outlines the functional organization and delegation principles for model validation within a bank.
2. Risk has responsibility for controlling the fair value of financial instruments, approving models, and establishing reserves policies. Finance is responsible for financial reporting and delegates model controls to Risk.
3. Other functions like Front Office, Operations, Middle Office, and Back Office play defined roles in deal execution, market parameter validation, and P&L production according to the responsibilities chart.
Model risk and validation are important processes for banks that rely on models. There are several potential sources of model risk over a model's lifecycle from data issues to changes over time that impact applicability. Effective validation ensures models are performing as intended and identifies limitations. It should include independent review and testing using quantitative and qualitative techniques on a regular basis to verify models continue to meet requirements.
Article, published on IJOnline, highlighting the importance of choosing the right model auditor, the risks of de-scoping the model audit, and how Navigant Consulting\'s is different and less risky.
Xiaorong Zou has over 10 years of experience in model validation and risk management. She currently works as a Senior Manager at BMO Financial Group, where she manages a team that validates market risk models. Prior to this role, she worked as a lecturer teaching mathematics and finance courses. She has a PhD in Mathematics and masters degrees in Electrical Engineering, Actuarial Science, and Applied Math.
This document summarizes the services of a company that specializes in designing and building custom automated systems, including deposition, laser integration, vision systems, software validation, and assembly of complex components. They have 25 years of experience in bespoke automation for applications such as medical devices, diagnostics, and biotechnology. Their services include mechanical and electrical design, software development, machine building, testing, validation, documentation, and distribution partnerships.
Software validation do's and dont's may 2013John Cachat
The document summarizes best practices for software validation based on a presentation by John Cachat. It discusses that software validation should be based on risk, not all software needs to be validated, and that validation requires evidence collected throughout the software development lifecycle. It also notes that software is different than hardware due to its complexity and ease of changes impacting the system. Finally, it provides an overview of typical tasks involved in software validation like requirements, design, testing, and maintaining validation when software changes.
The document outlines the software validation process which includes test development to check if software meets customer specifications. It describes the human and material resources needed for testing as well as constraints like limited resources and budget. The validation process involves documentation, test setup, development and execution, reporting, result analysis, defect retesting, regression testing, and closure to archive results for future projects.
The document discusses documentation principles and practices. It introduces terminology used in documentation. It emphasizes that documentation ensures consistency, communication and regulatory compliance. It outlines the document hierarchy and types of documents like SOPs, forms and records. It discusses preparing, revising, reviewing and controlling documents. It also provides seven good documentation practices.
The document summarizes an executive education program offered jointly by Georgetown University's McDonough School of Business and ESADE Business School. The program, called GAMP, provides a 3-week modular experience in Washington D.C., Shanghai, and Madrid focused on leadership development, global business operations and innovation, and decision-making for senior executives. The program utilizes dynamic learning through interactions with peers and draws on faculty expertise and real-world case studies and simulations.
V-model in software testing means Verification and Validation model. Much the same as the waterfall model, the V-Shaped life cycle is a consecutive path of execution of procedures. Every stage must be completed before the following stage starts. Testing of the product is arranged in parallel with a parallel stage of development in V-model.
The document proposes reclaiming and reframing the V-Model for software testing by removing outdated notions like the concept of time and instead focusing on validation and verification as processes of questioning and checking. It suggests ordering testing activities to meet specific needs and using one's own language and models rather than rigidly following the V-Model. The goal is to help testers operate effectively in the real world rather than just pass a certification.
Computer System Validation (CSV) is a core requirement for several industries. The aim of Computer System Validation is to ensure, through documentation, that the computer systems function the way they are intended to, consistently, repeatedly and reproducibly, somewhat in the manner expected of scientific experiments. So, the validation, meaning authentication or corroboration, is something that has to be done right from the start, that is, defining the computer system, to their use and going all the way right up to the time the computer system is retired.
The document discusses the V-model of the system development life cycle (SDLC). It begins by defining the SDLC as a structured process or framework for developing software. It then describes the key phases of the V-model - requirements analysis, design, implementation, unit testing, integration testing, system testing, and acceptance testing. Each phase in the development process (left side of the V) has a corresponding testing phase (right side of the V) to validate the work. The V-model aims to ensure quality at each stage and prevent defects from propagating through the lifecycle.
This document describes the Maven release process which involves checking in code, preparing a release by updating POMs and tagging the code, performing the release by deploying artifacts, and rolling back a release if needed. The release:prepare goal updates POMs, runs tests, and commits/tags the code. The release:perform goal checks out the tagged code, runs tests, and deploys artifacts. A single command can prepare and perform the release. Best practices include doing a dry run to simulate SCM operations before an actual release.
This document provides an overview of computer validation and compliance with regulatory guidance. It discusses the need for computer validation and outlines key principles from guidance documents such as software validation, use of off-the-shelf software in medical devices, and validation of electronic records and signatures. Validation approaches for different systems and software are covered, including spreadsheets. The document provides references to FDA and international regulatory guidance on these topics.
This document outlines the "V" model approach to system development. It discusses the key stages of the "V" model including requirements elicitation, system design, and testing phases. It provides an illustration of the "V" model workflow. The document also covers advantages of the "V" model like defined goals for each phase and early test planning. Disadvantages discussed are difficulty changing requirements late and limitations for complex projects. Finally, it provides examples comparing the suitability of the "V" and waterfall models for different problem scenarios.
The V-Model is a software development lifecycle model that structures testing activities in parallel to steps in the design process. It extends the waterfall model by performing testing at each development stage in both forward and reverse direction. This allows bugs to be found early and defects to be tracked proactively. The V-Model follows a strict process to develop quality software and constantly measures productivity through test case creation and coverage. However, it requires significant resources and money, and changes midway require updating documentation.
The V-Model is a software development model that depicts the relationships between system requirements, design, and testing. It emphasizes testing at each development stage with testing occurring in reverse order that modules are developed (top-down). The V-Model includes requirements analysis and system design at the beginning, followed by module design, integration, validation, and operation. Each stage establishes entry and exit criteria and is tested in turn before proceeding to the next stage. Benefits include reduced faults, improved quality, and validation at each stage, while disadvantages include high costs and rigidity.
The V Model outlines a process for determining objectives at different levels to ensure business alignment. It begins with (1) setting reaction objectives to define user preferences then progresses to (2) learning objectives, (3) application objectives, (4) impact objectives and ends with the highest level of (5) ROI objectives which are analyzed in relation to initial business payoff needs and goals. Each level of objectives must be clearly defined and linked to the next to ultimately measure the project's return on investment.
1. The document outlines the functional organization and delegation principles for model validation within a bank.
2. Risk has responsibility for controlling the fair value of financial instruments, approving models, and establishing reserves policies. Finance is responsible for financial reporting and delegates model controls to Risk.
3. Other functions like Front Office, Operations, Middle Office, and Back Office play defined roles in deal execution, market parameter validation, and P&L production according to the responsibilities chart.
Model risk and validation are important processes for banks that rely on models. There are several potential sources of model risk over a model's lifecycle from data issues to changes over time that impact applicability. Effective validation ensures models are performing as intended and identifies limitations. It should include independent review and testing using quantitative and qualitative techniques on a regular basis to verify models continue to meet requirements.
Article, published on IJOnline, highlighting the importance of choosing the right model auditor, the risks of de-scoping the model audit, and how Navigant Consulting\'s is different and less risky.
Xiaorong Zou has over 10 years of experience in model validation and risk management. She currently works as a Senior Manager at BMO Financial Group, where she manages a team that validates market risk models. Prior to this role, she worked as a lecturer teaching mathematics and finance courses. She has a PhD in Mathematics and masters degrees in Electrical Engineering, Actuarial Science, and Applied Math.
The document describes an 8-phase statistical approach to developing a customer risk rating model. The model assesses money laundering risk for a bank's customers based on their profiles. Phase 1 defines risk categories like geography, customer, product/account, and transaction attributes. Phase 2 analyzes data completeness and variation to select meaningful attributes. Phase 3 tests attribute correlations. Phase 4 samples customer profiles for subject matter expert risk ratings used to calibrate the model in Phase 5. Phase 6 evaluates the model's performance and uncertainty. Phase 7 implements the optimized model to automatically rate customer risk.
Independent models validation and automationSohail_farooq
This deck describes our service approach for independent third party validation of risk and capital models, and automation of validation tests for 2nd and 3rd lines of defense.
The rest of this deck is organized as follows:
Independent validation and description of validation tests and reporting
Automation of validation tests and a case study on cost savings
The document discusses validation of economic capital models from a regulatory perspective. It outlines a range of qualitative and quantitative validation approaches used in practice to assess different properties of economic capital models, from integrity of implementation to predictive ability. While individual tests have limitations, a layered approach using multiple validation techniques can provide more robust evidence of a model's fitness for its intended purposes. Key challenges include validating conceptual soundness and assumptions given many are untestable, as well as assessing accuracy, particularly in tail distributions where data is scarce.
The document discusses various project selection models that can be used to evaluate and select projects. It describes both numeric and non-numeric models. Numeric models discussed include profitability models like net present value (NPV), internal rate of return (IRR), payback period, and return on investment (ROI). Scoring models are also described, including unweighted and weighted factor scoring models. The advantages and disadvantages of both profitability and scoring models are provided. An example is also included to demonstrate the calculation and use of NPV, IRR, and payback period to evaluate and select between two potential projects.
Model risk management aims to identify and address risks from model failures. It evaluates models across their lifecycle from development to usage. A three lines of defense approach is used with model owners, a validation team, and internal audit each providing oversight. Regular model validation is important to assess performance, assumptions, and risk. Agile validation processes that provide ongoing feedback can help address increasing model volumes and changing regulatory needs.
The document discusses the impact of Basel II on supervisory authorities and risk model validation. It outlines how Basel II will increase transparency and accountability in the financial system. It also describes the roles of home and host supervisors in risk model validation and how supervisors will need to work together cooperatively under Basel II.
Model Risk for Pricing and Risk Models in FinanceHäner Consulting
This document discusses model classes, credit risk measures, model implementation, back testing, and model control. It covers various types of models including trade models, pricing models, and risk models. It also discusses different ways of measuring credit risk including measures of severity such as exposure at default and loss given default, as well as measures of frequency like probability of default. Finally, it discusses pricing models that incorporate credit risk.
This document discusses model risk, sampling error, and scenario reduction techniques for stochastic modeling. It begins by defining model risk and its components. It then discusses general principles for managing model risk, including assigning model owners, vetting models, segregating duties, maintaining model inventories and documentation, identifying and controlling risks, reviewing models, and backtesting. It also addresses how sampling error can impact stochastic models and the benefits and limitations of using representative scenarios to reduce computational requirements.
The document discusses several capital budgeting techniques: sensitivity analysis, which examines how changes in assumptions impact NPV; scenario analysis, which considers multiple forecasts simultaneously; and break-even analysis, which determines the sales needed to cover costs. It also discusses real options, noting that NPV underestimates a project's value since managers can adjust in response to changes. Decision trees are presented as a tool to analyze projects with uncertain outcomes.
The document discusses several capital budgeting techniques: sensitivity analysis, which examines how changes in assumptions impact NPV; scenario analysis, which considers multiple forecasts simultaneously; and break-even analysis, which determines the sales needed to cover costs. It also discusses real options, noting that NPV underestimates a project's value since managers can adjust in response to changes. Decision trees are presented as a tool to analyze projects with uncertain outcomes.
This document discusses how option theory and real options analysis can be applied to pharmaceutical research and development (R&D) to maximize value. It asserts that R&D projects are best viewed as real options due to their flexibility. The value of options increases with uncertainty, flexibility to adapt to new information, and efficient learning to resolve uncertainties over time. The document recommends that companies pursue riskier projects with high upside potential and manage their portfolios proactively according to option theory to create more value than traditional discounted cash flow approaches.
FitchLearning QuantUniversity Model Risk PresentationQuantUniversity
In this lecture, we discuss the importance of a framework driven model risk approach and discuss 4 aspects required to operationalize model risk including quantifying uncertainty, quantifying model risk, importance of model verification in model risk management and leveraging technology to scale stress and scenario testing
Model Risk Management: Using an infinitely scalable stress testing platform f...QuantUniversity
Model risk and the importance of model risk management has gotten significant attention in the last few years. As financial companies increase their reliance on quants and quantitative models for decision making, they are increasingly exposed to model risk and are looking for ways to mitigate it. The financial crisis of 2008 and various high profile financial accidents due to model failures has brought model risk management to the forefront as an important topic to be addressed. Many regulatory efforts (Solvency II, Basel III, Dodd-Frank etc.) have been initiated obligating banks and financial institutions to incorporate formal model risk management programs to address model risk.
In this talk, we will discuss the key aspects of model verification and validation and introduce a novel approach to do stress and scenario tests leveraging parallel and distributed computing technologies and the cloud. The platform leverages cloud based technologies to run stress tests on a massive scale without having to invest in fixed in-house architectures. Through a case study, we will illustrate best practices for stress and scenario testing for model verification and validation. These best practices meant to provide practical tips for companies embarking on a formal model risk management program or enhancing their model risk methodologies to address the new realities.
This document discusses considerations for building out model risk management (MRM) frameworks for qualitative models at banks. It begins by defining qualitative models as those where the functional specification is determined primarily by expert judgment or assumptions rather than quantitative methodologies.
It notes that while qualitative models pose model risk, approaches to managing this risk may differ from quantitative models due to different risk sources. Specifically, staffing, scheduling, scope and inventory size of MRM programs may vary significantly between large global banks and regional banks based on factors like resources. Regional banks especially may need to validate qualitative and quantitative models using the same team.
The document provides examples of how existing risk management processes at regional banks could take on aspects of qualitative model validation to
Similar to Model+Risk+Validation+Raphael+Albrecht (20)
1. Model Risk & Validation
Determining The Expectations of a Model
V lid ti F ti B t P tiValidation Function – Best Practice
CFP EventsCFP Events
QUANT RISK MANAGEMENT CONGRESS 2014
Presented by
h l Alb h
London, October 7-8, 2014
Raphael Albrecht
IndependentValidation Unit
Barclays
1
Barclays
2. Disclaimer
All f th i i d i thi t ti l l thAll of the opinions expressed in this presentation are solely those
of the speaker and should under no circumstances be taken to
represent those of any bank, regulatory agency or otherp y , g y g y
institution, financial or otherwise.
In particular, any views regarding “best practice of modelp y g g p
validation” are personal opinions of the speaker and DO NOT
necessarily comply with any particular internal or regulatory
d l d h d l d bguidance – please consider them as idealised statements about a
“model validation heaven”
All f th t d l h ld b id d h th ti lAll of the quoted examples should be considered hypothetical
2
3. Model Risk – a working Definition
Model Risk is the potential for adverse consequences (financialModel Risk is the potential for adverse consequences (financial
or other) from decisions based on incorrect or misused model
outputs.
This can arise from fundamental model flaws leading to
inaccurate outputs, errors in implementation, or
incorrect/inappropriate use.
Quantitative ModelValidation can help mitigating model risk
in pricing models by identifying and documenting model
assumptions and limitations
3
4. Elements of Model Risk Governance
TheThree Lines of Defence
1. Model Owner – Business , Developer (QA) and IT perform
UATTesting
2 Approver Internal Control Functions2. Approver – Internal Control Functions
ModelValidation: Confirms conceptual soundness and
documents limitations
Market Risk Manager: Confirms risk representation is
adequate, if necessary defines RNIVs or Add-Ons
Product /Valuation Controls:Product /Valuation Controls:
Mitigate limitations through reserves (FVAs and PVAs)
set up price testing (input & output testing), monitor input data and
cailbration performancecailbration performance
3. Audit
validates governance process and double-checks reviews
4
g p
5. Desirable Aspects Governance
The three lines should be independent of one anotherThe three lines should be independent of one another
Model owner (FO) needs to be incentivised to provide support to
ModelValidation (MV) during review( ) g
MV should be part of the trade approval process but not own it!
It makes more sense to approve trades rather than modelsIt makes more sense to approve trades rather than models
TradeApproval process should go far beyond what model review
can achieve
MV should avoid making any model-related recommendations as
this would make them into part-owners of the modelsp
Ideally, the model review process should be based on priorities,
objectives and procedures, not on deadlines
5
6. Scope of a Pricing Model Review?
Sophisticated pricing models that are used :
To value trades with exotic payoffs if they are marked to a model
metrics (sensitivities) used by the internalVaR model or other risk metrics
for adjustments and reserves
In curve builds andVol-surface builds
Less likely candidates for a review:
V ill P ff ( ill S i l ti CDS t )Vanilla Payoffs (vanilla Swaps, simple options, CDS etc)
Industry-wide standard pricing models
Example: CD swaptions priced on credit-adjusted Black
Note: Underlying (forward) credit curve build should still be subject to a review
Approximate booking (if the target model was either subject to a review or is not in scope )
Example: amortising CSO booked to standard CSO
Trader tools used only to estimate some model inputs (unless those tools are part of the pricing
algorithm)algorithm)
Example: cash flow profiles of mortgage-related securities generated using Intex from input
prepayment rates and indices
6
7. Key Objectives of a Model Review
To identify and document assumptions and limitations ofTo identify and document assumptions and limitations of
the model used for pricing a given payout
Focus on adequacy of the risk representation rather thanq y p
pricing accuracy
Prices of exotic derivatives are given by trader marks on key
input parameters
Any price (within a range) can be matched by shifting the
i tinputs
The pricing range is model-dependent, particular value are not
Risk representation is determined by the quality and numberRisk representation is determined by the quality and number
of risk factors and their calibration – this is model-dependent
7
8. When is a model ready for review?
1. Has the model been prioritised for review?p
2. Is the model properly documented?
Is the theory fully documented with all reasoning (non-standard derivations)
and detailed references to accessible sources?and detailed references to accessible sources?
Is the implementation fully documented?
Examples:
PDE solver schemePDE solver scheme
Calibration routines
Integration and interpolation schemes
Random number generator (if relevant)
3. Is a testable implementation available?
Are interfaces to intermediate values available?
facilitate implementation testingp g
Does prototype allow easy scenario runs?
Classical horror scenario: command-line pricer with xml configuration-file
8
9. Theoretical Review
Identify assumptions and limitations of the model used for pricing /risking a given family
f tof payouts
Are model assumptions adequate for the given the payout?
Are all key risk factors for the payout being modelled? – discuss with the RM!
Example: 1F HJM IR model may be appropriate for range accruals but missing
de-correlation of different tenors might have a more significant impact for
spread options
Is the calibration set up reasonable?Is the calibration set-up reasonable?
I.e. can we expect to get an adequate risk representation?
Investigate conceptual soundness of the model
h k i t ith th d l d f i ilcheck consistency with other models used for a similar purpose
Does the model represent best practice among industry peers?
Counterexample: CDO pricing with Gaussian copula and constant recovery
(past 2008)(past 2008)
Apply Occam’s razor: Is this the simplest solution to the given pricing
problem?
9
10. Testing Design
Philosophy: Establish results and let them speak for themselves!
Amount and detail of testing should be commensurate with the potential
model risk
Test design should follow standardised review templatesTest design should follow standardised review templates
Design your standard templates and make them part of governance
Often, it makes sense to consolidate common, repetitive elements into a
single doc referred to by other reviewssingle doc referred to by other reviews
Model Reviews for workhorse models used for a number of payouts
Examples: N-Factor-HJM, LMM, LSV-engine
Focus on calibration and re-pricing of vanillasFocus on calibration and re pricing of vanillas
Payout Reviews
Examples: various range-accrual types and spread options on N-Factor-HJM would
have separate reviews
Focus on payout implementation, behavioural tests & convergence
Curve /Vol Build Reviews
Focus on repricing input vanillas, build accuracy and stability
10
11. Implementation Testing
To verify the model is implemented in agreement with documentationy p g
Standard procedure: comparison to independent implementation
(replica)
li d ll k i i l i di id llapplied to all key pricing elements individually
Are different pricer versions used for various sub-tasks?
Example: Pricing on PDE, calibration on analytic approximation
If implementation is analytical, agreement should be to within
numerical accuracy
Otherwise it should be 100% clear that diffs are “small” and “random”
(i.e. no systematic bias), in worst case argue by showing regression
Benchmarking: if model is too complex to replicate (ex: general
PDE solver) or if natural benchmark models are readily availablePDE solver) or if natural benchmark models are readily available
and were previously reviewed
11
12. Model and Calibration Analysis
To verify the model behaviour is as expected from theoryy p y
Impact on model and calibration when varying important input
parameters
I t t k t d t ( i ld l t )Input parameters are market data (e.g., yield curve, vols, etc.) or
model parameters (ex: IR-CR correlation in a hybrid model)
Stress tests – to see if and when the model breaks down/fails to
l b h d f dcalibrate when wide ranges of parameters are used
Note that it is not the primary intention of MV testing to determine
precisely when the model will brake (this is generally impossible!)y g y
only if we see that it brakes we would like to investigate why this is the
case and determine if in that particular case this can be expected or not
ex. when a no-arbitrage bound is violatedg
depending on the specific model/calibration in question it may not be
possible/necessary to perform all of the above tests
12
13. Testing related to Limitations and
Approximations
To investigate the impact of model- and implementation-To investigate the impact of model- and implementation-
related assumptions and limitations on pricing and risk for
representative tradesep ese tat ve t a es
Examples:
Impact of counter intuitive model assumptions (ex negativeImpact of counter-intuitive model assumptions (ex. negative
hazard rates) on pricing and sensitivities
Convergence of PV and sensitivities with number of MCConvergence of PV and sensitivities with number of MC
runs, number of grid points in a lattice
Impact of using a simplified analytic pricer for calibration onImpact of using a simplified analytic pricer for calibration on
pricing accuracy
13
14. Additional Testing
Fixed slot for any testing not fitting into any of the standardFixed slot for any testing not fitting into any of the standard
categories above
Possible tests would include:
Thought experiments – E.Mach’s “Gedankenexperiment”
Pricing model example: ad-hoc scenarios to show the effect ofg p
missing correlation
Comparison to alternative models (usually ad-hoc)
Consistency among various payoff-variants
Example:
CLN CDS + D l + k ZCLN = CDS + recovery Digital + risky Zero
on risky credit curve build (in terms of PVs)
14
15. The Write Up – Elements of good style
Provide all relevant details in the model description, but try to be more concisep y
than QA and avoid any verbatim repeats, refer to derivations or them give in an
appendix
Be matter-of-fact and to the point!
Stick closely to the template for the given review type
workhorse model, payout, curve build, ...
Make sure
Front end – conclusions and executive summary are crystal-clear with extremely
user-friendly wording
conclusions flow smoothly from testing observations and are worded in matter-y g
of-fact way
any limitations are referred to testing demonstrating potential impact where
possiblep
wording and references are consistent with all other existing model reviews
Use internal peer review to sanity-check before circulating
15
16. After the Review is Completed
Walk your counterparts inTrading Risk and PC through theWalk your counterparts inTrading, Risk and PC through the
conclusions of the review and note their responses/suggested
mitigantst ga ts
Desks might volunteer to run periodic tests (possibly inspired by
your testing) for the sake of monitoring the impact of morey g g p
critical limitations
measures are pre-agreed to be taken if impact estimate exceeds thresholds
Discuss with other control functions (Risk and PC) their
suggestions for model-related FairValue Adjustments and how
those reflect your testing resultsthose reflect your testing results
Again, those might be inspired by your testing
16
17. Conclusion
Model Validation Function can help mitigating model riskModel Validation Function can help mitigating model risk
through
Following a robust model validation process focussed ong p
ModelAssumptions & Limitations and their potential impact
in the context of its usage
MV process being firmly based within an well-established
model governance process encompassing all stakeholders
Including required “fringe” measures (Pricetesting,
FVAs,RNIVs, Model Risk estimates) provided by otherp y
functions (PCG, RMs, other control functions and Audit)
17