Successfully reported this slideshow.

Operational Risk Management Data Validation Architecture

0

Share

1 of 32
1 of 32

Operational Risk Management Data Validation Architecture

0

Share

Download to read offline

This describes a structured approach to validating data used to construct and use an operational risk model. It details an integrated approach to operational risk data involving three components:

1. Using the Open Group FAIR (Factor Analysis of Information Risk) risk taxonomy to create a risk data model that reflects the required data needed to assess operational risk

2. Using the DMBOK model to define a risk data capability framework to assess the quality and accuracy of risk data

3. Applying standard fault analysis approaches - Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) - to the risk data capability framework to understand the possible causes of risk data failures within the risk model definition, operation and use

This describes a structured approach to validating data used to construct and use an operational risk model. It details an integrated approach to operational risk data involving three components:

1. Using the Open Group FAIR (Factor Analysis of Information Risk) risk taxonomy to create a risk data model that reflects the required data needed to assess operational risk

2. Using the DMBOK model to define a risk data capability framework to assess the quality and accuracy of risk data

3. Applying standard fault analysis approaches - Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) - to the risk data capability framework to understand the possible causes of risk data failures within the risk model definition, operation and use

More Related Content

More from Alan McSweeney

Related Books

Free with a 14 day trial from Scribd

See all

Operational Risk Management Data Validation Architecture

  1. 1. Operational Risk Management Data Validation Architecture Alan McSweeney http://ie.linkedin.com/in/alanmcsweeney https://www.amazon.com/dp/1797567616
  2. 2. Risk Management Data Validation Architecture • This describes a structured approach to validating data used to construct and use an operational risk model • The operational risk model used is based on the Open Group FAIR (Factor Analysis of Information Risk) risk taxonomy - https://www.opengroup.org/forum/security/riskanalysis − Open Risk Taxonomy Technical Standard (O-RT) – defines a risk taxonomy - http://www.opengroup.org/library/c20b − Open Risk Analysis Technical Standard (O-RA) – describes the approach to performing risk analysis using FAIR - http://www.opengroup.org/library/c20a April 6, 2021 2
  3. 3. Integrated Approach To Operation Risk Management Data Architecture April 6, 2021 3 DMBOK Data Validation Framework
  4. 4. Integrated Approach To Operation Risk Management Data Architecture • Integrated approach to operational risk data involves: 1. Using the FAIR risk taxonomy to create a risk data model that reflects the required data needed to assess operational risk 2. Using the DMBOK model to define a risk data capability framework to assess the quality and accuracy of risk data 3. Applying standard fault analysis approaches - Fault Tree Analysis (FTA) and Failure Mode and Effect Analysis (FMEA) - to the risk data capability framework to understand the possible causes of risk data failures within the risk model definition, operation and use and to ensure both quality risk data and quality risk model April 6, 2021 4
  5. 5. Open Group FAIR (Factor Analysis Of Information Risk) Risk Taxonomy – Extended View April 6, 2021 5 Risk Loss Event Frequency Threat Event Frequency Contact Frequency Probability of Action Vulnerability Threat Capability Resistance Strength Loss Prevention Controls Loss Magnitude Primary Loss Magnitude Primary Loss Factors Asset Loss Factors Threat Loss Factors Competence Action Access Misuse Disclose Modify Deny Access Threat Agent Threat Community Internal versus External Secondary Loss Event Secondary Loss Event Frequency Secondary Loss Magnitude Secondary Loss Factors Organisational Loss Factors Timing Reasonable Care Response Containment Remediation Recovery Detection External Loss Factors External Party Detection Legal and Regulatory Competitors Media Loss Mitigation Controls Asset Primary Stakeholders Secondary Stakeholders Threat Event Loss Scenario Loss Event
  6. 6. Contact Frequency Contact Event Action Open Group FAIR (Factor Analysis Of Information Risk) Risk Taxonomy – Key Concepts And Relationships April 6, 2021 6 Asset Threat Agent Primary Stakeholder Primary Stakeholder Owned By Or Responsible For Asset Contact Event Occurs When Threat Agent Makes Contact With An Asset Contact Frequency Is The Rate Of Contact A Threat Agent Makes With An Asset Risk Risk Factors Threat Capability Threat Community Vulnerability Probability of Action Loss Magnitude Loss Event Contact Event May Become a Threat Event Threat Agent May Belong To A Threat Community Threat Event Threat Threat Is Anything That Can Harm An Asset Controls Loss Event Frequency Loss Prevention Controls Loss Mitigation Controls Reduces Reduces Threat Event Becomes A Loss Event When Asset Is Harmed Loss Event Frequency Is The Probability A Threat Agent Will Harm An Asset Loss Event Has a Magnitude Contact Event Can Become A Harmful Action Probability That Contact Will Become A Harmful Action Resistance Strength Strength Of A Control Risk Is The Probability Of Frequency And Magnitude Of Loss Threat Capability Is The Level Of Harm A Threat Agent Can Apply To An Asset Controls Are Applied To Assets Threat Event Frequency Probability A Threat Agent Will Act Against An Asset Vulnerability Is The Probability a Threat Becomes a Loss
  7. 7. Many Types Of Operational Disk April 6, 2021 7 Operational Risk People And Conduct Risk Systems And Process Risk Technology Risk Reputational Risk Legal Risk External Cyber Risk Regulatory Risk Compliance Risk Fraud Risk Security Risk
  8. 8. FAIR (Factor Analysis Of Information Risk) Data Model • A data model can be created from the FAIR risk taxonomy • This defines a set of data structures that can then be populated with risk-related data • Having a rigorous data model allows the data required for the risk framework to be defined and understood in more detail • Quality data is a pre-requisite for effective operational risk management April 6, 2021 8
  9. 9. FAIR (Factor Analysis Of Information Risk) High-Level Data Model April 6, 2021 9
  10. 10. FAIR (Factor Analysis Of Information Risk) High-Level Data Model • High-level data model translates the FAIR risk taxonomy into a data structure where risk data can be maintained and analysed • Data model can then be used to identify the required data • This in turn can be used to identify the data path and data provenance April 6, 2021 10
  11. 11. Risk Data Path And Data Provenance April 6, 2021 11 Structured Accurate Risk Model Risk Model Translated Into Rigorous Data Model Risk Data Model Populated With Accurate Data Structured Data Validation Approach Effective Risk Data And Model Usage Model that reflects the relationships and dependencies between entities Effective Risk Management Create and validate the data model from the risk model that captures risk data Collect risk data and enter into the risk data mode Validate the data and its quality and identify and resolve any data faults Ensure the model is used effectively Operational risk management is optimised Ensure Quality Along The Data Path
  12. 12. Data Quality Characteristics April 6, 2021 12 Data Quality Accuracy and Correctness Without errors. Unambiguous. Having appropriate and necessary accuracy and detail for is intended purpose. Reliability and Consistency Data must be consistent across all data stores. It must not contradict data held in other stores. The process for collecting, transforming and storing the data must be reliable. Currency and Relevance Data must be up-to-date and not lag behind data held in source stores. Data changes must be reflected with necessary timeliness. The data must be relevant. Availability and Usability The data must be available to target user population. It must be in a format that is accessible and usable. Completeness and Thoroughness The data must be complete and sufficiently comprehensive for its intended use. There must not be any gaps. Auditability and Traceability The data path from source to target must be traceable. Data operations should be auditable. Utility and Objective The data must meet its desired purpose. The data must not be biased.
  13. 13. Data Iceberg – Hidden Operational Processes Needed To Ensure Data Usability And Utility April 6, 2021 13 Be Able To Take Action Based on Reliable Information Measure What is Important Know What Is Important In Order To Measure It Define Measurements Define Consistent Units of Measurements Define Measurement Processes Define Operational Framework Define Collection Process Define Data Storage Model Define Transformation And Standardisation Install Data Collection Facilities Collect Data Monitor Data Collection Manage Data Collection Validate And Store Data Report And Analyse Stored Data Define Reports Run And Distribute Reports Define Analyses Run And Distribute Analyses Provide Realtime Access To Collected Data Define Data Tools And Infrastructure To Do This ... ... You Need To Do This ... ... Which Requires This ... ... Which In Turn Needs This ... ... And So On ... ... ... ...
  14. 14. DMBOK (Data Management Book Of Knowledge) Data Capability Structure • DMBOK data capability framework can be used to provide a structure for analysing data process validity April 6, 2021 14 Data Capability Framework Data Governance Data Architecture Data Modelling and Design Data Storage and Operations Data Security Data Quality Data Integration, Flow and Interoperability Reference and Master Data Data Warehousing and Business Intelligence Documents and Content Metadata
  15. 15. DMBOK Data Capability And Competence Areas Data Subject Areas Description and Scope Data Governance Standards and their enforcement, planning, supervision, control and usage of data resources and the design and implementation of data management processes, data ownership. Data Architecture Overall data architecture and data technology standards and design and implement data infrastructural technology solutions. Data Modelling and Design Analysis, design, implementation, testing, deployment, maintenance. Data Storage and Operations Support physical data assets across the spectrum of data activities from data acquisition to storage to backup, recovery, availability, continuity, retention and purging, capacity planning and management. Data Security Data resource security, privacy, confidentiality and access control. This is becoming ever more important in the context of data privacy initiatives such as GDPR. Data Quality Define, monitor, maintain and improve data quality and ensuring data integrity. Data Integration, Data Flow and Interoperability Data resource integration, extraction, transformation, movement, delivery, replication, transfer, sharing, federation, virtualisation and operational support. Reference and Master Data Manage master versions of shared data resources to reduce redundancy and maintain data quality through standardised data definitions and use of common data values. Data Warehousing and Business Intelligence Enabling data reporting and analysis, decision support, visualisation and supporting technologies. Documents and Content Acquisition, storage, indexing of and access to unstructured data resources such as electronic files and paper records and the integration of these resources with structured data resources. Metadata Creation of data description standards and the collection, categorisation, maintenance, integration, application, use and management of data descriptions. April 6, 2021 15
  16. 16. Structured Data Framework Enables The Definition Of Necessary Risk Data Foundation April 6, 2021 16 Data Operations Data Quality Data Modelling and Design Metadata Documents and Content Reference and Master Data Data Security Data Warehousing and Business Intelligence Data Governance Data Architecture Solid Data Management Foundation and Framework } You Cannot Have This ... ... Without This Data Integration and Interoperability Quality, Actionable and Actioned Data
  17. 17. Operational Risk Data Issues • Data issues with the operational risk model can arise in three areas: 1. Data Framework Definition, Structure and Configuration Failures – the risk data model and the associated risk data framework and supporting tools must be defined correctly 2. Data Management and Operations Failures – the model must be populated with the correct and accurate data values and the processes to support the operationalisation of risk data must be implemented 3. Data Usage Failures – the model and its data, including data from activity, event and alert monitoring and results must be use effectively • This material is concerned with defining a structured approach to addressing these potential data issues April 6, 2021 17
  18. 18. Data Failure Analysis Approaches • Two standard well-proven techniques for failure analysis that can be applied to validate and avoid failures in risk data models and ensure quality April 6, 2021 18 Top Down Deductive Approach From Effect to Cause Bottom Up Inductive Approach From Cause to Effect Fault Tree Analysis (FTA) Failure Mode and Effect Analysis (FMEA)
  19. 19. Fault Tree Analysis • The core analysis involves identifying of the combinations of causes that lead to a data failure • The extended analysis consists of: − Assigning a probability to the occurrence of events − Identifying of critical event combinations − Definition of actions to avoid or reduce failures • Fault Tree consists of AND and OR combinations of independent events that combine to cause the ultimate failure April 6, 2021 19
  20. 20. Fault Tree Analysis – Simple Example • Output Event 3 occurs if Output Failure Event 1 AND Output Failure Event 2 occur • Output Failure Event 1 occurs if Independent Failure Event 1 OR Independent Failure Event 1 occurs • Output Failure Event 2 occurs if Independent Failure Event 1 AND Independent Failure Event 1 occurs • Fault Tree can be broken down into multiple levels of decomposition to identify detailed contributions to ultimate data failure • Adding move levels adds complexity and effort to the risk data failure tree but provides more detail on what might cause a data failure April 6, 2021 20
  21. 21. Data Fault Tree Analysis Level 1 Decomposition • Fault Tree view of data failures • A data failure can occur if any one of the following data failure events occurs 1. Data Structure and Configuration Failures – relates to the definition and implementation of the framework of key risk data capabilities 2. Data Management and Operations Failures – relates to risk data operations failures 3. Data Usage Failures – relates to failures in using the collected and processed risk data including data from monitoring and auditing April 6, 2021 21
  22. 22. Data Fault Tree Analysis Level 2 Decomposition • Sample level 2 data capability and process decomposition identifies key subject areas that have topics that apply to the level 1 data failure event groups • Elements of key data capabilities can apply to more than one level 1 data failure event groups April 6, 2021 22
  23. 23. Data Fault Tree Analysis Level 3 Decomposition • This shows a sample level 3 decomposition for the level 2 Data Governance capability area in the level 1 Data Structure and Configuration Failure groups of data failure events would be: − Data Governance Stewardship Failure − Data Governance Strategy Failure − Data Governance Policies Failure − Data Governance Architecture Failure − Data Governance Standards and Procedures Failure − Data Governance Regulatory Compliance Failure − Data Governance Issue Management Failure − Data Management Services Failure − Data Governance Asset Valuation Failure − Data Governance Communication and Promotion Failure April 6, 2021 23
  24. 24. Data Fault Tree Analysis Decomposition Levels • It may not be necessary or useful to perform the fault analysis to all levels of detail • Level 1 or level 2 analysis may be sufficient to identify and resolve faults to ensure quality April 6, 2021 24
  25. 25. Risk Data Failure Event Information Definition • For each level of data failure decomposition, define the following items of information: − Failure Description − Cause of Failure Event − Effect of Failure Event − Event Probability − Event Severity − Event Identification/Detection/Handling Process − Event Identification Metrics − Event Identification Data − Event Identification Lag − Event Risk Level − Event Mitigation/Avoidance Control − Event Failure Data − Event Failure Processing Audit Log • This include both quantitative and qualitative measures • This information can be defined and collected at each failure tree decomposition level • This provides a checklist to ensure that operational risk data and its usage is accurate and reliable and that quality is ensured April 6, 2021 25
  26. 26. Contact Frequency Contact Event Action Applying FTA To Risk Model Data April 6, 2021 26 Asset Threat Agent Primary Stakeholder Primary Stakeholder Owned By Or Responsible For Asset Contact Event Occurs When Threat Agent Makes Contact With An Asset Contact Frequency Is The Rate Of Contact A Threat Agent Makes With An Asset Risk Risk Factors Threat Capability Threat Community Vulnerability Probability of Action Loss Magnitude Loss Event Contact Event May Become a Threat Event Threat Agent May Belong To A Threat Community Threat Event Threat Threat Is Anything That Can Harm An Asset Controls Loss Event Frequency Loss Prevention Controls Loss Mitigation Controls Reduces Reduces Threat Event Becomes A Loss Event When Asset Is Harmed Loss Event Frequency Is The Probability A Threat Agent Will Harm An Asset Loss Event Has a Magnitude Contact Event Can Become A Harmful Action Probability That Contact Will Become A Harmful Action Resistance Strength Strength Of A Control Risk Is The Probability Of Frequency And Magnitude Of Loss Threat Capability Is The Level Of Harm A Threat Agent Can Apply To An Asset Controls Are Applied To Assets Threat Event Frequency Probability A Threat Agent Will Act Against An Asset Vulnerability Is The Probability a Threat Becomes a Loss
  27. 27. Applying FTA To Risk Model Data • The combined structured approach of a data framework (DMBOK) and a data fault analysis methodology (FTA/FMEA) to validating risk data can be applied to each risk model data element April 6, 2021 27
  28. 28. Failure Mode and Effect Analysis (FMEA) • FMEA allows for the identification and elimination of data problems early in the risk data definition and collection process • FMEA is a bottom-up analysis approach that analyses ways in which a data element and the process for its collection and usage could fail to perform its planned purpose − Design FMEA • Analyses design • Analyses data components − Process FMEA • Analyses the processes for data design and collection April 6, 2021 28
  29. 29. Operational Risk Data FMEA • The FMEA risk data analysis steps are: 1. For each data element, identify the ways in which failure could occur 2. For each failure mode identified, detail its effects and consequences 3. Identify the possible causes of the failure 4. Identify the likely rate of occurrence of the failure cause 5. For each effect of each failure mode identify its severity 6. Define controls for each failure cause 7. Define how readily the failure can be detected by each control 8. Calculate a risk priority (called a Risk Priority Number – RPN) based on a combination of severity, rate of occurrence and ease of detection 9. Address risks in priority order April 6, 2021 29
  30. 30. Operational Risk Data FMEA Concepts And Relationships April 6, 2021 30 Failure Mode Data Element Failure Effects Failure Cause Probability or Rate of Occurrence Failure Cause Controls Control Detection Effectiveness Failure Effect Severity Risk Priority Number Prioritised Action Plan
  31. 31. Operational Risk Data Complexity Balance • You need to balance the complexity of the risk data model and data analysis framework and the level of detail they contain with the effort required to populate and maintain them and the return on the effort • The objective of an operational risk model is to understand and address real mitigatable, circumventable, addressable rather than theoretical risks • Any risk model and associated data validation approach needs to contribute to the effectiveness of the operational risk model April 6, 2021 31
  32. 32. More Information Alan McSweeney http://ie.linkedin.com/in/alanmcsweeney https://www.amazon.com/dp/1797567616 06 April 2021 32

×